Seo, Hogyu David; Lee, Daeyoup
2018-05-15
Random mutagenesis of a target gene is commonly used to identify mutations that yield the desired phenotype. Of the methods that may be used to achieve random mutagenesis, error-prone PCR is a convenient and efficient strategy for generating a diverse pool of mutants (i.e., a mutant library). Error-prone PCR is the method of choice when a researcher seeks to mutate a pre-defined region, such as the coding region of a gene while leaving other genomic regions unaffected. After the mutant library is amplified by error-prone PCR, it must be cloned into a suitable plasmid. The size of the library generated by error-prone PCR is constrained by the efficiency of the cloning step. However, in the fission yeast, Schizosaccharomyces pombe, the cloning step can be replaced by the use of a highly efficient one-step fusion PCR to generate constructs for transformation. Mutants of desired phenotypes may then be selected using appropriate reporters. Here, we describe this strategy in detail, taking as an example, a reporter inserted at centromeric heterochromatin.
Creel, Scott; Spong, Goran; Sands, Jennifer L; Rotella, Jay; Zeigle, Janet; Joe, Lawrence; Murphy, Kerry M; Smith, Douglas
2003-07-01
Determining population sizes can be difficult, but is essential for conservation. By counting distinct microsatellite genotypes, DNA from noninvasive samples (hair, faeces) allows estimation of population size. Problems arise because genotypes from noninvasive samples are error-prone, but genotyping errors can be reduced by multiple polymerase chain reaction (PCR). For faecal genotypes from wolves in Yellowstone National Park, error rates varied substantially among samples, often above the 'worst-case threshold' suggested by simulation. Consequently, a substantial proportion of multilocus genotypes held one or more errors, despite multiple PCR. These genotyping errors created several genotypes per individual and caused overestimation (up to 5.5-fold) of population size. We propose a 'matching approach' to eliminate this overestimation bias.
A false positive food chain error associated with a generic predator gut content ELISA
USDA-ARS?s Scientific Manuscript database
Conventional prey-specific gut content ELISA and PCR assays are useful for identifying predators of insect pests in nature. However, these assays are prone to yielding certain types of food chain errors. For instance, it is possible that prey remains can pass through the food chain as the result of ...
Random mutagenesis of BoNT/E Hc nanobody to construct a secondary phage-display library.
Shahi, B; Mousavi Gargari, S L; Rasooli, I; Rajabi Bazl, M; Hoseinpoor, R
2014-08-01
To construct secondary mutant phage-display library of recombinant single variable domain (VHH) against botulinum neurotoxin E by error-prone PCR. The gene coding for specific VHH derived from the camel immunized with binding domain of botulinum neurotoxin E (BoNT/E) was amplified by error-prone PCR. Several biopanning rounds were used to screen the phage-displaying BoNT/E Hc nanobodies. The final nanobody, SHMR4, with increased affinity recognized BoNT/E toxin with no cross-reactivity with other antigens especially with related BoNT toxins. The constructed nanobody could be a suitable candidate for VHH-based biosensor production to detect the Clostridium botulinum type E. Diagnosis and treatment of botulinum neurotoxins are important. Generation of high-affinity antibodies based on the construction of secondary libraries using affinity maturation step leads to the development of reagents for precise diagnosis and therapy. © 2014 The Society for Applied Microbiology.
Digital Droplet PCR: CNV Analysis and Other Applications.
Mazaika, Erica; Homsy, Jason
2014-07-14
Digital droplet PCR (ddPCR) is an assay that combines state-of-the-art microfluidics technology with TaqMan-based PCR to achieve precise target DNA quantification at high levels of sensitivity and specificity. Because quantification is achieved without the need for standard assays in an easy to interpret, unambiguous digital readout, ddPCR is far simpler, faster, and less error prone than real-time qPCR. The basic protocol can be modified with minor adjustments to suit a wide range of applications, such as CNV analysis, rare variant detection, SNP genotyping, and transcript quantification. This unit describes the ddPCR workflow in detail for the Bio-Rad QX100 system, but the theory and data interpretation are generalizable to any ddPCR system. Copyright © 2014 John Wiley & Sons, Inc.
Computationally mapping sequence space to understand evolutionary protein engineering.
Armstrong, Kathryn A; Tidor, Bruce
2008-01-01
Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.
Random mutagenesis by error-prone pol plasmid replication in Escherichia coli.
Alexander, David L; Lilly, Joshua; Hernandez, Jaime; Romsdahl, Jillian; Troll, Christopher J; Camps, Manel
2014-01-01
Directed evolution is an approach that mimics natural evolution in the laboratory with the goal of modifying existing enzymatic activities or of generating new ones. The identification of mutants with desired properties involves the generation of genetic diversity coupled with a functional selection or screen. Genetic diversity can be generated using PCR or using in vivo methods such as chemical mutagenesis or error-prone replication of the desired sequence in a mutator strain. In vivo mutagenesis methods facilitate iterative selection because they do not require cloning, but generally produce a low mutation density with mutations not restricted to specific genes or areas within a gene. For this reason, this approach is typically used to generate new biochemical properties when large numbers of mutants can be screened or selected. Here we describe protocols for an advanced in vivo mutagenesis method that is based on error-prone replication of a ColE1 plasmid bearing the gene of interest. Compared to other in vivo mutagenesis methods, this plasmid-targeted approach allows increased mutation loads and facilitates iterative selection approaches. We also describe the mutation spectrum for this mutagenesis methodology in detail, and, using cycle 3 GFP as a target for mutagenesis, we illustrate the phenotypic diversity that can be generated using our method. In sum, error-prone Pol I replication is a mutagenesis method that is ideally suited for the evolution of new biochemical activities when a functional selection is available.
Statistical approaches to account for false-positive errors in environmental DNA samples.
Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid
2016-05-01
Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.
Mao, Chanjuan; Xie, Hongjie; Chen, Shiguo; Valverde, Bernal E; Qiang, Sheng
2017-09-01
Liriope spicata (Thunb.) Lour has a unique LsEPSPS structure contributing to the highest-ever-recognized natural glyphosate tolerance. The transformed LsEPSPS confers increased glyphosate resistance to E. coli and A. thaliana. However, the increased glyphosate-resistance level is not high enough to be of commercial value. Therefore, LsEPSPS was subjected to error-prone PCR to screen mutant EPSPS genes capable of endowing higher resistance levels. A mutant designated as ELs-EPSPS having five mutated amino acids (37Val, 67Asn, 277Ser, 351Gly and 422Gly) was selected for its ability to confer improved resistance to glyphosate. Expression of ELs-EPSPS in recombinant E. coli BL21 (DE3) strains enhanced resistance to glyphosate in comparison to both the LsEPSPS-transformed and -untransformed controls. Furthermore, transgenic ELs-EPSPS A. thaliana was about 5.4 fold and 2-fold resistance to glyphosate compared with the wild-type and the Ls-EPSPS-transgenic plants, respectively. Therefore, the mutated ELs-EPSPS gene has potential value for has potential for the development of glyphosate-resistant crops. Copyright © 2016 Elsevier Inc. All rights reserved.
A multiplexed droplet digital PCR assay performs better than qPCR on inhibition prone samples.
Sedlak, Ruth Hall; Kuypers, Jane; Jerome, Keith R
2014-12-01
We demonstrate the development of a multiplex droplet digital PCR assay for human cytomegalovirus (CMV), human adenovirus species F, and an internal plasmid control that may be useful for PCR inhibition-prone clinical samples. This assay performs better on inhibition-prone stool samples than a quantitative PCR assay for CMV and is the first published clinical virology droplet digital PCR assay to incorporate an internal control. Copyright © 2014 Elsevier Inc. All rights reserved.
Huang, Suzhen; Xue, Tingli; Wang, Zhiquan; Ma, Yuanyuan; He, Xueting; Hong, Jiefang; Zou, Shaolan; Song, Hao; Zhang, Minhua
2018-04-01
Furfural-tolerant strain is essential for the fermentative production of biofuels or chemicals from lignocellulosic biomass. In this study, Zymomonas mobilis CP4 was for the first time subjected to error-prone PCR-based whole genome shuffling, and the resulting mutants F211 and F27 that could tolerate 3 g/L furfural were obtained. The mutant F211 under various furfural stress conditions could rapidly grow when the furfural concentration reduced to 1 g/L. Meanwhile, the two mutants also showed higher tolerance to high concentration of glucose than the control strain CP4. Genome resequencing revealed that the F211 and F27 had 12 and 13 single-nucleotide polymorphisms. The activity assay demonstrated that the activity of NADH-dependent furfural reductase in mutant F211 and CP4 was all increased under furfural stress, and the activity peaked earlier in mutant than in control. Also, furfural level in the culture of F211 was also more rapidly decreased. These indicate that the increase in furfural tolerance of the mutants may be resulted from the enhanced NADH-dependent furfural reductase activity during early log phase, which could lead to an accelerated furfural detoxification process in mutants. In all, we obtained Z. mobilis mutants with enhanced furfural and high concentration of glucose tolerance, and provided valuable clues for the mechanism of furfural tolerance and strain development.
Ramos-Alemán, Fabiola; González-Jasso, Eva; Pless, Reynaldo C
2018-02-15
Several alkali chlorides were compared for their use in reverse transcription (RT) and PCR of different types of nucleic acid templates. On a test region of biological DNA incapable of forming G quadruplex (G4) structures, Taq DNA polymerase showed similar PCR performance with 50 mM KCl, CsCl, LiCl, and NaCl. In contrast, on a synthetic model polydeoxyribonucleotide prone to G4 formation, good PCR amplification was obtained with 50 mM CsCl, but little or none with LiCl or KCl. Similarly, in RT of a G4-prone model polyribonucleotide, MMLV reverse transcriptase produced a good yield with 50 mM CsCl, mediocre yields with LiCl or without added alkali chloride, and a poor yield with 50 mM KCl. The full RT-PCR assay starting from the G4-prone polyribonucleotide, showed good results with CsCl in both stages, poor results with LiCl, and no product formation with KCl. The model polynucleotides showed fast G quadruplex formation under PCR or RT conditions with 50 mM KCl, but not with CsCl or LiCl. The results argue for the use of CsCl instead of KCl for RT and PCR of G4-prone sequences. No advantage was observed when using the 7-deaza type nucleotide analog c 7 dGTP in PCR amplification of the G4-prone polydeoxyribonucleotide. Copyright © 2017 Elsevier Inc. All rights reserved.
Saracevic, Andrea; Simundic, Ana-Maria; Celap, Ivana; Luzanic, Valentina
2013-07-01
Rigat and colleagues were the first ones to develop a rapid PCR-based assay for identifying the angiotensin converting enzyme insertion/deletion (I/D) polymorphism. Due to a big difference between the length of the wild-type and mute alleles the PCR method is prone to mistyping because of preferential amplification of the D allele causing depicting I/D heterozygotes as D/D homozygotes. The aim of this study was to investigate whether this preferential amplification can be repressed by amplifying a longer DNA fragment in a so called Long PCR protocol. We also aimed to compare the results of genotyping using five different PCR protocols and to estimate the mistyping rate. The study included 200 samples which were genotyped using standard method used in our laboratory, a stepdown PCR, PCR protocol with the inclusion of 4 % DMSO, PCR with the use of insertion specific primers and new Long PCR method. The results of this study have shown that accurate ACE I/D polymorphism genotyping can be accomplished with the standard and the Long PCR method. Also, as of our results, accurate ACE I/D polymorphism genotyping can be accomplished regardless of the method used. Therefore, if the standard method is optimized more cautiously, accurate results can be obtained by this simple, inexpensive and rapid PCR protocol.
Structure-Function Analysis of Chloroplast Proteins via Random Mutagenesis Using Error-Prone PCR.
Dumas, Louis; Zito, Francesca; Auroy, Pascaline; Johnson, Xenie; Peltier, Gilles; Alric, Jean
2018-06-01
Site-directed mutagenesis of chloroplast genes was developed three decades ago and has greatly advanced the field of photosynthesis research. Here, we describe a new approach for generating random chloroplast gene mutants that combines error-prone polymerase chain reaction of a gene of interest with chloroplast complementation of the knockout Chlamydomonas reinhardtii mutant. As a proof of concept, we targeted a 300-bp sequence of the petD gene that encodes subunit IV of the thylakoid membrane-bound cytochrome b 6 f complex. By sequencing chloroplast transformants, we revealed 149 mutations in the 300-bp target petD sequence that resulted in 92 amino acid substitutions in the 100-residue target subunit IV sequence. Our results show that this method is suited to the study of highly hydrophobic, multisubunit, and chloroplast-encoded proteins containing cofactors such as hemes, iron-sulfur clusters, and chlorophyll pigments. Moreover, we show that mutant screening and sequencing can be used to study photosynthetic mechanisms or to probe the mutational robustness of chloroplast-encoded proteins, and we propose that this method is a valuable tool for the directed evolution of enzymes in the chloroplast. © 2018 American Society of Plant Biologists. All rights reserved.
Regulation of error-prone translesion synthesis by Spartan/C1orf124
Kim, Myoung Shin; Machida, Yuka; Vashisht, Ajay A.; Wohlschlegel, James A.; Pang, Yuan-Ping; Machida, Yuichi J.
2013-01-01
Translesion synthesis (TLS) employs low fidelity polymerases to replicate past damaged DNA in a potentially error-prone process. Regulatory mechanisms that prevent TLS-associated mutagenesis are unknown; however, our recent studies suggest that the PCNA-binding protein Spartan plays a role in suppression of damage-induced mutagenesis. Here, we show that Spartan negatively regulates error-prone TLS that is dependent on POLD3, the accessory subunit of the replicative DNA polymerase Pol δ. We demonstrate that the putative zinc metalloprotease domain SprT in Spartan directly interacts with POLD3 and contributes to suppression of damage-induced mutagenesis. Depletion of Spartan induces complex formation of POLD3 with Rev1 and the error-prone TLS polymerase Pol ζ, and elevates mutagenesis that relies on POLD3, Rev1 and Pol ζ. These results suggest that Spartan negatively regulates POLD3 function in Rev1/Pol ζ-dependent TLS, revealing a previously unrecognized regulatory step in error-prone TLS. PMID:23254330
Gwon, Hui-Jeong; Baik, Sang-Ho
2010-01-01
Diastereoselectivity-enhanced mutants of L: -threonine aldolase (L: -TA) for L: -threo-3,4-dihydroxyphenylserine (L: -threo-DOPS) synthesis were isolated by error-prone PCR followed by a high-throughput screening. The most improved mutant was achieved from the mutant T3-3mm2, showing a 4-fold increase over the wild-type L: -TA. When aldol condensation activity was examined using whole cells of T3-3mm2, its de was constantly maintained at 55% during the batch reactions for 80 h, yielding 3.8 mg L: -threo-DOPS/ml.
Malm, Magdalena; Kronqvist, Nina; Lindberg, Hanna; Gudmundsdotter, Lindvi; Bass, Tarek; Frejd, Fredrik Y; Höidén-Guthenberg, Ingmarie; Varasteh, Zohreh; Orlova, Anna; Tolmachev, Vladimir; Ståhl, Stefan; Löfblom, John
2013-01-01
The HER3 receptor is implicated in the progression of various cancers as well as in resistance to several currently used drugs, and is hence a potential target for development of new therapies. We have previously generated Affibody molecules that inhibit heregulin-induced signaling of the HER3 pathways. The aim of this study was to improve the affinity of the binders to hopefully increase receptor inhibition efficacy and enable a high receptor-mediated uptake in tumors. We explored a novel strategy for affinity maturation of Affibody molecules that is based on alanine scanning followed by design of library diversification to mimic the result from an error-prone PCR reaction, but with full control over mutated positions and thus less biases. Using bacterial surface display and flow-cytometric sorting of the maturation library, the affinity for HER3 was improved more than 30-fold down to 21 pM. The affinity is among the higher that has been reported for Affibody molecules and we believe that the maturation strategy should be generally applicable for improvement of affinity proteins. The new binders also demonstrated an improved thermal stability as well as complete refolding after denaturation. Moreover, inhibition of ligand-induced proliferation of HER3-positive breast cancer cells was improved more than two orders of magnitude compared to the previously best-performing clone. Radiolabeled Affibody molecules showed specific targeting of a number of HER3-positive cell lines in vitro as well as targeting of HER3 in in vivo mouse models and represent promising candidates for future development of targeted therapies and diagnostics.
Cooperstein, Robert; Young, Morgan
2014-01-01
Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings.
2014-01-01
Background Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Methods Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. Results The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. Conclusions As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings. PMID:24904747
Designing an algorithm to preserve privacy for medical record linkage with error-prone data.
Pal, Doyel; Chen, Tingting; Zhong, Sheng; Khethavath, Praveen
2014-01-20
Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients' privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other's database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other's database.
A continuous quality improvement project to reduce medication error in the emergency department.
Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts
2013-01-01
Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.
Designing an Algorithm to Preserve Privacy for Medical Record Linkage With Error-Prone Data
Pal, Doyel; Chen, Tingting; Khethavath, Praveen
2014-01-01
Background Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients’ privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. Objective To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. Methods To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. Results We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other’s database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Conclusions Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other’s database. PMID:25600786
Effects of Shame and Guilt on Error Reporting Among Obstetric Clinicians.
Zabari, Mara Lynne; Southern, Nancy L
2018-04-17
To understand how the experiences of shame and guilt, coupled with organizational factors, affect error reporting by obstetric clinicians. Descriptive cross-sectional. A sample of 84 obstetric clinicians from three maternity units in Washington State. In this quantitative inquiry, a variant of the Test of Self-Conscious Affect was used to measure proneness to guilt and shame. In addition, we developed questions to assess attitudes regarding concerns about damaging one's reputation if an error was reported and the choice to keep an error to oneself. Both assessments were analyzed separately and then correlated to identify relationships between constructs. Interviews were used to identify organizational factors that affect error reporting. As a group, mean scores indicated that obstetric clinicians would not choose to keep errors to themselves. However, bivariate correlations showed that proneness to shame was positively correlated to concerns about one's reputation if an error was reported, and proneness to guilt was negatively correlated with keeping errors to oneself. Interview data analysis showed that Past Experience with Responses to Errors, Management and Leadership Styles, Professional Hierarchy, and Relationships With Colleagues were influential factors in error reporting. Although obstetric clinicians want to report errors, their decisions to report are influenced by their proneness to guilt and shame and perceptions of the degree to which organizational factors facilitate or create barriers to restore their self-images. Findings underscore the influence of the organizational context on clinicians' decisions to report errors. Copyright © 2018 AWHONN, the Association of Women’s Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.
Garcia, Tanya P; Ma, Yanyuan
2017-10-01
We develop consistent and efficient estimation of parameters in general regression models with mismeasured covariates. We assume the model error and covariate distributions are unspecified, and the measurement error distribution is a general parametric distribution with unknown variance-covariance. We construct root- n consistent, asymptotically normal and locally efficient estimators using the semiparametric efficient score. We do not estimate any unknown distribution or model error heteroskedasticity. Instead, we form the estimator under possibly incorrect working distribution models for the model error, error-prone covariate, or both. Empirical results demonstrate robustness to different incorrect working models in homoscedastic and heteroskedastic models with error-prone covariates.
Easy preparation of a large-size random gene mutagenesis library in Escherichia coli.
You, Chun; Percival Zhang, Y-H
2012-09-01
A simple and fast protocol for the preparation of a large-size mutant library for directed evolution in Escherichia coli was developed based on the DNA multimers generated by prolonged overlap extension polymerase chain reaction (POE-PCR). This protocol comprised the following: (i) a linear DNA mutant library was generated by error-prone PCR or shuffling, and a linear vector backbone was prepared by regular PCR; (ii) the DNA multimers were generated based on these two DNA templates by POE-PCR; and (iii) the one restriction enzyme-digested DNA multimers were ligated to circular plasmids, followed by transformation to E. coli. Because the ligation efficiency of one DNA fragment was several orders of magnitude higher than that of two DNA fragments for typical mutant library construction, it was very easy to generate a mutant library with a size of more than 10(7) protein mutants per 50 μl of the POE-PCR product. Via this method, four new fluorescent protein mutants were obtained based on monomeric cherry fluorescent protein. This new protocol was simple and fast because it did not require labor-intensive optimizations in restriction enzyme digestion and ligation, did not involve special plasmid design, and enabled constructing a large-size mutant library for directed enzyme evolution within 1 day. Copyright © 2012 Elsevier Inc. All rights reserved.
Validation, Edits, and Application Processing Phase II and Error-Prone Model Report.
ERIC Educational Resources Information Center
Gray, Susan; And Others
The impact of quality assurance procedures on the correct award of Basic Educational Opportunity Grants (BEOGs) for 1979-1980 was assessed, and a model for detecting error-prone applications early in processing was developed. The Bureau of Student Financial Aid introduced new comments into the edit system in 1979 and expanded the pre-established…
Meiotic Divisions: No Place for Gender Equality.
El Yakoubi, Warif; Wassmann, Katja
2017-01-01
In multicellular organisms the fusion of two gametes with a haploid set of chromosomes leads to the formation of the zygote, the first cell of the embryo. Accurate execution of the meiotic cell division to generate a female and a male gamete is required for the generation of healthy offspring harboring the correct number of chromosomes. Unfortunately, meiosis is error prone. This has severe consequences for fertility and under certain circumstances, health of the offspring. In humans, female meiosis is extremely error prone. In this chapter we will compare male and female meiosis in humans to illustrate why and at which frequency errors occur, and describe how this affects pregnancy outcome and health of the individual. We will first introduce key notions of cell division in meiosis and how they differ from mitosis, followed by a detailed description of the events that are prone to errors during the meiotic divisions.
Two-Step Fair Scheduling of Continuous Media Streams over Error-Prone Wireless Channels
NASA Astrophysics Data System (ADS)
Oh, Soohyun; Lee, Jin Wook; Park, Taejoon; Jo, Tae-Chang
In wireless cellular networks, streaming of continuous media (with strict QoS requirements) over wireless links is challenging due to their inherent unreliability characterized by location-dependent, bursty errors. To address this challenge, we present a two-step scheduling algorithm for a base station to provide streaming of continuous media to wireless clients over the error-prone wireless links. The proposed algorithm is capable of minimizing the packet loss rate of individual clients in the presence of error bursts, by transmitting packets in the round-robin manner and also adopting a mechanism for channel prediction and swapping.
Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi
2014-01-01
Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. PMID:25326311
Suzuki, Hirokazu; Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi
2015-01-01
Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Keller, Mark; Naue, Jana; Zengerle, Roland; von Stetten, Felix; Schmidt, Ulrike
2015-01-01
Nested PCR remains a labor-intensive and error-prone biomolecular analysis. Laboratory workflow automation by precise control of minute liquid volumes in centrifugal microfluidic Lab-on-a-Chip systems holds great potential for such applications. However, the majority of these systems require costly custom-made processing devices. Our idea is to augment a standard laboratory device, here a centrifugal real-time PCR thermocycler, with inbuilt liquid handling capabilities for automation. We have developed a microfluidic disk segment enabling an automated nested real-time PCR assay for identification of common European animal groups adapted to forensic standards. For the first time we utilize a novel combination of fluidic elements, including pre-storage of reagents, to automate the assay at constant rotational frequency of an off-the-shelf thermocycler. It provides a universal duplex pre-amplification of short fragments of the mitochondrial 12S rRNA and cytochrome b genes, animal-group-specific main-amplifications, and melting curve analysis for differentiation. The system was characterized with respect to assay sensitivity, specificity, risk of cross-contamination, and detection of minor components in mixtures. 92.2% of the performed tests were recognized as fluidically failure-free sample handling and used for evaluation. Altogether, augmentation of the standard real-time thermocycler with a self-contained centrifugal microfluidic disk segment resulted in an accelerated and automated analysis reducing hands-on time, and circumventing the risk of contamination associated with regular nested PCR protocols.
Update: Validation, Edits, and Application Processing. Phase II and Error-Prone Model Report.
ERIC Educational Resources Information Center
Gray, Susan; And Others
An update to the Validation, Edits, and Application Processing and Error-Prone Model Report (Section 1, July 3, 1980) is presented. The objective is to present the most current data obtained from the June 1980 Basic Educational Opportunity Grant applicant and recipient files and to determine whether the findings reported in Section 1 of the July…
Gole, Markus; Köchel, Angelika; Schäfer, Axel; Schienle, Anne
2012-03-01
The goal of the present study was to investigate a threat engagement, disengagement, and sensitivity bias in individuals suffering from pathological worry. Twenty participants high in worry proneness and 16 control participants low in worry proneness completed an emotional go/no-go task with worry-related threat words and neutral words. Shorter reaction times (i.e., threat engagement bias), smaller omission error rates (i.e., threat sensitivity bias), and larger commission error rates (i.e., threat disengagement bias) emerged only in the high worry group when worry-related words constituted the go-stimuli and neutral words the no-go stimuli. Also, smaller omission error rates as well as larger commission error rates were observed in the high worry group relative to the low worry group when worry-related go stimuli and neutral no-go stimuli were used. The obtained results await further replication within a generalized anxiety disorder sample. Also, further samples should include men as well. Our data suggest that worry-prone individuals are threat-sensitive, engage more rapidly with aversion, and disengage harder. Copyright © 2011 Elsevier Ltd. All rights reserved.
Smailes, David; Meins, Elizabeth; Fernyhough, Charles
2015-01-01
People who experience intrusive thoughts are at increased risk of developing hallucinatory experiences, as are people who have weak reality discrimination skills. No study has yet examined whether these two factors interact to make a person especially prone to hallucinatory experiences. The present study examined this question in a non-clinical sample. Participants were 160 students, who completed a reality discrimination task, as well as self-report measures of cannabis use, negative affect, intrusive thoughts and auditory hallucination-proneness. The possibility of an interaction between reality discrimination performance and level of intrusive thoughts was assessed using multiple regression. The number of reality discrimination errors and level of intrusive thoughts were independent predictors of hallucination-proneness. The reality discrimination errors × intrusive thoughts interaction term was significant, with participants who made many reality discrimination errors and reported high levels of intrusive thoughts being especially prone to hallucinatory experiences. Hallucinatory experiences are more likely to occur in people who report high levels of intrusive thoughts and have weak reality discrimination skills. If applicable to clinical samples, these findings suggest that improving patients' reality discrimination skills and reducing the number of intrusive thoughts they experience may reduce the frequency of hallucinatory experiences.
Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.
Yamamoto, Loren; Kanemori, Joan
2010-06-01
Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Saavedra, Pedro; And Others
Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…
Olsen, Morten Tange; Bérubé, Martine; Robbins, Jooke; Palsbøll, Per J
2012-09-06
Telomeres, the protective cap of chromosomes, have emerged as powerful markers of biological age and life history in model and non-model species. The qPCR method for telomere length estimation is one of the most common methods for telomere length estimation, but has received recent critique for being too error-prone and yielding unreliable results. This critique coincides with an increasing awareness of the potentials and limitations of the qPCR technique in general and the proposal of a general set of guidelines (MIQE) for standardization of experimental, analytical, and reporting steps of qPCR. In order to evaluate the utility of the qPCR method for telomere length estimation in non-model species, we carried out four different qPCR assays directed at humpback whale telomeres, and subsequently performed a rigorous quality control to evaluate the performance of each assay. Performance differed substantially among assays and only one assay was found useful for telomere length estimation in humpback whales. The most notable factors causing these inter-assay differences were primer design and choice of using singleplex or multiplex assays. Inferred amplification efficiencies differed by up to 40% depending on assay and quantification method, however this variation only affected telomere length estimates in the worst performing assays. Our results suggest that seemingly well performing qPCR assays may contain biases that will only be detected by extensive quality control. Moreover, we show that the qPCR method for telomere length estimation can be highly precise and accurate, and thus suitable for telomere measurement in non-model species, if effort is devoted to optimization at all experimental and analytical steps. We conclude by highlighting a set of quality controls which may serve for further standardization of the qPCR method for telomere length estimation, and discuss some of the factors that may cause variation in qPCR experiments.
2012-01-01
Background Telomeres, the protective cap of chromosomes, have emerged as powerful markers of biological age and life history in model and non-model species. The qPCR method for telomere length estimation is one of the most common methods for telomere length estimation, but has received recent critique for being too error-prone and yielding unreliable results. This critique coincides with an increasing awareness of the potentials and limitations of the qPCR technique in general and the proposal of a general set of guidelines (MIQE) for standardization of experimental, analytical, and reporting steps of qPCR. In order to evaluate the utility of the qPCR method for telomere length estimation in non-model species, we carried out four different qPCR assays directed at humpback whale telomeres, and subsequently performed a rigorous quality control to evaluate the performance of each assay. Results Performance differed substantially among assays and only one assay was found useful for telomere length estimation in humpback whales. The most notable factors causing these inter-assay differences were primer design and choice of using singleplex or multiplex assays. Inferred amplification efficiencies differed by up to 40% depending on assay and quantification method, however this variation only affected telomere length estimates in the worst performing assays. Conclusion Our results suggest that seemingly well performing qPCR assays may contain biases that will only be detected by extensive quality control. Moreover, we show that the qPCR method for telomere length estimation can be highly precise and accurate, and thus suitable for telomere measurement in non-model species, if effort is devoted to optimization at all experimental and analytical steps. We conclude by highlighting a set of quality controls which may serve for further standardization of the qPCR method for telomere length estimation, and discuss some of the factors that may cause variation in qPCR experiments. PMID:22954451
Selection of stable scFv antibodies by phage display.
Brockmann, Eeva-Christine
2012-01-01
ScFv fragments are popular recombinant antibody formats but often suffer from limited stability. Phage display is a powerful tool in antibody engineering and applicable also for stability selection. ScFv variants with improved stability can be selected from large randomly mutated phage displayed libraries with a specific antigen after the unstable variants have been inactivated by heat or GdmCl. Irreversible scFv denaturation, which is a prerequisite for efficient selection, is achieved by combining denaturation with reduction of the intradomain disulfide bonds. Repeated selection cycles of increasing stringency result in enrichment of stabilized scFv fragments. Procedures for constructing a randomly mutated scFv library by error-prone PCR and phage display selection for enrichment of stable scFv antibodies from the library are described here.
Development of a Dependency Theory Toolbox for Database Design.
1987-12-01
published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and
Absence of Mutagenic Activity of Hycanthone in Serratia marcescens,
1986-05-29
repair system but is enhanced by the plasmid pKMl01, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , .1...enhanced by the plasmid pKM10, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , intercalates between the stacked bases...Roth (1974) lave suggested that proflavin , which has a planar triple ring structure similar to hycanthone, interacts with DNA, which upon replication
Zengerle, Roland; von Stetten, Felix; Schmidt, Ulrike
2015-01-01
Nested PCR remains a labor-intensive and error-prone biomolecular analysis. Laboratory workflow automation by precise control of minute liquid volumes in centrifugal microfluidic Lab-on-a-Chip systems holds great potential for such applications. However, the majority of these systems require costly custom-made processing devices. Our idea is to augment a standard laboratory device, here a centrifugal real-time PCR thermocycler, with inbuilt liquid handling capabilities for automation. We have developed a microfluidic disk segment enabling an automated nested real-time PCR assay for identification of common European animal groups adapted to forensic standards. For the first time we utilize a novel combination of fluidic elements, including pre-storage of reagents, to automate the assay at constant rotational frequency of an off-the-shelf thermocycler. It provides a universal duplex pre-amplification of short fragments of the mitochondrial 12S rRNA and cytochrome b genes, animal-group-specific main-amplifications, and melting curve analysis for differentiation. The system was characterized with respect to assay sensitivity, specificity, risk of cross-contamination, and detection of minor components in mixtures. 92.2% of the performed tests were recognized as fluidically failure-free sample handling and used for evaluation. Altogether, augmentation of the standard real-time thermocycler with a self-contained centrifugal microfluidic disk segment resulted in an accelerated and automated analysis reducing hands-on time, and circumventing the risk of contamination associated with regular nested PCR protocols. PMID:26147196
ERIC Educational Resources Information Center
Saavedra, Pedro; Kuchak, JoAnn
An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…
Chakravarti, D; Mailander, P C; Li, K M; Higginbotham, S; Zhang, H L; Gross, M L; Meza, J L; Cavalieri, E L; Rogan, E G
2001-11-29
Treatment of SENCAR mouse skin with dibenzo[a,l]pyrene results in abundant formation of abasic sites that undergo error-prone excision repair, forming oncogenic H-ras mutations in the early preneoplastic period. To examine whether the abundance of abasic sites causes repair infidelity, we treated SENCAR mouse skin with estradiol-3,4-quinone (E(2)-3,4-Q) and determined adduct levels 1 h after treatment, as well as mutation spectra in the H-ras gene between 6 h and 3 days after treatment. E(2)-3,4-Q formed predominantly (> or =99%) the rapidly-depurinating 4-hydroxy estradiol (4-OHE(2))-1-N3Ade adduct and the slower-depurinating 4-OHE(2)-1-N7Gua adduct. Between 6 h and 3 days, E(2)-3,4-Q induced abundant A to G mutations in H-ras DNA, frequently in the context of a 3'-G residue. Using a T.G-DNA glycosylase (TDG)-PCR assay, we determined that the early A to G mutations (6 and 12 h) were in the form of G.T heteroduplexes, suggesting misrepair at A-specific depurination sites. Since G-specific mutations were infrequent in the spectra, it appears that the slow rate of depurination of the N7Gua adducts during active repair may not generate a threshold level of G-specific abasic sites to affect repair fidelity. These results also suggest that E(2)-3,4-Q, a suspected endogenous carcinogen, is a genotoxic compound and could cause mutations.
Goomber, Shelly; Kumar, Arbind; Kaur, Jagdeep
2016-01-15
Cold adapted enzymes have applications in detergent, textile, food, bioremediation and biotechnology processes. Bacillus lipases are 'generally recognized as safe' (GRAS) and hence are industrially attractive. Bacillus lipase of 1.4 subfamily are of lowest molecular weight and are reversibly unfolded due to absence of disulphide bonds. Therefore these are largely used to study energetic of protein stability that represents unfolding of native protein to fully unfolded state. In present study, metagenomically isolated Bacillus LipJ was laboratory evolved for cold adaptation by error Prone PCR. Library of variants were screened for high relative activity at low temperature of 10°C compared to native protein LipJ. Point mutant sequenced as Phe19→Leu was determined to be active at cold and was selected for extensive biochemical, biophysical characterization. Variant F19L showed its maximum activity at 10°C where parent protein LipJ had 20% relative activity. Psychrophilic nature of F19L was established with about 50% relative active at 5°C where native protein was frozen to act. Variant F19L showed no activity at temperature 40°C and above, establishing its thermolabile nature. Thermostability studies determined mutant to be unstable above 20°C and three fold decrease in its half life at 30°C compared to native protein. Far UV-CD and intrinsic fluorescence study demonstrated unstable tertiary structure of point variant F19L leading to its unfolding at low temperature of 20°C. Cold adaptation of mutant F19L is accompanied with increased specific activity. Mutant was catalytically more efficient with 1.3 fold increase in kcat. Homologue structure modelling predicted disruption of intersecondary hydrophobic core formed by aromatic ring of Phe19 with non polar residues placed at β3, β4, β5, β6, αF. Increased local flexibility of variant F19L explains molecular basis of its psychrophilic nature. Copyright © 2015 Elsevier B.V. All rights reserved.
Enhancement of cellulosome-mediated deconstruction of cellulose by improving enzyme thermostability.
Moraïs, Sarah; Stern, Johanna; Kahn, Amaranta; Galanopoulou, Anastasia P; Yoav, Shahar; Shamshoum, Melina; Smith, Matthew A; Hatzinikolaou, Dimitris G; Arnold, Frances H; Bayer, Edward A
2016-01-01
The concerted action of three complementary cellulases from Clostridium thermocellum, engineered to be stable at elevated temperatures, was examined on a cellulosic substrate and compared to that of the wild-type enzymes. Exoglucanase Cel48S and endoglucanase Cel8A, both key elements of the natural cellulosome from this bacterium, were engineered previously for increased thermostability, either by SCHEMA, a structure-guided, site-directed protein recombination method, or by consensus-guided mutagenesis combined with random mutagenesis using error-prone PCR, respectively. A thermostable β-glucosidase BglA mutant was also selected from a library generated by error-prone PCR that will assist the two cellulases in their methodic deconstruction of crystalline cellulose. The effects of a thermostable scaffoldin versus those of a largely mesophilic scaffoldin were also examined. By improving the stability of the enzyme subunits and the structural component, we aimed to improve cellulosome-mediated deconstruction of cellulosic substrates. The results demonstrate that the combination of thermostable enzymes as free enzymes and a thermostable scaffoldin was more active on the cellulosic substrate than the wild-type enzymes. Significantly, "thermostable" designer cellulosomes exhibited a 1.7-fold enhancement in cellulose degradation compared to the action of conventional designer cellulosomes that contain the respective wild-type enzymes. For designer cellulosome formats, the use of the thermostabilized scaffoldin proved critical for enhanced enzymatic performance under conditions of high temperatures. Simple improvement in the activity of a given enzyme does not guarantee its suitability for use in an enzyme cocktail or as a designer cellulosome component. The true merit of improvement resides in its ultimate contribution to synergistic action, which can only be determined experimentally. The relevance of the mutated thermostable enzymes employed in this study as components in multienzyme systems has thus been confirmed using designer cellulosome technology. Enzyme integration via a thermostable scaffoldin is critical to the ultimate stability of the complex at higher temperatures. Engineering of thermostable cellulases and additional lignocellulosic enzymes may prove a determinant parameter for development of state-of-the-art designer cellulosomes for their employment in the conversion of cellulosic biomass to soluble sugars.Graphical abstractConversion of conventional designer cellulosomes into thermophilic designer cellulosomes.
Somatic immunoglobulin hypermutation
Diaz, Marilyn; Casali, Paolo
2015-01-01
Immunoglobulin hypermutation provides the structural correlate for the affinity maturation of the antibody response. Characteristic modalities of this mechanism include a preponderance of point-mutations with prevalence of transitions over transversions, and the mutational hotspot RGYW sequence. Recent evidence suggests a mechanism whereby DNA-breaks induce error-prone DNA synthesis in immunoglobulin V(D)J regions by error-prone DNA polymerases. The nature of the targeting mechanism and the trans-factors effecting such breaks and their repair remain to be determined. PMID:11869898
Nickerson, Naomi H; Li, Ying; Benjamin, Simon C
2013-01-01
A scalable quantum computer could be built by networking together many simple processor cells, thus avoiding the need to create a single complex structure. The difficulty is that realistic quantum links are very error prone. A solution is for cells to repeatedly communicate with each other and so purify any imperfections; however prior studies suggest that the cells themselves must then have prohibitively low internal error rates. Here we describe a method by which even error-prone cells can perform purification: groups of cells generate shared resource states, which then enable stabilization of topologically encoded data. Given a realistically noisy network (≥10% error rate) we find that our protocol can succeed provided that intra-cell error rates for initialisation, state manipulation and measurement are below 0.82%. This level of fidelity is already achievable in several laboratory systems.
Brébion, Gildas; Larøi, Frank; Van der Linden, Martial
2010-10-01
Hallucinations in patients with schizophrenia have been associated with a liberal response bias in signal detection and recognition tasks and with various types of source-memory error. We investigated the associations of hallucination proneness with free-recall intrusions and false recognitions of words in a nonclinical sample. A total of 81 healthy individuals were administered a verbal memory task involving free recall and recognition of one nonorganizable and one semantically organizable list of words. Hallucination proneness was assessed by means of a self-rating scale. Global hallucination proneness was associated with free-recall intrusions in the nonorganizable list and with a response bias reflecting tendency to make false recognitions of nontarget words in both types of list. The verbal hallucination score was associated with more intrusions and with a reduced tendency to make false recognitions of words. The associations between global hallucination proneness and two types of verbal memory error in a nonclinical sample corroborate those observed in patients with schizophrenia and suggest that common cognitive mechanisms underlie hallucinations in psychiatric and nonclinical individuals.
How Alterations in the Cdt1 Expression Lead to Gene Amplification in Breast Cancer
2011-07-01
absence of extrinsic DNA damage. We measured the TLS activity by measuring the mutation frequency in a supF gene (in a shuttle vector) subjected to UV...induced DNA damage before its introduction into the cells. Error-prone TLS activity will mutate the supF gene , which is scored by a blue-white colony...Figure 4A). Sequencing of the mutant supF genes , revealed a mutation spectrum consistent with error prone TLS (Supplemental Table 1). Significantly
An optimized one-tube, semi-nested PCR assay for Paracoccidioides brasiliensis detection.
Pitz, Amanda de Faveri; Koishi, Andrea Cristine; Tavares, Eliandro Reis; Andrade, Fábio Goulart de; Loth, Eduardo Alexandre; Gandra, Rinaldo Ferreira; Venancio, Emerson José
2013-01-01
Herein, we report a one-tube, semi-nested-polymerase chain reaction (OTsn-PCR) assay for the detection of Paracoccidioides brasiliensis. We developed the OTsn-PCR assay for the detection of P. brasiliensis in clinical specimens and compared it with other PCR methods. The OTsn-PCR assay was positive for all clinical samples, and the detection limit was better or equivalent to the other nested or semi-nested PCR methods for P. brasiliensis detection. The OTsn-PCR assay described in this paper has a detection limit similar to other reactions for the molecular detection of P. brasiliensis, but this approach is faster and less prone to contamination than other conventional nested or semi-nested PCR assays.
Kim, Jae-Eung; Huang, Rui; Chen, Hui; You, Chun; Zhang, Y-H Percival
2016-09-01
A foolproof protocol was developed for the construction of mutant DNA library for directed protein evolution. First, a library of linear mutant gene was generated by error-prone PCR or molecular shuffling, and a linear vector backbone was prepared by high-fidelity PCR. Second, the amplified insert and vector fragments were assembled by overlap-extension PCR with a pair of 5'-phosphorylated primers. Third, full-length linear plasmids with phosphorylated 5'-ends were self-ligated with T4 ligase, yielding circular plasmids encoding mutant variants suitable for high-efficiency transformation. Self-made competent Escherichia coli BL21(DE3) showed a transformation efficiency of 2.4 × 10(5) cfu/µg of the self-ligated circular plasmid. Using this method, three mutants of mCherry fluorescent protein were found to alter their colors and fluorescent intensities under visible and UV lights, respectively. Also, one mutant of 6-phosphorogluconate dehydrogenase from a thermophilic bacterium Moorella thermoacetica was found to show the 3.5-fold improved catalytic efficiency (kcat /Km ) on NAD(+) as compared to the wild-type. This protocol is DNA-sequence independent, and does not require restriction enzymes, special E. coli host, or labor-intensive optimization. In addition, this protocol can be used for subcloning the relatively long DNA sequences into any position of plasmids. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Proximal antecedents and correlates of adopted error approach: a self-regulatory perspective.
Van Dyck, Cathy; Van Hooft, Edwin; De Gilder, Dick; Liesveld, Lillian
2010-01-01
The current study aims to further investigate earlier established advantages of an error mastery approach over an error aversion approach. The two main purposes of the study relate to (1) self-regulatory traits (i.e., goal orientation and action-state orientation) that may predict which error approach (mastery or aversion) is adopted, and (2) proximal, psychological processes (i.e., self-focused attention and failure attribution) that relate to adopted error approach. In the current study participants' goal orientation and action-state orientation were assessed, after which they worked on an error-prone task. Results show that learning goal orientation related to error mastery, while state orientation related to error aversion. Under a mastery approach, error occurrence did not result in cognitive resources "wasted" on self-consciousness. Rather, attention went to internal-unstable, thus controllable, improvement oriented causes of error. Participants that had adopted an aversion approach, in contrast, experienced heightened self-consciousness and attributed failure to internal-stable or external causes. These results imply that when working on an error-prone task, people should be stimulated to take on a mastery rather than an aversion approach towards errors.
Improved acid tolerance of Lactobacillus pentosus by error-prone whole genome amplification.
Ye, Lidan; Zhao, Hua; Li, Zhi; Wu, Jin Chuan
2013-05-01
Acid tolerance of Lactobacillus pentosus ATCC 8041 was improved by error-prone amplification of its genomic DNA using random primers and Taq DNA polymerase. The resulting amplification products were transferred into wild-type L. pentosus by electroporation and the transformants were screened for growth on low-pH agar plates. After only one round of mutation, one mutant (MT3) was identified that was able to completely consume 20 g/L of glucose to produce lactic acid at a yield of 95% in 1L MRS medium at pH 3.8 within 36 h, whereas no growth or lactic acid production was observed for the wild-type strain under the same conditions. The acid tolerance of mutant MT3 remained genetically stable for at least 25 subcultures. Therefore, the error-prone whole genome amplification technique is a very powerful tool for improving phenotypes of this lactic acid bacterium and may also be applicable for other microorganisms. Copyright © 2012 Elsevier Ltd. All rights reserved.
Lakhin, A V; Efremova, A S; Makarova, I V; Grishina, E E; Shram, S I; Tarantul, V Z; Gening, L V
2013-01-01
The DNA polymerase iota (Pol iota), which has some peculiar features and is characterized by an extremely error-prone DNA synthesis, belongs to the group of enzymes preferentially activated by Mn2+ instead of Mg2+. In this work, the effect of Mn2+ on DNA synthesis in cell extracts from a) normal human and murine tissues, b) human tumor (uveal melanoma), and c) cultured human tumor cell lines SKOV-3 and HL-60 was tested. Each group displayed characteristic features of Mn-dependent DNA synthesis. The changes in the Mn-dependent DNA synthesis caused by malignant transformation of normal tissues are described. It was also shown that the error-prone DNA synthesis catalyzed by Pol iota in extracts of all cell types was efficiently suppressed by an RNA aptamer (IKL5) against Pol iota obtained in our work earlier. The obtained results suggest that IKL5 might be used to suppress the enhanced activity of Pol iota in tumor cells.
Parental mosaicism is a pitfall in preimplantation genetic diagnosis of dominant disorders.
Steffann, Julie; Michot, Caroline; Borghese, Roxana; Baptista-Fernandes, Marcia; Monnot, Sophie; Bonnefont, Jean-Paul; Munnich, Arnold
2014-05-01
PCR amplification on single cells is prone to allele drop-out (PCR failure of one allele), a cause of misdiagnosis in preimplantation genetic diagnosis (PGD). Owing to this error risk, PGD usually relies on both direct and indirect genetic analyses. When the affected partner is the sporadic case of a dominant disorder, building haplotypes require spermatozoon or polar body testing prior to PGD, but these procedures are cost and time-consuming. A couple requested PGD because the male partner suffered from a dominant Cowden syndrome (CS). He was a sporadic case, but the couple had a first unaffected child and the non-mutated paternal haplotype was tentatively deduced. The couple had a second spontaneous pregnancy and the fetus was found to carry the at-risk haplotype but not the PTEN mutation. The mutation was present in blood from the affected father, but at low level, confirming the somatic mosaicism. Ignoring the possibility of mosaicism in the CS patient would have potentially led to selection of affected embryos. This observation emphasizes the risk of PGD in families at risk to transmit autosomal-dominant disorder when the affected partner is a sporadic case.
WISC-R Examiner Errors: Cause for Concern.
ERIC Educational Resources Information Center
Slate, John R.; Chick, David
1989-01-01
Clinical psychology graduate students (N=14) administered Wechsler Intelligence Scale for Children-Revised. Found numerous scoring and mechanical errors that influenced full-scale intelligence quotient scores on two-thirds of protocols. Particularly prone to error were Verbal subtests of Vocabulary, Comprehension, and Similarities. Noted specific…
Delaney, Michael P; Stevens, Paul E; Witham, Helen J; Judge, Caroline; Eaglestone, Gillian L; Carter, Joanne L; Bassett, Paul; Lamb, Edmund J
2016-01-01
♦ Small solute clearance, especially that derived from residual renal function (RRF), is an independent risk factor for death in peritoneal dialysis (PD) patients. Assessment of solute clearance is time-consuming and prone to multiple errors. Cystatin C is a small protein which has been used as a glomerular filtration rate (GFR) marker. We investigated whether serum cystatin C concentrations are related to mortality in patients receiving PD. ♦ New and prevalent PD patients (n = 235) underwent assessment of Kt/Vurea, RRF, weekly creatinine clearance (CCr), normalized protein catabolic rate (nPCR) and a peritoneal equilibration test (PET) at intervals. Blood was collected simultaneously for cystatin C measurement. Patients were followed for a median of 1,429 days (range 12 to 2,964 days) until death or study closure. Cause of death was recorded where given. Cox regression was performed to determine whether cystatin C had prognostic value either independently or with adjustment for other factors (age, sex, dialysis modality, diabetic status, cardiovascular comorbidity, Kt/V, CCr, RRF, nPCR or 4 h dialysate to plasma creatinine ratio (4 h D/Pcr) during the PET). The primary outcomes were all-cause mortality and treatment failure. ♦ There were 93 deaths. Increasing age and 4 h D/Pcr ratio, decreased RRF and presence of diabetes were significantly [p < 0.05] negatively associated with survival and treatment failure. Serum cystatin C was not related to either outcome. ♦ Serum cystatin C concentration does not predict mortality or treatment failure in patients receiving PD. Copyright © 2016 International Society for Peritoneal Dialysis.
Cheng, Maria; Yoshiyasu, Hayato; Okano, Kenji; Ohtake, Hisao; Honda, Kohsuke
2016-01-01
Acetolactate synthase and pyruvate decarboxylase are thiamine pyrophosphate-dependent enzymes that convert pyruvate into acetolactate and acetaldehyde, respectively. Although the former are encoded in the genomes of many thermophiles and hyperthermophiles, the latter has been found only in mesophilic organisms. In this study, the reaction specificity of acetolactate synthase from Thermus thermophilus was redirected to catalyze acetaldehyde formation to develop a thermophilic pyruvate decarboxylase. Error-prone PCR and mutant library screening led to the identification of a quadruple mutant with 3.1-fold higher acetaldehyde-forming activity than the wild-type. Site-directed mutagenesis experiments revealed that the increased activity of the mutant was due to H474R amino acid substitution, which likely generated two new hydrogen bonds near the thiamine pyrophosphate-binding site. These hydrogen bonds might result in the better accessibility of H+ to the substrate-cofactor-enzyme intermediate and a shift in the reaction specificity of the enzyme. PMID:26731734
Adaptive Constructive Processes and the Future of Memory
ERIC Educational Resources Information Center
Schacter, Daniel L.
2012-01-01
Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes…
Error Rate Comparison during Polymerase Chain Reaction by DNA Polymerase
McInerney, Peter; Adams, Paul; Hadi, Masood Z.
2014-01-01
As larger-scale cloning projects become more prevalent, there is an increasing need for comparisons among high fidelity DNA polymerases used for PCR amplification. All polymerases marketed for PCR applications are tested for fidelity properties (i.e., error rate determination) by vendors, and numerous literature reports have addressed PCR enzyme fidelity. Nonetheless, it is often difficult to make direct comparisons among different enzymes due to numerous methodological and analytical differences from study to study. We have measured the error rates for 6 DNA polymerases commonly used in PCR applications, including 3 polymerases typically used for cloning applications requiring high fidelity. Error ratemore » measurement values reported here were obtained by direct sequencing of cloned PCR products. The strategy employed here allows interrogation of error rate across a very large DNA sequence space, since 94 unique DNA targets were used as templates for PCR cloning. The six enzymes included in the study, Taq polymerase, AccuPrime-Taq High Fidelity, KOD Hot Start, cloned Pfu polymerase, Phusion Hot Start, and Pwo polymerase, we find the lowest error rates with Pfu , Phusion, and Pwo polymerases. Error rates are comparable for these 3 enzymes and are >10x lower than the error rate observed with Taq polymerase. Mutation spectra are reported, with the 3 high fidelity enzymes displaying broadly similar types of mutations. For these enzymes, transition mutations predominate, with little bias observed for type of transition.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Shijun; Yao Jianhua; Liu Jiamin
Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice--Once supine and once prone--to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined bymore » the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27{+-}52.97 to 14.98 mm{+-}11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline.« less
Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.
OConnor, William; Runquist, Elizabeth A
2008-07-01
Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McInerney, Peter; Adams, Paul; Hadi, Masood Z.
As larger-scale cloning projects become more prevalent, there is an increasing need for comparisons among high fidelity DNA polymerases used for PCR amplification. All polymerases marketed for PCR applications are tested for fidelity properties (i.e., error rate determination) by vendors, and numerous literature reports have addressed PCR enzyme fidelity. Nonetheless, it is often difficult to make direct comparisons among different enzymes due to numerous methodological and analytical differences from study to study. We have measured the error rates for 6 DNA polymerases commonly used in PCR applications, including 3 polymerases typically used for cloning applications requiring high fidelity. Error ratemore » measurement values reported here were obtained by direct sequencing of cloned PCR products. The strategy employed here allows interrogation of error rate across a very large DNA sequence space, since 94 unique DNA targets were used as templates for PCR cloning. The six enzymes included in the study, Taq polymerase, AccuPrime-Taq High Fidelity, KOD Hot Start, cloned Pfu polymerase, Phusion Hot Start, and Pwo polymerase, we find the lowest error rates with Pfu , Phusion, and Pwo polymerases. Error rates are comparable for these 3 enzymes and are >10x lower than the error rate observed with Taq polymerase. Mutation spectra are reported, with the 3 high fidelity enzymes displaying broadly similar types of mutations. For these enzymes, transition mutations predominate, with little bias observed for type of transition.« less
Rajeev, K R; Menon, Smrithy S; Beena, K; Holla, Raghavendra; Kumar, R Rajaneesh; Dinesh, M
2014-01-01
A prospective study was undertaken to evaluate the influence of patient positioning on the set up variations to determine the planning target volume (PTV) margins and to evaluate the clinical relevance volume assessment of the small bowel (SB) within the irradiated volume. During the period of months from December 2011 to April 2012, a computed tomography (CT) scan was done either in supine position or in prone position using a belly board (BB) for 20 consecutive patients. All the patients had histologically proven rectal cancer and received either post- or pre-operative pelvic irradiation. Using a three-dimensional planning system, the dose-volume histogram for SB was defined in each axial CT slice. Total dose was 46-50 Gy (2 Gy/fraction), delivered using the 4-field box technique. The set up variation of the study group was assessed from the data received from the electronic portal imaging device in the linear accelerator. The shift along X, Y, and Z directions were noted. Both systematic and random errors were calculated and using both these values the PTV margin was calculated. The systematic errors of patients treated in the supine position were 0.87 (X-mm), 0.66 (Y-mm), 1.6 (Z-mm) and in the prone position were 1.3 (X-mm), 0.59 (Y-mm), 1.17 (Z-mm). The random errors of patients treated in the supine positions were 1.81 (X-mm), 1.73 (Y-mm), 1.83 (Z-mm) and in prone position were 2.02 (X-mm), 1.21 (Y-mm), 3.05 (Z-mm). The calculated PTV margins in the supine position were 3.45 (X-mm), 2.87 (Y-mm), 5.31 (Z-mm) and in the prone position were 4.91 (X-mm), 2.32 (Y-mm), 5.08 (Z-mm). The mean volume of the peritoneal cavity was 648.65 cm 3 in the prone position and 1197.37 cm 3 in the supine position. The prone position using BB device was more effective in reducing irradiated SB volume in rectal cancer patients. There were no significant variations in the daily set up for patients treated in both supine and prone positions.
Boehme, Philip; Stellberger, Thorsten; Solanki, Manish; Zhang, Wenli; Schulz, Eric; Bergmann, Thorsten; Liu, Jing; Doerner, Johannes; Baiker, Armin E.
2015-01-01
Abstract High-capacity adenoviral vectors (HCAdVs) are promising tools for gene therapy as well as for genetic engineering. However, one limitation of the HCAdV vector system is the complex, time-consuming, and labor-intensive production process and the following quality control procedure. Since HCAdVs are deleted for all viral coding sequences, a helper virus (HV) is needed in the production process to provide the sequences for all viral proteins in trans. For the purification procedure of HCAdV, cesium chloride density gradient centrifugation is usually performed followed by buffer exchange using dialysis or comparable methods. However, performing these steps is technically difficult, potentially error-prone, and not scalable. Here, we establish a new protocol for small-scale production of HCAdV based on commercially available adenovirus purification systems and a standard method for the quality control of final HCAdV preparations. For titration of final vector preparations, we established a droplet digital polymerase chain reaction (ddPCR) that uses a standard free-end-point PCR in small droplets of defined volume. By using different probes, this method is capable of detecting and quantifying HCAdV and HV in one reaction independent of reference material, rendering this method attractive for accurately comparing viral titers between different laboratories. In summary, we demonstrate that it is possible to produce HCAdV in a small scale of sufficient quality and quantity to perform experiments in cell culture, and we established a reliable protocol for vector titration based on ddPCR. Our method significantly reduces time and required equipment to perform HCAdV production. In the future the ddPCR technology could be advantageous for titration of other viral vectors commonly used in gene therapy. PMID:25640117
Bill, Anke; Rosethorne, Elizabeth M; Kent, Toby C; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P; Renaud, Nicole A; Charlton, Steven J; Gosling, Martin; Gaither, L Alex; Groot-Kormelink, Paul J
2014-01-01
The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs.
Kent, Toby C.; Fawcett, Lindsay; Burchell, Lynn; van Diepen, Michiel T.; Marelli, Anthony; Batalov, Sergey; Miraglia, Loren; Orth, Anthony P.; Renaud, Nicole A.; Charlton, Steven J.; Gosling, Martin; Gaither, L. Alex; Groot-Kormelink, Paul J.
2014-01-01
The human prostacyclin receptor (hIP receptor) is a seven-transmembrane G protein-coupled receptor (GPCR) that plays a critical role in vascular smooth muscle relaxation and platelet aggregation. hIP receptor dysfunction has been implicated in numerous cardiovascular abnormalities, including myocardial infarction, hypertension, thrombosis and atherosclerosis. Genomic sequencing has discovered several genetic variations in the PTGIR gene coding for hIP receptor, however, its structure-function relationship has not been sufficiently explored. Here we set out to investigate the applicability of high throughput random mutagenesis to study the structure-function relationship of hIP receptor. While chemical mutagenesis was not suitable to generate a mutagenesis library with sufficient coverage, our data demonstrate error-prone PCR (epPCR) mediated mutagenesis as a valuable method for the unbiased screening of residues regulating hIP receptor function and expression. Here we describe the generation and functional characterization of an epPCR derived mutagenesis library compromising >4000 mutants of the hIP receptor. We introduce next generation sequencing as a useful tool to validate the quality of mutagenesis libraries by providing information about the coverage, mutation rate and mutational bias. We identified 18 mutants of the hIP receptor that were expressed at the cell surface, but demonstrated impaired receptor function. A total of 38 non-synonymous mutations were identified within the coding region of the hIP receptor, mapping to 36 distinct residues, including several mutations previously reported to affect the signaling of the hIP receptor. Thus, our data demonstrates epPCR mediated random mutagenesis as a valuable and practical method to study the structure-function relationship of GPCRs. PMID:24886841
One-step random mutagenesis by error-prone rolling circle amplification
Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi
2004-01-01
In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684
Producing good font attribute determination using error-prone information
NASA Astrophysics Data System (ADS)
Cooperman, Robert
1997-04-01
A method to provide estimates of font attributes in an OCR system, using detectors of individual attributes that are error-prone. For an OCR system to preserve the appearance of a scanned document, it needs accurate detection of font attributes. However, OCR environments have noise and other sources of errors, tending to make font attribute detection unreliable. Certain assumptions about font use can greatly enhance accuracy. Attributes such as boldness and italics are more likely to change between neighboring words, while attributes such as serifness are less likely to change within the same paragraph. Furthermore, the document as a whole, tends to have a limited number of sets of font attributes. These assumptions allow a better use of context than the raw data, or what would be achieved by simpler methods that would oversmooth the data.
Evolving artificial metalloenzymes via random mutagenesis
NASA Astrophysics Data System (ADS)
Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.
2018-03-01
Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.
Improving furfural tolerance of Zymomonas mobilis by rewiring a sigma factor RpoD protein.
Tan, Fu-Rong; Dai, Li-Chun; Wu, Bo; Qin, Han; Shui, Zong-Xia; Wang, Jing-Li; Zhu, Qi-Li; Hu, Qi-Chun; Ruan, Zhi-Yong; He, Ming-Xiong
2015-06-01
Furfural from lignocellulosic hydrolysates is the key inhibitor for bio-ethanol fermentation. In this study, we report a strategy of improving the furfural tolerance in Zymomonas mobilis on the transcriptional level by engineering its global transcription sigma factor (σ(70), RpoD) protein. Three furfural tolerance RpoD mutants (ZM4-MF1, ZM4-MF2, and ZM4-MF3) were identified from error-prone PCR libraries. The best furfural-tolerance strain ZM4-MF2 reached to the maximal cell density (OD600) about 2.0 after approximately 30 h, while control strain ZM4-rpoD reached its highest cell density of about 1.3 under the same conditions. ZM4-MF2 also consumed glucose faster and yield higher ethanol; expression levels and key Entner-Doudoroff (ED) pathway enzymatic activities were also compared to control strain under furfural stress condition. Our results suggest that global transcription machinery engineering could potentially be used to improve stress tolerance and ethanol production in Z. mobilis.
Teaching Statistics Online Using "Excel"
ERIC Educational Resources Information Center
Jerome, Lawrence
2011-01-01
As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…
Anticipating cognitive effort: roles of perceived error-likelihood and time demands.
Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F
2017-11-13
Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.
Situating Student Errors: Linguistic-to-Algebra Translation Errors
ERIC Educational Resources Information Center
Adu-Gyamfi, Kwaku; Bossé, Michael J.; Chandler, Kayla
2015-01-01
While it is well recognized that students are prone to difficulties when performing linguistic-to-algebra translations, the nature of students' difficulties remain an issue of contention. Moreover, the literature indicates that these difficulties are not easily remediated by domain-specific instruction. Some have opined that this is the case…
Errors of Inference in Structural Equation Modeling
ERIC Educational Resources Information Center
McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.
2007-01-01
Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…
Single-Cell RT-PCR in Microfluidic Droplets with Integrated Chemical Lysis.
Kim, Samuel C; Clark, Iain C; Shahi, Payam; Abate, Adam R
2018-01-16
Droplet microfluidics can identify and sort cells using digital reverse transcription polymerase chain reaction (RT-PCR) signals from individual cells. However, current methods require multiple microfabricated devices for enzymatic cell lysis and PCR reagent addition, making the process complex and prone to failure. Here, we describe a new approach that integrates all components into a single device. The method enables controlled exposure of isolated single cells to a high pH buffer, which lyses cells and inactivates reaction inhibitors but can be instantly neutralized with RT-PCR buffer. Using our chemical lysis approach, we distinguish individual cells' gene expression with data quality equivalent to more complex two-step workflows. Our system accepts cells and produces droplets ready for amplification, making single-cell droplet RT-PCR faster and more reliable.
Exploring the relationship between boredom and sustained attention.
Malkovsky, Ela; Merrifield, Colleen; Goldberg, Yael; Danckert, James
2012-08-01
Boredom is a common experience, prevalent in neurological and psychiatric populations, yet its cognitive characteristics remain poorly understood. We explored the relationship between boredom proneness, sustained attention and adult symptoms of attention deficit hyperactivity disorder (ADHD). The results showed that high boredom-prone individuals (HBP) performed poorly on measures of sustained attention and showed increased symptoms of ADHD and depression. The results also showed that HBP individuals can be characterised as either apathetic-in which the individual is unconcerned with his/her environment, or as agitated-in which the individual is motivated to engage in meaningful activities, although attempts to do so fail to satisfy. Apathetic boredom proneness was associated with attention lapses, whereas agitated boredom proneness was associated with decreased sensitivity to errors of sustained attention, and increased symptoms of adult ADHD. Our results suggest there is a complex relationship between attention and boredom proneness.
The Concept of Accident Proneness: A Review
Froggatt, Peter; Smiley, James A.
1964-01-01
The term accident proneness was coined by psychological research workers in 1926. Since then its concept—that certain individuals are always more likely than others to sustain accidents, even though exposed to equal risk—has been questioned but seldom seriously challenged. This article describes much of the work and theory on which this concept is based, details the difficulties encountered in obtaining valid information and the interpretative errors that can arise from the examination of imperfect data, and explains why accident proneness became so readily accepted as an explanation of the facts. A recent hypothesis of accident causation, namely that a person's accident liability may vary from time to time, is outlined, and the respective abilities of this and of accident proneness to accord with data from the more reliable literature are examined. The authors conclude that the hypothesis of individual variation in liability is more realistic and in better agreement with the data than is accident proneness. PMID:14106130
Baek, Seung Cheol; Ho, Thien-Hoang; Lee, Hyun Woo; Jung, Won Kyeong; Gang, Hyo-Seung; Kang, Lin-Woo; Kim, Hoon
2017-05-01
β-1,3-1,4-Glucanase (BGlc8H) from Paenibacillus sp. X4 was mutated by error-prone PCR or truncated using termination primers to improve its enzyme properties. The crystal structure of BGlc8H was determined at a resolution of 1.8 Å to study the possible roles of mutated residues and truncated regions of the enzyme. In mutation experiments, three clones of EP 2-6, 2-10, and 5-28 were finally selected that exhibited higher specific activities than the wild type when measured using their crude extracts. Enzyme variants of BG 2-6 , BG 2-10 , and BG 5-28 were mutated at two, two, and six amino acid residues, respectively. These enzymes were purified homogeneously by Hi-Trap Q and CHT-II chromatography. Specific activity of BG 5-28 was 2.11-fold higher than that of wild-type BG wt , whereas those of BG 2-6 and BG 2-10 were 0.93- and 1.19-fold that of the wild type, respectively. The optimum pH values and temperatures of the variants were nearly the same as those of BG wt (pH 5.0 and 40 °C, respectively). However, the half-life of the enzyme activity and catalytic efficiency (k cat /K m ) of BG 5-28 were 1.92- and 2.12-fold greater than those of BG wt at 40 °C, respectively. The catalytic efficiency of BG 5-28 increased to 3.09-fold that of BG wt at 60 °C. These increases in the thermostability and catalytic efficiency of BG 5-28 might be useful for the hydrolysis of β-glucans to produce fermentable sugars. Of the six mutated residues of BG 5-28 , five residues were present in mature BGlc8H protein, and two of them were located in the core scaffold of BGlc8H and the remaining three residues were in the substrate-binding pocket forming loop regions. In truncation experiments, three forms of C-terminal truncated BGlc8H were made, which comprised 360, 286, and 215 amino acid residues instead of the 409 residues of the wild type. No enzyme activity was observed for these truncated enzymes, suggesting the complete scaffold of the α 6 /α 6 -double-barrel structure is essential for enzyme activity.
Wang, Shijun; Yao, Jianhua; Liu, Jiamin; Petrick, Nicholas; Van Uitert, Robert L.; Periaswamy, Senthil; Summers, Ronald M.
2009-01-01
Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice—Once supine and once prone—to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined by the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27±52.97 to 14.98 mm±11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline. PMID:20095272
Eisenberg, Dan T A; Kuzawa, Christopher W; Hayes, M Geoffrey
2015-01-01
Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although, easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age, and between mother and offspring are examined. First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error as compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity in our study. Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, rerunning analyses of previous results with well position correction could serve as an independent test of the validity of these results. © 2015 Wiley Periodicals, Inc.
High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization
Rodríguez-Escribano, David; de Salas, Felipe; Camarero, Susana
2017-01-01
Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin–Ciocalteau reagent (FCR), to detect the decrease in phenolic content produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). The method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates. PMID:28820431
Störmer, M; Cassens, U; Kleesiek, K; Dreier, J
2007-02-01
Bacteria show differences in their growth kinetics depending on the type of blood component. On to storage at 22 degrees C, platelet concentrates (PCs) seem to be more prone to bacterial multiplication than red cell concentrates. Knowledge of the potential for bacterial proliferation in blood components, which are stored at a range of temperatures, is essential before considering implementation of a detection strategy. The efficacy of bacterial detection was determined, using real-time reverse transcriptase-polymerase chain reaction (RT-PCR), following bacterial growth in blood components obtained from a deliberately contaminated whole-blood (WB) unit. Cultivation was used as the reference method. WB was spiked with 2 colony-forming units mL(-1)Staphylococcus epidermidis or Klebsiella pneumoniae, kept for 15 h at room temperature and component preparation was processed. Samples were drawn, at intervals throughout the whole separation process, from each blood component. Nucleic acids were extracted using an automated high-volume extraction method. The 15-h storage revealed an insignificant increase in bacterial titre. No bacterial growth was detected in red blood cell or plasma units. K. pneumoniae showed rapid growth in the pooled PC and could be detected immediately after preparation using RT-PCR. S. epidermidis grew slowly and was detected 24 h after separation. These experiments show that sampling is indicative at 24 h after preparation of PCs at the earliest to minimize the sampling error.
High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez-Escribano, David; de Salas, Felipe; Pardo, Isabel
Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin-Ciocalteau reagent (FCR), to detect the decrease in phenolic contentmore » produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). As a result, the method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates.« less
High-Throughput Screening Assay for Laccase Engineering toward Lignosulfonate Valorization
Rodriguez-Escribano, David; de Salas, Felipe; Pardo, Isabel; ...
2017-08-18
Lignin valorization is a pending issue for the integrated conversion of lignocellulose in consumer goods. Lignosulfonates (LS) are the main technical lignins commercialized today. However, their molecular weight should be enlarged to meet application requirements as additives or dispersing agents. Oxidation of lignosulfonates with fungal oxidoreductases, such as laccases, can increase the molecular weight of lignosulfonates by the cross-linking of lignin phenols. To advance in this direction, we describe here the development of a high-throughput screening (HTS) assay for the directed evolution of laccases, with lignosulfonate as substrate and the Folin-Ciocalteau reagent (FCR), to detect the decrease in phenolic contentmore » produced upon polymerization of lignosulfonate by the enzyme. Once the reaction conditions were adjusted to the 96-well-plate format, the enzyme for validating the assay was selected from a battery of high-redox-potential laccase variants functionally expressed in S. cerevisiae (the preferred host for the directed evolution of fungal oxidoreductases). The colorimetric response (absorbance at 760 nm) correlated with laccase activity secreted by the yeast. The HTS assay was reproducible (coefficient of variation (CV) = 15%) and sensitive enough to detect subtle differences in activity among yeast clones expressing a laccase mutant library obtained by error-prone PCR (epPCR). As a result, the method is therefore feasible for screening thousands of clones during the precise engineering of laccases toward valorization of lignosulfonates.« less
Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-06-23
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
NASA Astrophysics Data System (ADS)
Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-07-01
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
1980-03-01
interpreting/smoothing data containing a significant percentage of gross errors, and thus is ideally suited for applications in automated image ... analysis where interpretation is based on the data provided by error-prone feature detectors. A major portion of the paper describes the application of
Zhou, Shuntai; Jones, Corbin; Mieczkowski, Piotr
2015-01-01
ABSTRACT Validating the sampling depth and reducing sequencing errors are critical for studies of viral populations using next-generation sequencing (NGS). We previously described the use of Primer ID to tag each viral RNA template with a block of degenerate nucleotides in the cDNA primer. We now show that low-abundance Primer IDs (offspring Primer IDs) are generated due to PCR/sequencing errors. These artifactual Primer IDs can be removed using a cutoff model for the number of reads required to make a template consensus sequence. We have modeled the fraction of sequences lost due to Primer ID resampling. For a typical sequencing run, less than 10% of the raw reads are lost to offspring Primer ID filtering and resampling. The remaining raw reads are used to correct for PCR resampling and sequencing errors. We also demonstrate that Primer ID reveals bias intrinsic to PCR, especially at low template input or utilization. cDNA synthesis and PCR convert ca. 20% of RNA templates into recoverable sequences, and 30-fold sequence coverage recovers most of these template sequences. We have directly measured the residual error rate to be around 1 in 10,000 nucleotides. We use this error rate and the Poisson distribution to define the cutoff to identify preexisting drug resistance mutations at low abundance in an HIV-infected subject. Collectively, these studies show that >90% of the raw sequence reads can be used to validate template sampling depth and to dramatically reduce the error rate in assessing a genetically diverse viral population using NGS. IMPORTANCE Although next-generation sequencing (NGS) has revolutionized sequencing strategies, it suffers from serious limitations in defining sequence heterogeneity in a genetically diverse population, such as HIV-1 due to PCR resampling and PCR/sequencing errors. The Primer ID approach reveals the true sampling depth and greatly reduces errors. Knowing the sampling depth allows the construction of a model of how to maximize the recovery of sequences from input templates and to reduce resampling of the Primer ID so that appropriate multiplexing can be included in the experimental design. With the defined sampling depth and measured error rate, we are able to assign cutoffs for the accurate detection of minority variants in viral populations. This approach allows the power of NGS to be realized without having to guess about sampling depth or to ignore the problem of PCR resampling, while also being able to correct most of the errors in the data set. PMID:26041299
Roon, David A.; Waits, L.P.; Kendall, K.C.
2005-01-01
Non-invasive genetic sampling (NGS) is becoming a popular tool for population estimation. However, multiple NGS studies have demonstrated that polymerase chain reaction (PCR) genotyping errors can bias demographic estimates. These errors can be detected by comprehensive data filters such as the multiple-tubes approach, but this approach is expensive and time consuming as it requires three to eight PCR replicates per locus. Thus, researchers have attempted to correct PCR errors in NGS datasets using non-comprehensive error checking methods, but these approaches have not been evaluated for reliability. We simulated NGS studies with and without PCR error and 'filtered' datasets using non-comprehensive approaches derived from published studies and calculated mark-recapture estimates using CAPTURE. In the absence of data-filtering, simulated error resulted in serious inflations in CAPTURE estimates; some estimates exceeded N by ??? 200%. When data filters were used, CAPTURE estimate reliability varied with per-locus error (E??). At E?? = 0.01, CAPTURE estimates from filtered data displayed < 5% deviance from error-free estimates. When E?? was 0.05 or 0.09, some CAPTURE estimates from filtered data displayed biases in excess of 10%. Biases were positive at high sampling intensities; negative biases were observed at low sampling intensities. We caution researchers against using non-comprehensive data filters in NGS studies, unless they can achieve baseline per-locus error rates below 0.05 and, ideally, near 0.01. However, we suggest that data filters can be combined with careful technique and thoughtful NGS study design to yield accurate demographic information. ?? 2005 The Zoological Society of London.
Bentley, Johanne; Diggle, Christine P.; Harnden, Patricia; Knowles, Margaret A.; Kiltie, Anne E.
2004-01-01
In human cells DNA double strand breaks (DSBs) can be repaired by the non-homologous end-joining (NHEJ) pathway. In a background of NHEJ deficiency, DSBs with mismatched ends can be joined by an error-prone mechanism involving joining between regions of nucleotide microhomology. The majority of joins formed from a DSB with partially incompatible 3′ overhangs by cell-free extracts from human glioblastoma (MO59K) and urothelial (NHU) cell lines were accurate and produced by the overlap/fill-in of mismatched termini by NHEJ. However, repair of DSBs by extracts using tissue from four high-grade bladder carcinomas resulted in no accurate join formation. Junctions were formed by the non-random deletion of terminal nucleotides and showed a preference for annealing at a microhomology of 8 nt buried within the DNA substrate; this process was not dependent on functional Ku70, DNA-PK or XRCC4. Junctions were repaired in the same manner in MO59K extracts in which accurate NHEJ was inactivated by inhibition of Ku70 or DNA-PKcs. These data indicate that bladder tumour extracts are unable to perform accurate NHEJ such that error-prone joining predominates. Therefore, in high-grade tumours mismatched DSBs are repaired by a highly mutagenic, microhomology-mediated, alternative end-joining pathway, a process that may contribute to genomic instability observed in bladder cancer. PMID:15466592
Niemann, Dorothee; Bertsche, Astrid; Meyrath, David; Koepf, Ellen D; Traiser, Carolin; Seebald, Katja; Schmitt, Claus P; Hoffmann, Georg F; Haefeli, Walter E; Bertsche, Thilo
2015-01-01
To prevent medication errors in drug handling in a paediatric ward. One in five preventable adverse drug events in hospitalised children is caused by medication errors. Errors in drug prescription have been studied frequently, but data regarding drug handling, including drug preparation and administration, are scarce. A three-step intervention study including monitoring procedure was used to detect and prevent medication errors in drug handling. After approval by the ethics committee, pharmacists monitored drug handling by nurses on an 18-bed paediatric ward in a university hospital prior to and following each intervention step. They also conducted a questionnaire survey aimed at identifying knowledge deficits. Each intervention step targeted different causes of errors. The handout mainly addressed knowledge deficits, the training course addressed errors caused by rule violations and slips, and the reference book addressed knowledge-, memory- and rule-based errors. The number of patients who were subjected to at least one medication error in drug handling decreased from 38/43 (88%) to 25/51 (49%) following the third intervention, and the overall frequency of errors decreased from 527 errors in 581 processes (91%) to 116/441 (26%). The issue of the handout reduced medication errors caused by knowledge deficits regarding, for instance, the correct 'volume of solvent for IV drugs' from 49-25%. Paediatric drug handling is prone to errors. A three-step intervention effectively decreased the high frequency of medication errors by addressing the diversity of their causes. Worldwide, nurses are in charge of drug handling, which constitutes an error-prone but often-neglected step in drug therapy. Detection and prevention of errors in daily routine is necessary for a safe and effective drug therapy. Our three-step intervention reduced errors and is suitable to be tested in other wards and settings. © 2014 John Wiley & Sons Ltd.
Yeow, Jianwei; Wang, Ivy; Zhang, Hongfang; Song, Hao; Jiang, Rongrong
2013-01-01
A major challenge in bioethanol fermentation is the low tolerance of the microbial host towards the end product bioethanol. Here we report to improve the ethanol tolerance of E. coli from the transcriptional level by engineering its global transcription factor cAMP receptor protein (CRP), which is known to regulate over 400 genes in E. coli. Three ethanol tolerant CRP mutants (E1– E3) were identified from error-prone PCR libraries. The best ethanol-tolerant strain E2 (M59T) had the growth rate of 0.08 h−1 in 62 g/L ethanol, higher than that of the control at 0.06 h−1. The M59T mutation was then integrated into the genome to create variant iE2. When exposed to 150 g/l ethanol, the survival of iE2 after 15 min was about 12%, while that of BW25113 was <0.01%. Quantitative real-time reverse transcription PCR analysis (RT-PCR) on 444 CRP-regulated genes using OpenArray® technology revealed that 203 genes were differentially expressed in iE2 in the absence of ethanol, whereas 92 displayed differential expression when facing ethanol stress. These genes belong to various functional groups, including central intermediary metabolism (aceE, acnA, sdhD, sucA), iron ion transport (entH, entD, fecA, fecB), and general stress response (osmY, rpoS). Six up-regulated and twelve down-regulated common genes were found in both iE2 and E2 under ethanol stress, whereas over one hundred common genes showed differential expression in the absence of ethanol. Based on the RT-PCR results, entA, marA or bhsA was knocked out in iE2 and the resulting strains became more sensitive towards ethanol. PMID:23469036
De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly.
Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan
2015-11-26
Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm.
Comparing errors in Medicaid reporting across surveys: evidence to date.
Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria
2013-04-01
To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. All available validation studies. Compare results from existing research to understand variation in reporting across surveys. Synthesize all available studies validating survey reports of Medicaid coverage. Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. © Health Research and Educational Trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, L.T.; Man, Nguyen Thi; Morris, G.E.
1995-08-28
We have produced a new panel of 20 monoclonal antibodies (mAbs) against a region of the dystrophin protein corresponding to a deletion-prone region of the Duchenne muscular dystrophy gene (exons 45-50). We show that immunohistochemistry or Western blotting with these {open_quotes}exon-specific{close_quotes} mAbs can provide a valuable addition to Southern blotting or PCR methods for the accurate identification of genetic deletions in Becker muscular dystrophy patients. The antibodies were mapped to the following exons: exon 45 (2 mAbs), exon 46 (6), exon 47 (1), exons 47/48 (4), exons 48-50 (6), and exon 50 (1). PCR amplification of single exons or groupsmore » of exons was used both to produce specific dystrophin immunogens and to map the mAbs obtained. PCR-mediated mutagenesis was also used to identify regions of dystrophin important for mAb binding. Because the mAbs can be used to characterize the dystrophin produced by individual muscle fibres, they will also be useful for studying {open_quotes}revertant{close_quotes} fibres in Duchenne muscle and for monitoring the results of myoblast therapy trials in MD patients with deletions in this region of the dystrophin gene. 27 refs., 7 figs., 3 tabs.« less
ERIC Educational Resources Information Center
Al Baghal, Tarek
2017-01-01
Prior studies suggest memories are potentially error prone. Proactive dependent interviewing (PDI) is a possible method to reduce errors in reports of change in longitudinal studies, reminding respondents of previous answers while asking if there has been any change since the last survey. However, little research has been conducted on the impact…
The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.
ERIC Educational Resources Information Center
System Development Corp., Falls Church, VA.
An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…
Estimation of a cover-type change matrix from error-prone data
Steen Magnussen
2009-01-01
Coregistration and classification errors seriously compromise per-pixel estimates of land cover change. A more robust estimation of change is proposed in which adjacent pixels are grouped into 3x3 clusters and treated as a unit of observation. A complete change matrix is recovered in a two-step process. The diagonal elements of a change matrix are recovered from...
EEG and chaos: Description of underlying dynamics and its relation to dissociative states
NASA Technical Reports Server (NTRS)
Ray, William J.
1994-01-01
The goal of this work is the identification of states especially as related to the process of error production and lapses of awareness as might be experienced during aviation. Given the need for further articulation of the characteristics of 'error prone state' or 'hazardous state of awareness,' this NASA grant focused on basic ground work for the study of the psychophysiology of these states. In specific, the purpose of this grant was to establish the necessary methodology for addressing three broad questions. The first is how the error prone state should be conceptualized, and whether it is similar to a dissociative state, a hypnotic state, or absent mindedness. Over 1200 subjects completed a variety of psychometric measures reflecting internal states and proneness to mental lapses and absent mindedness; the study suggests that there exists a consistency of patterns displayed by individuals who self-report dissociative experiences such that those individuals who score high on measures of dissociation also score high on measures of absent mindedness, errors, and absorption, but not on scales of hypnotizability. The second broad question is whether some individuals are more prone to enter these states than others. A study of 14 young adults who scored either high or low on the dissociation experiences scale performed a series of six tasks. This study suggests that high and low dissociative individuals arrive at the experiment in similar electrocortical states and perform cognitive tasks (e.g., mental math) in a similar manner; it is in the processing of internal emotional states that differences begin to emerge. The third question to be answered is whether recent research in nonlinear dynamics, i.e., chaos, offer an addition and/or alternative to traditional signal processing methods, i.e., fast Fourier transforms, and whether chaos procedures can be modified to offer additional information useful in identifying brain states. A preliminary review suggests that current nonlinear dynamical techniques such as dimensional analysis can be successfully applied to electrocortical activity. Using the data set developed in the study of the young adults, chaos analyses using the Farmer algorithm were performed; it is concluded that dimensionality measures reflect information not contained in traditional EEG Fourier analysis.
In-vitro engineering of novel bioactivity in the natural enzymes
NASA Astrophysics Data System (ADS)
Tiwari, Vishvanath
2016-10-01
Enzymes catalyze various biochemical functions with high efficiency and specificity. In-vitro design of the enzyme leads to novel bioactivity in this natural biomolecule that give answers of some vital questions like crucial residues in binding with substrate, molecular evolution, cofactor specificity etc. Enzyme engineering technology involves directed evolution, rational designing, semi-rational designing and structure-based designing using chemical modifications. Similarly, combined computational and in-vitro evolution approaches together help in artificial designing of novel bioactivity in the natural enzyme. DNA shuffling, error prone PCR and staggered extension process are used to artificially redesign active site of enzyme, which can alter its efficiency and specificity. Modifications of the enzyme can lead to the discovery of new path of molecular evolution, designing of efficient enzymes, locating active sites and crucial residues, shift in substrate and cofactor specificity. The methods and thermodynamics of in-vitro designing of the enzyme are also discussed. Similarly, engineered thermophilic and psychrophilic enzymes attain substrate specificity and activity of mesophilic enzymes that may also be beneficial for industry and therapeutics.
[A new method of processing quantitative PCR data].
Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun
2003-05-01
Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.
Comparing Errors in Medicaid Reporting across Surveys: Evidence to Date
Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria
2013-01-01
Objective To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. Data Sources All available validation studies. Study Design Compare results from existing research to understand variation in reporting across surveys. Data Collection Methods Synthesize all available studies validating survey reports of Medicaid coverage. Principal Findings Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Conclusions Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. PMID:22816493
Weeden, Clare E.; Chen, Yunshun; Ma, Stephen B.; Hu, Yifang; Ramm, Georg; Sutherland, Kate D.; Smyth, Gordon K.
2017-01-01
Lung squamous cell carcinoma (SqCC), the second most common subtype of lung cancer, is strongly associated with tobacco smoking and exhibits genomic instability. The cellular origins and molecular processes that contribute to SqCC formation are largely unexplored. Here we show that human basal stem cells (BSCs) isolated from heavy smokers proliferate extensively, whereas their alveolar progenitor cell counterparts have limited colony-forming capacity. We demonstrate that this difference arises in part because of the ability of BSCs to repair their DNA more efficiently than alveolar cells following ionizing radiation or chemical-induced DNA damage. Analysis of mice harbouring a mutation in the DNA-dependent protein kinase catalytic subunit (DNA-PKcs), a key enzyme in DNA damage repair by nonhomologous end joining (NHEJ), indicated that BSCs preferentially repair their DNA by this error-prone process. Interestingly, polyploidy, a phenomenon associated with genetically unstable cells, was only observed in the human BSC subset. Expression signature analysis indicated that BSCs are the likely cells of origin of human SqCC and that high levels of NHEJ genes in SqCC are correlated with increasing genomic instability. Hence, our results favour a model in which heavy smoking promotes proliferation of BSCs, and their predilection for error-prone NHEJ could lead to the high mutagenic burden that culminates in SqCC. Targeting DNA repair processes may therefore have a role in the prevention and therapy of SqCC. PMID:28125611
Sauer, Juergen; Chavaillaz, Alain; Wastell, David
2016-06-01
This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.
Peripheral blood antigen presenting cell responses in otitis-prone and non-otitis-prone infants.
Surendran, Naveen; Nicolosi, Ted; Kaur, Ravinder; Pichichero, Michael E
2016-01-01
Stringently defined otitis-prone (sOP) children represent a new classification of the otitis-prone condition. Previous studies showed dysfunction in Ab, B-cell memory and T-cell memory responses. We sought to determine whether there are defects in numbers, phenotype and/or function of professional APC in the peripheral blood of sOP infants. APC phenotypic counts, MHC II expression and intracellular cytokine levels were determined in response to TLR7/8 (R848) stimulation by flow cytometry. Innate immune mRNA expression was measured using RT-PCR and cytokines were measured using Luminex technology. Significant (P < 0.05) increases in the phenotypic counts of monocytes and conventional dendritic cells but not plasmacytoid DCs were observed in sOP compared with non-otitis-prone (NOP) age-matched infants. No significant differences in APC activation or function were observed. Expression of various TLRs, intracellular signaling molecules and downstream cytokines was also not found to be significantly different between sOP and NOP infants. Higher numbers of APCs in sOP infants suggest the possibility of a persistent mucosal inflammatory status. Transcriptional and cytokine profiles of PBMCs among sOP infants suggest their systemic innate responses are not different compared to NOP infants. © The Author(s) 2015.
STARS Proceedings (3-4 December 1991)
1991-12-04
PROJECT PROCESS OBJECTIVES & ASSOCIATED METRICS: Prioritize ECPs: complexity & error-history measures 0 Make vs Buy decisions: Effort & Quality (or...history measures, error- proneness and past histories of trouble with particular modules are very useful measures. Make vs Buy decisions: Does the...Effort offset the gain in Quality relative to buy ... Effort and Quality (or defect rate) histories give helpful indications of how to make this decision
Defense Mapping Agency (DMA) Raster-to-Vector Analysis
1984-11-30
model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected
Inducible DNA-repair systems in yeast: competition for lesions.
Mitchel, R E; Morrison, D P
1987-03-01
DNA lesions may be recognized and repaired by more than one DNA-repair process. If two repair systems with different error frequencies have overlapping lesion specificity and one or both is inducible, the resulting variable competition for the lesions can change the biological consequences of these lesions. This concept was demonstrated by observing mutation in yeast cells (Saccharomyces cerevisiae) exposed to combinations of mutagens under conditions which influenced the induction of error-free recombinational repair or error-prone repair. Total mutation frequency was reduced in a manner proportional to the dose of 60Co-gamma- or 254 nm UV radiation delivered prior to or subsequent to an MNNG exposure. Suppression was greater per unit radiation dose in cells gamma-irradiated in O2 as compared to N2. A rad3 (excision-repair) mutant gave results similar to wild-type but mutation in a rad52 (rec-) mutant exposed to MNNG was not suppressed by radiation. Protein-synthesis inhibition with heat shock or cycloheximide indicated that it was the mutation due to MNNG and not that due to radiation which had changed. These results indicate that MNNG lesions are recognized by both the recombinational repair system and the inducible error-prone system, but that gamma-radiation induction of error-free recombinational repair resulted in increased competition for the lesions, thereby reducing mutation. Similarly, gamma-radiation exposure resulted in a radiation dose-dependent reduction in mutation due to MNU, EMS, ENU and 8-MOP + UVA, but no reduction in mutation due to MMS. These results suggest that the number of mutational MMS lesions recognizable by the recombinational repair system must be very small relative to those produced by the other agents. MNNG induction of the inducible error-prone systems however, did not alter mutation frequencies due to ENU or MMS exposure but, in contrast to radiation, increased the mutagenic effectiveness of EMS. These experiments demonstrate that in this lower eukaryote, mutagen exposure does not necessarily result in a fixed risk of mutation, but that the risk can be markedly influenced by a variety of external stimuli including heat shock or exposure to other mutagens.
Haliasos, N; Rezajooi, K; O'neill, K S; Van Dellen, J; Hudovsky, Anita; Nouraei, Sar
2010-04-01
Clinical coding is the translation of documented clinical activities during an admission to a codified language. Healthcare Resource Groupings (HRGs) are derived from coding data and are used to calculate payment to hospitals in England, Wales and Scotland and to conduct national audit and benchmarking exercises. Coding is an error-prone process and an understanding of its accuracy within neurosurgery is critical for financial, organizational and clinical governance purposes. We undertook a multidisciplinary audit of neurosurgical clinical coding accuracy. Neurosurgeons trained in coding assessed the accuracy of 386 patient episodes. Where clinicians felt a coding error was present, the case was discussed with an experienced clinical coder. Concordance between the initial coder-only clinical coding and the final clinician-coder multidisciplinary coding was assessed. At least one coding error occurred in 71/386 patients (18.4%). There were 36 diagnosis and 93 procedure errors and in 40 cases, the initial HRG changed (10.4%). Financially, this translated to pound111 revenue-loss per patient episode and projected to pound171,452 of annual loss to the department. 85% of all coding errors were due to accumulation of coding changes that occurred only once in the whole data set. Neurosurgical clinical coding is error-prone. This is financially disadvantageous and with the coding data being the source of comparisons within and between departments, coding inaccuracies paint a distorted picture of departmental activity and subspecialism in audit and benchmarking. Clinical engagement improves accuracy and is encouraged within a clinical governance framework.
Clustered Mutation Signatures Reveal that Error-Prone DNA Repair Targets Mutations to Active Genes.
Supek, Fran; Lehner, Ben
2017-07-27
Many processes can cause the same nucleotide change in a genome, making the identification of the mechanisms causing mutations a difficult challenge. Here, we show that clustered mutations provide a more precise fingerprint of mutagenic processes. Of nine clustered mutation signatures identified from >1,000 tumor genomes, three relate to variable APOBEC activity and three are associated with tobacco smoking. An additional signature matches the spectrum of translesion DNA polymerase eta (POLH). In lymphoid cells, these mutations target promoters, consistent with AID-initiated somatic hypermutation. In solid tumors, however, they are associated with UV exposure and alcohol consumption and target the H3K36me3 chromatin of active genes in a mismatch repair (MMR)-dependent manner. These regions normally have a low mutation rate because error-free MMR also targets H3K36me3 chromatin. Carcinogens and error-prone repair therefore redistribute mutations to the more important regions of the genome, contributing a substantial mutation load in many tumors, including driver mutations. Copyright © 2017 Elsevier Inc. All rights reserved.
A hydrostatic weighing method using total lung capacity and a small tank.
Warner, J G; Yeater, R; Sherwood, L; Weber, K
1986-01-01
The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing. PMID:3697596
A hydrostatic weighing method using total lung capacity and a small tank.
Warner, J G; Yeater, R; Sherwood, L; Weber, K
1986-03-01
The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing.
Stem revenue losses with effective CDM management.
Alwell, Michael
2003-09-01
Effective CDM management not only minimizes revenue losses due to denied claims, but also helps eliminate administrative costs associated with correcting coding errors. Accountability for CDM management should be assigned to a single individual, who ideally reports to the CFO or high-level finance director. If your organization is prone to making billing errors due to CDM deficiencies, you should consider purchasing CDM software to help you manage your CDM.
Baumann, Claudia; Wang, Xiaotian; Yang, Luhan; Viveiros, Maria M
2017-04-01
Mouse oocytes lack canonical centrosomes and instead contain unique acentriolar microtubule-organizing centers (aMTOCs). To test the function of these distinct aMTOCs in meiotic spindle formation, pericentrin (Pcnt), an essential centrosome/MTOC protein, was knocked down exclusively in oocytes by using a transgenic RNAi approach. Here, we provide evidence that disruption of aMTOC function in oocytes promotes spindle instability and severe meiotic errors that lead to pronounced female subfertility. Pcnt-depleted oocytes from transgenic (Tg) mice were ovulated at the metaphase-II stage, but show significant chromosome misalignment, aneuploidy and premature sister chromatid separation. These defects were associated with loss of key Pcnt-interacting proteins (γ-tubulin, Nedd1 and Cep215) from meiotic spindle poles, altered spindle structure and chromosome-microtubule attachment errors. Live-cell imaging revealed disruptions in the dynamics of spindle assembly and organization, together with chromosome attachment and congression defects. Notably, spindle formation was dependent on Ran GTPase activity in Pcnt-deficient oocytes. Our findings establish that meiotic division is highly error-prone in the absence of Pcnt and disrupted aMTOCs, similar to what reportedly occurs in human oocytes. Moreover, these data underscore crucial differences between MTOC-dependent and -independent meiotic spindle assembly. © 2017. Published by The Company of Biologists Ltd.
Qualitative and quantitative assessment of Illumina's forensic STR and SNP kits on MiSeq FGx™.
Sharma, Vishakha; Chow, Hoi Yan; Siegel, Donald; Wurmbach, Elisa
2017-01-01
Massively parallel sequencing (MPS) is a powerful tool transforming DNA analysis in multiple fields ranging from medicine, to environmental science, to evolutionary biology. In forensic applications, MPS offers the ability to significantly increase the discriminatory power of human identification as well as aid in mixture deconvolution. However, before the benefits of any new technology can be employed, a thorough evaluation of its quality, consistency, sensitivity, and specificity must be rigorously evaluated in order to gain a detailed understanding of the technique including sources of error, error rates, and other restrictions/limitations. This extensive study assessed the performance of Illumina's MiSeq FGx MPS system and ForenSeq™ kit in nine experimental runs including 314 reaction samples. In-depth data analysis evaluated the consequences of different assay conditions on test results. Variables included: sample numbers per run, targets per run, DNA input per sample, and replications. Results are presented as heat maps revealing patterns for each locus. Data analysis focused on read numbers (allele coverage), drop-outs, drop-ins, and sequence analysis. The study revealed that loci with high read numbers performed better and resulted in fewer drop-outs and well balanced heterozygous alleles. Several loci were prone to drop-outs which led to falsely typed homozygotes and therefore to genotype errors. Sequence analysis of allele drop-in typically revealed a single nucleotide change (deletion, insertion, or substitution). Analyses of sequences, no template controls, and spurious alleles suggest no contamination during library preparation, pooling, and sequencing, but indicate that sequencing or PCR errors may have occurred due to DNA polymerase infidelities. Finally, we found utilizing Illumina's FGx System at recommended conditions does not guarantee 100% outcomes for all samples tested, including the positive control, and required manual editing due to low read numbers and/or allele drop-in. These findings are important for progressing towards implementation of MPS in forensic DNA testing.
Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša
2018-05-03
RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.
De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly
Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan
2015-01-01
Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm. DOI: http://dx.doi.org/10.7554/eLife.10586.001 PMID:26609813
Isolation and characterization of high affinity aptamers against DNA polymerase iota.
Lakhin, Andrei V; Kazakov, Andrei A; Makarova, Alena V; Pavlov, Yuri I; Efremova, Anna S; Shram, Stanislav I; Tarantul, Viacheslav Z; Gening, Leonid V
2012-02-01
Human DNA-polymerase iota (Pol ι) is an extremely error-prone enzyme and the fidelity depends on the sequence context of the template. Using the in vitro systematic evolution of ligands by exponential enrichment (SELEX) procedure, we obtained an oligoribonucleotide with a high affinity to human Pol ι, named aptamer IKL5. We determined its dissociation constant with homogenous preparation of Pol ι and predicted its putative secondary structure. The aptamer IKL5 specifically inhibits DNA-polymerase activity of the purified enzyme Pol ι, but did not inhibit the DNA-polymerase activities of human DNA polymerases beta and kappa. IKL5 suppressed the error-prone DNA-polymerase activity of Pol ι also in cellular extracts of the tumor cell line SKOV-3. The aptamer IKL5 is useful for studies of the biological role of Pol ι and as a potential drug to suppress the increase of the activity of this enzyme in malignant cells.
Design and optimization of reverse-transcription quantitative PCR experiments.
Tichopad, Ales; Kitchen, Rob; Riedmaier, Irmgard; Becker, Christiane; Ståhlberg, Anders; Kubista, Mikael
2009-10-01
Quantitative PCR (qPCR) is a valuable technique for accurately and reliably profiling and quantifying gene expression. Typically, samples obtained from the organism of study have to be processed via several preparative steps before qPCR. We estimated the errors of sample withdrawal and extraction, reverse transcription (RT), and qPCR that are introduced into measurements of mRNA concentrations. We performed hierarchically arranged experiments with 3 animals, 3 samples, 3 RT reactions, and 3 qPCRs and quantified the expression of several genes in solid tissue, blood, cell culture, and single cells. A nested ANOVA design was used to model the experiments, and relative and absolute errors were calculated with this model for each processing level in the hierarchical design. We found that intersubject differences became easily confounded by sample heterogeneity for single cells and solid tissue. In cell cultures and blood, the noise from the RT and qPCR steps contributed substantially to the overall error because the sampling noise was less pronounced. We recommend the use of sample replicates preferentially to any other replicates when working with solid tissue, cell cultures, and single cells, and we recommend the use of RT replicates when working with blood. We show how an optimal sampling plan can be calculated for a limited budget. .
A modified error correction protocol for CCITT signalling system no. 7 on satellite links
NASA Astrophysics Data System (ADS)
Kreuer, Dieter; Quernheim, Ulrich
1991-10-01
Comite Consultatif International des Telegraphe et Telephone (CCITT) Signalling System No. 7 (SS7) provides a level 2 error correction protocol particularly suited for links with propagation delays higher than 15 ms. Not being originally designed for satellite links, however, the so called Preventive Cyclic Retransmission (PCR) Method only performs well on satellite channels when traffic is low. A modified level 2 error control protocol, termed Fix Delay Retransmission (FDR) method is suggested which performs better at high loads, thus providing a more efficient use of the limited carrier capacity. Both the PCR and the FDR methods are investigated by means of simulation and results concerning throughput, queueing delay, and system delay, respectively. The FDR method exhibits higher capacity and shorter delay than the PCR method.
Nelson, Lindsay D.; Patrick, Christopher J.; Bernat, Edward M.
2010-01-01
The externalizing dimension is viewed as a broad dispositional factor underlying risk for numerous disinhibitory disorders. Prior work has documented deficits in event-related brain potential (ERP) responses in individuals prone to externalizing problems. Here, we constructed a direct physiological index of externalizing vulnerability from three ERP indicators and evaluated its validity in relation to criterion measures in two distinct domains: psychometric and physiological. The index was derived from three ERP measures that covaried in their relations with externalizing proneness the error-related negativity and two variants of the P3. Scores on this ERP composite predicted psychometric criterion variables and accounted for externalizing-related variance in P3 response from a separate task. These findings illustrate how a diagnostic construct can be operationalized as a composite (multivariate) psychophysiological variable (phenotype). PMID:20573054
The application of Aronson's taxonomy to medication errors in nursing.
Johnson, Maree; Young, Helen
2011-01-01
Medication administration is a frequent nursing activity that is prone to error. In this study of 318 self-reported medication incidents (including near misses), very few resulted in patient harm-7% required intervention or prolonged hospitalization or caused temporary harm. Aronson's classification system provided an excellent framework for analysis of the incidents with a close connection between the type of error and the change strategy to minimize medication incidents. Taking a behavioral approach to medication error classification has provided helpful strategies for nurses such as nurse-call cards on patient lockers when patients are absent and checking of medication sign-off by outgoing and incoming staff at handover.
Hagen, Ralf Matthias; Hinz, Rebecca; Tannich, Egbert; Frickmann, Hagen
2015-06-01
We compared the performance of an in-house and a commercial malaria polymerase chain reaction (PCR) assay using freeze-thawed hemolytic blood samples. A total of 116 freeze-thawed ethylenediamine tetraacetic acid (EDTA) blood samples of patients with suspicion of malaria were analyzed by an in-house as well as by a commercially available real-time PCR. Concordant malaria negative PCR results were reported for 39 samples and malaria-positive PCR results for 67 samples. The in-house assay further detected one case of Plasmodium falciparum infection, which was negative in the commercial assay as well as five cases of P. falciparum malaria and three cases of Plasmodium vivax malaria, which showed sample inhibition in the commercial assay. The commercial malaria assay was positive in spite of a negative in-house PCR result in one case. In all concordant results, cycle threshold values of P. falciparum-positive samples were lower in the commercial PCR than in the in-house assay. Although Ct values of the commercial PCR kit suggest higher sensitivity in case of concordant results, it is prone to inhibition if it is applied to hemolytic freeze-thawed blood samples. The number of misidentifications was, however, identical for both real-time PCR assays.
Improving travel information products via robust estimation techniques : final report, March 2009.
DOT National Transportation Integrated Search
2009-03-01
Traffic-monitoring systems, such as those using loop detectors, are prone to coverage gaps, arising from sensor noise, processing errors and : transmission problems. Such gaps adversely affect the accuracy of Advanced Traveler Information Systems. Th...
Belief-bias reasoning in non-clinical delusion-prone individuals.
Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R
2017-03-01
It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.
Belief-bias reasoning in non-clinical delusion-prone individuals.
Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R
2017-09-01
It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.
Perspective-taking abilities in the balance between autism tendencies and psychosis proneness.
Abu-Akel, Ahmad M; Wood, Stephen J; Hansen, Peter C; Apperly, Ian A
2015-06-07
Difficulties with the ability to appreciate the perspective of others (mentalizing) is central to both autism and schizophrenia spectrum disorders. While the disorders are diagnostically independent, they can co-occur in the same individual. The effect of such co-morbidity is hypothesized to worsen mentalizing abilities. The recent influential 'diametric brain theory', however, suggests that the disorders are etiologically and phenotypically diametrical, predicting opposing effects on one's mentalizing abilities. To test these contrasting hypotheses, we evaluated the effect of psychosis and autism tendencies on the perspective-taking (PT) abilities of 201 neurotypical adults, on the assumption that autism tendencies and psychosis proneness are heritable dimensions of normal variation. We show that while both autism tendencies and psychosis proneness induce PT errors, their interaction reduced these errors. Our study is, to our knowledge, the first to observe that co-occurring autistic and psychotic traits can exert opposing influences on performance, producing a normalizing effect possibly by way of their diametrical effects on socio-cognitive abilities. This advances the notion that some individuals may, to some extent, be buffered against developing either illness or present fewer symptoms owing to a balanced expression of autistic and psychosis liability. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
DOT National Transportation Integrated Search
2011-03-01
Traffic Management applications such as ramp metering, incident detection, travel time prediction, and vehicle : classification greatly depend on the accuracy of data collected from inductive loop detectors, but these data are : prone to various erro...
Jeremiah, S S; Balaji, V; Anandan, S; Sahni, R D
2014-01-01
The modified Hodge test (MHT) is widely used as a screening test for the detection of carbapenemases in Gram-negative bacteria. This test has several pitfalls in terms of validity and interpretation. Also the test has a very low sensitivity in detecting the New Delhi metallo-β-lactamase (NDM). Considering the degree of dissemination of the NDM and the growing pandemic of carbapenem resistance, a more accurate alternative test is needed at the earliest. The study intends to compare the performance of the MHT with the commercially available Neo-Sensitabs - Carbapenemases/Metallo-β-Lactamase (MBL) Confirmative Identification pack to find out whether the latter could be an efficient alternative to the former. A total of 105 isolates of Klebsiella pneumoniae resistant to imipenem and meropenem, collected prospectively over a period of 2 years were included in the study. The study isolates were tested with the MHT, the Neo-Sensitabs - Carbapenemases/MBL Confirmative Identification pack and polymerase chain reaction (PCR) for detecting the blaNDM-1 gene. Among the 105 isolates, the MHT identified 100 isolates as carbapenemase producers. In the five isolates negative for the MHT, four were found to produce MBLs by the Neo-Sensitabs. The Neo-Sensitabs did not have any false negatives when compared against the PCR. The MHT can give false negative results, which lead to failure in detecting the carbapenemase producers. Also considering the other pitfalls of the MHT, the Neo-Sensitabs--Carbapenemases/MBL Confirmative Identification pack could be a more efficient alternative for detection of carbapenemase production in Gram-negative bacteria.
First order error corrections in common introductory physics experiments
NASA Astrophysics Data System (ADS)
Beckey, Jacob; Baker, Andrew; Aravind, Vasudeva; Clarion Team
As a part of introductory physics courses, students perform different standard lab experiments. Almost all of these experiments are prone to errors owing to factors like friction, misalignment of equipment, air drag, etc. Usually these types of errors are ignored by students and not much thought is paid to the source of these errors. However, paying attention to these factors that give rise to errors help students make better physics models and understand physical phenomena behind experiments in more detail. In this work, we explore common causes of errors in introductory physics experiment and suggest changes that will mitigate the errors, or suggest models that take the sources of these errors into consideration. This work helps students build better and refined physical models and understand physics concepts in greater detail. We thank Clarion University undergraduate student grant for financial support involving this project.
Kam, Winnie W Y; Lake, Vanessa; Banos, Connie; Davies, Justin; Banati, Richard
2013-05-30
Quantitative polymerase chain reaction (qPCR) has been widely used to quantify changes in gene copy numbers after radiation exposure. Here, we show that gamma irradiation ranging from 10 to 100 Gy of cells and cell-free DNA samples significantly affects the measured qPCR yield, due to radiation-induced fragmentation of the DNA template and, therefore, introduces errors into the estimation of gene copy numbers. The radiation-induced DNA fragmentation and, thus, measured qPCR yield varies with temperature not only in living cells, but also in isolated DNA irradiated under cell-free conditions. In summary, the variability in measured qPCR yield from irradiated samples introduces a significant error into the estimation of both mitochondrial and nuclear gene copy numbers and may give spurious evidence for polyploidization.
Honda, Kohsuke; Inoue, Mizuha; Ono, Tomohiro; Okano, Kenji; Dekishima, Yasumasa; Kawabata, Hiroshi
2017-06-01
Directed evolution of enantio-selective carbonyl reductase from Ogataea minuta was conducted to improve the operational stability of the enzyme. A mutant library was constructed by an error-prone PCR and screened using a newly developed colorimetric assay. The stability of a mutant with two amino acid substitutions was significantly higher than that of the wild type at 50°C in the presence of dimethyl sulfoxide. Site-directed mutagenesis analysis showed that the improved stability of the enzyme can be attributed to the amino acid substitution of V166A. The half-lives of the V166A mutant were 11- and 6.1-times longer than those of the wild type at 50°C in the presence and absence, respectively, of 20% (v/v) dimethyl sulfoxide. No significant differences in the substrate specificity and enantio-selectivity of the enzyme were observed. The mutant enzyme converted 60 mM 2,2,2-trifluoroacetophenone to (R)-(-)-α-(trifluoromethyl)benzyl alcohol in a molar yield of 71% whereas the conversion yield with an equivalent concentration of the wild-type enzyme was 27%. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Leferink, Nicole G. H.; Hendriks, Annemarie; Brouns, Stan J. J.; Hennemann, Hans-Georg; Dauβmann, Thomas; van der Oost, John
2008-01-01
There is considerable interest in the use of enantioselective alcohol dehydrogenases for the production of enantio- and diastereomerically pure diols, which are important building blocks for pharmaceuticals, agrochemicals and fine chemicals. Due to the need for a stable alcohol dehydrogenase with activity at low-temperature process conditions (30°C) for the production of (2S,5S)-hexanediol, we have improved an alcohol dehydrogenase from the hyperthermophilic archaeon Pyrococcus furiosus (AdhA). A stable S-selective alcohol dehydrogenase with increased activity at 30°C on the substrate 2,5-hexanedione was generated by laboratory evolution on the thermostable alcohol dehydrogenase AdhA. One round of error-prone PCR and screening of ∼1,500 mutants was performed. The maximum specific activity of the best performing mutant with 2,5-hexanedione at 30°C was tenfold higher compared to the activity of the wild-type enzyme. A 3D-model of AdhA revealed that this mutant has one mutation in the well-conserved NADP(H)-binding site (R11L), and a second mutation (A180V) near the catalytic and highly conserved threonine at position 183. PMID:18452026
Accurate and predictive antibody repertoire profiling by molecular amplification fingerprinting.
Khan, Tarik A; Friedensohn, Simon; Gorter de Vries, Arthur R; Straszewski, Jakub; Ruscheweyh, Hans-Joachim; Reddy, Sai T
2016-03-01
High-throughput antibody repertoire sequencing (Ig-seq) provides quantitative molecular information on humoral immunity. However, Ig-seq is compromised by biases and errors introduced during library preparation and sequencing. By using synthetic antibody spike-in genes, we determined that primer bias from multiplex polymerase chain reaction (PCR) library preparation resulted in antibody frequencies with only 42 to 62% accuracy. Additionally, Ig-seq errors resulted in antibody diversity measurements being overestimated by up to 5000-fold. To rectify this, we developed molecular amplification fingerprinting (MAF), which uses unique molecular identifier (UID) tagging before and during multiplex PCR amplification, which enabled tagging of transcripts while accounting for PCR efficiency. Combined with a bioinformatic pipeline, MAF bias correction led to measurements of antibody frequencies with up to 99% accuracy. We also used MAF to correct PCR and sequencing errors, resulting in enhanced accuracy of full-length antibody diversity measurements, achieving 98 to 100% error correction. Using murine MAF-corrected data, we established a quantitative metric of recent clonal expansion-the intraclonal diversity index-which measures the number of unique transcripts associated with an antibody clone. We used this intraclonal diversity index along with antibody frequencies and somatic hypermutation to build a logistic regression model for prediction of the immunological status of clones. The model was able to predict clonal status with high confidence but only when using MAF error and bias corrected Ig-seq data. Improved accuracy by MAF provides the potential to greatly advance Ig-seq and its utility in immunology and biotechnology.
Accurate and predictive antibody repertoire profiling by molecular amplification fingerprinting
Khan, Tarik A.; Friedensohn, Simon; de Vries, Arthur R. Gorter; Straszewski, Jakub; Ruscheweyh, Hans-Joachim; Reddy, Sai T.
2016-01-01
High-throughput antibody repertoire sequencing (Ig-seq) provides quantitative molecular information on humoral immunity. However, Ig-seq is compromised by biases and errors introduced during library preparation and sequencing. By using synthetic antibody spike-in genes, we determined that primer bias from multiplex polymerase chain reaction (PCR) library preparation resulted in antibody frequencies with only 42 to 62% accuracy. Additionally, Ig-seq errors resulted in antibody diversity measurements being overestimated by up to 5000-fold. To rectify this, we developed molecular amplification fingerprinting (MAF), which uses unique molecular identifier (UID) tagging before and during multiplex PCR amplification, which enabled tagging of transcripts while accounting for PCR efficiency. Combined with a bioinformatic pipeline, MAF bias correction led to measurements of antibody frequencies with up to 99% accuracy. We also used MAF to correct PCR and sequencing errors, resulting in enhanced accuracy of full-length antibody diversity measurements, achieving 98 to 100% error correction. Using murine MAF-corrected data, we established a quantitative metric of recent clonal expansion—the intraclonal diversity index—which measures the number of unique transcripts associated with an antibody clone. We used this intraclonal diversity index along with antibody frequencies and somatic hypermutation to build a logistic regression model for prediction of the immunological status of clones. The model was able to predict clonal status with high confidence but only when using MAF error and bias corrected Ig-seq data. Improved accuracy by MAF provides the potential to greatly advance Ig-seq and its utility in immunology and biotechnology. PMID:26998518
An automated calibration method for non-see-through head mounted displays.
Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew
2011-08-15
Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.
Grunting's competitive advantage: Considerations of force and distraction
Maglinti, Cj; Kingstone, Alan
2018-01-01
Background Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport—mixed martial arts—where distraction, rather than masking, is the most likely mechanism. Methodology/Principal findings We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent’s response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. Conclusions/Significance The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined. PMID:29470505
Grunting's competitive advantage: Considerations of force and distraction.
Sinnett, Scott; Maglinti, Cj; Kingstone, Alan
2018-01-01
Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport-mixed martial arts-where distraction, rather than masking, is the most likely mechanism. We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent's response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined.
Lo, Te-Wen; Pickle, Catherine S; Lin, Steven; Ralston, Edward J; Gurling, Mark; Schartner, Caitlin M; Bian, Qian; Doudna, Jennifer A; Meyer, Barbara J
2013-10-01
Exploitation of custom-designed nucleases to induce DNA double-strand breaks (DSBs) at genomic locations of choice has transformed our ability to edit genomes, regardless of their complexity. DSBs can trigger either error-prone repair pathways that induce random mutations at the break sites or precise homology-directed repair pathways that generate specific insertions or deletions guided by exogenously supplied DNA. Prior editing strategies using site-specific nucleases to modify the Caenorhabditis elegans genome achieved only the heritable disruption of endogenous loci through random mutagenesis by error-prone repair. Here we report highly effective strategies using TALE nucleases and RNA-guided CRISPR/Cas9 nucleases to induce error-prone repair and homology-directed repair to create heritable, precise insertion, deletion, or substitution of specific DNA sequences at targeted endogenous loci. Our robust strategies are effective across nematode species diverged by 300 million years, including necromenic nematodes (Pristionchus pacificus), male/female species (Caenorhabditis species 9), and hermaphroditic species (C. elegans). Thus, genome-editing tools now exist to transform nonmodel nematode species into genetically tractable model organisms. We demonstrate the utility of our broadly applicable genome-editing strategies by creating reagents generally useful to the nematode community and reagents specifically designed to explore the mechanism and evolution of X chromosome dosage compensation. By developing an efficient pipeline involving germline injection of nuclease mRNAs and single-stranded DNA templates, we engineered precise, heritable nucleotide changes both close to and far from DSBs to gain or lose genetic function, to tag proteins made from endogenous genes, and to excise entire loci through targeted FLP-FRT recombination.
Ketkar, Amit; Zafar, Maroof K; Banerjee, Surajit; Marquez, Victor E; Egli, Martin; Eoff, Robert L
2012-06-27
Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2'-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2'-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle, which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base-stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase.
Ketkar, Amit; Zafar, Maroof K.; Banerjee, Surajit; Marquez, Victor E.; Egli, Martin; Eoff, Robert L
2012-01-01
Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2′-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2′-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle (χ), which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase. PMID:22632140
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; ...
2017-02-15
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less
Software for Quantifying and Simulating Microsatellite Genotyping Error
Johnson, Paul C.D.; Haydon, Daniel T.
2007-01-01
Microsatellite genetic marker data are exploited in a variety of fields, including forensics, gene mapping, kinship inference and population genetics. In all of these fields, inference can be thwarted by failure to quantify and account for data errors, and kinship inference in particular can benefit from separating errors into two distinct classes: allelic dropout and false alleles. Pedant is MS Windows software for estimating locus-specific maximum likelihood rates of these two classes of error. Estimation is based on comparison of duplicate error-prone genotypes: neither reference genotypes nor pedigree data are required. Other functions include: plotting of error rate estimates and confidence intervals; simulations for performing power analysis and for testing the robustness of error rate estimates to violation of the underlying assumptions; and estimation of expected heterozygosity, which is a required input. The program, documentation and source code are available from http://www.stats.gla.ac.uk/~paulj/pedant.html. PMID:20066126
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; Rudinger, Kenneth; Mizrahi, Jonathan; Fortier, Kevin; Maunz, Peter
2017-01-01
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Here we use gate set tomography to completely characterize operations on a trapped-Yb+-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10−4). PMID:28198466
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik
Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less
Sources of PCR-induced distortions in high-throughput sequencing data sets
Kebschull, Justus M.; Zador, Anthony M.
2015-01-01
PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991
Multi-template polymerase chain reaction.
Kalle, Elena; Kubista, Mikael; Rensing, Christopher
2014-12-01
PCR is a formidable and potent technology that serves as an indispensable tool in a wide range of biological disciplines. However, due to the ease of use and often lack of rigorous standards many PCR applications can lead to highly variable, inaccurate, and ultimately meaningless results. Thus, rigorous method validation must precede its broad adoption to any new application. Multi-template samples possess particular features, which make their PCR analysis prone to artifacts and biases: multiple homologous templates present in copy numbers that vary within several orders of magnitude. Such conditions are a breeding ground for chimeras and heteroduplexes. Differences in template amplification efficiencies and template competition for reaction compounds undermine correct preservation of the original template ratio. In addition, the presence of inhibitors aggravates all of the above-mentioned problems. Inhibitors might also have ambivalent effects on the different templates within the same sample. Yet, no standard approaches exist for monitoring inhibitory effects in multitemplate PCR, which is crucial for establishing compatibility between samples.
Ussowicz, Marek; Rybka, Blanka; Wendycz-Domalewska, Danuta; Ryczan, Renata; Gorczyńska, Ewa; Kałwak, Krzysztof; Woźniak, Mieczysław
2010-01-01
After stem cell transplantation, human patients are prone to life-threatening opportunistic infections with a plethora of microorganisms. We report a retrospective study on 116 patients (98 children, 18 adults) who were transplanted in a pediatric bone marrow transplantation unit. Blood, urine and stool samples were collected and monitored for adenovirus (AdV) DNA using polymerase chain reaction (PCR) and real-time PCR (RT-PCR) on a regular basis. AdV DNA was detected in 52 (44.8%) patients, with mortality reaching 19% in this subgroup. Variables associated with adenovirus infection were transplantations from matched unrelated donors and older age of the recipient. An increased seasonal occurrence of adenoviral infections was observed in autumn and winter. Analysis of immune reconstitution showed a higher incidence of AdV infections during periods of low T-lymphocyte count. This study also showed a strong interaction between co-infections of AdV and BK polyomavirus in patients undergoing hematopoietic stem cell transplantations. PMID:20848295
Effect of lethality on the extinction and on the error threshold of quasispecies.
Tejero, Hector; Marín, Arturo; Montero, Francisco
2010-02-21
In this paper the effect of lethality on error threshold and extinction has been studied in a population of error-prone self-replicating molecules. For given lethality and a simple fitness landscape, three dynamic regimes can be obtained: quasispecies, error catastrophe, and extinction. Using a simple model in which molecules are classified as master, lethal and non-lethal mutants, it is possible to obtain the mutation rates of the transitions between the three regimes analytically. The numerical resolution of the extended model, in which molecules are classified depending on their Hamming distance to the master sequence, confirms the results obtained in the simple model and shows how an error catastrophe regime changes when lethality is taken in account. (c) 2009 Elsevier Ltd. All rights reserved.
Ability/Motivation Interactions in Complex Skill Acquisition
1988-04-28
attentional resources. Finally, in the declarative knowledge phase, performance is slow and error prone. Once the learner has come to an adequate cognitive...mediation by the learner. After a substantial amount of consistent task practice, skilled performance becomes fast , accurate, and the task can often be
DNA polymerase η mutational signatures are found in a variety of different types of cancer.
Rogozin, Igor B; Goncearenco, Alexander; Lada, Artem G; De, Subhajyoti; Yurchenko, Vyacheslav; Nudelman, German; Panchenko, Anna R; Cooper, David N; Pavlov, Youri I
2018-01-01
DNA polymerase (pol) η is a specialized error-prone polymerase with at least two quite different and contrasting cellular roles: to mitigate the genetic consequences of solar UV irradiation, and promote somatic hypermutation in the variable regions of immunoglobulin genes. Misregulation and mistargeting of pol η can compromise genome integrity. We explored whether the mutational signature of pol η could be found in datasets of human somatic mutations derived from normal and cancer cells. A substantial excess of single and tandem somatic mutations within known pol η mutable motifs was noted in skin cancer as well as in many other types of human cancer, suggesting that somatic mutations in A:T bases generated by DNA polymerase η are a common feature of tumorigenesis. Another peculiarity of pol ηmutational signatures, mutations in YCG motifs, led us to speculate that error-prone DNA synthesis opposite methylated CpG dinucleotides by misregulated pol η in tumors might constitute an additional mechanism of cytosine demethylation in this hypermutable dinucleotide.
Inducible error-prone repair in B. subtilis. Final report, September 1, 1979-June 30, 1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yasbin, R. E.
1981-06-01
The research performed under this contract has been concentrated on the relationship between inducible DNA repair systems, mutagenesis and the competent state in the gram positive bacterium Bacillus subtilis. The following results have been obtained from this research: (1) competent Bacillus subtilis cells have been developed into a sensitive tester system for carcinogens; (2) competent B. subtilis cells have an efficient excision-repair system, however, this system will not function on bacteriophage DNA taken into the cell via the process of transfection; (3) DNA polymerase III is essential in the mechanism of the process of W-reactivation; (4) B. subtilis strains curedmore » of their defective prophages have been isolated and are now being developed for gene cloning systems; (5) protoplasts of B. subtilis have been shown capable of acquiring DNA repair enzymes (i.e., enzyme therapy); and (6) a plasmid was characterized which enhanced inducible error-prone repair in a gram positive organism.« less
A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction
NASA Astrophysics Data System (ADS)
Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf
2018-07-01
Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.
Somatic stem cells and the kinetics of mutagenesis and carcinogenesis
Cairns, John
2002-01-01
There is now strong experimental evidence that epithelial stem cells arrange their sister chromatids at mitosis such that the same template DNA strands stay together through successive divisions; DNA labeled with tritiated thymidine in infancy is still present in the stem cells of adult mice even though these cells are incorporating (and later losing) bromodeoxyuridine [Potten, C. S., Owen, G., Booth, D. & Booth, C. (2002) J. Cell Sci.115, 2381–2388]. But a cell that preserves “immortal strands” will avoid the accumulation of replication errors only if it inhibits those pathways for DNA repair that involve potentially error-prone resynthesis of damaged strands, and this appears to be a property of intestinal stem cells because they are extremely sensitive to the lethal effects of agents that damage DNA. It seems that the combination, in the stem cell, of immortal strands and the choice of death rather than error-prone repair makes epithelial stem cell systems resistant to short exposures to DNA-damaging agents, because the stem cell accumulates few if any errors, and any errors made by the daughters are destined to be discarded. This paper discusses these issues and shows that they lead to a model that explains the strange kinetics of mutagenesis and carcinogenesis in adult mammalian tissues. Coincidentally, the model also can explain why cancers arise even though the spontaneous mutation rate of differentiated mammalian cells is not high enough to generate the multiple mutations needed to form a cancer and why loss of nucleotide-excision repair does not significantly increase the frequency of the common internal cancers. PMID:12149477
Hybrid learning in signalling games
NASA Astrophysics Data System (ADS)
Barrett, Jeffrey A.; Cochran, Calvin T.; Huttegger, Simon; Fujiwara, Naoki
2017-09-01
Lewis-Skyrms signalling games have been studied under a variety of low-rationality learning dynamics. Reinforcement dynamics are stable but slow and prone to evolving suboptimal signalling conventions. A low-inertia trial-and-error dynamical like win-stay/lose-randomise is fast and reliable at finding perfect signalling conventions but unstable in the context of noise or agent error. Here we consider a low-rationality hybrid of reinforcement and win-stay/lose-randomise learning that exhibits the virtues of both. This hybrid dynamics is reliable, stable and exceptionally fast.
Reduced vision selectively impairs spatial updating in fall-prone older adults.
Barrett, Maeve M; Doheny, Emer P; Setti, Annalisa; Maguinness, Corrina; Foran, Timothy G; Kenny, Rose Anne; Newell, Fiona N
2013-01-01
The current study examined the role of vision in spatial updating and its potential contribution to an increased risk of falls in older adults. Spatial updating was assessed using a path integration task in fall-prone and healthy older adults. Specifically, participants conducted a triangle completion task in which they were guided along two sides of a triangular route and were then required to return, unguided, to the starting point. During the task, participants could either clearly view their surroundings (full vision) or visuo-spatial information was reduced by means of translucent goggles (reduced vision). Path integration performance was measured by calculating the distance and angular deviation from the participant's return point relative to the starting point. Gait parameters for the unguided walk were also recorded. We found equivalent performance across groups on all measures in the full vision condition. In contrast, in the reduced vision condition, where participants had to rely on interoceptive cues to spatially update their position, fall-prone older adults made significantly larger distance errors relative to healthy older adults. However, there were no other performance differences between fall-prone and healthy older adults. These findings suggest that fall-prone older adults, compared to healthy older adults, have greater difficulty in reweighting other sensory cues for spatial updating when visual information is unreliable.
Kanno, Alex I; Goulart, Cibelly; Rofatto, Henrique K; Oliveira, Sergio C; Leite, Luciana C C; McFadden, Johnjoe
2016-04-01
The expression of many antigens, stimulatory molecules, or even metabolic pathways in mycobacteria such as Mycobacterium bovis BCG or M. smegmatis was made possible through the development of shuttle vectors, and several recombinant vaccines have been constructed. However, gene expression in any of these systems relied mostly on the selection of natural promoters expected to provide the required level of expression by trial and error. To establish a systematic selection of promoters with a range of strengths, we generated a library of mutagenized promoters through error-prone PCR of the strong PL5 promoter, originally from mycobacteriophage L5. These promoters were cloned upstream of the enhanced green fluorescent protein reporter gene, and recombinant M. smegmatis bacteria exhibiting a wide range of fluorescence levels were identified. A set of promoters was selected and identified as having high (pJK-F8), intermediate (pJK-B7, pJK-E6, pJK-D6), or low (pJK-C1) promoter strengths in both M. smegmatis and M. bovisBCG. The sequencing of the promoter region demonstrated that it was extensively modified (6 to 11%) in all of the plasmids selected. To test the functionality of the system, two different expression vectors were demonstrated to allow corresponding expression levels of the Schistosoma mansoni antigen Sm29 in BCG. The approach used here can be used to adjust expression levels for synthetic and/or systems biology studies or for vaccine development to maximize the immune response. Copyright © 2016 Kanno et al.
Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan
2010-09-01
Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and qPCR reaction would greatly improve the performance of the model. This methodology, built upon Bacteroidales assays, is readily transferable to any other microbial source indicator where a universal assay for fecal sources of that indicator exists. Copyright © 2010 Elsevier Ltd. All rights reserved.
Wu, Da-lin; Ling, Han-xin; Tang, Hao
2004-11-01
To evaluate the accuracy of PCR with sequence-specific primers (PCR-SSP) for HLA-I genotyping and analyze the causes of the errors occurring in the genotyping. DNA samples and were obtained from 34 clinical patients, and serological typing with monoclonal antibody (mAb) and HLA-A and, B antigen genotyping with PCR-SSP were performed. HLA-A and, B alleles were successfully typed in 34 clinical samples by mAb and PCR-SSP. No false positive or false negative results were found, and the erroneous and missed diagnosis rates were obviously higher in serological detection, being 23.5% for HLA-A and 26.5% for HLA-B. Error or confusion was more likely to occur in the antigens of A2 and A68, A32 and A33, B5, B60 and B61. DNA typing for HLA-I class (A, B antigens) by PCR-SSP has high resolution, high specificity, and good reproducibility, which is more suitable for clinical application than serological typing. PCR-SSP may accurately detect the alleles that are easily missed or mistaken in serological typing.
Adopting Extensible Business Reporting Language (XBRL): A Grounded Theory
ERIC Educational Resources Information Center
Cruz, Marivic
2010-01-01
In 2007 and 2008, government challenges consisted of error prone, manually intensive, and inefficient environments for financial reporting. Banking regulators worldwide faced issues with respect to transparency, timeliness, quality, and managing risks associated with accounting opacity. The general problem was the existing reporting standards and…
2014-10-02
intervals (Neil, Tailor, Marquez, Fenton , & Hear, 2007). This is cumbersome, error prone and usually inaccurate. Even though a universal framework...Science. Neil, M., Tailor, M., Marquez, D., Fenton , N., & Hear. (2007). Inference in Bayesian networks using dynamic discretisation. Statistics
Kingma, Sandra D K; Li, Nan; Sun, Frank; Valladares, Ricardo B; Neu, Joe; Lorca, Graciela L
2011-06-01
Lactobacillus johnsonii (Ljo) N6.2 has been shown to mitigate the development of type 1 diabetes when administered to diabetes-prone rats. The specific mechanisms underlying this observed response remain under investigation. The objective of this study was to assess the effect of Ljo N6.2 on mucosal inflammatory response using differentiated Caco-2 monolayers. The mRNA expression levels of CCL20, CXCL8, and CXCL10 chemokines were determined by qRT-PCR. Ljo at 10(11) CFU/L induced a strong response in all chemokines examined. To assess the specific host-signaling pathways involved, we performed RT-PCR amplification of Toll-like receptors (TLR) and nucleotide-binding oligomerization domain-like receptors. TLR7 and TLR9 expression levels were induced 4.2- and 9-fold, respectively, whereas other TLR and nucleotide-binding oligomerization domain receptors were not modified. A similar effect was observed in Caco-2 monolayers treated with Ljo cell-free extract or purified nucleic acids (NA). Increased levels of IFN type 1 and IFN regulators Stat1 and IRF7 followed the upregulation of TLR9. Activation of TLR9 was also evidenced by increased Frizzled 5 expression in Ljo-treated Caco-2 cells and an increase in the number of Paneth cells in Ljo-fed, diabetes-prone rats. These results are in agreement with the polarizing-tolerizing mechanism recently described in which the apical stimulation of TLR9 in intestinal epithelial cells leads to a higher state of immunologic alertness. Furthermore, these results suggest that live probiotics could be, in the future, replaced with select cellular components.
Linger, Michele L; Ray, Glen E; Zachar, Peter; Underhill, Andrea T; LoBello, Steven G
2007-10-01
Studies of graduate students learning to administer the Wechsler scales have generally shown that training is not associated with the development of scoring proficiency. Many studies report on the reduction of aggregated administration and scoring errors, a strategy that does not highlight the reduction of errors on subtests identified as most prone to error. This study evaluated the development of scoring proficiency specifically on the Wechsler (WISC-IV and WAIS-III) Vocabulary, Comprehension, and Similarities subtests during training by comparing a set of 'early test administrations' to 'later test administrations.' Twelve graduate students enrolled in an intelligence-testing course participated in the study. Scoring errors (e.g., incorrect point assignment) were evaluated on the students' actual practice administration test protocols. Errors on all three subtests declined significantly when scoring errors on 'early' sets of Wechsler scales were compared to those made on 'later' sets. However, correcting these subtest scoring errors did not cause significant changes in subtest scaled scores. Implications for clinical instruction and future research are discussed.
Simplified stereo-optical ultrasound plane calibration
NASA Astrophysics Data System (ADS)
Hoßbach, Martin; Noll, Matthias; Wesarg, Stefan
2013-03-01
Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke
Efficient Variational Quantum Simulator Incorporating Active Error Minimization
NASA Astrophysics Data System (ADS)
Li, Ying; Benjamin, Simon C.
2017-04-01
One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.
NASA Astrophysics Data System (ADS)
Debchoudhury, Shantanab; Earle, Gregory
2017-04-01
Retarding Potential Analyzers (RPA) have a rich flight heritage. Standard curve-fitting analysis techniques exist that can infer state variables in the ionospheric plasma environment from RPA data, but the estimation process is prone to errors arising from a number of sources. Previous work has focused on the effects of grid geometry on uncertainties in estimation; however, no prior study has quantified the estimation errors due to additive noise. In this study, we characterize the errors in estimation of thermal plasma parameters by adding noise to the simulated data derived from the existing ionospheric models. We concentrate on low-altitude, mid-inclination orbits since a number of nano-satellite missions are focused on this region of the ionosphere. The errors are quantified and cross-correlated for varying geomagnetic conditions.
Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan
2016-01-01
Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.
List of Error-Prone Abbreviations, Symbols, and Dose Designations
... unit dose (e.g., diltiazem 125 mg IV infusion “UD” misin- terpreted as meaning to give the entire infusion as a unit [bolus] dose) Use “as directed” ... Names Intended Meaning Misinterpretation Correction “Nitro” drip nitroglycerin infusion Mistaken as sodium nitroprusside infusion Use complete drug ...
Improving Advising Using Technology and Data Analytics
ERIC Educational Resources Information Center
Phillips, Elizabeth D.
2013-01-01
Traditionally, the collegiate advising system provides each student with a personal academic advisor who designs a pathway to the degree for that student in face-to-face meetings. Ideally, this is a supportive mentoring relationship. In truth, however, this system is highly inefficient, error prone, expensive, and a source of ubiquitous student…
Finite element modeling of light propagation in fruit under illumination of continuous-wave beam
USDA-ARS?s Scientific Manuscript database
Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...
Finite element simulation of light transfer in turbid media under structured illumination
USDA-ARS?s Scientific Manuscript database
Spatial-frequency domain (SFD) imaging technique allows to estimate the optical properties of biological tissues in a wide field of view. The technique is, however, prone to error in measurement because the two crucial assumptions used for deriving the analytical solution to diffusion approximation ...
Propensity Score Weighting with Error-Prone Covariates
ERIC Educational Resources Information Center
McCaffrey, Daniel F.; Lockwood, J. R.; Setodji, Claude M.
2011-01-01
Inverse probability weighting (IPW) estimates are widely used in applications where data are missing due to nonresponse or censoring or in observational studies of causal effects where the counterfactuals cannot be observed. This extensive literature has shown the estimators to be consistent and asymptotically normal under very general conditions,…
ERIC Educational Resources Information Center
Sylwester, Robert
1994-01-01
Studies show our emotional system is a complex, widely distributed, and error-prone system that defines our basic personality early in life and is quite resistant to change. This article describes our emotional system's major parts (the peptides that carry emotional information and the body and brain structures that activate and regulate emotions)…
Online Hand Holding in Fixing Computer Glitches
ERIC Educational Resources Information Center
Goldsborough, Reid
2005-01-01
According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…
Thermally multiplexed polymerase chain reaction.
Phaneuf, Christopher R; Pak, Nikita; Saunders, D Curtis; Holst, Gregory L; Birjiniuk, Joav; Nagpal, Nikita; Culpepper, Stephen; Popler, Emily; Shane, Andi L; Jerris, Robert; Forest, Craig R
2015-07-01
Amplification of multiple unique genetic targets using the polymerase chain reaction (PCR) is commonly required in molecular biology laboratories. Such reactions are typically performed either serially or by multiplex PCR. Serial reactions are time consuming, and multiplex PCR, while powerful and widely used, can be prone to amplification bias, PCR drift, and primer-primer interactions. We present a new thermocycling method, termed thermal multiplexing, in which a single heat source is uniformly distributed and selectively modulated for independent temperature control of an array of PCR reactions. Thermal multiplexing allows amplification of multiple targets simultaneously-each reaction segregated and performed at optimal conditions. We demonstrate the method using a microfluidic system consisting of an infrared laser thermocycler, a polymer microchip featuring 1 μl, oil-encapsulated reactions, and closed-loop pulse-width modulation control. Heat transfer modeling is used to characterize thermal performance limitations of the system. We validate the model and perform two reactions simultaneously with widely varying annealing temperatures (48 °C and 68 °C), demonstrating excellent amplification. In addition, to demonstrate microfluidic infrared PCR using clinical specimens, we successfully amplified and detected both influenza A and B from human nasopharyngeal swabs. Thermal multiplexing is scalable and applicable to challenges such as pathogen detection where patients presenting non-specific symptoms need to be efficiently screened across a viral or bacterial panel.
Wetherbee, G.A.; Latysh, N.E.; Gordon, J.D.
2005-01-01
Data from the U.S. Geological Survey (USGS) collocated-sampler program for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) are used to estimate the overall error of NADP/NTN measurements. Absolute errors are estimated by comparison of paired measurements from collocated instruments. Spatial and temporal differences in absolute error were identified and are consistent with longitudinal distributions of NADP/NTN measurements and spatial differences in precipitation characteristics. The magnitude of error for calcium, magnesium, ammonium, nitrate, and sulfate concentrations, specific conductance, and sample volume is of minor environmental significance to data users. Data collected after a 1994 sample-handling protocol change are prone to less absolute error than data collected prior to 1994. Absolute errors are smaller during non-winter months than during winter months for selected constituents at sites where frozen precipitation is common. Minimum resolvable differences are estimated for different regions of the USA to aid spatial and temporal watershed analyses.
The importance of robust error control in data compression applications
NASA Technical Reports Server (NTRS)
Woolley, S. I.
1993-01-01
Data compression has become an increasingly popular option as advances in information technology have placed further demands on data storage capabilities. With compression ratios as high as 100:1 the benefits are clear; however, the inherent intolerance of many compression formats to error events should be given careful consideration. If we consider that efficiently compressed data will ideally contain no redundancy, then the introduction of a channel error must result in a change of understanding from that of the original source. While the prefix property of codes such as Huffman enables resynchronisation, this is not sufficient to arrest propagating errors in an adaptive environment. Arithmetic, Lempel-Ziv, discrete cosine transform (DCT) and fractal methods are similarly prone to error propagating behaviors. It is, therefore, essential that compression implementations provide sufficient combatant error control in order to maintain data integrity. Ideally, this control should be derived from a full understanding of the prevailing error mechanisms and their interaction with both the system configuration and the compression schemes in use.
Self-Interaction Error in Density Functional Theory: An Appraisal.
Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G
2018-05-03
Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.
The Diagnosis of Error in Histories of Science
NASA Astrophysics Data System (ADS)
Thomas, William
Whether and how to diagnose error in the history of science is a contentious issue. For many scientists, diagnosis is appealing because it allows them to discuss how knowledge can progress most effectively. Many historians disagree. They consider diagnosis inappropriate because it may discard features of past actors' thought that are important to understanding it, and may have even been intellectually productive. Ironically, these historians are apt to diagnose flaws in scientists' histories as proceeding from a misguided desire to idealize scientific method, and from their attendant identification of deviations from the ideal as, ipso facto, a paramount source of error in historical science. While both views have some merit, they should be reconciled if a more harmonious and productive relationship between the disciplines is to prevail. In To Explain the World, Steven Weinberg narrates the slow but definite emergence of what we call science from long traditions of philosophical and mathematical thought. This narrative follows in a historiographical tradition charted by historians such as Alexandre Koyre and Rupert Hall about sixty years ago. It is essentially a history of the emergence of reliable (if fallible) scientific method from more error-prone thought. While some historians such as Steven Shapin view narratives of this type as fundamentally error-prone, I do not view such projects as a priori illegitimate. They are, however, perhaps more difficult than Weinberg supposes. In this presentation, I will focus on two of Weinberg's strong historical claims: that physics became detached from religion as early as the beginning of the eighteenth century, and that physics proved an effective model for placing other fields on scientific grounds. While I disagree with these claims, they represent at most an overestimation of vintage science's interest in discarding theological questions, and an overestimation of that science's ability to function at all reliably.
Surface driven biomechanical breast image registration
NASA Astrophysics Data System (ADS)
Eiben, Björn; Vavourakis, Vasileios; Hipwell, John H.; Kabus, Sven; Lorenz, Cristian; Buelow, Thomas; Williams, Norman R.; Keshtgar, M.; Hawkes, David J.
2016-03-01
Biomechanical modelling enables large deformation simulations of breast tissues under different loading conditions to be performed. Such simulations can be utilised to transform prone Magnetic Resonance (MR) images into a different patient position, such as upright or supine. We present a novel integration of biomechanical modelling with a surface registration algorithm which optimises the unknown material parameters of a biomechanical model and performs a subsequent regularised surface alignment. This allows deformations induced by effects other than gravity, such as those due to contact of the breast and MR coil, to be reversed. Correction displacements are applied to the biomechanical model enabling transformation of the original pre-surgical images to the corresponding target position. The algorithm is evaluated for the prone-to-supine case using prone MR images and the skin outline of supine Computed Tomography (CT) scans for three patients. A mean target registration error (TRE) of 10:9 mm for internal structures is achieved. For the prone-to-upright scenario, an optical 3D surface scan of one patient is used as a registration target and the nipple distances after alignment between the transformed MRI and the surface are 10:1 mm and 6:3 mm respectively.
Park, Se-yeon; Yoo, Won-gyu
2013-10-01
The aim of this study was to compare muscular activation during five different normalization techniques that induced maximal isometric contraction of the latissimus dorsi. Sixteen healthy men participated in the study. Each participant performed three repetitions each of five types of isometric exertion: (1) conventional shoulder extension in the prone position, (2) caudal shoulder depression in the prone position, (3) body lifting with shoulder depression in the seated position, (4) trunk bending to the right in the lateral decubitus position, and (5) downward bar pulling in the seated position. In most participants, maximal activation of the latissimus dorsi was observed during conventional shoulder extension in the prone position; the percentage of maximal voluntary contraction was significantly greater for this exercise than for all other normalization techniques except downward bar pulling in the seated position. Although differences in electrode placement among various electromyographic studies represent a limitation, normalization techniques for the latissimus dorsi are recommended to minimize error in assessing maximal muscular activation of the latissimus dorsi through the combined use of shoulder extension in the prone position and downward pulling. Copyright © 2013 Elsevier Ltd. All rights reserved.
Kunakorn, M; Raksakai, K; Pracharktam, R; Sattaudom, C
1999-03-01
Our experiences from 1993 to 1997 in the development and use of IS6110 base PCR for the diagnosis of extrapulmonary tuberculosis in a routine clinical setting revealed that error-correcting processes can improve existing diagnostic methodology. The reamplification method initially used had a sensitivity of 90.91% and a specificity of 93.75%. The concern was focused on the false positive results of this method caused by product-carryover contamination. This method was changed to single round PCR with carryover prevention by uracil DNA glycosylase (UDG), resulting in a 100% specificity but only 63% sensitivity. Dot blot hybridization was added after the single round PCR, increasing the sensitivity to 87.50%. However, false positivity resulted from the nonspecific dot blot hybridization signal, reducing the specificity to 89.47%. The hybridization of PCR was changed to a Southern blot with a new oligonucleotide probe giving the sensitivity of 85.71% and raising the specificity to 99.52%. We conclude that the PCR protocol for routine clinical use should include UDG for carryover prevention and hybridization with specific probes to optimize diagnostic sensitivity and specificity in extrapulmonary tuberculosis testing.
Zhu, Jianjie; Chen, Lanxin; Mao, Yong; Zhou, Huan
2013-01-01
Allele-specific amplification on the basis of polymerase chain reaction (PCR) has been widely used for single-nucleotide polymorphism (SNP) genotyping. However, the extraction of PCR-compatible genomic DNA from whole blood is usually required. This process is complicated and tedious, and is prone to cause cross-contamination between samples. To facilitate direct PCR amplification from whole blood without the extraction of genomic DNA, we optimized the pH value of PCR solution and the concentrations of magnesium ions and facilitator glycerol. Then, we developed multiplex allele-specific amplifications from whole blood and applied them to a case–control study. In this study, we successfully established triplex, five-plex, and eight-plex allele-specific amplifications from whole blood for determining the distribution of genotypes and alleles of 14 polymorphisms in 97 gastric cancer patients and 141 healthy controls. Statistical analysis results showed significant association of SNPs rs9344, rs1799931, and rs1800629 with the risk of gastric cancer. This method is accurate, time-saving, cost-effective, and easy-to-do, especially suitable for clinical prediction of disease susceptibility. PMID:23072573
Begg, Graham S; Cullen, Danny W; Iannetta, Pietro P M; Squire, Geoff R
2007-02-01
Testing of seed and grain lots is essential in the enforcement of GM labelling legislation and needs reliable procedures for which associated errors have been identified and minimised. In this paper we consider the testing of oilseed rape seed lots obtained from the harvest of a non-GM crop known to be contaminated by volunteer plants from a GM herbicide tolerant variety. The objective was to identify and quantify the error associated with the testing of these lots from the initial sampling to completion of the real-time PCR assay with which the level of GM contamination was quantified. The results showed that, under the controlled conditions of a single laboratory, the error associated with the real-time PCR assay to be negligible in comparison with sampling error, which was exacerbated by heterogeneity in the distribution of GM seeds, most notably at a small scale, i.e. 25 cm3. Sampling error was reduced by one to two thirds on the application of appropriate homogenisation procedures.
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error
Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee
2017-01-01
Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146
Dual processing and diagnostic errors.
Norman, Geoff
2009-09-01
In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.
Path-following in model predictive rollover prevention using front steering and braking
NASA Astrophysics Data System (ADS)
Ghazali, Mohammad; Durali, Mohammad; Salarieh, Hassan
2017-01-01
In this paper vehicle path-following in the presence of rollover risk is investigated. Vehicles with high centre of mass are prone to roll instability. Untripped rollover risk is increased in high centre of gravity vehicles and high-friction road condition. Researches introduce strategies to handle the short-duration rollover condition. In these researches, however, trajectory tracking is affected and not thoroughly investigated. This paper puts stress on tracking error from rollover prevention. A lower level model predictive front steering controller is adopted to deal with rollover and tracking error as a priority sequence. A brake control is included in lower level controller which directly obeys an upper level controller (ULC) command. The ULC manages vehicle speed regarding primarily tracking error. Simulation results show that the proposed control framework maintains roll stability while tracking error is confined to predefined error limit.
Classification-Based Spatial Error Concealment for Visual Communications
NASA Astrophysics Data System (ADS)
Chen, Meng; Zheng, Yefeng; Wu, Min
2006-12-01
In an error-prone transmission environment, error concealment is an effective technique to reconstruct the damaged visual content. Due to large variations of image characteristics, different concealment approaches are necessary to accommodate the different nature of the lost image content. In this paper, we address this issue and propose using classification to integrate the state-of-the-art error concealment techniques. The proposed approach takes advantage of multiple concealment algorithms and adaptively selects the suitable algorithm for each damaged image area. With growing awareness that the design of sender and receiver systems should be jointly considered for efficient and reliable multimedia communications, we proposed a set of classification-based block concealment schemes, including receiver-side classification, sender-side attachment, and sender-side embedding. Our experimental results provide extensive performance comparisons and demonstrate that the proposed classification-based error concealment approaches outperform the conventional approaches.
A water-vapor radiometer error model. [for ionosphere in geodetic microwave techniques
NASA Technical Reports Server (NTRS)
Beckman, B.
1985-01-01
The water-vapor radiometer (WVR) is used to calibrate unpredictable delays in the wet component of the troposphere in geodetic microwave techniques such as very-long-baseline interferometry (VLBI) and Global Positioning System (GPS) tracking. Based on experience with Jet Propulsion Laboratory (JPL) instruments, the current level of accuracy in wet-troposphere calibration limits the accuracy of local vertical measurements to 5-10 cm. The goal for the near future is 1-3 cm. Although the WVR is currently the best calibration method, many instruments are prone to systematic error. In this paper, a treatment of WVR data is proposed and evaluated. This treatment reduces the effect of WVR systematic errors by estimating parameters that specify an assumed functional form for the error. The assumed form of the treatment is evaluated by comparing the results of two similar WVR's operating near each other. Finally, the observability of the error parameters is estimated by covariance analysis.
USDA-ARS?s Scientific Manuscript database
Food preparation skills may encourage healthy eating. Traditional assessment of child food preparation employs self- or parent proxy-reporting methods, which are prone to error. The eButton is a wearable all-day camera that has promise as an objective, passive method for measuring child food prepara...
ERIC Educational Resources Information Center
Farri, Oladimeji Feyisetan
2012-01-01
Large quantities of redundant clinical data are usually transferred from one clinical document to another, making the review of such documents cognitively burdensome and potentially error-prone. Inadequate designs of electronic health record (EHR) clinical document user interfaces probably contribute to the difficulties clinicians experience while…
USDA-ARS?s Scientific Manuscript database
Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...
ATS-PD: An Adaptive Testing System for Psychological Disorders
ERIC Educational Resources Information Center
Donadello, Ivan; Spoto, Andrea; Sambo, Francesco; Badaloni, Silvana; Granziol, Umberto; Vidotto, Giulio
2017-01-01
The clinical assessment of mental disorders can be a time-consuming and error-prone procedure, consisting of a sequence of diagnostic hypothesis formulation and testing aimed at restricting the set of plausible diagnoses for the patient. In this article, we propose a novel computerized system for the adaptive testing of psychological disorders.…
Towards New Multiplatform Hybrid Online Laboratory Models
ERIC Educational Resources Information Center
Rodriguez-Gil, Luis; García-Zubia, Javier; Orduña, Pablo; López-de-Ipiña, Diego
2017-01-01
Online laboratories have traditionally been split between virtual labs, with simulated components; and remote labs, with real components. The former tend to provide less realism but to be easily scalable and less expensive to maintain, while the latter are fully real but tend to require a higher maintenance effort and be more error-prone. This…
ERIC Educational Resources Information Center
Ruller, Roberto; Silva-Rocha, Rafael; Silva, Artur; Schneider, Maria Paula Cruz; Ward, Richard John
2011-01-01
Protein engineering is a powerful tool, which correlates protein structure with specific functions, both in applied biotechnology and in basic research. Here, we present a practical teaching course for engineering the green fluorescent protein (GFP) from "Aequorea victoria" by a random mutagenesis strategy using error-prone polymerase…
Accuracy of an IFSAR-derived digital terrain model under a conifer forest canopy.
Hans-Erik Andersen; Stephen E. Reutebuch; Robert J. McGaughey
2005-01-01
Accurate digital terrain models (DTMs) are necessary for a variety of forest resource management applications, including watershed management, timber harvest planning, and fire management. Traditional methods for acquiring topographic data typically rely on aerial photogrammetry, where measurement of the terrain surface below forest canopy is difficult and error prone...
NASA Astrophysics Data System (ADS)
Banáš, Pavel; Otyepka, Michal; Jeřábek, Petr; Petřek, Martin; Damborský, Jiří
2006-06-01
1,2,3-Trichloropropane (TCP) is a highly toxic, recalcitrant byproduct of epichlorohydrin manufacture. Haloalkane dehalogenase (DhaA) from Rhodococcus sp. hydrolyses the carbon-halogen bond in various halogenated compounds including TCP, but with low efficiency ( k cat/ K m = 36 s-1 M-1). A Cys176Tyr-DhaA mutant with a threefold higher catalytic efficiency for TCP dehalogenation has been previously obtained by error-prone PCR. We have used molecular simulations and quantum mechanical calculations to elucidate the molecular mechanisms involved in the improved catalysis of the mutant, and enantioselectivity of DhaA toward TCP. The Cys176Tyr mutation modifies the protein access and export routes. Substitution of the Cys residue by the bulkier Tyr narrows the upper tunnel, making the second tunnel "slot" the preferred route. TCP can adopt two major orientations in the DhaA enzyme, in one of which the halide-stabilizing residue Asn41 forms a hydrogen bond with the terminal halogen atom of the TCP molecule, while in the other it bonds with the central halogen atom. The differences in these binding patterns explain the preferential formation of the ( R)- over the ( S)-enantiomer of 2,3-dichloropropane-1-ol in the reaction catalyzed by the enzyme.
Verhey, Theodore B; Castellanos, Mildred; Chaconas, George
2018-05-29
The Lyme disease spirochete, Borrelia burgdorferi, uses antigenic variation as a strategy to evade the host's acquired immune response. New variants of surface-localized VlsE are generated efficiently by unidirectional recombination from 15 unexpressed vls cassettes into the vlsE locus. Using algorithms to analyze switching from vlsE sequencing data, we characterize a population of over 45,000 inferred recombination events generated during mouse infection. We present evidence for clustering of these recombination events within the population and along the vlsE gene, a role for the direct repeats flanking the variable region in vlsE, and the importance of sequence homology in determining the location of recombination, despite RecA's dispensability. Finally, we report that non-templated sequence variation is strongly associated with recombinational switching and occurs predominantly at the 5' end of conversion tracts. This likely results from an error-prone repair mechanism operational during recombinational switching that elevates the mutation rate > 5,000-fold in switched regions. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Error-prone bypass of O6-methylguanine by DNA polymerase of Pseudomonas aeruginosa phage PaP1.
Gu, Shiling; Xiong, Jingyuan; Shi, Ying; You, Jia; Zou, Zhenyu; Liu, Xiaoying; Zhang, Huidong
2017-09-01
O 6 -Methylguanine (O 6 -MeG) is highly mutagenic and is commonly found in DNA exposed to methylating agents, generally leads to G:C to A:T mutagenesis. To study DNA replication encountering O 6 -MeG by the DNA polymerase (gp90) of P. aeruginosa phage PaP1, we analyzed steady-state and pre-steady-state kinetics of nucleotide incorporation opposite O 6 -MeG by gp90 exo - . O 6 -MeG partially inhibited full-length extension by gp90 exo - . O 6 -MeG greatly reduces dNTP incorporation efficiency, resulting in 67-fold preferential error-prone incorporation of dTTP than dCTP. Gp90 exo - extends beyond T:O 6 -MeG 2-fold more efficiently than C:O 6 -MeG. Incorporation of dCTP opposite G and incorporation of dCTP or dTTP opposite O 6 -MeG show fast burst phases. The pre-steady-state incorporation efficiency (k pol /K d,dNTP ) is decreased in the order of dCTP:G>dTTP:O 6 -MeG>dCTP:O 6 -MeG. The presence of O 6 -MeG at template does not affect the binding affinity of polymerase to DNA but it weakened their binding in the presence of dCTP and Mg 2+ . Misincorporation of dTTP opposite O 6 -MeG further weakens the binding affinity of polymerase to DNA. The priority of dTTP incorporation opposite O 6 -MeG is originated from the fact that dTTP can induce a faster conformational change step and a faster chemical step than dCTP. This study reveals that gp90 bypasses O 6 -MeG in an error-prone manner and provides further understanding in DNA replication encountering mutagenic alkylation DNA damage for P. aeruginosa phage PaP1. Copyright © 2017 Elsevier B.V. All rights reserved.
Nikolaitchik, Olga A.; Burdick, Ryan C.; Gorelick, Robert J.; Keele, Brandon F.; Hu, Wei-Shau; Pathak, Vinay K.
2016-01-01
Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10−5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10−21 and1 × 10−11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication. PMID:27186986
Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay
2016-04-01
Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Eyre-Walker, Adam; Stoletzki, Nina
2013-10-01
The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative.
Delviks-Frankenberry, Krista A; Nikolaitchik, Olga A; Burdick, Ryan C; Gorelick, Robert J; Keele, Brandon F; Hu, Wei-Shau; Pathak, Vinay K
2016-05-01
Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10-5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10-21 and1 × 10-11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication.
Eyre-Walker, Adam; Stoletzki, Nina
2013-01-01
The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative. PMID:24115908
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, A; Foster, J; Chu, W
2015-06-15
Purpose: Many cancer centers treat colorectal patients in the prone position on a belly board to minimize dose to the small bowel. That may potentially Result in patient setup instability with corresponding impact on dose delivery accuracy for highly conformal techniques such as IMRT/VMAT. Two aims of this work are 1) to investigate setup accuracy of rectum patients treated in the prone position on a belly board using CBCT and 2) to evaluate dosimetric impact on bladder and small bowel of treating rectum patients in supine vs. prone position. Methods: For the setup accuracy study, 10 patients were selected. Weeklymore » CBCTs were acquired and matched to bone. The CBCT-determined shifts were recorded. For the dosimetric study, 7 prone-setup patients and 7 supine-setup patients were randomly selected from our clinical database. Various clinically relevant dose volume histogram values were recorded for the small bowel and bladder. Results: The CBCT-determined rotational shifts had a wide variation. For the dataset acquired at the time of this writing, the ranges of rotational setup errors for pitch, roll, and yaw were [−3.6° 4.7°], [−4.3° 3.2°], and [−1.4° 1.4°]. For the dosimetric study: the small bowel V(45Gy) and mean dose for the prone position was 5.6±12.1% and 18.4±6.2Gy (ranges indicate standard deviations); for the supine position the corresponding dose values were 12.9±15.8% and 24.7±8.8Gy. For the bladder, the V(30Gy) and mean dose for prone position were 68.7±12.7% and 38.4±3.3Gy; for supine position these dose values were 77.1±13.7% and 40.7±3.1Gy. Conclusion: There is evidence of significant rotational instability in the prone position. The OAR dosimetry study indicates that there are some patients that may still benefit from the prone position, though many patients can be safely treated supine.« less
Commission errors of active intentions: the roles of aging, cognitive load, and practice.
Boywitt, C Dennis; Rummel, Jan; Meiser, Thorsten
2015-01-01
Performing an intended action when it needs to be withheld, for example, when temporarily prescribed medication is incompatible with the other medication, is referred to as commission errors of prospective memory (PM). While recent research indicates that older adults are especially prone to commission errors for finished intentions, there is a lack of research on the effects of aging on commission errors for still active intentions. The present research investigates conditions which might contribute to older adults' propensity to perform planned intentions under inappropriate conditions. Specifically, disproportionally higher rates of commission errors for still active intentions were observed in older than in younger adults with both salient (Experiment 1) and non-salient (Experiment 2) target cues. Practicing the PM task in Experiment 2, however, helped execution of the intended action in terms of higher PM performance at faster ongoing-task response times but did not increase the rate of commission errors. The results have important implications for the understanding of older adults' PM commission errors and the processes involved in these errors.
The Significance of the Record Length in Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Senarath, S. U.
2013-12-01
Of all of the potential natural hazards, flood is the most costly in many regions of the world. For example, floods cause over a third of Europe's average annual catastrophe losses and affect about two thirds of the people impacted by natural catastrophes. Increased attention is being paid to determining flow estimates associated with pre-specified return periods so that flood-prone areas can be adequately protected against floods of particular magnitudes or return periods. Flood frequency analysis, which is conducted by using an appropriate probability density function that fits the observed annual maximum flow data, is frequently used for obtaining these flow estimates. Consequently, flood frequency analysis plays an integral role in determining the flood risk in flood prone watersheds. A long annual maximum flow record is vital for obtaining accurate estimates of discharges associated with high return period flows. However, in many areas of the world, flood frequency analysis is conducted with limited flow data or short annual maximum flow records. These inevitably lead to flow estimates that are subject to error. This is especially the case with high return period flow estimates. In this study, several statistical techniques are used to identify errors caused by short annual maximum flow records. The flow estimates used in the error analysis are obtained by fitting a log-Pearson III distribution to the flood time-series. These errors can then be used to better evaluate the return period flows in data limited streams. The study findings, therefore, have important implications for hydrologists, water resources engineers and floodplain managers.
Errors Affect Hypothetical Intertemporal Food Choice in Women
Sellitto, Manuela; di Pellegrino, Giuseppe
2014-01-01
Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534
Skills, rules and knowledge in aircraft maintenance: errors in context
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Williamson, Ann
2002-01-01
Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.
Hasse, Katelyn; Neylon, John; Sheng, Ke; Santhanam, Anand P
2016-03-01
Breast elastography is a critical tool for improving the targeted radiotherapy treatment of breast tumors. Current breast radiotherapy imaging protocols only involve prone and supine CT scans. There is a lack of knowledge on the quantitative accuracy with which breast elasticity can be systematically measured using only prone and supine CT datasets. The purpose of this paper is to describe a quantitative elasticity estimation technique for breast anatomy using only these supine/prone patient postures. Using biomechanical, high-resolution breast geometry obtained from CT scans, a systematic assessment was performed in order to determine the feasibility of this methodology for clinically relevant elasticity distributions. A model-guided inverse analysis approach is presented in this paper. A graphics processing unit (GPU)-based linear elastic biomechanical model was employed as a forward model for the inverse analysis with the breast geometry in a prone position. The elasticity estimation was performed using a gradient-based iterative optimization scheme and a fast-simulated annealing (FSA) algorithm. Numerical studies were conducted to systematically analyze the feasibility of elasticity estimation. For simulating gravity-induced breast deformation, the breast geometry was anchored at its base, resembling the chest-wall/breast tissue interface. Ground-truth elasticity distributions were assigned to the model, representing tumor presence within breast tissue. Model geometry resolution was varied to estimate its influence on convergence of the system. A priori information was approximated and utilized to record the effect on time and accuracy of convergence. The role of the FSA process was also recorded. A novel error metric that combined elasticity and displacement error was used to quantify the systematic feasibility study. For the authors' purposes, convergence was set to be obtained when each voxel of tissue was within 1 mm of ground-truth deformation. The authors' analyses showed that a ∼97% model convergence was systematically observed with no-a priori information. Varying the model geometry resolution showed no significant accuracy improvements. The GPU-based forward model enabled the inverse analysis to be completed within 10-70 min. Using a priori information about the underlying anatomy, the computation time decreased by as much as 50%, while accuracy improved from 96.81% to 98.26%. The use of FSA was observed to allow the iterative estimation methodology to converge more precisely. By utilizing a forward iterative approach to solve the inverse elasticity problem, this work indicates the feasibility and potential of the fast reconstruction of breast tissue elasticity using supine/prone patient postures.
A Semantic Analysis of XML Schema Matching for B2B Systems Integration
ERIC Educational Resources Information Center
Kim, Jaewook
2011-01-01
One of the most critical steps to integrating heterogeneous e-Business applications using different XML schemas is schema matching, which is known to be costly and error-prone. Many automatic schema matching approaches have been proposed, but the challenge is still daunting because of the complexity of schemas and immaturity of technologies in…
A Logically Centralized Approach for Control and Management of Large Computer Networks
ERIC Educational Resources Information Center
Iqbal, Hammad A.
2012-01-01
Management of large enterprise and Internet service provider networks is a complex, error-prone, and costly challenge. It is widely accepted that the key contributors to this complexity are the bundling of control and data forwarding in traditional routers and the use of fully distributed protocols for network control. To address these…
ERIC Educational Resources Information Center
Dougherty, Michael R.; Sprenger, Amber
2006-01-01
This article introduces 2 new sources of bias in probability judgment, discrimination failure and inhibition failure, which are conceptualized as arising from an interaction between error prone memory processes and a support theory like comparison process. Both sources of bias stem from the influence of irrelevant information on participants'…
Pre-Modeling Ensures Accurate Solid Models
ERIC Educational Resources Information Center
Gow, George
2010-01-01
Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…
An Evaluation of a New Printing Instrument to Aid in Identifying the Failure-prone Preschool Child.
ERIC Educational Resources Information Center
Simner, Marvin L.
Involving 619 preschool children, a longitudinal investigation evaluated a new test for identifying preschool children who produce an excessive number of form errors in printing. All children participating were fluent in English and were in the appropriate grades for their ages, either pre-kindergarten or kindergarten, when they were given the…
Computer programs for optical dendrometer measurements of standing tree profiles
Jacob R. Beard; Thomas G. Matney; Emily B. Schultz
2015-01-01
Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...
Conducting Web-Based Surveys. ERIC Digest.
ERIC Educational Resources Information Center
Solomon, David J.
Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…
Practicable group testing method to evaluate weight/weight GMO content in maize grains.
Mano, Junichi; Yanaka, Yuka; Ikezu, Yoko; Onishi, Mari; Futo, Satoshi; Minegishi, Yasutaka; Ninomiya, Kenji; Yotsuyanagi, Yuichi; Spiegelhalter, Frank; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Naito, Shigehiro; Koiwa, Tomohiro; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi
2011-07-13
Because of the increasing use of maize hybrids with genetically modified (GM) stacked events, the established and commonly used bulk sample methods for PCR quantification of GM maize in non-GM maize are prone to overestimate the GM organism (GMO) content, compared to the actual weight/weight percentage of GM maize in the grain sample. As an alternative method, we designed and assessed a group testing strategy in which the GMO content is statistically evaluated based on qualitative analyses of multiple small pools, consisting of 20 maize kernels each. This approach enables the GMO content evaluation on a weight/weight basis, irrespective of the presence of stacked-event kernels. To enhance the method's user-friendliness in routine application, we devised an easy-to-use PCR-based qualitative analytical method comprising a sample preparation step in which 20 maize kernels are ground in a lysis buffer and a subsequent PCR assay in which the lysate is directly used as a DNA template. This method was validated in a multilaboratory collaborative trial.
Posterior capsular rent: Prevention and management.
Chakrabarti, Arup; Nazm, Nazneen
2017-12-01
This review article deals with a potentially sight threatening complication - rupture of the posterior capsule - during cataract surgery. Cataract surgery is the most commonly performed surgical procedure in ophthalmology and despite tremendous technical and technological advancements, posterior capsular rent (PCR) still occurs. PCR occurs both in the hands of experienced senior surgeons and the neophyte surgeons, although with a higher frequency in the latter group. Additionally, certain types of cataracts are prone to this development. If managed properly in a timely manner the eventual outcome may be no different from that of an uncomplicated case. However, improper management may lead to serious complications with a higher incidence of permanent visual disability. The article covers the management of posterior capsular rent from two perspectives: 1. Identifying patients at higher risk and measures to manage such patients by surgical discipline, and 2. Intraoperative management of posterior capsular rent and various case scenarios to minimize long-term complications.This review is written for experienced and not-so-experienced eye surgeons alike to understand and manage PCR.
Barakat, Fareed H; Luthra, Rajyalakshmi; Yin, C Cameron; Barkoh, Bedia A; Hai, Seema; Jamil, Waqar; Bhakta, Yaminiben I; Chen, Su; Medeiros, L Jeffrey; Zuo, Zhuang
2011-08-01
Nucleophosmin 1 (NPM1) is the most commonly mutated gene in acute myeloid leukemia. Detection of NPM1 mutations is useful for stratifying patients for therapy, predicting prognosis, and assessing for minimal residual disease. Several methods have been developed to rapidly detect NPM1 mutations in genomic DNA and/or messenger RNA specimens. To directly compare a quantitative real-time polymerase chain reaction (qPCR) assay with a widely used capillary electrophoresis assay for detecting NPM1 mutations. We adopted and modified a qPCR assay designed to detect the 6 most common NPM1 mutations and performed the assay in parallel with capillary electrophoresis assay in 207 bone marrow aspirate or peripheral blood samples from patients with a range of hematolymphoid neoplasms. The qPCR assay demonstrated a higher analytical sensitivity than the capillary electrophoresis 1/1000 versus 1/40, respectively. The capillary electrophoresis assay generated 10 equivocal results that needed to be repeated, whereas the qPCR assay generated only 1 equivocal result. After test conditions were optimized, the qPCR and capillary electrophoresis methods produced 100% concordant results, 85 positive and 122 negative. Given the higher analytical sensitivity and specificity of the qPCR assay, that assay is less likely to generate equivocal results than the capillary electrophoresis assay. Moreover, the qPCR assay is quantitative, faster, cheaper, less prone to contamination, and well suited for monitoring minimal residual disease.
A Digital PCR-Based Method for Efficient and Highly Specific Screening of Genome Edited Cells
Berman, Jennifer R.; Postovit, Lynne-Marie
2016-01-01
The rapid adoption of gene editing tools such as CRISPRs and TALENs for research and eventually therapeutics necessitates assays that can rapidly detect and quantitate the desired alterations. Currently, the most commonly used assay employs “mismatch nucleases” T7E1 or “Surveyor” that recognize and cleave heteroduplexed DNA amplicons containing mismatched base-pairs. However, this assay is prone to false positives due to cancer-associated mutations and/or SNPs and requires large amounts of starting material. Here we describe a powerful alternative wherein droplet digital PCR (ddPCR) can be used to decipher homozygous from heterozygous mutations with superior levels of both precision and sensitivity. We use this assay to detect knockout inducing alterations to stem cell associated proteins, NODAL and SFRP1, generated using either TALENs or an “all-in-one” CRISPR/Cas plasmid that we have modified for one-step cloning and blue/white screening of transformants. Moreover, we highlight how ddPCR can be used to assess the efficiency of varying TALEN-based strategies. Collectively, this work highlights how ddPCR-based screening can be paired with CRISPR and TALEN technologies to enable sensitive, specific, and streamlined approaches to gene editing and validation. PMID:27089539
NASA Astrophysics Data System (ADS)
Su, Tengfei
2018-04-01
In this paper, an unsupervised evaluation scheme for remote sensing image segmentation is developed. Based on a method called under- and over-segmentation aware (UOA), the new approach is improved by overcoming the defect in the part of estimating over-segmentation error. Two cases of such error-prone defect are listed, and edge strength is employed to devise a solution to this issue. Two subsets of high resolution remote sensing images were used to test the proposed algorithm, and the experimental results indicate its superior performance, which is attributed to its improved OSE detection model.
Kranz, J; Sommer, K-J; Steffens, J
2014-05-01
Patient safety and risk/complication management rank among the current megatrends in modern medicine, which has undoubtedly become more complex. In time-critical, error-prone and difficult situations, which often occur repeatedly in everyday clinical practice, guidelines are inappropriate for acting rapidly and intelligently. With the establishment and consistent use of standard operating procedures like in commercial aviation, a possible strategic approach is available. These medical aids to decision-making - quick reference cards - are short, optimized instructions that enable a standardized procedure in case of medical claims.
Skull registration for prone patient position using tracked ultrasound
NASA Astrophysics Data System (ADS)
Underwood, Grace; Ungi, Tamas; Baum, Zachary; Lasso, Andras; Kronreif, Gernot; Fichtinger, Gabor
2017-03-01
PURPOSE: Tracked navigation has become prevalent in neurosurgery. Problems with registration of a patient and a preoperative image arise when the patient is in a prone position. Surfaces accessible to optical tracking on the back of the head are unreliable for registration. We investigated the accuracy of surface-based registration using points accessible through tracked ultrasound. Using ultrasound allows access to bone surfaces that are not available through optical tracking. Tracked ultrasound could eliminate the need to work (i) under the table for registration and (ii) adjust the tracker between surgery and registration. In addition, tracked ultrasound could provide a non-invasive method in comparison to an alternative method of registration involving screw implantation. METHODS: A phantom study was performed to test the feasibility of tracked ultrasound for registration. An initial registration was performed to partially align the pre-operative computer tomography data and skull phantom. The initial registration was performed by an anatomical landmark registration. Surface points accessible by tracked ultrasound were collected and used to perform an Iterative Closest Point Algorithm. RESULTS: When the surface registration was compared to a ground truth landmark registration, the average TRE was found to be 1.6+/-0.1mm and the average distance of points off the skull surface was 0.6+/-0.1mm. CONCLUSION: The use of tracked ultrasound is feasible for registration of patients in prone position and eliminates the need to perform registration under the table. The translational component of error found was minimal. Therefore, the amount of TRE in registration is due to a rotational component of error.
Comparison of three PCR-based assays for SNP genotyping in sugar beet
USDA-ARS?s Scientific Manuscript database
Background: PCR allelic discrimination technologies have broad applications in the detection of single nucleotide polymorphisms (SNPs) in genetics and genomics. The use of fluorescence-tagged probes is the leading method for targeted SNP detection, but assay costs and error rates could be improved t...
Acharya, Kamal R.; Dhand, Navneet K.; Whittington, Richard J.; Plain, Karren M.
2017-01-01
Molecular tests such as polymerase chain reaction (PCR) are increasingly being applied for the diagnosis of Johne’s disease, a chronic intestinal infection of ruminants caused by Mycobacterium avium subspecies paratuberculosis (MAP). Feces, as the primary test sample, presents challenges in terms of effective DNA isolation, with potential for PCR inhibition and ultimately for reduced analytical and diagnostic sensitivity. However, limited evidence is available regarding the magnitude and diagnostic implications of PCR inhibition for the detection of MAP in feces. This study aimed to investigate the presence and diagnostic implications of PCR inhibition in a quantitative PCR assay for MAP (High-throughput Johne’s test) to investigate the characteristics of samples prone to inhibition and to identify measures that can be taken to overcome this. In a study of fecal samples derived from a high prevalence, endemically infected cattle herd, 19.94% of fecal DNA extracts showed some evidence of inhibition. Relief of inhibition by a five-fold dilution of the DNA extract led to an average increase in quantification of DNA by 3.3-fold that consequently increased test sensitivity of the qPCR from 55 to 80% compared to fecal culture. DNA extracts with higher DNA and protein content had 19.33 and 10.94 times higher odds of showing inhibition, respectively. The results suggest that the current test protocol is sensitive for herd level diagnosis of Johne’s disease but that test sensitivity and individual level diagnosis could be enhanced by relief of PCR inhibition, achieved by five-fold dilution of the DNA extract. Furthermore, qualitative and quantitative parameters derived from absorbance measures of DNA extracts could be useful for prediction of inhibitory fecal samples. PMID:28210245
Cheng, Hong; Macaluso, Maurizio; Vermund, Sten H.; Hook, Edward W.
2001-01-01
Published estimates of the sensitivity and specificity of PCR and ligase chain reaction (LCR) for detecting Chlamydia trachomatis are potentially biased because of study design limitations (confirmation of test results was limited to subjects who were PCR or LCR positive but culture negative). Relative measures of test accuracy are less prone to bias in incomplete study designs. We estimated the relative sensitivity (RSN) and relative false-positive rate (RFP) for PCR and LCR versus cell culture among 1,138 asymptomatic men and evaluated the potential bias of RSN and RFP estimates. PCR and LCR testing in urine were compared to culture of urethral specimens. Discordant results (PCR or LCR positive, but culture negative) were confirmed by using a sequence including the other DNA amplification test, direct fluorescent antibody testing, and a DNA amplification test to detect chlamydial major outer membrane protein. The RSN estimates for PCR and LCR were 1.45 (95% confidence interval [CI] = 1.3 to 1.7) and 1.49 (95% CI = 1.3 to 1.7), respectively, indicating that both methods are more sensitive than culture. Very few false-positive results were found, indicating that the specificity levels of PCR, LCR, and culture are high. The potential bias in RSN and RFP estimates were <5 and <20%, respectively. The estimation of bias is based on the most likely and probably conservative parameter settings. If the sensitivity of culture is between 60 and 65%, then the true sensitivity of PCR and LCR is between 90 and 97%. Our findings indicate that PCR and LCR are significantly more sensitive than culture, while the three tests have similar specificities. PMID:11682509
13Check_RNA: A tool to evaluate 13C chemical shifts assignments of RNA.
Icazatti, A A; Martin, O A; Villegas, M; Szleifer, I; Vila, J A
2018-06-19
Chemical shifts (CS) are an important source of structural information of macromolecules such as RNA. In addition to the scarce availability of CS for RNA, the observed values are prone to errors due to a wrong re-calibration or miss assignments. Different groups have dedicated their efforts to correct CS systematic errors on RNA. Despite this, there are not automated and freely available algorithms for correct assignments of RNA 13C CS before their deposition to the BMRB or re-reference already deposited CS with systematic errors. Based on an existent method we have implemented an open source python module to correct 13C CS (from here on 13Cexp) systematic errors of RNAs and then return the results in 3 formats including the nmrstar one. This software is available on GitHub at https://github.com/BIOS-IMASL/13Check_RNA under a MIT license. Supplementary data are available at Bioinformatics online.
Saldanha, J; Silvy, M; Beaufils, N; Arlinghaus, R; Barbany, G; Branford, S; Cayuela, J-M; Cazzaniga, G; Gonzalez, M; Grimwade, D; Kairisto, V; Miyamura, K; Lawler, M; Lion, T; Macintyre, E; Mahon, F-X; Muller, M C; Ostergaard, M; Pfeifer, H; Saglio, G; Sawyers, C; Spinelli, O; van der Velden, V H J; Wang, J Q; Zoi, K; Patel, V; Phillips, P; Matejtschuk, P; Gabert, J
2007-07-01
Monitoring of BCR-ABL transcripts has become established practice in the management of chronic myeloid leukemia. However, nucleic acid amplification techniques are prone to variations which limit the reliability of real-time quantitative PCR (RQ-PCR) for clinical decision making, highlighting the need for standardization of assays and reporting of minimal residual disease (MRD) data. We evaluated a lyophilized preparation of a leukemic cell line (K562) as a potential quality control reagent. This was found to be relatively stable, yielding comparable respective levels of ABL, GUS and BCR-ABL transcripts as determined by RQ-PCR before and after accelerated degradation experiments as well as following 5 years storage at -20 degrees C. Vials of freeze-dried cells were sent at ambient temperature to 22 laboratories on four continents, with RQ-PCR analyses detecting BCR-ABL transcripts at levels comparable to those observed in primary patient samples. Our results suggest that freeze-dried cells can be used as quality control reagents with a range of analytical instrumentations and could enable the development of urgently needed international standards simulating clinically relevant levels of MRD.
Corrected score estimation in the proportional hazards model with misclassified discrete covariates
Zucker, David M.; Spiegelman, Donna
2013-01-01
SUMMARY We consider Cox proportional hazards regression when the covariate vector includes error-prone discrete covariates along with error-free covariates, which may be discrete or continuous. The misclassification in the discrete error-prone covariates is allowed to be of any specified form. Building on the work of Nakamura and his colleagues, we present a corrected score method for this setting. The method can handle all three major study designs (internal validation design, external validation design, and replicate measures design), both functional and structural error models, and time-dependent covariates satisfying a certain ‘localized error’ condition. We derive the asymptotic properties of the method and indicate how to adjust the covariance matrix of the regression coefficient estimates to account for estimation of the misclassification matrix. We present the results of a finite-sample simulation study under Weibull survival with a single binary covariate having known misclassification rates. The performance of the method described here was similar to that of related methods we have examined in previous works. Specifically, our new estimator performed as well as or, in a few cases, better than the full Weibull maximum likelihood estimator. We also present simulation results for our method for the case where the misclassification probabilities are estimated from an external replicate measures study. Our method generally performed well in these simulations. The new estimator has a broader range of applicability than many other estimators proposed in the literature, including those described in our own earlier work, in that it can handle time-dependent covariates with an arbitrary misclassification structure. We illustrate the method on data from a study of the relationship between dietary calcium intake and distal colon cancer. PMID:18219700
Intransparent German number words complicate transcoding - a translingual comparison with Japanese.
Moeller, Korbinian; Zuber, Julia; Olsen, Naoko; Nuerk, Hans-Christoph; Willmes, Klaus
2015-01-01
Superior early numerical competencies of children in several Asian countries have (amongst others) been attributed to the higher transparency of their number word systems. Here, we directly investigated this claim by evaluating whether Japanese children's transcoding performance when writing numbers to dictation (e.g., "twenty five" → 25) was less error prone than that of German-speaking children - both in general as well as when considering language-specific attributes of the German number word system such as the inversion property, in particular. In line with this hypothesis we observed that German-speaking children committed more transcoding errors in general than their Japanese peers. Moreover, their error pattern reflected the specific inversion intransparency of the German number-word system. Inversion errors in transcoding represented the most prominent error category in German-speaking children, but were almost absent in Japanese-speaking children. We conclude that the less transparent German number-word system complicates the acquisition of the correspondence between symbolic Arabic numbers and their respective verbal number words.
ERIC Educational Resources Information Center
Rast, Philippe; Zimprich, Daniel; Van Boxtel, Martin; Jolles, Jellemer
2009-01-01
The Cognitive Failures Questionnaire (CFQ) is designed to assess a person's proneness to committing cognitive slips and errors in the completion of everyday tasks. Although the CFQ is a widely used instrument, its factor structure remains an issue of scientific debate. The present study used data of a representative sample (N = 1,303, 24-83 years…
Ground-based digital imagery for tree stem analysis
Neil Clark; Daniel L. Schmoldt; Randolph H. Wynne; Matthew F. Winn; Philip A. Araman
2000-01-01
In the USA, a subset of permanent forest sample plots within each geographic region are intensively measured to obtain estimates of tree volume and products. The detailed field measurements required for this type of sampling are both time consuming and error prone. We are attempting to reduce both of these factors with the aid of a commercially-available solid-state...
USDA-ARS?s Scientific Manuscript database
We investigated measurement error in the self-reported diets of US Hispanics/Latinos, who are prone to obesity and related comorbidities, by background (Central American, Cuban, Dominican, Mexican, Puerto Rican, and South American) in 2010–2012. In 477 participants aged 18–74 years, doubly labeled w...
ERIC Educational Resources Information Center
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias
2017-01-01
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Aboussekhra, A; Chanet, R; Zgaga, Z; Cassier-Chauvat, C; Heude, M; Fabre, F
1989-09-25
A new type of radiation-sensitive mutant of S. cerevisiae is described. The recessive radH mutation sensitizes to the lethal effect of UV radiations haploids in the G1 but not in the G2 mitotic phase. Homozygous diploids are as sensitive as G1 haploids. The UV-induced mutagenesis is depressed, while the induction of gene conversion is increased. The mutation is believed to channel the repair of lesions engaged in the mutagenic pathway into a recombination process, successful if the events involve sister-chromatids but lethal if they involve homologous chromosomes. The sequence of the RADH gene reveals that it may code for a DNA helicase, with a Mr of 134 kDa. All the consensus domains of known DNA helicases are present. Besides these consensus regions, strong homologies with the Rep and UvrD helicases of E. coli were found. The RadH putative helicase appears to belong to the set of proteins involved in the error-prone repair mechanism, at least for UV-induced lesions, and could act in coordination with the Rev3 error-prone DNA polymerase.
Whitman, Richard L.; Ge, Zhongfu; Nevers, Meredith B.; Boehm, Alexandria B.; Chern, Eunice C.; Haugland, Richard A.; Lukasik, Ashley M.; Molina, Marirosa; Przybyla-Kelly, Kasia; Shively, Dawn A.; White, Emily M.; Zepp, Richard G.; Byappanahalli, Muruleedhara N.
2010-01-01
The quantitative polymerase chain reaction (qPCR) method provides rapid estimates of fecal indicator bacteria densities that have been indicated to be useful in the assessment of water quality. Primarily because this method provides faster results than standard culture-based methods, the U.S. Environmental Protection Agency is currently considering its use as a basis for revised ambient water quality criteria. In anticipation of this possibility, we sought to examine the relationship between qPCR-based and culture-based estimates of enterococci in surface waters. Using data from several research groups, we compared enterococci estimates by the two methods in water samples collected from 37 sites across the United States. A consistent linear pattern in the relationship between cell equivalents (CCE), based on the qPCR method, and colony-forming units (CFU), based on the traditional culturable method, was significant (P 10CFU > 2.0/100 mL) while uncertainty increases at lower CFU values. It was further noted that the relative error in replicated qPCR estimates was generally higher than that in replicated culture counts even at relatively high target levels, suggesting a greater need for replicated analyses in the qPCR method to reduce relative error. Further studies evaluating the relationship between culture and qPCR should take into account analytical uncertainty as well as potential differences in results of these methods that may arise from sample variability, different sources of pollution, and environmental factors.
Magellan spacecraft and memory state tracking: Lessons learned, future thoughts
NASA Technical Reports Server (NTRS)
Bucher, Allen W.
1993-01-01
Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.
Huang, Chengqiang; Yang, Youchang; Wu, Bo; Yu, Weize
2018-06-01
The sub-pixel arrangement of the RGBG panel and the image with RGB format are different and the algorithm that converts RGB to RGBG is urgently needed to display an image with RGB arrangement on the RGBG panel. However, the information loss is still large although color fringing artifacts are weakened in the published papers that study this conversion. In this paper, an RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error (EDMSE) is proposed. The main points of innovation include the following: (1) the edge detection is first proposed to distinguish image details with serious color fringing artifacts and image details which are prone to be lost in the process of RGB-RGBG conversion; (2) for image details with serious color fringing artifacts, the weighting factor 0.5 is applied to weaken the color fringing artifacts; and (3) for image details that are prone to be lost in the process of RGB-RGBG conversion, a special mechanism to minimize square error is proposed. The experiment shows that the color fringing artifacts are slightly improved by EDMSE, and the values of MSE of the image processed are 19.6% and 7% smaller than those of the image processed by the direct assignment and weighting factor algorithm, respectively. The proposed algorithm is implemented on a field programmable gate array to enable the image display on the RGBG panel.
Magellan spacecraft and memory state tracking: Lessons learned, future thoughts
NASA Astrophysics Data System (ADS)
Bucher, Allen W.
1993-03-01
Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.
Chancey, Eric T; Bliss, James P; Yamani, Yusuke; Handley, Holly A H
2017-05-01
This study provides a theoretical link between trust and the compliance-reliance paradigm. We propose that for trust mediation to occur, the operator must be presented with a salient choice, and there must be an element of risk for dependence. Research suggests that false alarms and misses affect dependence via two independent processes, hypothesized as trust in signals and trust in nonsignals. These two trust types manifest in categorically different behaviors: compliance and reliance. Eighty-eight participants completed a primary flight task and a secondary signaling system task. Participants evaluated their trust according to the informational bases of trust: performance, process, and purpose. Participants were in a high- or low-risk group. Signaling systems varied by reliability (90%, 60%) within subjects and error bias (false alarm prone, miss prone) between subjects. False-alarm rate affected compliance but not reliance. Miss rate affected reliance but not compliance. Mediation analyses indicated that trust mediated the relationship between false-alarm rate and compliance. Bayesian mediation analyses favored evidence indicating trust did not mediate miss rate and reliance. Conditional indirect effects indicated that factors of trust mediated the relationship between false-alarm rate and compliance (i.e., purpose) and reliance (i.e., process) but only in the high-risk group. The compliance-reliance paradigm is not the reflection of two types of trust. This research could be used to update training and design recommendations that are based upon the assumption that trust causes operator responses regardless of error bias.
Using Block-local Atomicity to Detect Stale-value Concurrency Errors
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Biere, Armin
2004-01-01
Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.
Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials
Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.
2013-01-01
Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072
Non-stochastic sampling error in quantal analyses for Campylobacter species on poultry products
USDA-ARS?s Scientific Manuscript database
Using primers and fluorescent probes specific for the most common foodborne Campylobacter species (C. jejuni = Cj and C. coli = Cc), we developed a multiplex, most probable number (MPN) assay using quantitative PCR (qPCR) as the determinant for binomial detection: number of p positives out of n = 6 ...
Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.
2010-01-01
Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error patterns of 65 aphasic subjects from their naming errors. The model’s characterizations of the subjects’ naming errors were taken from the companion paper to this one (Schwartz, Dell, N. Martin, Gahl & Sobel, 2006), and their repetition was predicted from the model on the assumption that naming involves two error prone steps, word and phonological retrieval, whereas repetition only creates errors in the second of these steps. A version of the model in which lexical-semantic and lexical-phonological connections could be independently lesioned was generally successful in predicting repetition for the aphasics. An analysis of the few cases in which model predictions were inaccurate revealed the role of input phonology in the repetition task. PMID:21085621
Portable and Error-Free DNA-Based Data Storage.
Yazdi, S M Hossein Tabatabaei; Gabrys, Ryan; Milenkovic, Olgica
2017-07-10
DNA-based data storage is an emerging nonvolatile memory technology of potentially unprecedented density, durability, and replication efficiency. The basic system implementation steps include synthesizing DNA strings that contain user information and subsequently retrieving them via high-throughput sequencing technologies. Existing architectures enable reading and writing but do not offer random-access and error-free data recovery from low-cost, portable devices, which is crucial for making the storage technology competitive with classical recorders. Here we show for the first time that a portable, random-access platform may be implemented in practice using nanopore sequencers. The novelty of our approach is to design an integrated processing pipeline that encodes data to avoid costly synthesis and sequencing errors, enables random access through addressing, and leverages efficient portable sequencing via new iterative alignment and deletion error-correcting codes. Our work represents the only known random access DNA-based data storage system that uses error-prone nanopore sequencers, while still producing error-free readouts with the highest reported information rate/density. As such, it represents a crucial step towards practical employment of DNA molecules as storage media.
DNA assembly with error correction on a droplet digital microfluidics platform.
Khilko, Yuliya; Weyman, Philip D; Glass, John I; Adams, Mark D; McNeil, Melanie A; Griffin, Peter B
2018-06-01
Custom synthesized DNA is in high demand for synthetic biology applications. However, current technologies to produce these sequences using assembly from DNA oligonucleotides are costly and labor-intensive. The automation and reduced sample volumes afforded by microfluidic technologies could significantly decrease materials and labor costs associated with DNA synthesis. The purpose of this study was to develop a gene assembly protocol utilizing a digital microfluidic device. Toward this goal, we adapted bench-scale oligonucleotide assembly methods followed by enzymatic error correction to the Mondrian™ digital microfluidic platform. We optimized Gibson assembly, polymerase chain reaction (PCR), and enzymatic error correction reactions in a single protocol to assemble 12 oligonucleotides into a 339-bp double- stranded DNA sequence encoding part of the human influenza virus hemagglutinin (HA) gene. The reactions were scaled down to 0.6-1.2 μL. Initial microfluidic assembly methods were successful and had an error frequency of approximately 4 errors/kb with errors originating from the original oligonucleotide synthesis. Relative to conventional benchtop procedures, PCR optimization required additional amounts of MgCl 2 , Phusion polymerase, and PEG 8000 to achieve amplification of the assembly and error correction products. After one round of error correction, error frequency was reduced to an average of 1.8 errors kb - 1 . We demonstrated that DNA assembly from oligonucleotides and error correction could be completely automated on a digital microfluidic (DMF) platform. The results demonstrate that enzymatic reactions in droplets show a strong dependence on surface interactions, and successful on-chip implementation required supplementation with surfactants, molecular crowding agents, and an excess of enzyme. Enzymatic error correction of assembled fragments improved sequence fidelity by 2-fold, which was a significant improvement but somewhat lower than expected compared to bench-top assays, suggesting an additional capacity for optimization.
Hunter, Margaret; Dorazio, Robert M.; Butterfield, John S.; Meigs-Friend, Gaia; Nico, Leo; Ferrante, Jason A.
2017-01-01
A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species’ presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty – indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis, and forensic and clinical diagnostics.
Spatial calibration of an optical see-through head mounted display
Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew
2010-01-01
We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125
Samsiah, A; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi; Wan-Mohaina, W M
2016-12-01
Reporting and analysing the data on medication errors (MEs) is important and contributes to a better understanding of the error-prone environment. This study aims to examine the characteristics of errors submitted to the National Medication Error Reporting System (MERS) in Malaysia. A retrospective review of reports received from 1 January 2009 to 31 December 2012 was undertaken. Descriptive statistics method was applied. A total of 17,357 MEs reported were reviewed. The majority of errors were from public-funded hospitals. Near misses were classified in 86.3 % of the errors. The majority of errors (98.1 %) had no harmful effects on the patients. Prescribing contributed to more than three-quarters of the overall errors (76.1 %). Pharmacists detected and reported the majority of errors (92.1 %). Cases of erroneous dosage or strength of medicine (30.75 %) were the leading type of error, whilst cardiovascular (25.4 %) was the most common category of drug found. MERS provides rich information on the characteristics of reported MEs. Low contribution to reporting from healthcare facilities other than government hospitals and non-pharmacists requires further investigation. Thus, a feasible approach to promote MERS among healthcare providers in both public and private sectors needs to be formulated and strengthened. Preventive measures to minimise MEs should be directed to improve prescribing competency among the fallible prescribers identified.
Matsui, Daisuke; Okazaki, Seiji; Matsuda, Motoki; Asano, Yasuhisa
2015-02-20
Microbial NAD(+)-dependent L-tryptophan dehydrogenase (TrpDH, EC1.4.1.19), which catalyzes the reversible oxidative deamination and the reductive amination between L-tryptophan and indole-3-pyruvic acid, was found in the scytonemin biosynthetic pathway of Nostoc punctiforme ATCC29133. The TrpDH exhibited high specificity toward L-tryptophan, but its instability was a drawback for L-tryptophan determination. The mutant enzyme TrpDH L59F/D168G/A234D/I296N with thermal stability was obtained by screening of Escherichia coli transformants harboring various mutant genes, which were generated by error-prone PCR using complementation in an L-tryptophan auxotroph of E. coli. The specific activity and stability of this mutant enzyme were higher than those of the wild type enzyme. We also revealed here that in these four mutation points, the two amino acid residues Asp168 and Ile296 contributed to increase the enzyme stability, and the Leu59, Ala234 residues to increase its specific activity. Growth of the strain harboring the gene of above 4 point mutated enzyme was accelerated by the enhanced performance. In the present study, we demonstrated that TrpDH L59F/D168G/A234D/I296N was available for determination of L-tryptophan in human plasma. Copyright © 2015 Elsevier B.V. All rights reserved.
Pydna: a simulation and documentation tool for DNA assembly strategies using python.
Pereira, Filipa; Azevedo, Flávio; Carvalho, Ângela; Ribeiro, Gabriela F; Budde, Mark W; Johansson, Björn
2015-05-02
Recent advances in synthetic biology have provided tools to efficiently construct complex DNA molecules which are an important part of many molecular biology and biotechnology projects. The planning of such constructs has traditionally been done manually using a DNA sequence editor which becomes error-prone as scale and complexity of the construction increase. A human-readable formal description of cloning and assembly strategies, which also allows for automatic computer simulation and verification, would therefore be a valuable tool. We have developed pydna, an extensible, free and open source Python library for simulating basic molecular biology DNA unit operations such as restriction digestion, ligation, PCR, primer design, Gibson assembly and homologous recombination. A cloning strategy expressed as a pydna script provides a description that is complete, unambiguous and stable. Execution of the script automatically yields the sequence of the final molecule(s) and that of any intermediate constructs. Pydna has been designed to be understandable for biologists with limited programming skills by providing interfaces that are semantically similar to the description of molecular biology unit operations found in literature. Pydna simplifies both the planning and sharing of cloning strategies and is especially useful for complex or combinatorial DNA molecule construction. An important difference compared to existing tools with similar goals is the use of Python instead of a specifically constructed language, providing a simulation environment that is more flexible and extensible by the user.
Reznicek, O; Facey, S J; de Waal, P P; Teunissen, A W R H; de Bont, J A M; Nijland, J G; Driessen, A J M; Hauer, B
2015-07-01
Saccharomyces cerevisiae does not express any xylose-specific transporters. To enhance the xylose uptake of S. cerevisiae, directed evolution of the Gal2 transporter was performed. Three rounds of error-prone PCR were used to generate mutants with improved xylose-transport characteristics. After developing a fast and reliable high-throughput screening assay based on flow cytometry, eight mutants were obtained showing an improved uptake of xylose compared to wild-type Gal2 out of 41 200 single yeast cells. Gal2 variant 2·1 harbouring five amino acid substitutions showed an increased affinity towards xylose with a faster overall sugar metabolism of glucose and xylose. Another Gal2 variant 3·1 carrying an additional amino acid substitution revealed an impaired growth on glucose but not on xylose. Random mutagenesis of the S. cerevisiae Gal2 led to an increased xylose uptake capacity and decreased glucose affinity, allowing improved co-consumption. Random mutagenesis is a powerful tool to evolve sugar transporters like Gal2 towards co-consumption of new substrates. Using a high-throughput screening system based on flow-through cytometry, various mutants were identified with improved xylose-transport characteristics. The Gal2 variants in this work are a promising starting point for further engineering to improve xylose uptake from mixed sugars in biomass. © 2015 The Society for Applied Microbiology.
Varriale, Simona; Cerullo, Gabriella; Antonopoulou, Io; Christakopoulos, Paul; Rova, Ulrika; Tron, Thierry; Fauré, Régis; Jütten, Peter; Piechot, Alexander; Brás, Joana L A; Fontes, Carlos M G A; Faraco, Vincenza
2018-06-01
The chemical syntheses currently employed for industrial purposes, including in the manufacture of cosmetics, present limitations such as unwanted side reactions and the need for harsh chemical reaction conditions. In order to overcome these drawbacks, novel enzymes are developed to catalyze the targeted bioconversions. In the present study, a methodology for the construction and the automated screening of evolved variants library of a Type B feruloyl esterase from Myceliophthora thermophila (MtFae1a) was developed and applied to generation of 30,000 mutants and their screening for selecting the variants with higher activity than the wild-type enzyme. The library was generated by error-prone PCR of mtfae1a cDNA and expressed in Saccharomyces cerevisiae. Screening for extracellular enzymatic activity towards 4-nitrocatechol-1-yl ferulate, a new substrate developed ad hoc for high-throughput assays of feruloyl esterases, led to the selection of 30 improved enzyme variants. The best four variants and the wild-type MtFae1a were investigated in docking experiments with hydroxycinnamic acid esters using a model of 3D structure of MtFae1a. These variants were also used as biocatalysts in transesterification reactions leading to different target products in detergentless microemulsions and showed enhanced synthetic activities, although the screening strategy had been based on improved hydrolytic activity.
Huang, Renhua; Fang, Pete; Kay, Brian K
2012-09-01
Site-directed mutagenesis is routinely performed in protein engineering experiments. One method, termed Kunkel mutagenesis, is frequently used for constructing libraries of peptide or protein variants in M13 bacteriophage, followed by affinity selection of phage particles. To make this method more efficient, the following two modifications were introduced: culture was incubated at 25°C for phage replication, which yielded two- to sevenfold more single-stranded DNA template compared to growth at 37°C, and restriction endonuclease recognition sites were used to remove non-recombinants. With both of the improvements, we could construct primary libraries of high complexity and that were 99-100% recombinant. Finally, with a third modification to the standard protocol of Kunkel mutagenesis, two secondary (mutagenic) libraries of a fibronectin type III (FN3) monobody were constructed with DNA segments that were amplified by error-prone and asymmetric PCR. Two advantages of this modification are that it bypasses the lengthy steps of restriction enzyme digestion and ligation, and that the pool of phage clones, recovered after affinity selection, can be used directly to generate a secondary library. Screening one of the two mutagenic libraries yielded variants that bound two- to fourfold tighter to human Pak1 kinase than the starting clone. The protocols described in this study should accelerate the discovery of phage-displayed recombinant affinity reagents. Copyright © 2012 Elsevier Inc. All rights reserved.
Functional dissection of the alphavirus capsid protease: sequence requirements for activity.
Thomas, Saijo; Rai, Jagdish; John, Lijo; Günther, Stephan; Drosten, Christian; Pützer, Brigitte M; Schaefer, Stephan
2010-11-18
The alphavirus capsid is multifunctional and plays a key role in the viral life cycle. The nucleocapsid domain is released by the self-cleavage activity of the serine protease domain within the capsid. All alphaviruses analyzed to date show this autocatalytic cleavage. Here we have analyzed the sequence requirements for the cleavage activity of Chikungunya virus capsid protease of genus alphavirus. Amongst alphaviruses, the C-terminal amino acid tryptophan (W261) is conserved and found to be important for the cleavage. Mutating tryptophan to alanine (W261A) completely inactivated the protease. Other amino acids near W261 were not having any effect on the activity of this protease. However, serine protease inhibitor AEBSF did not inhibit the activity. Through error-prone PCR we found that isoleucine 227 is important for the effective activity. The loss of activity was analyzed further by molecular modelling and comparison of WT and mutant structures. It was found that lysine introduced at position 227 is spatially very close to the catalytic triad and may disrupt electrostatic interactions in the catalytic site and thus inactivate the enzyme. We are also examining other sequence requirements for this protease activity. We analyzed various amino acid sequence requirements for the activity of ChikV capsid protease and found that amino acids outside the catalytic triads are important for the activity.
Miyaoka, Yuichiro; Berman, Jennifer R; Cooper, Samantha B; Mayerl, Steven J; Chan, Amanda H; Zhang, Bin; Karlin-Neumann, George A; Conklin, Bruce R
2016-03-31
Precise genome-editing relies on the repair of sequence-specific nuclease-induced DNA nicking or double-strand breaks (DSBs) by homology-directed repair (HDR). However, nonhomologous end-joining (NHEJ), an error-prone repair, acts concurrently, reducing the rate of high-fidelity edits. The identification of genome-editing conditions that favor HDR over NHEJ has been hindered by the lack of a simple method to measure HDR and NHEJ directly and simultaneously at endogenous loci. To overcome this challenge, we developed a novel, rapid, digital PCR-based assay that can simultaneously detect one HDR or NHEJ event out of 1,000 copies of the genome. Using this assay, we systematically monitored genome-editing outcomes of CRISPR-associated protein 9 (Cas9), Cas9 nickases, catalytically dead Cas9 fused to FokI, and transcription activator-like effector nuclease at three disease-associated endogenous gene loci in HEK293T cells, HeLa cells, and human induced pluripotent stem cells. Although it is widely thought that NHEJ generally occurs more often than HDR, we found that more HDR than NHEJ was induced under multiple conditions. Surprisingly, the HDR/NHEJ ratios were highly dependent on gene locus, nuclease platform, and cell type. The new assay system, and our findings based on it, will enable mechanistic studies of genome-editing and help improve genome-editing technology.
Cloning, expression and mutation of a triazophos hydrolase gene from Burkholderia sp. SZL-1.
Zhang, Hao; Li, Qiang; Guo, Su-Hui; Cheng, Ming-Gen; Zhao, Meng-Jun; Hong, Qing; Huang, Xing
2016-06-01
Triazophos is a broad-spectrum and highly effective insecticide, and the residues of triazophos have been frequently detected in the environment. A triazophos-degrading bacterium, Burkholderia sp. SZL-1, was isolated from a long-term triazophos-polluted soil. Strain SZL-1 could hydrolyze triazophos to 1-phenyl-3-hydroxy-1,2,4-triazole, which was further utilized as the carbon sources for growth. The triazophos hydrolase gene trhA, cloned from strain SZL-1, was expressed and homogenously purified using Ni-nitrilotriacetic acid affinity chromatography. TrhA is 55 kDa and displays maximum activity at 25°C, pH 8.0. This enzyme still has nearly 60% activity at the range of 15°C-50°C for 30 min. TrhA was mutated by sequential error prone PCR and screened for improved activity for triazophos degradation. One purified variant protein (Val89-Gly89) named TrhA-M1 showed up to 3-fold improvement in specific activity against triazophos, and the specificity constants of Kcat and Kcat/Km for TrhA-M1 were improved up to 2.3- and 8.28-fold, respectively, compared to the wild-type enzyme. The results in this paper provided potential material for the contaminated soil remediation and hydrolase genetic structure research. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Branchini, Bruce R.; Southworth, Tara L.; Khattak, Neelum F.; Murtiashaw, Martha H.; Fleet, Sarah E.
2004-06-01
Firefly luciferase, which emits yellow-green (557 nm) light, and the corresponding cDNA have been used successfully as a bioluminescence reporter of gene expression. One particularly exciting application is in the area of in vivo bioluminescence imaging. Our interest is in developing improved reagents by identifying Photinus pyralis luciferase mutants that efficiently emit red bioluminescence. In this way, the proven advantages of the P. pyralis protein can be combined with the potential advantages of a red-shifted emitter. Using site-directed mutagenesis techniques, we have identified many mutants emitting red bioluminescence. Unfortunately, these enzymes generally have significantly decreased bioluminescence activity. Interestingly, we discovered a mutation, Ile351Ala, that produced a moderate 16 nm red-shift, while maintaining excellent bioluminescence activity. We then undertook a random mutagenesis approach to identify luciferase mutants that emit further red-shifted bioluminescence with minimal loss of activity. Libraries of mutants were created using an error-prone PCR method and the Ile351Ala luciferase mutant as the template DNA. The libraries were screened by in vivo bacterial assays and the promising mutants were purified to enable accurate determination of bioluminescence emission spectra and total bioluminescence activity. We will report the characterization results, including the identification of the randomly altered amino acids, of several mutants that catalyze bioluminescence with emission maxima of approximately 600 nm.
Methodological flaws introduce strong bias into molecular analysis of microbial populations.
Krakat, N; Anjum, R; Demirel, B; Schröder, P
2017-02-01
In this study, we report how different cell disruption methods, PCR primers and in silico analyses can seriously bias results from microbial population studies, with consequences for the credibility and reproducibility of the findings. Our results emphasize the pitfalls of commonly used experimental methods that can seriously weaken the interpretation of results. Four different cell lysis methods, three commonly used primer pairs and various computer-based analyses were applied to investigate the microbial diversity of a fermentation sample composed of chicken dung. The fault-prone, but still frequently used, amplified rRNA gene restriction analysis was chosen to identify common weaknesses. In contrast to other studies, we focused on the complete analytical process, from cell disruption to in silico analysis, and identified potential error rates. This identified a wide disagreement of results between applied experimental approaches leading to very different community structures depending on the chosen approach. The interpretation of microbial diversity data remains a challenge. In order to accurately investigate the taxonomic diversity and structure of prokaryotic communities, we suggest a multi-level approach combining DNA-based and DNA-independent techniques. The identified weaknesses of commonly used methods to study microbial diversity can be overcome by a multi-level approach, which produces more reliable data about the fate and behaviour of microbial communities of engineered habitats such as biogas plants, so that the best performance can be ensured. © 2016 The Society for Applied Microbiology.
Hochman, Eldad Yitzhak; Orr, Joseph M; Gehring, William J
2014-02-01
Cognitive control in the posterior medial frontal cortex (pMFC) is formulated in models that emphasize adaptive behavior driven by a computation evaluating the degree of difference between 2 conflicting responses. These functions are manifested by an event-related brain potential component coined the error-related negativity (ERN). We hypothesized that the ERN represents a regulative rather than evaluative pMFC process, exerted over the error motor representation, expediting the execution of a corrective response. We manipulated the motor representations of the error and the correct response to varying degrees. The ERN was greater when 1) the error response was more potent than when the correct response was more potent, 2) more errors were committed, 3) fewer and slower corrections were observed, and 4) the error response shared fewer motor features with the correct response. In their current forms, several prominent models of the pMFC cannot be reconciled with these findings. We suggest that a prepotent, unintended error is prone to reach the manual motor processor responsible for response execution before a nonpotent, intended correct response. In this case, the correct response is a correction and its execution must wait until the error is aborted. The ERN may reflect pMFC activity that aimed to suppress the error.
Comprehensive analysis of a medication dosing error related to CPOE.
Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L
2005-01-01
This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.
ERIC Educational Resources Information Center
Murphy, Jeremy W.; Foxe, John J.; Molholm, Sophie
2016-01-01
The ability to attend to one among multiple sources of information is central to everyday functioning. Just as central is the ability to switch attention among competing inputs as the task at hand changes. Such processes develop surprisingly slowly, such that even into adolescence, we remain slower and more error prone at switching among tasks…
Real-time monitoring of clinical processes using complex event processing and transition systems.
Meinecke, Sebastian
2014-01-01
Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.
"Truth be told" - Semantic memory as the scaffold for veridical communication.
Hayes, Brett K; Ramanan, Siddharth; Irish, Muireann
2018-01-01
Theoretical accounts placing episodic memory as central to constructive and communicative functions neglect the role of semantic memory. We argue that the decontextualized nature of semantic schemas largely supersedes the computational bottleneck and error-prone nature of episodic memory. Rather, neuroimaging and neuropsychological evidence of episodic-semantic interactions suggest that an integrative framework more accurately captures the mechanisms underpinning social communication.
Automated lattice data generation
NASA Astrophysics Data System (ADS)
Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.
2018-03-01
The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.
ERIC Educational Resources Information Center
Frings, Christian; Spence, Charles
2011-01-01
Negative priming (NP) refers to the finding that people's responses to probe targets previously presented as prime distractors are usually slower and more error prone than to unrepeated stimuli. In a typical NP experiment, each probe target is accompanied by a distractor. It is an accepted, albeit puzzling, finding that the NP effect depends on…
An abstract specification language for Markov reliability models
NASA Technical Reports Server (NTRS)
Butler, R. W.
1985-01-01
Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.
An abstract language for specifying Markov reliability models
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
1986-01-01
Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.
ERIC Educational Resources Information Center
Foreman, David; Morton, Stephanie; Ford, Tamsin
2009-01-01
Background: The clinical diagnosis of ADHD is time-consuming and error-prone. Secondary care referral results in long waiting times, but primary care staff may not provide reliable diagnoses. The Development And Well-Being Assessment (DAWBA) is a standardised assessment for common child mental health problems, including attention…
Measuring Diameters Of Large Vessels
NASA Technical Reports Server (NTRS)
Currie, James R.; Kissel, Ralph R.; Oliver, Charles E.; Smith, Earnest C.; Redmon, John W., Sr.; Wallace, Charles C.; Swanson, Charles P.
1990-01-01
Computerized apparatus produces accurate results quickly. Apparatus measures diameter of tank or other large cylindrical vessel, without prior knowledge of exact location of cylindrical axis. Produces plot of inner circumference, estimate of true center of vessel, data on radius, diameter of best-fit circle, and negative and positive deviations of radius from circle at closely spaced points on circumference. Eliminates need for time-consuming and error-prone manual measurements.
Sherrer, Shanen M.; Taggart, David J.; Pack, Lindsey R.; Malik, Chanchal K.; Basu, Ashis K.; Suo, Zucai
2012-01-01
N- (deoxyguanosin-8-yl)-1-aminopyrene (dGAP) is the predominant nitro polyaromatic hydrocarbon product generated from the air pollutant 1-nitropyrene reacting with DNA. Previous studies have shown that dGAP induces genetic mutations in bacterial and mammalian cells. One potential source of these mutations is the error-prone bypass of dGAP lesions catalyzed by the low-fidelity Y-family DNA polymerases. To provide a comparative analysis of the mutagenic potential of the translesion DNA synthesis (TLS) of dGAP, we employed short oligonucleotide sequencing assays (SOSAs) with the model Y-family DNA polymerase from Sulfolobus solfataricus, DNA Polymerase IV (Dpo4), and the human Y-family DNA polymerases eta (hPolη), kappa (hPolκ), and iota (hPolι). Relative to undamaged DNA, all four enzymes generated far more mutations (base deletions, insertions, and substitutions) with a DNA template containing a site-specifically placed dGAP. Opposite dGAP and at an immediate downstream template position, the most frequent mutations made by the three human enzymes were base deletions and the most frequent base substitutions were dAs for all enzymes. Based on the SOSA data, Dpo4 was the least error-prone Y-family DNA polymerase among the four enzymes during the TLS of dGAP. Among the three human Y-family enzymes, hPolκ made the fewest mutations at all template positions except opposite the lesion site. hPolκ was significantly less error-prone than hPolι and hPolη during the extension of dGAP bypass products. Interestingly, the most frequent mutations created by hPolι at all template positions were base deletions. Although hRev1, the fourth human Y-family enzyme, could not extend dGAP bypass products in our standing start assays, it preferentially incorporated dCTP opposite the bulky lesion. Collectively, these mutagenic profiles suggest that hPolkk and hRev1 are the most suitable human Y-family DNA polymerases to perform TLS of dGAP in humans. PMID:22917544
Experimental investigation of observation error in anuran call surveys
McClintock, B.T.; Bailey, L.L.; Pollock, K.H.; Simons, T.R.
2010-01-01
Occupancy models that account for imperfect detection are often used to monitor anuran and songbird species occurrence. However, presenceabsence data arising from auditory detections may be more prone to observation error (e.g., false-positive detections) than are sampling approaches utilizing physical captures or sightings of individuals. We conducted realistic, replicated field experiments using a remote broadcasting system to simulate simple anuran call surveys and to investigate potential factors affecting observation error in these studies. Distance, time, ambient noise, and observer abilities were the most important factors explaining false-negative detections. Distance and observer ability were the best overall predictors of false-positive errors, but ambient noise and competing species also affected error rates for some species. False-positive errors made up 5 of all positive detections, with individual observers exhibiting false-positive rates between 0.5 and 14. Previous research suggests false-positive errors of these magnitudes would induce substantial positive biases in standard estimators of species occurrence, and we recommend practices to mitigate for false positives when developing occupancy monitoring protocols that rely on auditory detections. These recommendations include additional observer training, limiting the number of target species, and establishing distance and ambient noise thresholds during surveys. ?? 2010 The Wildlife Society.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
Evolution of gossip-based indirect reciprocity on a bipartite network
Giardini, Francesca; Vilone, Daniele
2016-01-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission. PMID:27885256
Evolution of gossip-based indirect reciprocity on a bipartite network.
Giardini, Francesca; Vilone, Daniele
2016-11-25
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
Evolution of gossip-based indirect reciprocity on a bipartite network
NASA Astrophysics Data System (ADS)
Giardini, Francesca; Vilone, Daniele
2016-11-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
ADEPT, a dynamic next generation sequencing data error-detection program with trimming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Shihai; Lo, Chien-Chi; Li, Po-E
Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less
ADEPT, a dynamic next generation sequencing data error-detection program with trimming
Feng, Shihai; Lo, Chien-Chi; Li, Po-E; ...
2016-02-29
Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less
Dietary Assessment in Food Environment Research
Kirkpatrick, Sharon I.; Reedy, Jill; Butler, Eboneé N.; Dodd, Kevin W.; Subar, Amy F.; Thompson, Frances E.; McKinnon, Robin A.
2015-01-01
Context The existing evidence on food environments and diet is inconsistent, potentially due in part to heterogeneity in measures used to assess diet. The objective of this review, conducted in 2012–2013, was to examine measures of dietary intake utilized in food environment research. Evidence acquisition Included studies were published from January 2007 through June 2012 and assessed relationships between at least one food environment exposure and at least one dietary outcome. Fifty-one articles were identified using PubMed, Scopus, Web of Knowledge, and PsycINFO; references listed in the papers reviewed and relevant review articles; and the National Cancer Institute's Measures of the Food Environment website. The frequency of the use of dietary intake measures and assessment of specific dietary outcomes was examined, as were patterns of results among studies using different dietary measures. Evidence synthesis The majority of studies used brief instruments, such as screeners or one or two questions, to assess intake. Food frequency questionnaires were used in about a third of studies, one in ten used 24-hour recalls, and fewer than one in twenty used diaries. Little consideration of dietary measurement error was evident. Associations between the food environment and diet were more consistently in the expected direction in studies using less error-prone measures. Conclusions There is a tendency toward the use of brief dietary assessment instruments with low cost and burden rather than more detailed instruments that capture intake with less bias. Use of error-prone dietary measures may lead to spurious findings and reduced power to detect associations. PMID:24355678
Clinical errors that can occur in the treatment decision-making process in psychotherapy.
Park, Jake; Goode, Jonathan; Tompkins, Kelley A; Swift, Joshua K
2016-09-01
Clinical errors occur in the psychotherapy decision-making process whenever a less-than-optimal treatment or approach is chosen when working with clients. A less-than-optimal approach may be one that a client is unwilling to try or fully invest in based on his/her expectations and preferences, or one that may have little chance of success based on contraindications and/or limited research support. The doctor knows best and the independent choice models are two decision-making models that are frequently used within psychology, but both are associated with an increased likelihood of errors in the treatment decision-making process. In particular, these models fail to integrate all three components of the definition of evidence-based practice in psychology (American Psychological Association, 2006). In this article we describe both models and provide examples of clinical errors that can occur in each. We then introduce the shared decision-making model as an alternative that is less prone to clinical errors. PsycINFO Database Record (c) 2016 APA, all rights reserved
Guan, Yongtao; Li, Yehua; Sinha, Rajita
2011-01-01
In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854
Chemoprevention by Elimination of Cancer-Prone, Mutant p53-Containing Breast Cells
2009-09-01
Briefly, 50 AL reaction mixture were used for each reaction, which contained 2 QuantiTect SYBR Green RT-PCR Master Mix, 0.5 AL QuantiTect RT mix, 0.5 Amol ...cysteine-free DMEM, containing 5% dialyzed FCS and 50 Amol /L MG132. Cells were then labeled with 100 ACi/mL of [35S]-methionine (MP Biochemicals) for 5 min...with much high drug doses up to 10 Amol /L to achieve a moderate effect. Interestingly, digoxin or ouabain failed to reduce the endogenous wt p53
Variations of Human DNA Polymerase Genes as Biomarkers of Prostate Cancer Progression
2013-07-01
discovery , cancer genetics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC...variations identified (including all single and double mutant combinations of the Triple mutant), and some POLK mutants • Discovery of a novel...Athens, Greece, 07/10 Makridakis N. Error-prone polymerase mutations and prostate cancer progression, COBRE /Cancer Genetics group seminar, Tulane
The expanding polymerase universe.
Goodman, M F; Tippin, B
2000-11-01
Over the past year, the number of known prokaryotic and eukaryotic DNA polymerases has exploded. Many of these newly discovered enzymes copy aberrant bases in the DNA template over which 'respectable' polymerases fear to tread. The next step is to unravel their functions, which are thought to range from error-prone copying of DNA lesions, somatic hypermutation and avoidance of skin cancer, to restarting stalled replication forks and repairing double-stranded DNA breaks.
Error rates and resource overheads of encoded three-qubit gates
NASA Astrophysics Data System (ADS)
Takagi, Ryuji; Yoder, Theodore J.; Chuang, Isaac L.
2017-10-01
A non-Clifford gate is required for universal quantum computation, and, typically, this is the most error-prone and resource-intensive logical operation on an error-correcting code. Small, single-qubit rotations are popular choices for this non-Clifford gate, but certain three-qubit gates, such as Toffoli or controlled-controlled-Z (ccz), are equivalent options that are also more suited for implementing some quantum algorithms, for instance, those with coherent classical subroutines. Here, we calculate error rates and resource overheads for implementing logical ccz with pieceable fault tolerance, a nontransversal method for implementing logical gates. We provide a comparison with a nonlocal magic-state scheme on a concatenated code and a local magic-state scheme on the surface code. We find the pieceable fault-tolerance scheme particularly advantaged over magic states on concatenated codes and in certain regimes over magic states on the surface code. Our results suggest that pieceable fault tolerance is a promising candidate for fault tolerance in a near-future quantum computer.
Hunter, Margaret E; Dorazio, Robert M; Butterfield, John S S; Meigs-Friend, Gaia; Nico, Leo G; Ferrante, Jason A
2017-03-01
A set of universal guidelines is needed to determine the limit of detection (LOD) in PCR-based analyses of low-concentration DNA. In particular, environmental DNA (eDNA) studies require sensitive and reliable methods to detect rare and cryptic species through shed genetic material in environmental samples. Current strategies for assessing detection limits of eDNA are either too stringent or subjective, possibly resulting in biased estimates of species' presence. Here, a conservative LOD analysis grounded in analytical chemistry is proposed to correct for overestimated DNA concentrations predominantly caused by the concentration plateau, a nonlinear relationship between expected and measured DNA concentrations. We have used statistical criteria to establish formal mathematical models for both quantitative and droplet digital PCR. To assess the method, a new Grass Carp (Ctenopharyngodon idella) TaqMan assay was developed and tested on both PCR platforms using eDNA in water samples. The LOD adjustment reduced Grass Carp occupancy and detection estimates while increasing uncertainty-indicating that caution needs to be applied to eDNA data without LOD correction. Compared to quantitative PCR, digital PCR had higher occurrence estimates due to increased sensitivity and dilution of inhibitors at low concentrations. Without accurate LOD correction, species occurrence and detection probabilities based on eDNA estimates are prone to a source of bias that cannot be reduced by an increase in sample size or PCR replicates. Other applications also could benefit from a standardized LOD such as GMO food analysis and forensic and clinical diagnostics. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Scotti, S. J.
1989-01-01
The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.
CTF Preprocessor User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria; Salko, Robert K.
2016-05-26
This document describes how a user should go about using the CTF pre- processor tool to create an input deck for modeling rod-bundle geometry in CTF. The tool was designed to generate input decks in a quick and less error-prone manner for CTF. The pre-processor is a completely independent utility, written in Fortran, that takes a reduced amount of input from the user. The information that the user must supply is basic information on bundle geometry, such as rod pitch, clad thickness, and axial location of spacer grids--the pre-processor takes this basic information and determines channel placement and connection informationmore » to be written to the input deck, which is the most time-consuming and error-prone segment of creating a deck. Creation of the model is also more intuitive, as the user can specify assembly and water-tube placement using visual maps instead of having to place them by determining channel/channel and rod/channel connections. As an example of the benefit of the pre-processor, a quarter-core model that contains 500,000 scalar-mesh cells was read into CTF from an input deck containing 200,000 lines of data. This 200,000 line input deck was produced automatically from a set of pre-processor decks that contained only 300 lines of data.« less
Overconfidence across the psychosis continuum: a calibration approach.
Balzan, Ryan P; Woodward, Todd S; Delfabbro, Paul; Moritz, Steffen
2016-11-01
An 'overconfidence in errors' bias has been consistently observed in people with schizophrenia relative to healthy controls, however, the bias is seldom found to be associated with delusional ideation. Using a more precise confidence-accuracy calibration measure of overconfidence, the present study aimed to explore whether the overconfidence bias is greater in people with higher delusional ideation. A sample of 25 participants with schizophrenia and 50 non-clinical controls (25 high- and 25 low-delusion-prone) completed 30 difficult trivia questions (accuracy <75%); 15 'half-scale' items required participants to indicate their level of confidence for accuracy, and the remaining 'confidence-range' items asked participants to provide lower/upper bounds in which they were 80% confident the true answer lay within. There was a trend towards higher overconfidence for half-scale items in the schizophrenia and high-delusion-prone groups, which reached statistical significance for confidence-range items. However, accuracy was particularly low in the two delusional groups and a significant negative correlation between clinical delusional scores and overconfidence was observed for half-scale items within the schizophrenia group. Evidence in support of an association between overconfidence and delusional ideation was therefore mixed. Inflated confidence-accuracy miscalibration for the two delusional groups may be better explained by their greater unawareness of their underperformance, rather than representing genuinely inflated overconfidence in errors.
A Novel Way to Relate Ontology Classes
Choksi, Ami T.; Jinwala, Devesh C.
2015-01-01
The existing ontologies in the semantic web typically have anonymous union and intersection classes. The anonymous classes are limited in scope and may not be part of the whole inference process. The tools, namely, the pellet, the jena, and the protégé, interpret collection classes as (a) equivalent/subclasses of union class and (b) superclasses of intersection class. As a result, there is a possibility that the tools will produce error prone inference results for relations, namely, sub-, union, intersection, equivalent relations, and those dependent on these relations, namely, complement. To verify whether a class is complement of other involves utilization of sub- and equivalent relations. Motivated by the same, we (i) refine the test data set of the conference ontology by adding named, union, and intersection classes and (ii) propose a match algorithm to (a) calculate corrected subclasses list, (b) correctly relate intersection and union classes with their collection classes, and (c) match union, intersection, sub-, complement, and equivalent classes in a proper sequence, to avoid error prone match results. We compare the results of our algorithms with those of a candidate reasoner, namely, the pellet reasoner. To the best of our knowledge, ours is a unique attempt in establishing a novel way to relate ontology classes. PMID:25984560
Coral-Vazquez, R; Arenas, D; Cisneros, B; Peñaloza, L; Salamanca, F; Kofman, S; Mercado, R; Montañez, C
1997-06-13
We have analyzed 59 unrelated Mexican Duchenne/Becker muscular dystrophy patients (DMD/BMD) using PCR analysis of the 2 prone deletion regions in the DMD gene. Thirty one (52%) of the patients had a deletion of one or several of the exons. Most of the alterations (87%) were clustered in exons 44-52, this being the highest percentage reported until now. In order to improve the molecular diagnosis in the Mexican population, we designed a new multiplex assay to PCR amplify exons 44-52. This assay allowed for the identification of a greater number of deletions in this region compared with the 9 and 5-plex assays previously described and to determine most of the deletion end boundaries. This is a reliable alternative for the initial screening of the DMD patients in the Mexican population.
Allegrini, Franco; Braga, Jez W B; Moreira, Alessandro C O; Olivieri, Alejandro C
2018-06-29
A new multivariate regression model, named Error Covariance Penalized Regression (ECPR) is presented. Following a penalized regression strategy, the proposed model incorporates information about the measurement error structure of the system, using the error covariance matrix (ECM) as a penalization term. Results are reported from both simulations and experimental data based on replicate mid and near infrared (MIR and NIR) spectral measurements. The results for ECPR are better under non-iid conditions when compared with traditional first-order multivariate methods such as ridge regression (RR), principal component regression (PCR) and partial least-squares regression (PLS). Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bezan, Scott; Shirani, Shahram
2006-12-01
To reliably transmit video over error-prone channels, the data should be both source and channel coded. When multiple channels are available for transmission, the problem extends to that of partitioning the data across these channels. The condition of transmission channels, however, varies with time. Therefore, the error protection added to the data at one instant of time may not be optimal at the next. In this paper, we propose a method for adaptively adding error correction code in a rate-distortion (RD) optimized manner using rate-compatible punctured convolutional codes to an MJPEG2000 constant rate-coded frame of video. We perform an analysis on the rate-distortion tradeoff of each of the coding units (tiles and packets) in each frame and adapt the error correction code assigned to the unit taking into account the bandwidth and error characteristics of the channels. This method is applied to both single and multiple time-varying channel environments. We compare our method with a basic protection method in which data is either not transmitted, transmitted with no protection, or transmitted with a fixed amount of protection. Simulation results show promising performance for our proposed method.
Utility of PCR in diagnosing pulmonary tuberculosis.
Bennedsen, J; Thomsen, V O; Pfyffer, G E; Funke, G; Feldmann, K; Beneke, A; Jenkins, P A; Hegginbothom, M; Fahr, A; Hengstler, M; Cleator, G; Klapper, P; Wilkins, E G
1996-06-01
At present, the rapid diagnosis of pulmonary tuberculosis rests with microscopy. However, this technique is insensitive and many cases of pulmonary tuberculosis cannot be initially confirmed. Nucleic acid amplification techniques are extremely sensitive, but when they are applied to tuberculosis diagnosis, they have given variable results. Investigators at six centers in Europe compared a standardized PCR system (Amplicor; Roche) against conventional culture methods. Defined clinical information was collected. Discrepant samples were retested, and inhibition assays and backup amplification with a separate primer pair were performed. Mycobacterium tuberculosis complex organisms were recovered from 654 (9.1%) of 7,194 samples and 293 (7.8%) of 3,738 patients. Four hundred fifty-two of the M. tuberculosis isolates from 204 patients were smear positive and culture positive. Among the culture-positive specimens, PCR had a sensitivity of 91.4% for smear-positive specimens and 60.9% for smear-negative specimens, with a specificity of 96.1%. Analysis of 254 PCR-positive, culture-negative specimens with discrepant results revealed that 130 were from patients with recently diagnosed tuberculosis and 94 represented a presumed laboratory error. Similar analysis of 118 PCR-negative, culture-positive specimens demonstrated that 27 discrepancies were due to presumed uneven aliquot distribution and 11 were due to presumed laboratory error; PCR inhibitors were detected in 8 specimens. Amplicor enables laboratories with little previous experience with nucleic acid amplification to perform PCR. Disease in more than 60% of the patients with tuberculosis with smear-negative, culture-positive specimens can be diagnosed at the time of admission, and potentially all patients with smear-positive specimens can immediately be confirmed as being infected with M. tuberculosis, leading to improved clinical management.
Bernat, Edward M; Nelson, Lindsay D; Steele, Vaughn R; Gehring, William J; Patrick, Christopher J
2011-05-01
Externalizing is a broad construct that reflects propensity toward a variety of impulse control problems, including antisocial personality disorder and substance use disorders. Two event-related potential responses known to be reduced among individuals high in externalizing proneness are the P300, which reflects postperceptual processing of a stimulus, and the error-related negativity (ERN), which indexes performance monitoring based on endogenous representations. In the current study, the authors used a simulated gambling task to examine the relation between externalizing proneness and the feedback-related negativity (FRN), a brain response that indexes performance monitoring related to exogenous cues, which is thought to be highly related to the ERN. Time-frequency (TF) analysis was used to disentangle the FRN from the accompanying P300 response to feedback cues by parsing the overall feedback-locked potential into distinctive theta (4-7 Hz) and delta (<3 Hz) TF components. Whereas delta-P300 amplitude was reduced among individuals high in externalizing proneness, theta-FRN response was unrelated to externalizing. These findings suggest that in contrast with previously reported deficits in endogenously based performance monitoring (as indexed by the ERN), individuals prone to externalizing problems show intact monitoring of exogenous cues (as indexed by the FRN). The results also contribute to a growing body of evidence indicating that the P300 is attenuated across a broad range of task conditions in high-externalizing individuals.
Jordana-Lluch, Elena; Rivaya, Belén; Marcó, Clara; Giménez, Montserrat; Quesada, Mª Dolores; Escobedo, Agustín; Batlle, Montserrat; Martró, Elisa; Ausina, Vicente
2017-02-01
Onco-haematological patients are prone to develop infections, and antibiotic prophylaxis may lead to negative blood cultures. Thus, the microbiological diagnosis and subsequent administration of a targeted antimicrobial therapy is often difficult. The goal of this study was to evaluate the usefulness of IRIDICA (PCR/ESI-MS technology) for the molecular diagnosis of bloodstream infections in this patient group. A total of 463 whole blood specimens from different sepsis episodes in 429 patients were analysed using the PCR/ESI-MS platform, comparing the results with those of blood culture and other clinically relevant information. The sensitivity of PCR/ESI-MS by specimen (excluding polymicrobial infections, n = 25) in comparison with blood culture was 64.3% overall, 69.0% in oncological patients, and 59.3% in haematological patients. When comparing with a clinical infection criterion, overall sensitivity rose to 74.7%, being higher in oncological patients (80.0%) than in haematological patients (67.7%). Thirty-one microorganisms isolated by culture were not detected by IRIDICA, whereas 42 clinically relevant pathogens not isolated by culture were detected moleculary. PCR/ESI-MS offers a reliable identification of pathogens directly from whole blood. While additional studies are needed to confirm our findings, the system showed a lower sensitivity in onco-haematological patients in comparison with previously reported results in patients from the Intensive Care Unit. Copyright © 2016 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
Adaptive constructive processes and the future of memory.
Schacter, Daniel L
2012-11-01
Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes and focuses in particular on the process of imagining or simulating events that might occur in one's personal future. Simulating future events relies on many of the same cognitive and neural processes as remembering past events, which may help to explain why imagination and memory can be easily confused. The article considers both pitfalls and adaptive aspects of future event simulation in the context of research on planning, prediction, problem solving, mind-wandering, prospective and retrospective memory, coping and positivity bias, and the interconnected set of brain regions known as the default network. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Murphy, Helen R; Lee, Seulgi; da Silva, Alexandre J
2017-07-01
Cyclospora cayetanensis is a protozoan parasite that causes human diarrheal disease associated with the consumption of fresh produce or water contaminated with C. cayetanensis oocysts. In the United States, foodborne outbreaks of cyclosporiasis have been linked to various types of imported fresh produce, including cilantro and raspberries. An improved method was developed for identification of C. cayetanensis in produce at the U.S. Food and Drug Administration. The method relies on a 0.1% Alconox produce wash solution for efficient recovery of oocysts, a commercial kit for DNA template preparation, and an optimized TaqMan real-time PCR assay with an internal amplification control for molecular detection of the parasite. A single laboratory validation study was performed to assess the method's performance and compare the optimized TaqMan real-time PCR assay and a reference nested PCR assay by examining 128 samples. The samples consisted of 25 g of cilantro or 50 g of raspberries seeded with 0, 5, 10, or 200 C. cayetanensis oocysts. Detection rates for cilantro seeded with 5 and 10 oocysts were 50.0 and 87.5%, respectively, with the real-time PCR assay and 43.7 and 94.8%, respectively, with the nested PCR assay. Detection rates for raspberries seeded with 5 and 10 oocysts were 25.0 and 75.0%, respectively, with the real-time PCR assay and 18.8 and 68.8%, respectively, with the nested PCR assay. All unseeded samples were negative, and all samples seeded with 200 oocysts were positive. Detection rates using the two PCR methods were statistically similar, but the real-time PCR assay is less laborious and less prone to amplicon contamination and allows monitoring of amplification and analysis of results, making it more attractive to diagnostic testing laboratories. The improved sample preparation steps and the TaqMan real-time PCR assay provide a robust, streamlined, and rapid analytical procedure for surveillance, outbreak response, and regulatory testing of foods for detection of C. cayetanensis.
Design and Sampling Plan Optimization for RT-qPCR Experiments in Plants: A Case Study in Blueberry.
Die, Jose V; Roman, Belen; Flores, Fernando; Rowland, Lisa J
2016-01-01
The qPCR assay has become a routine technology in plant biotechnology and agricultural research. It is unlikely to be technically improved, but there are still challenges which center around minimizing the variability in results and transparency when reporting technical data in support of the conclusions of a study. There are a number of aspects of the pre- and post-assay workflow that contribute to variability of results. Here, through the study of the introduction of error in qPCR measurements at different stages of the workflow, we describe the most important causes of technical variability in a case study using blueberry. In this study, we found that the stage for which increasing the number of replicates would be the most beneficial depends on the tissue used. For example, we would recommend the use of more RT replicates when working with leaf tissue, while the use of more sampling (RNA extraction) replicates would be recommended when working with stems or fruits to obtain the most optimal results. The use of more qPCR replicates provides the least benefit as it is the most reproducible step. By knowing the distribution of error over an entire experiment and the costs at each step, we have developed a script to identify the optimal sampling plan within the limits of a given budget. These findings should help plant scientists improve the design of qPCR experiments and refine their laboratory practices in order to conduct qPCR assays in a more reliable-manner to produce more consistent and reproducible data.
Jin, Mengtong; Sun, Wenshuo; Li, Qin; Sun, Xiaohong; Pan, Yingjie; Zhao, Yong
2014-04-04
We evaluated the difference of three standard curves in quantifying viable Vibrio parahaemolyticus in samples by real-time reverse-transcriptase PCR (Real-time RT-PCR). The standard curve A was established by 10-fold diluted cDNA. The cDNA was reverse transcripted after RNA synthesized in vitro. The standard curve B and C were established by 10-fold diluted cDNA. The cDNA was synthesized after RNA isolated from Vibrio parahaemolyticus in pure cultures (10(8) CFU/mL) and shrimp samples (10(6) CFU/g) (Standard curve A and C were proposed for the first time). Three standard curves were performed to quantitatively detect V. parahaemolyticus in six samples, respectively (Two pure cultured V. parahaemolyticus samples, two artificially contaminated cooked Litopenaeus vannamei samples and two artificially contaminated Litopenaeus vannamei samples). Then we evaluated the quantitative results of standard curve and the plate counting results and then analysed the differences. The three standard curves all show a strong linear relationship between the fractional cycle number and V. parahaemolyticus concentration (R2 > 0.99); The quantitative results of Real-time PCR were significantly (p < 0.05) lower than the results of plate counting. The relative errors compared with the results of plate counting ranked standard curve A (30.0%) > standard curve C (18.8%) > standard curve B (6.9%); The average differences between standard curve A and standard curve B and C were - 2.25 Lg CFU/mL and - 0.75 Lg CFU/mL, respectively, and the mean relative errors were 48.2% and 15.9%, respectively; The average difference between standard curve B and C was among (1.47 -1.53) Lg CFU/mL and the average relative errors were among 19.0% - 23.8%. Standard curve B could be applied to Real-time RT-PCR when quantify the number of viable microorganisms in samples.
Error-free versus mutagenic processing of genomic uracil--relevance to cancer.
Krokan, Hans E; Sætrom, Pål; Aas, Per Arne; Pettersen, Henrik Sahlin; Kavli, Bodil; Slupphaug, Geir
2014-07-01
Genomic uracil is normally processed essentially error-free by base excision repair (BER), with mismatch repair (MMR) as an apparent backup for U:G mismatches. Nuclear uracil-DNA glycosylase UNG2 is the major enzyme initiating BER of uracil of U:A pairs as well as U:G mismatches. Deficiency in UNG2 results in several-fold increases in genomic uracil in mammalian cells. Thus, the alternative uracil-removing glycosylases, SMUG1, TDG and MBD4 cannot efficiently complement UNG2-deficiency. A major function of SMUG1 is probably to remove 5-hydroxymethyluracil from DNA with general back-up for UNG2 as a minor function. TDG and MBD4 remove deamination products U or T mismatched to G in CpG/mCpG contexts, but may have equally or more important functions in development, epigenetics and gene regulation. Genomic uracil was previously thought to arise only from spontaneous cytosine deamination and incorporation of dUMP, generating U:G mismatches and U:A pairs, respectively. However, the identification of activation-induced cytidine deaminase (AID) and other APOBEC family members as DNA-cytosine deaminases has spurred renewed interest in the processing of genomic uracil. Importantly, AID triggers the adaptive immune response involving error-prone processing of U:G mismatches, but also contributes to B-cell lymphomagenesis. Furthermore, mutational signatures in a substantial fraction of other human cancers are consistent with APOBEC-induced mutagenesis, with U:G mismatches as prime suspects. Mutations can be caused by replicative polymerases copying uracil in U:G mismatches, or by translesion polymerases that insert incorrect bases opposite abasic sites after uracil-removal. In addition, kataegis, localized hypermutations in one strand in the vicinity of genomic rearrangements, requires APOBEC protein, UNG2 and translesion polymerase REV1. What mechanisms govern error-free versus error prone processing of uracil in DNA remains unclear. In conclusion, genomic uracil is an essential intermediate in adaptive immunity and innate antiviral responses, but may also be a fundamental cause of a wide range of malignancies. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
The High Altitude Pollution Program (1976-1982).
1984-01-01
ground, where air pollution problems arise due to ground level emissions from, for example, automobiles and power plants) to about 25 km above the...downward and poleward. Near the ground, in areas such as cities prone to air pollution , ozone is produced by nitrogen dioxide photolysis and reaction...Spectrophotcmeter Total Ozone Measurement Errors caused by Interfering Absorbing Species Such as SO2, NO2 and Photochemically Produced 03 IN Polluted Air ," NOAA
Vap, Linda; Bohn, Andrea A
2015-01-01
Interpretation of camelid hematology results is similar to that of other mammals. Obtaining accurate results and using appropriate reference intervals can be a bit problematic, particularly when evaluating the erythron. Camelid erythrocytes vary from other mammals in that they are small, flat, and elliptical. This variation makes data obtained from samples collected from these species prone to error when using some automated instruments. Normal and abnormal findings in camelid blood are reviewed as well as how to ensure accurate results.
Coordinating Robot Teams for Disaster Relief
2015-05-01
eventually guide vehicles in cooperation with its Operator(s), but in this paper we assume static mission goals, a fixed number of vehicles, and a...is tedious and error prone. Kress-Gazit et al. (2009) instead synthesize an FSA from an LTL specification using a game theory approach (Bloem et al...helping an Operator coordinate a team of vehicles in Disaster Relief. Acknowledgements Thanks to OSD ASD (R&E) for sponsoring this research. The
Vienna Fortran - A Language Specification. Version 1.1
1992-03-01
other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence
Toward an Operational Definition of Workload: A Workload Assessment of Aviation Maneuvers
2010-08-01
and evaluated by the learner . With practice, the learner moves into the second phase, where optimal strategies are strengthened. The final stage of...The first phase demands a great amount of resources as performance is slow and prone to errors. During this phase, strategies are being formulated...asked to assess mental, physical, visual, aural , and verbal demands of each task. The new assessment is a cost effective method of assessing workload
NASA Astrophysics Data System (ADS)
Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.
2015-07-01
Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.
Identification and correction of systematic error in high-throughput sequence data
2011-01-01
Background A feature common to all DNA sequencing technologies is the presence of base-call errors in the sequenced reads. The implications of such errors are application specific, ranging from minor informatics nuisances to major problems affecting biological inferences. Recently developed "next-gen" sequencing technologies have greatly reduced the cost of sequencing, but have been shown to be more error prone than previous technologies. Both position specific (depending on the location in the read) and sequence specific (depending on the sequence in the read) errors have been identified in Illumina and Life Technology sequencing platforms. We describe a new type of systematic error that manifests as statistically unlikely accumulations of errors at specific genome (or transcriptome) locations. Results We characterize and describe systematic errors using overlapping paired reads from high-coverage data. We show that such errors occur in approximately 1 in 1000 base pairs, and that they are highly replicable across experiments. We identify motifs that are frequent at systematic error sites, and describe a classifier that distinguishes heterozygous sites from systematic error. Our classifier is designed to accommodate data from experiments in which the allele frequencies at heterozygous sites are not necessarily 0.5 (such as in the case of RNA-Seq), and can be used with single-end datasets. Conclusions Systematic errors can easily be mistaken for heterozygous sites in individuals, or for SNPs in population analyses. Systematic errors are particularly problematic in low coverage experiments, or in estimates of allele-specific expression from RNA-Seq data. Our characterization of systematic error has allowed us to develop a program, called SysCall, for identifying and correcting such errors. We conclude that correction of systematic errors is important to consider in the design and interpretation of high-throughput sequencing experiments. PMID:22099972
Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.
Heredia, Nicholas J
2018-01-01
Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Qiu, Zilong; Jiang, Rongrong
2017-01-01
Classical strain engineering methods often have limitations in altering multigenetic cellular phenotypes. Here we try to improve Saccharomyces cerevisiae ethanol tolerance and productivity by reprogramming its transcription profile through rewiring its key transcription component RNA polymerase II (RNAP II), which plays a central role in synthesizing mRNAs. This is the first report on using directed evolution method to engineer RNAP II to alter S. cerevisiae strain phenotypes. Error-prone PCR was employed to engineer the subunit Rpb7 of RNAP II to improve yeast ethanol tolerance and production. Based on previous studies and the presumption that improved ethanol resistance would lead to enhanced ethanol production, we first isolated variant M1 with much improved resistance towards 8 and 10% ethanol. The ethanol titers of M1 was ~122 g/L (96.58% of the theoretical yield) under laboratory very high gravity (VHG) fermentation, 40% increase as compared to the control. DNA microarray assay showed that 369 genes had differential expression in M1 after 12 h VHG fermentation, which are involved in glycolysis, alcoholic fermentation, oxidative stress response, etc. This is the first study to demonstrate the possibility of engineering eukaryotic RNAP to alter global transcription profile and improve strain phenotypes. Targeting subunit Rpb7 of RNAP II was able to bring differential expression in hundreds of genes in S. cerevisiae , which finally led to improvement in yeast ethanol tolerance and production.
Anbar, Michael; Gul, Ozgur; Lamed, Raphael; Sezerman, Ugur O.
2012-01-01
The use of thermostable cellulases is advantageous for the breakdown of lignocellulosic biomass toward the commercial production of biofuels. Previously, we have demonstrated the engineering of an enhanced thermostable family 8 cellulosomal endoglucanase (EC 3.2.1.4), Cel8A, from Clostridium thermocellum, using random error-prone PCR and a combination of three beneficial mutations, dominated by an intriguing serine-to-glycine substitution (M. Anbar, R. Lamed, E. A. Bayer, ChemCatChem 2:997–1003, 2010). In the present study, we used a bioinformatics-based approach involving sequence alignment of homologous family 8 glycoside hydrolases to create a library of consensus mutations in which residues of the catalytic module are replaced at specific positions with the most prevalent amino acids in the family. One of the mutants (G283P) displayed a higher thermal stability than the wild-type enzyme. Introducing this mutation into the previously engineered Cel8A triple mutant resulted in an optimized enzyme, increasing the half-life of activity by 14-fold at 85°C. Remarkably, no loss of catalytic activity was observed compared to that of the wild-type endoglucanase. The structural changes were simulated by molecular dynamics analysis, and specific regions were identified that contributed to the observed thermostability. Intriguingly, most of the proteins used for sequence alignment in determining the consensus residues were derived from mesophilic bacteria, with optimal temperatures well below that of C. thermocellum Cel8A. PMID:22389377
Moghaddam, Samira Mafi; Song, Qijian; Mamidi, Sujan; Schmutz, Jeremy; Lee, Rian; Cregan, Perry; Osorno, Juan M; McClean, Phillip E
2014-01-01
Next generation sequence data provides valuable information and tools for genetic and genomic research and offers new insights useful for marker development. This data is useful for the design of accurate and user-friendly molecular tools. Common bean (Phaseolus vulgaris L.) is a diverse crop in which separate domestication events happened in each gene pool followed by race and market class diversification that has resulted in different morphological characteristics in each commercial market class. This has led to essentially independent breeding programs within each market class which in turn has resulted in limited within market class sequence variation. Sequence data from selected genotypes of five bean market classes (pinto, black, navy, and light and dark red kidney) were used to develop InDel-based markers specific to each market class. Design of the InDel markers was conducted through a combination of assembly, alignment and primer design software using 1.6× to 5.1× coverage of Illumina GAII sequence data for each of the selected genotypes. The procedure we developed for primer design is fast, accurate, less error prone, and higher throughput than when they are designed manually. All InDel markers are easy to run and score with no need for PCR optimization. A total of 2687 InDel markers distributed across the genome were developed. To highlight their usefulness, they were employed to construct a phylogenetic tree and a genetic map, showing that InDel markers are reliable, simple, and accurate.
Moghaddam, Samira Mafi; Song, Qijian; Mamidi, Sujan; Schmutz, Jeremy; Lee, Rian; Cregan, Perry; Osorno, Juan M.; McClean, Phillip E.
2013-01-01
Next generation sequence data provides valuable information and tools for genetic and genomic research and offers new insights useful for marker development. This data is useful for the design of accurate and user-friendly molecular tools. Common bean (Phaseolus vulgaris L.) is a diverse crop in which separate domestication events happened in each gene pool followed by race and market class diversification that has resulted in different morphological characteristics in each commercial market class. This has led to essentially independent breeding programs within each market class which in turn has resulted in limited within market class sequence variation. Sequence data from selected genotypes of five bean market classes (pinto, black, navy, and light and dark red kidney) were used to develop InDel-based markers specific to each market class. Design of the InDel markers was conducted through a combination of assembly, alignment and primer design software using 1.6× to 5.1× coverage of Illumina GAII sequence data for each of the selected genotypes. The procedure we developed for primer design is fast, accurate, less error prone, and higher throughput than when they are designed manually. All InDel markers are easy to run and score with no need for PCR optimization. A total of 2687 InDel markers distributed across the genome were developed. To highlight their usefulness, they were employed to construct a phylogenetic tree and a genetic map, showing that InDel markers are reliable, simple, and accurate. PMID:24860578
Bora, Bandana; Gogoi, Debananda; Tripathy, Debabrata; Kurkalang, Sillarine; Ramani, Sheetal; Chatterjee, Anupam; Mukherjee, Ashis K
2018-05-01
An N-terminal truncated fibrino(geno)lytic serine protease gene encoding a ~42kDa protein from Bacillus cereus strain AB01 was produced by error prone PCR, cloned into pET19b vector, and expressed in E5 coli BL21 DE3 cells. The deletion of 24 amino acid residues from N-terminal of wild-type Bacifrinase improves the catalytic activity of [Bacifrinase (ΔN24)]. The anticoagulant potency of [Bacifrinase (ΔN24)] was comparable to Nattokinase and Warfarin and results showed that its anticoagulant action is contributed by progressive defibrinogenation and antiplatelet activities. Nonetheless, at the tested concentration of 2.0μM [Bacifrinase (ΔN24)] did not show in vitro cytotoxicity or chromosomal aberrations on human embryonic kidney cells-293 (HEK-293) and human peripheral blood lymphocytes (HPBL) cells. [Bacifrinase (ΔN24)], at a dose of 2mg/kg, did not show toxicity, adverse pharmacological effects, tissue necrosis or hemorrhagic effect after 72h of its administration in Swiss albino mice. However, at the tested doses of 0.125 to 0.5mg/kg, it demonstrated significant in anticoagulant effect as well as defibrinogenation after 6h of administration in mice. We propose that [Bacifrinase (ΔN24)] may serve as prototype for the development of potent drug to prevent hyperfibrinogenemia related disorders. Copyright © 2018 Elsevier B.V. All rights reserved.
Zhang, N; Stewart, B G; Moore, J C; Greasham, R L; Robinson, D K; Buckland, B C; Lee, C
2000-10-01
Toluene dioxygenase (TDO) from Pseudomonas putida F1 converts indene to a mixture of cis-indandiol (racemic), 1-indenol, and 1-indanone. The desired product, cis-(1S,2R)-indandiol, is a potential key intermediate in the chemical synthesis of indinavir sulfate (Crixivan), Merck's HIV-1 protease inhibitor for the treatment of AIDS. To reduce the undesirable byproducts 1-indenol and 1-indanone formed during indene bioconversion, the recombinant TDO expressed in Escherichia coli was evolved by directed evolution using the error-prone polymerase chain reaction (epPCR) method. High-throughput fluorometric and spectrophotometric assays were developed for rapid screening of the mutant libraries in a 96-well format. Mutants with reduced 1-indenol by-product formation were identified, and the individual indene bioconversion product profiles of the selected mutants were confirmed by HPLC. Changes in the amino acid sequence of the mutant enzymes were identified by analyzing the nucleotide sequence of the genes. A mutant with the most desirable product profile from each library, defined as the most reduced 1-indenol concentration and with the highest cis-(1S,2R)-indandiol enantiomeric excess, was used to perform each subsequent round of mutagenesis. After three rounds of mutagenesis and screening, mutant 1C4-3G was identified to have a threefold reduction in 1-indenol formation over the wild type (20% vs 60% of total products) and a 40% increase of product (cis-indandiol) yield.
He, Dong; Luo, Wen; Wang, Zhiyuan; Lv, Pengmei; Yuan, Zhenhong; Huang, Shaowei; Xv, Jingliang
2017-07-01
Directed evolution has been proved an effective way to improve the stability of proteins, but high throughput screening assays for directed evolution with simultaneous improvement of two or more properties are still rare. In this study, we aimed to establish a membrane-blot assay for use in the high-throughput screening of Rhizomucor miehei lipases (RMLs). With the assistance of the membrane-blot screening assay, a mutant E47K named G10 that showed improved thermal stability was detected in the first round of error-prone PCR. Using G10 as the parent, two variants G10-11 and G10-20 that showed improved thermal stability and methanol tolerance without loss of activity compared to the wild type RML were obtained. The T 50 60 -value of G10-11 and G10-20 increased by 12°C and 6.5°C, respectively. After incubation for 1h, the remaining residual activity of G10-11 and G10-20 was 63.45% and 74.33%, respectively, in 50% methanol, and 15.98% and 30.22%, respectively, in 80% methanol. Thus, we successfully developed a membrane-blot assay that could be used for the high-throughput screening of RMLs with improved thermostability and methanol tolerance. Based on our findings, we believe that our newly developed membrane-blot assay will have potential applications in directed evolution in the future. Copyright © 2017 Elsevier Inc. All rights reserved.
Ohki, Taku; Shibata, Naoki; Higuchi, Yoshiki; Kawashima, Yasuyuki; Takeo, Masahiro; Kato, Dai-ichiro; Negoro, Seiji
2009-01-01
Promiscuous 6-aminohexanoate-linear dimer (Ald)-hydrolytic activity originally obtained in a carboxylesterase with a β-lactamase fold was enhanced about 80-fold by directed evolution using error-prone PCR and DNA shuffling. Kinetic studies of the mutant enzyme (Hyb-S4M94) demonstrated that the enzyme had acquired an increased affinity (Km = 15 mM) and turnover (kcat = 3.1 s−1) for Ald, and that a catalytic center suitable for nylon-6 byproduct hydrolysis had been generated. Construction of various mutant enzymes revealed that the enhanced activity in the newly evolved enzyme is due to the substitutions R187S/F264C/D370Y. Crystal structures of Hyb-S4M94 with bound substrate suggested that catalytic function for Ald was improved by hydrogen-bonding/hydrophobic interactions between the Ald—COOH and Tyr370, a hydrogen-bonding network from Ser187 to , and interaction between and Gln27-Oɛ derived from another subunit in the homo-dimeric structure. In wild-type Ald-hydrolase (NylB), Ald-hydrolytic activity is thought to be optimized by the substitutions G181D/H266N, which improve an electrostatic interaction with (Kawashima et al., FEBS J 2009; 276:2547–2556). We propose here that there exist at least two alternative modes for optimizing the Ald-hydrolytic activity of a carboxylesterase with a β-lactamase fold. PMID:19521995
Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay
Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming
2011-01-01
Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997
Error-proneness as a handicap signal.
De Jaegher, Kris
2003-09-21
This paper describes two discrete signalling models in which the error-proneness of signals can serve as a handicap signal. In the first model, the direct handicap of sending a high-quality signal is not large enough to assure that a low-quality signaller will not send it. However, if the receiver sometimes mistakes a high-quality signal for a low-quality one, then there is an indirect handicap to sending a high-quality signal. The total handicap of sending such a signal may then still be such that a low-quality signaller would not want to send it. In the second model, there is no direct handicap of sending signals, so that nothing would seem to stop a signaller from always sending a high-quality signal. However, the receiver sometimes fails to detect signals, and this causes an indirect handicap of sending a high-quality signal that still stops the low-quality signaller of sending such a signal. The conditions for honesty are that the probability of an error of detection is higher for a high-quality than for a low-quality signal, and that the signaller who does not detect a signal adopts a response that is bad to the signaller. In both our models, we thus obtain the result that signal accuracy should not lie above a certain level in order for honest signalling to be possible. Moreover, we show that the maximal accuracy that can be achieved is higher the lower the degree of conflict between signaller and receiver. As well, we show that it is the conditions for honest signalling that may be constraining signal accuracy, rather than the signaller trying to make honest signals as effective as possible given receiver psychology, or the signaller adapting the accuracy of honest signals depending on his interests.
Dynamic power scheduling system for JPEG2000 delivery over wireless networks
NASA Astrophysics Data System (ADS)
Martina, Maurizio; Vacca, Fabrizio
2003-06-01
Third generation mobile terminals diffusion is encouraging the development of new multimedia based applications. The reliable transmission of audiovisual content will gain major interest being one of the most valuable services. Nevertheless, mobile scenario is severely power constrained: high compression ratios and refined energy management strategies are highly advisable. JPEG2000 as the source encoding stage assures excellent performance with extremely good visual quality. However the limited power budged imposes to limit the computational effort in order to save as much power as possible. Starting from an error prone environment, as the wireless one, high error-resilience features need to be employed. This paper tries to investigate the trade-off between quality and power in such a challenging environment.
Identification of User Facility Related Publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, Robert M; Stahl, Christopher G; Wells, Jack C
2012-01-01
Scientific user facilities provide physical resources and technical support that enable scientists to conduct experiments or simulations pertinent to their respective research. One metric for evaluating the scientific value or impact of a facility is the number of publications by users as a direct result of using that facility. Unfortunately, for a variety of reasons, capturing accurate values for this metric proves time consuming and error-prone. This work describes a new approach that leverages automated browser technology combined with text analytics to reduce the time and error involved in identifying publications related to user facilities. With this approach, scientific usermore » facilities gain more accurate measures of their impact as well as insight into policy revisions for user access.« less
Interactions and Localization of Escherichia coli Error-Prone DNA Polymerase IV after DNA Damage.
Mallik, Sarita; Popodi, Ellen M; Hanson, Andrew J; Foster, Patricia L
2015-09-01
Escherichia coli's DNA polymerase IV (Pol IV/DinB), a member of the Y family of error-prone polymerases, is induced during the SOS response to DNA damage and is responsible for translesion bypass and adaptive (stress-induced) mutation. In this study, the localization of Pol IV after DNA damage was followed using fluorescent fusions. After exposure of E. coli to DNA-damaging agents, fluorescently tagged Pol IV localized to the nucleoid as foci. Stepwise photobleaching indicated ∼60% of the foci consisted of three Pol IV molecules, while ∼40% consisted of six Pol IV molecules. Fluorescently tagged Rep, a replication accessory DNA helicase, was recruited to the Pol IV foci after DNA damage, suggesting that the in vitro interaction between Rep and Pol IV reported previously also occurs in vivo. Fluorescently tagged RecA also formed foci after DNA damage, and Pol IV localized to them. To investigate if Pol IV localizes to double-strand breaks (DSBs), an I-SceI endonuclease-mediated DSB was introduced close to a fluorescently labeled LacO array on the chromosome. After DSB induction, Pol IV localized to the DSB site in ∼70% of SOS-induced cells. RecA also formed foci at the DSB sites, and Pol IV localized to the RecA foci. These results suggest that Pol IV interacts with RecA in vivo and is recruited to sites of DSBs to aid in the restoration of DNA replication. DNA polymerase IV (Pol IV/DinB) is an error-prone DNA polymerase capable of bypassing DNA lesions and aiding in the restart of stalled replication forks. In this work, we demonstrate in vivo localization of fluorescently tagged Pol IV to the nucleoid after DNA damage and to DNA double-strand breaks. We show colocalization of Pol IV with two proteins: Rep DNA helicase, which participates in replication, and RecA, which catalyzes recombinational repair of stalled replication forks. Time course experiments suggest that Pol IV recruits Rep and that RecA recruits Pol IV. These findings provide in vivo evidence that Pol IV aids in maintaining genomic stability not only by bypassing DNA lesions but also by participating in the restoration of stalled replication forks. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Bordelon, Hali; Ricks, Keersten M.; Pask, Megan E.; Russ, Patricia K.; Solinas, Francesca; Baglia, Mark L.; Short, Philip A.; Nel, Andrew; Blackburn, Jonathan; Dheda, Keertan; Zamudio, Carlos; Cáceres, Tatiana; Wright, David W.; Haselton, Frederick R.; Pettit, April C.
2017-01-01
Urine samples are increasingly used for diagnosing infections including Escherichia coli, Ebola virus, and Zika virus. However, extraction and concentration of nucleic acid biomarkers from urine is necessary for many molecular detection strategies such as polymerase chain reaction (PCR). Since urine samples typically have large volumes with dilute biomarker concentrations making them prone to false negatives, another impediment for urine-based diagnostics is the establishment of appropriate controls particularly to rule out false negatives. In this study, a mouse glyceraldehyde 3-phosphate dehydrogenase (GAPDH) DNA target was added to retrospectively collected urine samples from tuberculosis (TB)-infected and TB-uninfected patients to indicate extraction of intact DNA and removal of PCR inhibitors from urine samples. We tested this design on surrogate urine samples, retrospective 1 milliliter (mL) urine samples from patients in Lima, Peru and retrospective 5 mL urine samples from patients in Cape Town, South Africa. Extraction/PCR control DNA was detectable in 97% of clinical samples with no statistically significant differences among groups. Despite the inclusion of this control, there was no difference in the amount of TB IS6110 Tr-DNA detected between TB-infected and TB-uninfected groups except for samples from known HIV-infected patients. We found a increase in TB IS6110 Tr-DNA between TB/HIV co-infected patients compared to TB-uninfected/HIV-infected patients (N=18, p=0.037). The inclusion of an extraction/PCR control DNA to indicate successful DNA extraction and removal of PCR inhibitors should be easily adaptable as a sample preparation control for other acellular sample types. PMID:28285168
The rate of phosphocreatine hydrolysis and resynthesis in exercising muscle in humans using 31P-MRS.
Yoshida, Takayoshi
2002-09-01
Time-resolved 31-phosphorus nuclear magnetic resonance spectroscopy (31P-MRS) of the biceps femoris muscles was performed during exercise and recovery in six healthy sedentary male subjects (maximal oxygen uptake; 46.6 +/- 1.7 (SEM) ml.kg-1.min-1), 5 male sprinters (56.2 +/- 2.5), and 5 male long-distance runners (73.6 +/- 2.2). Each performed 4 min of knee flexion exercises at absolute values of 1.63 W and 4.90 W, followed by 5 min of recovery in a prone position in a 2.1 T superconducting magnet with a 67 cm bore. 31P-MRS spectra were recorded every 12.8 s during the rest-exercise-recovery sequence. Computer-aided contour analysis and pixel imaging of phosphocreatine peaks (PCr) and inorganic phosphate (Pi) were performed. The work loads in the present study were selected as mild exercise (1.63 W) and heavy exercise (4.90 W), corresponding to 18-23% and 54-70% of maximal exercise intensity. Long-distance runners showed a significantly smaller decrement in PCr and less acidification at a given exercise intensity compared to those shown by sedentary subjects. The transient responses of PCr and Pi during recovery were characterized by first-order kinetics. After exercise, the recovery rates of PCr and Pi were significantly faster in long-distance runners than in sedentary subjects (P < 0.05). Since it is postulated that PCr resynthesis is controlled by aerobic metabolism and mitochondrial creatine kinase, it is suggested that the faster PCr and Pi recovery rates and decreased acidification seen in long-distance runners during and after exercise might be attributed to their greater capacity for aerobic metabolism.
Bordelon, Hali; Ricks, Keersten M; Pask, Megan E; Russ, Patricia K; Solinas, Francesca; Baglia, Mark L; Short, Philip A; Nel, Andrew; Blackburn, Jonathan; Dheda, Keertan; Zamudio, Carlos; Cáceres, Tatiana; Wright, David W; Haselton, Frederick R; Pettit, April C
2017-05-01
Urine samples are increasingly used for diagnosing infections including Escherichia coli, Ebola virus, and Zika virus. However, extraction and concentration of nucleic acid biomarkers from urine is necessary for many molecular detection strategies such as polymerase chain reaction (PCR). Since urine samples typically have large volumes with dilute biomarker concentrations making them prone to false negatives, another impediment for urine-based diagnostics is the establishment of appropriate controls particularly to rule out false negatives. In this study, a mouse glyceraldehyde 3-phosphate dehydrogenase (GAPDH) DNA target was added to retrospectively collected urine samples from tuberculosis (TB)-infected and TB-uninfected patients to indicate extraction of intact DNA and removal of PCR inhibitors from urine samples. We tested this design on surrogate urine samples, retrospective 1milliliter (mL) urine samples from patients in Lima, Peru and retrospective 5mL urine samples from patients in Cape Town, South Africa. Extraction/PCR control DNA was detectable in 97% of clinical samples with no statistically significant differences among groups. Despite the inclusion of this control, there was no difference in the amount of TB IS6110 Tr-DNA detected between TB-infected and TB-uninfected groups except for samples from known HIV-infected patients. We found an increase in TB IS6110 Tr-DNA between TB/HIV co-infected patients compared to TB-uninfected/HIV-infected patients (N=18, p=0.037). The inclusion of an extraction/PCR control DNA to indicate successful DNA extraction and removal of PCR inhibitors should be easily adaptable as a sample preparation control for other acellular sample types. Copyright © 2017 Elsevier B.V. All rights reserved.
ATYPICAL CHLAMYDIACEAE IN WILD POPULATIONS OF HAWKS ( BUTEO SPP.) IN CALIFORNIA.
Luján-Vega, Charlene; Hawkins, Michelle G; Johnson, Christine K; Briggs, Christopher; Vennum, Chris; Bloom, Peter H; Hull, Joshua M; Cray, Carolyn; Pesti, Denise; Johnson, Lisa; Ciembor, Paula; Ritchie, Branson R
2018-03-01
Chlamydiaceae bacteria infect many vertebrate hosts, and previous reports based on polymerase chain reaction (PCR) assays and serologic assays that are prone to cross-reaction among chlamydial organisms have been used to describe the prevalence of either DNA fragments or antibodies to Chlamydia spp. in wild raptorial populations. This study reports the PCR-based prevalence of Chlamydiaceae DNA that does not 100% match any avian or mammalian Chlamydiaceae in wild populations of hawks in California Buteo species. In this study, multimucosal swab samples ( n = 291) for quantitative PCR (qPCR) and plasma ( n = 78) for serology were collected from wild hawks. All available plasma samples were negative for antibodies using a C. psittaci-specific elementary body agglutination test (EBA; n = 78). For IgY antibodies all 51 available samples were negative using the indirect immunofluorescent assay. The overall prevalence of Chlamydiaceae DNA detection in wild Buteo species sampled was 1.37% (4/291) via qPCR-based analysis. Two fledgling Swainson's hawks ( Buteo swainsoni) and two juvenile red-tailed hawks ( Buteo jamaicensis) were positive by qPCR-based assay for an atypical chlamydial sequence that did not 100% match any known C. psittaci genotype. Positive swab samples from these four birds were sequenced based on the ompA gene and compared by high-resolution melt analysis with all known avian and mammalian Chlamydiaceae. The amplicon sequence did not 100% match any known avian chlamydial sequence; however, it was most similar (98.6%) to C. psittaci M56, a genotype that is typically found in muskrats and hares. Culture and full genome sequence analysis of Chlamydia spp. isolated from diseased hawks will be necessary to classify this organism and to better understand its epizootiology and potential health impact on wild Buteo populations in California.
Jonsen, Ian
2016-02-08
State-space models provide a powerful way to scale up inference of movement behaviours from individuals to populations when the inference is made across multiple individuals. Here, I show how a joint estimation approach that assumes individuals share identical movement parameters can lead to improved inference of behavioural states associated with different movement processes. I use simulated movement paths with known behavioural states to compare estimation error between nonhierarchical and joint estimation formulations of an otherwise identical state-space model. Behavioural state estimation error was strongly affected by the degree of similarity between movement patterns characterising the behavioural states, with less error when movements were strongly dissimilar between states. The joint estimation model improved behavioural state estimation relative to the nonhierarchical model for simulated data with heavy-tailed Argos location errors. When applied to Argos telemetry datasets from 10 Weddell seals, the nonhierarchical model estimated highly uncertain behavioural state switching probabilities for most individuals whereas the joint estimation model yielded substantially less uncertainty. The joint estimation model better resolved the behavioural state sequences across all seals. Hierarchical or joint estimation models should be the preferred choice for estimating behavioural states from animal movement data, especially when location data are error-prone.
Causal inference with measurement error in outcomes: Bias analysis and estimation methods.
Shu, Di; Yi, Grace Y
2017-01-01
Inverse probability weighting estimation has been popularly used to consistently estimate the average treatment effect. Its validity, however, is challenged by the presence of error-prone variables. In this paper, we explore the inverse probability weighting estimation with mismeasured outcome variables. We study the impact of measurement error for both continuous and discrete outcome variables and reveal interesting consequences of the naive analysis which ignores measurement error. When a continuous outcome variable is mismeasured under an additive measurement error model, the naive analysis may still yield a consistent estimator; when the outcome is binary, we derive the asymptotic bias in a closed-form. Furthermore, we develop consistent estimation procedures for practical scenarios where either validation data or replicates are available. With validation data, we propose an efficient method for estimation of average treatment effect; the efficiency gain is substantial relative to usual methods of using validation data. To provide protection against model misspecification, we further propose a doubly robust estimator which is consistent even when either the treatment model or the outcome model is misspecified. Simulation studies are reported to assess the performance of the proposed methods. An application to a smoking cessation dataset is presented.
IPTV multicast with peer-assisted lossy error control
NASA Astrophysics Data System (ADS)
Li, Zhi; Zhu, Xiaoqing; Begen, Ali C.; Girod, Bernd
2010-07-01
Emerging IPTV technology uses source-specific IP multicast to deliver television programs to end-users. To provide reliable IPTV services over the error-prone DSL access networks, a combination of multicast forward error correction (FEC) and unicast retransmissions is employed to mitigate the impulse noises in DSL links. In existing systems, the retransmission function is provided by the Retransmission Servers sitting at the edge of the core network. In this work, we propose an alternative distributed solution where the burden of packet loss repair is partially shifted to the peer IP set-top boxes. Through Peer-Assisted Repair (PAR) protocol, we demonstrate how the packet repairs can be delivered in a timely, reliable and decentralized manner using the combination of server-peer coordination and redundancy of repairs. We also show that this distributed protocol can be seamlessly integrated with an application-layer source-aware error protection mechanism called forward and retransmitted Systematic Lossy Error Protection (SLEP/SLEPr). Simulations show that this joint PARSLEP/ SLEPr framework not only effectively mitigates the bottleneck experienced by the Retransmission Servers, thus greatly enhancing the scalability of the system, but also efficiently improves the resistance to the impulse noise.
On Neglecting Chemical Exchange Effects When Correcting in Vivo 31P MRS Data for Partial Saturation
NASA Astrophysics Data System (ADS)
Ouwerkerk, Ronald; Bottomley, Paul A.
2001-02-01
Signal acquisition in most MRS experiments requires a correction for partial saturation that is commonly based on a single exponential model for T1 that ignores effects of chemical exchange. We evaluated the errors in 31P MRS measurements introduced by this approximation in two-, three-, and four-site chemical exchange models under a range of flip-angles and pulse sequence repetition times (TR) that provide near-optimum signal-to-noise ratio (SNR). In two-site exchange, such as the creatine-kinase reaction involving phosphocreatine (PCr) and γ-ATP in human skeletal and cardiac muscle, errors in saturation factors were determined for the progressive saturation method and the dual-angle method of measuring T1. The analysis shows that these errors are negligible for the progressive saturation method if the observed T1 is derived from a three-parameter fit of the data. When T1 is measured with the dual-angle method, errors in saturation factors are less than 5% for all conceivable values of the chemical exchange rate and flip-angles that deliver useful SNR per unit time over the range T1/5 ≤ TR ≤ 2T1. Errors are also less than 5% for three- and four-site exchange when TR ≥ T1*/2, the so-called "intrinsic" T1's of the metabolites. The effect of changing metabolite concentrations and chemical exchange rates on observed T1's and saturation corrections was also examined with a three-site chemical exchange model involving ATP, PCr, and inorganic phosphate in skeletal muscle undergoing up to 95% PCr depletion. Although the observed T1's were dependent on metabolite concentrations, errors in saturation corrections for TR = 2 s could be kept within 5% for all exchanging metabolites using a simple interpolation of two dual-angle T1 measurements performed at the start and end of the experiment. Thus, the single-exponential model appears to be reasonably accurate for correcting 31P MRS data for partial saturation in the presence of chemical exchange. Even in systems where metabolite concentrations change, accurate saturation corrections are possible without much loss in SNR.
Accurate typing of short tandem repeats from genome-wide sequencing data and its applications.
Fungtammasan, Arkarachai; Ananda, Guruprasad; Hile, Suzanne E; Su, Marcia Shu-Wei; Sun, Chen; Harris, Robert; Medvedev, Paul; Eckert, Kristin; Makova, Kateryna D
2015-05-01
Short tandem repeats (STRs) are implicated in dozens of human genetic diseases and contribute significantly to genome variation and instability. Yet profiling STRs from short-read sequencing data is challenging because of their high sequencing error rates. Here, we developed STR-FM, short tandem repeat profiling using flank-based mapping, a computational pipeline that can detect the full spectrum of STR alleles from short-read data, can adapt to emerging read-mapping algorithms, and can be applied to heterogeneous genetic samples (e.g., tumors, viruses, and genomes of organelles). We used STR-FM to study STR error rates and patterns in publicly available human and in-house generated ultradeep plasmid sequencing data sets. We discovered that STRs sequenced with a PCR-free protocol have up to ninefold fewer errors than those sequenced with a PCR-containing protocol. We constructed an error correction model for genotyping STRs that can distinguish heterozygous alleles containing STRs with consecutive repeat numbers. Applying our model and pipeline to Illumina sequencing data with 100-bp reads, we could confidently genotype several disease-related long trinucleotide STRs. Utilizing this pipeline, for the first time we determined the genome-wide STR germline mutation rate from a deeply sequenced human pedigree. Additionally, we built a tool that recommends minimal sequencing depth for accurate STR genotyping, depending on repeat length and sequencing read length. The required read depth increases with STR length and is lower for a PCR-free protocol. This suite of tools addresses the pressing challenges surrounding STR genotyping, and thus is of wide interest to researchers investigating disease-related STRs and STR evolution. © 2015 Fungtammasan et al.; Published by Cold Spring Harbor Laboratory Press.
2017-02-15
Maunz2 Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone...information processors have been demonstrated experimentally using superconducting circuits1–3, electrons in semiconductors4–6, trapped atoms and...qubit quantum information processor has been realized14, and single- qubit gates have demonstrated randomized benchmarking (RB) infidelities as low as 10
The preliminary SOL (Sizing and Optimization Language) reference manual
NASA Technical Reports Server (NTRS)
Lucas, Stephen H.; Scotti, Stephen J.
1989-01-01
The Sizing and Optimization Language, SOL, a high-level special-purpose computer language has been developed to expedite application of numerical optimization to design problems and to make the process less error-prone. This document is a reference manual for those wishing to write SOL programs. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler and runtime library routines. An overview of SOL appears in NASA TM 100565.
NASA Technical Reports Server (NTRS)
Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat
2008-01-01
This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.
The Implications of Self-Reporting Systems for Maritime Domain Awareness
2006-12-01
SIA), offrent des avantages significatifs comparativement à la poursuite des navires par détecteur ordinaire et que la disponibilité de l’information...reporting system for sea-going vessels that originated in Sweden in the early 1990s. It was designed primarily for safety of life at sea (SOLAS) and...report information is prone to human error and potential malicious altering and the system itself was not designed with these vulnerabilities in mind
Livneh, Zvi
2006-09-01
To overcome DNA lesions that block replication the cell employs translesion DNA synthesis (TLS) polymerases, a group of low fidelity DNA polymerases that have the capacity to bypass a wide range of DNA lesions. This TLS process is also termed error-prone repair, due to its inherent mutagenic nature. We have recently shown that the tumor suppressor p53 and the cell cycle inhibitor p21 are global regulators of TLS. When these proteins are missing or nonfunctional, TLS gets out of control: its extent increases to very high levels, and its fidelity decreases, causing an overall increase in mutation load. This may be explained by the loss of selectivity in the bypass of specific DNA lesions by their cognate specialized polymerases, such that lesion bypass continues to a maximum, regardless of the price paid in increased mutations. The p53 and p21 proteins are also required for efficient UV light-induced monoubiquitination of PCNA, which is consistent with a model in which this modification of PCNA is necessary but not sufficient for the normal activity of TLS. This regulation suggests that TLS evolved in mammals as a system that balances gain in survival with a tolerable mutational cost, and that disturbing this balance causes a potentially harmful increase in mutations, which might play a role in carcinogenesis.
Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J.; Berdis, Anthony J.
2016-01-01
This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. PMID:26717984
Seidman, M M; Bredberg, A; Seetharam, S; Kraemer, K H
1987-07-01
Mutagenesis was studied at the DNA-sequence level in human fibroblast and lymphoid cells by use of a shuttle vector plasmid, pZ189, containing a suppressor tRNA marker gene. In a series of experiments, 62 plasmids were recovered that had two to six base substitutions in the 160-base-pair marker gene. Approximately 20-30% of the mutant plasmids that were recovered after passing ultraviolet-treated pZ189 through a repair-proficient human fibroblast line contained these multiple mutations. In contrast, passage of ultraviolet-treated pZ189 through an excision-repair-deficient (xeroderma pigmentosum) line yielded only 2% multiple base substitution mutants. Introducing a single-strand nick in otherwise unmodified pZ189 adjacent to the marker, followed by passage through the xeroderma pigmentosum cells, resulted in about 66% multiple base substitution mutants. The multiple mutations were found in a 160-base-pair region containing the marker gene but were rarely found in an adjacent 170-base-pair region. Passing ultraviolet-treated or nicked pZ189 through a repair-proficient human B-cell line also yielded multiple base substitution mutations in 20-33% of the mutant plasmids. An explanation for these multiple mutations is that they were generated by an error-prone polymerase while filling gaps. These mutations share many of the properties displayed by mutations in the immunoglobulin hypervariable regions.
de Andrade, H H; Marques, E K; Schenberg, A C; Henriques, J A
1989-06-01
The induction of mitotic gene conversion and crossing-over in Saccharomyces cerevisiae diploid cells homozygous for the pso4-1 mutation was examined in comparison to the corresponding wild-type strain. The pso4-1 mutant strain was found to be completely blocked in mitotic recombination induced by photoaddition of mono- and bifunctional psoralen derivatives as well as by mono- (HN1) and bifunctional (HN2) nitrogen mustards or 254 nm UV radiation in both stationary and exponential phases of growth. Concerning the lethal effect, diploids homozygous for the pso4-1 mutation are more sensitive to all agents tested in any growth phase. However, this effect is more pronounced in the G2 phase of the cell cycle. These results imply that the ploidy effect and the resistance of budding cells are under the control of the PSO4 gene. On the other hand, the pso4-1 mutant is mutationally defective for all agents used. Therefore, the pso4-1 mutant has a generalized block in both recombination and mutation ability. This indicates that the PSO4 gene is involved in an error-prone repair pathway which relies on a recombinational mechanism, strongly suggesting an analogy between the pso4-1 mutation and the RecA or LexA mutation of Escherichia coli.
Efficient error correction for next-generation sequencing of viral amplicons
2012-01-01
Background Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. Results In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Conclusions Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses. The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm PMID:22759430
Efficient error correction for next-generation sequencing of viral amplicons.
Skums, Pavel; Dimitrova, Zoya; Campo, David S; Vaughan, Gilberto; Rossi, Livia; Forbi, Joseph C; Yokosawa, Jonny; Zelikovsky, Alex; Khudyakov, Yury
2012-06-25
Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses.The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galavis, P; Barbee, D; Jozsef, G
2016-06-15
Purpose: Prone accelerated partial breast irradiation (APBI) results in dose reduction to the heart and lung. Flattening filter free beams (FFF) reduce out of field dose due to the reduced scatter from the removal of the flattening filter and reduce the buildup region. The aim of this work is to evaluate the dosimetric advantages of FFF beams to prone APBI target coverage and reduction in dose to organs at risk. Methods: Fifteen clinical prone APBI cases using flattened photon beams were retrospectively re-planned in Eclipse-TPS using FFF beams. FFF plans were designed to provide equivalent target coverage with similar hotspotsmore » using the same field arrangements, resulting in comparable target DVHs. Both plans were transferred to a prone breast phantom and delivered on Varian-Edge-Linac. GafChromic-film was placed in the coronal plane of the phantom, partially overlapping the treatment field and extending into OARs to compare dose profiles from both plans. Results: FFF plans were comparable to the clinical plans with maximum doses of (108.3±2.3)% and (109.2±2.4)% and mean doses of (104.5±1.0)% and (104.6±1.2)%, respectively. Similar mean dose doses to the heart and contralateral lungs were observed from both plans, whereas the mean dose to the contra-lateral breast was (2.79±1.18) cGy and (2.86±1.40) cGy for FFF and clinical plans respectively. However for both plans the error between calculated and measured doses at 4 cm from the field edge was 10%. Conclusion: The results showed that FFF beams in prone APBI provide dosimetrically equivalent target coverage and improved coverage in superficial target due to softer energy spectra. Film analysis showed that the TPS underestimates dose outside field edges for both cases. The FFF measured plans showed less dose outside the beam that might reduce the probability of secondary cancers in the contralateral breast.« less
Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.
Olson, Andrew P J; Graber, Mark L; Singh, Hardeep
2018-01-29
Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.
Gut immune deficits in LEW.1AR1-iddm rats partially overcome by feeding a diabetes-protective diet.
Crookshank, Jennifer A; Patrick, Christopher; Wang, Gen-Sheng; Noel, J Ariana; Scott, Fraser W
2015-07-01
The gut immune system and its modification by diet have been implicated in the pathogenesis of type 1 diabetes (T1D). Therefore, we investigated gut immune status in non-diabetes-prone LEW.1AR1 and diabetes-prone LEW.1AR1-iddm rats and evaluated the effect of a low antigen, hydrolysed casein (HC)-based diet on gut immunity and T1D. Rats were weaned onto a cereal-based or HC-based diet and monitored for T1D. Strain and dietary effects on immune homeostasis were assessed in non-diabetic rats (50-60 days old) and rats with recent-onset diabetes using flow cytometry and immunohistochemistry. Immune gene expression was analysed in mesenteric lymph nodes (MLN) and jejunum using quantitative RT-PCR and PCR arrays. T1D was prevented in LEW.1AR1-iddm rats by feeding an HC diet. Diabetic LEW.1AR1-iddm rats had fewer lymphoid tissue T cells compared with LEW.1AR1 rats. The percentage of CD4(+) Foxp3(+) regulatory T (Treg) cells was decreased in pancreatic lymph nodes (PLN) of diabetic rats. The jejunum of 50-day LEW.1AR1-iddm rats contained fewer CD3(+) T cells, CD163(+) M2 macrophages and Foxp3(+) Treg cells. Ifng expression was increased in MLN and Foxp3 expression was decreased in the jejunum of LEW.1AR1-iddm rats; Ifng/Il4 was decreased in jejunum of LEW.1AR1-iddm rats fed HC. PCR arrays revealed decreased expression of M2-associated macrophage factors in 50-day LEW.1AR1-iddm rats. Wheat peptides stimulated T-cell proliferation and activation in MLN and PLN cells from diabetic LEW.1AR1-iddm rats. LEW.1AR1-iddm rats displayed gut immune cell deficits and decreased immunoregulatory capacity, which were partially corrected in animals fed a low antigen, protective HC diet consistent with other models of T1D. © 2015 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, H; Gao, Y; Liu, T
Purpose: To develop quantitative clinical guidelines between supine Deep Inspiratory Breath Hold (DIBH) and prone free breathing treatments for breast patients, we applied 3D deformable phantoms to perform Monte Carlo simulation to predict corresponding Dose to the Organs at Risk (OARs). Methods: The RPI-adult female phantom (two selected cup sizes: A and D) was used to represent the female patient, and it was simulated using the MCNP6 Monte Carlo code. Doses to OARs were investigated for supine DIBH and prone treatments, considering two breast sizes. The fluence maps of the 6-MV opposed tangential fields were exported. In the Monte Carlomore » simulation, the fluence maps allow each simulated photon particle to be weighed in the final dose calculation. The relative error of all dose calculations was kept below 5% by simulating 3*10{sup 7} photons for each projection. Results: In terms of dosimetric accuracy, the RPI Adult Female phantom with cup size D in DIBH positioning matched with a DIBH treatment plan of the patient. Based on the simulation results, for cup size D phantom, prone positioning reduced the cardiac dose and the dose to other OARs, while cup size A phantom benefits more from DIBH positioning. Comparing simulation results for cup size A and D phantom, dose to OARs was generally higher for the large breast size due to increased scattering arising from a larger portion of the body in the primary beam. The lower dose that was registered for the heart in the large breast phantom in prone positioning was due to the increase of the distance between the heart and the primary beam when the breast was pendulous. Conclusion: Our 3D deformable phantom appears an excellent tool to predict dose to the OARs for the supine DIBH and prone positions, which might help quantitative clinical decisions. Further investigation will be conducted. National Institutes of Health R01EB015478.« less
Kuikka, Liisa; Pitkälä, Kaisu
2014-01-01
Abstract Objective. To study coping differences between young and experienced GPs in primary care who experience medical errors and uncertainty. Design. Questionnaire-based survey (self-assessment) conducted in 2011. Setting. Finnish primary practice offices in Southern Finland. Subjects. Finnish GPs engaged in primary health care from two different respondent groups: young (working experience ≤ 5years, n = 85) and experienced (working experience > 5 years, n = 80). Main outcome measures. Outcome measures included experiences and attitudes expressed by the included participants towards medical errors and tolerance of uncertainty, their coping strategies, and factors that may influence (positively or negatively) sources of errors. Results. In total, 165/244 GPs responded (response rate: 68%). Young GPs expressed significantly more often fear of committing a medical error (70.2% vs. 48.1%, p = 0.004) and admitted more often than experienced GPs that they had committed a medical error during the past year (83.5% vs. 68.8%, p = 0.026). Young GPs were less prone to apologize to a patient for an error (44.7% vs. 65.0%, p = 0.009) and found, more often than their more experienced colleagues, on-site consultations and electronic databases useful for avoiding mistakes. Conclusion. Experienced GPs seem to better tolerate uncertainty and also seem to fear medical errors less than their young colleagues. Young and more experienced GPs use different coping strategies for dealing with medical errors. Implications. When GPs become more experienced, they seem to get better at coping with medical errors. Means to support these skills should be studied in future research. PMID:24914458
Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A
2018-04-15
For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.
Calibration-free assays on standard real-time PCR devices
Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr
2017-01-01
Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration. PMID:28327545
Calibration-free assays on standard real-time PCR devices
NASA Astrophysics Data System (ADS)
Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr
2017-03-01
Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration.
Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.
2018-01-01
Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737
Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D
2018-01-01
Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.
Infralimbic EphB2 Modulates Fear Extinction in Adolescent Rats
Cruz, Emmanuel; Soler-Cedeño, Omar; Negrón, Geovanny; Criado-Marrero, Marangelie; Chompré, Gladys
2015-01-01
Adolescent rats are prone to impaired fear extinction, suggesting that mechanistic differences in extinction could exist in adolescent and adult rats. Since the infralimbic cortex (IL) is critical for fear extinction, we used PCR array technology to identify gene expression changes in IL induced by fear extinction in adolescent rats. Interestingly, the ephrin type B receptor 2 (EphB2), a tyrosine kinase receptor associated with synaptic development, was downregulated in IL after fear extinction. Consistent with the PCR array results, EphB2 levels of mRNA and protein were reduced in IL after fear extinction compared with fear conditioning, suggesting that EphB2 signaling in IL regulates fear extinction memory in adolescents. Finally, reducing EphB2 synthesis in IL with shRNA accelerated fear extinction learning in adolescent rats, but not in adult rats. These findings identify EphB2 in IL as a key regulator of fear extinction during adolescence, perhaps due to the increase in synaptic remodeling occurring during this developmental phase. PMID:26354908
Primer3_masker: integrating masking of template sequence with primer design software.
Kõressaar, Triinu; Lepamets, Maarja; Kaplinski, Lauris; Raime, Kairi; Andreson, Reidar; Remm, Maido
2018-06-01
Designing PCR primers for amplifying regions of eukaryotic genomes is a complicated task because the genomes contain a large number of repeat sequences and other regions unsuitable for amplification by PCR. We have developed a novel k-mer based masking method that uses a statistical model to detect and mask failure-prone regions on the DNA template prior to primer design. We implemented the software as a standalone software primer3_masker and integrated it into the primer design program Primer3. The standalone version of primer3_masker is implemented in C. The source code is freely available at https://github.com/bioinfo-ut/primer3_masker/ (standalone version for Linux and macOS) and at https://github.com/primer3-org/primer3/ (integrated version). Primer3 web application that allows masking sequences of 196 animal and plant genomes is available at http://primer3.ut.ee/. maido.remm@ut.ee. Supplementary data are available at Bioinformatics online.
Chana, Narinder; Porat, Talya; Whittlesea, Cate; Delaney, Brendan
2017-03-01
Electronic prescribing has benefited from computerised clinical decision support systems (CDSSs); however, no published studies have evaluated the potential for a CDSS to support GPs in prescribing specialist drugs. To identify potential weaknesses and errors in the existing process of prescribing specialist drugs that could be addressed in the development of a CDSS. Semi-structured interviews with key informants followed by an observational study involving GPs in the UK. Twelve key informants were interviewed to investigate the use of CDSSs in the UK. Nine GPs were observed while performing case scenarios depicting requests from hospitals or patients to prescribe a specialist drug. Activity diagrams, hierarchical task analysis, and systematic human error reduction and prediction approach analyses were performed. The current process of prescribing specialist drugs by GPs is prone to error. Errors of omission due to lack of information were the most common errors, which could potentially result in a GP prescribing a specialist drug that should only be prescribed in hospitals, or prescribing a specialist drug without reference to a shared care protocol. Half of all possible errors in the prescribing process had a high probability of occurrence. A CDSS supporting GPs during the process of prescribing specialist drugs is needed. This could, first, support the decision making of whether or not to undertake prescribing, and, second, provide drug-specific parameters linked to shared care protocols, which could reduce the errors identified and increase patient safety. © British Journal of General Practice 2017.
Target Uncertainty Mediates Sensorimotor Error Correction
Vijayakumar, Sethu; Wolpert, Daniel M.
2017-01-01
Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323
Target Uncertainty Mediates Sensorimotor Error Correction.
Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M
2017-01-01
Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.
Modeling the Error of the Medtronic Paradigm Veo Enlite Glucose Sensor.
Biagi, Lyvia; Ramkissoon, Charrise M; Facchinetti, Andrea; Leal, Yenny; Vehi, Josep
2017-06-12
Continuous glucose monitors (CGMs) are prone to inaccuracy due to time lags, sensor drift, calibration errors, and measurement noise. The aim of this study is to derive the model of the error of the second generation Medtronic Paradigm Veo Enlite (ENL) sensor and compare it with the Dexcom SEVEN PLUS (7P), G4 PLATINUM (G4P), and advanced G4 for Artificial Pancreas studies (G4AP) systems. An enhanced methodology to a previously employed technique was utilized to dissect the sensor error into several components. The dataset used included 37 inpatient sessions in 10 subjects with type 1 diabetes (T1D), in which CGMs were worn in parallel and blood glucose (BG) samples were analyzed every 15 ± 5 min Calibration error and sensor drift of the ENL sensor was best described by a linear relationship related to the gain and offset. The mean time lag estimated by the model is 9.4 ± 6.5 min. The overall average mean absolute relative difference (MARD) of the ENL sensor was 11.68 ± 5.07% Calibration error had the highest contribution to total error in the ENL sensor. This was also reported in the 7P, G4P, and G4AP. The model of the ENL sensor error will be useful to test the in silico performance of CGM-based applications, i.e., the artificial pancreas, employing this kind of sensor.
Fidelity of DNA Replication in Normal and Malignant Human Breast Cells
1998-07-01
synthesome has been extensively demonstrated to carry out full length DNA replication in vitro, and to accurately depict the DNA replication process as it...occurs in the intact cell. By examining the fidelity of the DNA replication process carried out by the DNA synthesome from a number of breast cell types...we have demonstrated for the first time, that the cellular DNA replication machinery of malignant human breast cells is significantly more error-prone than that of non- malignant human breast cells.
Addition to the Lewis Chemical Equilibrium Program to allow computation from coal composition data
NASA Technical Reports Server (NTRS)
Sevigny, R.
1980-01-01
Changes made to the Coal Gasification Project are reported. The program was developed by equilibrium combustion in rocket engines. It can be applied directly to the entrained flow coal gasification process. The particular problem addressed is the reduction of the coal data into a form suitable to the program, since the manual process is involved and error prone. A similar problem in relating the normal output of the program to parameters meaningful to the coal gasification process is also addressed.
Effects of Non-Normal Outlier-Prone Error Distribution on Kalman Filter Track
1991-09-01
other possibilities exist. For example the GST (Generic Statistical Tracker) uses four motion models [Ref. 41. The GST keeps track of both the target...1.011 + + + 3.113 1.291 4 Although this procedure is not easily statistically interpretable, it was used for the sake of comparison with the other... TRANSITOR TARGET’ WRITE(6,*)’ 3 SECOND ORDER GAUSS MARKOV TARGET’ WRITE(6,*)’ 4 RANDOM TOUR TARGET’ READ(6,*) CHOICE IF((CHOICE.LT.1).OR.(CHOICE.GT.4
Computer Aided Software Engineering (CASE) Environment Issues.
1987-06-01
tasks tend to be error prone and slowv when done by humans . Ti-.c,. are e’.el nt anidates for automation using a computer. (MacLennan. 10S1. p. 51 2...CASE r,’sourCcs; * human resources. Lonsisting of the people who use and facilitate utilization in !:1e case of manual resource, of the environment...engineering process in a given er,%irent rnizthe nature of rnanua! and human resources. CA.SU_ -esources should provide the softwvare enizincerin2 team
Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual
1988-12-01
The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.
The OPL Access Control Policy Language
NASA Astrophysics Data System (ADS)
Alm, Christopher; Wolf, Ruben; Posegga, Joachim
Existing policy languages suffer from a limited ability of directly and elegantly expressing high-level access control principles such as history-based separation of duty [22], binding of duty [26], context constraints [24], Chinese wall properties [10], and obligations [20]. It is often difficult to extend a language in order to retrofit these features once required or it is necessary to use complicated and complex language constructs to express such concepts. The latter, however, is cumbersome and error-prone for humans dealing with policy administration.
A filtering method to generate high quality short reads using illumina paired-end technology.
Eren, A Murat; Vineis, Joseph H; Morrison, Hilary G; Sogin, Mitchell L
2013-01-01
Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.
Gamification of Clinical Routine: The Dr. Fill Approach.
Bukowski, Mark; Kühn, Martin; Zhao, Xiaoqing; Bettermann, Ralf; Jonas, Stephan
2016-01-01
Gamification is used in clinical context in the health care education. Furthermore, it has shown great promises to improve the performance of the health care staff in their daily routine. In this work we focus on the medication sorting task, which is performed manually in hospitals. This task is very error prone and needs to be performed daily. Nevertheless, errors in the medication are crucial and lead to serious complications. In this work we present a real world gamification approach of the medication sorting task in a patient's daily pill organizer. The player of the game needs to sort the correct medication into the correct dispenser slots and is rewarded or punished in real time. At the end of the game, a score is given and the user can register in a leaderboard.
Predicted Errors In Children's Early Sentence Comprehension
Gertner, Yael; Fisher, Cynthia
2012-01-01
Children use syntax to interpret sentences and learn verbs; this is syntactic bootstrapping. The structure-mapping account of early syntactic bootstrapping proposes that a partial representation of sentence structure, the set of nouns occurring with the verb, guides initial interpretation and provides an abstract format for new learning. This account predicts early successes, but also telltale errors: Toddlers should be unable to tell transitive sentences from other sentences containing two nouns. In testing this prediction, we capitalized on evidence that 21-month-olds use what they have learned about noun order in English sentences to understand new transitive verbs. In two experiments, 21-month-olds applied this noun-order knowledge to two-noun intransitive sentences, mistakenly assigning different interpretations to “The boy and the girl are gorping!” and “The girl and the boy are gorping!”. This suggests that toddlers exploit partial representations of sentence structure to guide sentence interpretation; these sparse representations are useful, but error-prone. PMID:22525312
Metrics to quantify the importance of mixing state for CCN activity
Ching, Joseph; Fast, Jerome; West, Matthew; ...
2017-06-21
It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less
Landmark-based elastic registration using approximating thin-plate splines.
Rohr, K; Stiehl, H S; Sprengel, R; Buzug, T M; Weese, J; Kuhn, M H
2001-06-01
We consider elastic image registration based on a set of corresponding anatomical point landmarks and approximating thin-plate splines. This approach is an extension of the original interpolating thin-plate spline approach and allows to take into account landmark localization errors. The extension is important for clinical applications since landmark extraction is always prone to error. Our approach is based on a minimizing functional and can cope with isotropic as well as anisotropic landmark errors. In particular, in the latter case it is possible to include different types of landmarks, e.g., unique point landmarks as well as arbitrary edge points. Also, the scheme is general with respect to the image dimension and the order of smoothness of the underlying functional. Optimal affine transformations as well as interpolating thin-plate splines are special cases of this scheme. To localize landmarks we use a semi-automatic approach which is based on three-dimensional (3-D) differential operators. Experimental results are presented for two-dimensional as well as 3-D tomographic images of the human brain.
The Sizing and Optimization Language, (SOL): Computer language for design problems
NASA Technical Reports Server (NTRS)
Lucas, Stephen H.; Scotti, Stephen J.
1988-01-01
The Sizing and Optimization Language, (SOL), a new high level, special purpose computer language was developed to expedite application of numerical optimization to design problems and to make the process less error prone. SOL utilizes the ADS optimization software and provides a clear, concise syntax for describing an optimization problem, the OPTIMIZE description, which closely parallels the mathematical description of the problem. SOL offers language statements which can be used to model a design mathematically, with subroutines or code logic, and with existing FORTRAN routines. In addition, SOL provides error checking and clear output of the optimization results. Because of these language features, SOL is best suited to model and optimize a design concept when the model consits of mathematical expressions written in SOL. For such cases, SOL's unique syntax and error checking can be fully utilized. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler, runtime library routines, and a SOL reference manual.
Lockhart, Joseph J; Satya-Murti, Saty
2017-11-01
Cognitive effort is an essential part of both forensic and clinical decision-making. Errors occur in both fields because the cognitive process is complex and prone to bias. We performed a selective review of full-text English language literature on cognitive bias leading to diagnostic and forensic errors. Earlier work (1970-2000) concentrated on classifying and raising bias awareness. Recently (2000-2016), the emphasis has shifted toward strategies for "debiasing." While the forensic sciences have focused on the control of misleading contextual cues, clinical debiasing efforts have relied on checklists and hypothetical scenarios. No single generally applicable and effective bias reduction strategy has emerged so far. Generalized attempts at bias elimination have not been particularly successful. It is time to shift focus to the study of errors within specific domains, and how to best communicate uncertainty in order to improve decision making on the part of both the expert and the trier-of-fact. © 2017 American Academy of Forensic Sciences.
Metrics to quantify the importance of mixing state for CCN activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ching, Joseph; Fast, Jerome; West, Matthew
It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less
Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros
2013-01-01
Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors. PMID:24688709
Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros
2013-01-01
Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors.
Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.
Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L
2018-05-01
Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.
Bronchoalveolar lavage for diagnosis of tuberculosis infection in elephants.
Hermes, R; Saragusty, J; Moser, I; Holtze, S; Nieter, J; Sachse, K; Voracek, T; Bernhard, A; Bouts, T; Göritz, F; Hildebrandt, T B
2018-03-01
Tuberculosis (TB) has been known to affect elephants for thousands of years. It was put into spotlight when few circus elephants were diagnosed carrying Mycobacterium (M.) tuberculosis. Because of the zoonotic risk and high susceptibility to M. tuberculosis, periodic testing was enacted since, in captive breeding programmes. Presently, trunk wash is the recommended diagnostic procedure for TB. Trunk wash, however, puts the operator at risk, has low sensitivity, and is prone to contamination. Here, bronchoalveolar lavage is described for the first time for TB diagnosis in elephants. Bronchial, trunk and mouth fluids were investigated using bacterial culture, M. tuberculosis complex (MTC)-specific real-time quantitative PCR (qPCR) and mycobacterial genus-specific qPCR for overall presence of mycobacteria or mycobacterial DNA including bacteria or DNA of closely related genera, respectively, in 14 elephants. Neither bacteria of the MTC nor their DNA were identified in any of the elephants. Yet, 25% of the cultures grew non-tuberculous mycobacteria (NTM) or closely related bacterial species. Furthermore, 85% of the samples contained DNA of NTM or closely related bacterial genera. This finding might explain continued false-positive results from various serological tests. From a zoonotic point of view, bronchoalveolar lavage is safer for the testing personal, has higher probability of capturing MTC and, through PCR, identifies DNA NTM in elephants. Yet, necessary endoscopic equipment, animal sedation and access to a TB reference laboratory might pose challenging requirements in remote conditions in some elephant range countries.
Digital Assays Part I: Partitioning Statistics and Digital PCR.
Basu, Amar S
2017-08-01
A digital assay is one in which the sample is partitioned into many small containers such that each partition contains a discrete number of biological entities (0, 1, 2, 3, …). A powerful technique in the biologist's toolkit, digital assays bring a new level of precision in quantifying nucleic acids, measuring proteins and their enzymatic activity, and probing single-cell genotypes and phenotypes. Part I of this review begins with the benefits and Poisson statistics of partitioning, including sources of error. The remainder focuses on digital PCR (dPCR) for quantification of nucleic acids. We discuss five commercial instruments that partition samples into physically isolated chambers (cdPCR) or droplet emulsions (ddPCR). We compare the strengths of dPCR (absolute quantitation, precision, and ability to detect rare or mutant targets) with those of its predecessor, quantitative real-time PCR (dynamic range, larger sample volumes, and throughput). Lastly, we describe several promising applications of dPCR, including copy number variation, quantitation of circulating tumor DNA and viral load, RNA/miRNA quantitation with reverse transcription dPCR, and library preparation for next-generation sequencing. This review is intended to give a broad perspective to scientists interested in adopting digital assays into their workflows. Part II focuses on digital protein and cell assays.
Quantitative Real-Time PCR using the Thermo Scientific Solaris qPCR Assay
Ogrean, Christy; Jackson, Ben; Covino, James
2010-01-01
The Solaris qPCR Gene Expression Assay is a novel type of primer/probe set, designed to simplify the qPCR process while maintaining the sensitivity and accuracy of the assay. These primer/probe sets are pre-designed to >98% of the human and mouse genomes and feature significant improvements from previously available technologies. These improvements were made possible by virtue of a novel design algorithm, developed by Thermo Scientific bioinformatics experts. Several convenient features have been incorporated into the Solaris qPCR Assay to streamline the process of performing quantitative real-time PCR. First, the protocol is similar to commonly employed alternatives, so the methods used during qPCR are likely to be familiar. Second, the master mix is blue, which makes setting the qPCR reactions easier to track. Third, the thermal cycling conditions are the same for all assays (genes), making it possible to run many samples at a time and reducing the potential for error. Finally, the probe and primer sequence information are provided, simplifying the publication process. Here, we demonstrate how to obtain the appropriate Solaris reagents using the GENEius product search feature found on the ordering web site (www.thermo.com/solaris) and how to use the Solaris reagents for performing qPCR using the standard curve method. PMID:20567213
Sequetyping: Serotyping Streptococcus pneumoniae by a Single PCR Sequencing Strategy
Leung, Marcus H.; Bryson, Kevin; Freystatter, Kathrin; Pichon, Bruno; Edwards, Giles; Gillespie, Stephen H.
2012-01-01
The introduction of pneumococcal conjugate vaccines necessitates continued monitoring of circulating strains to assess vaccine efficacy and replacement serotypes. Conventional serological methods are costly, labor-intensive, and prone to misidentification, while current DNA-based methods have limited serotype coverage requiring multiple PCR primers. In this study, a computer algorithm was developed to interrogate the capsulation locus (cps) of vaccine serotypes to locate primer pairs in conserved regions that border variable regions and could differentiate between serotypes. In silico analysis of cps from 92 serotypes indicated that a primer pair spanning the regulatory gene cpsB could putatively amplify 84 serotypes and differentiate 46. This primer set was specific to Streptococcus pneumoniae, with no amplification observed for other species, including S. mitis, S. oralis, and S. pseudopneumoniae. One hundred thirty-eight pneumococcal strains covering 48 serotypes were tested. Of 23 vaccine serotypes included in the study, most (19/22, 86%) were identified correctly at least to the serogroup level, including all of the 13-valent conjugate vaccine and other replacement serotypes. Reproducibility was demonstrated by the correct sequetyping of different strains of a serotype. This novel sequence-based method employing a single PCR primer pair is cost-effective and simple. Furthermore, it has the potential to identify new serotypes that may evolve in the future. PMID:22553238
Mao, Meng; Austin, Andrew D; Johnson, Norman F; Dowton, Mark
2014-03-01
Recombination has been proposed as a possible mechanism to explain mitochondrial (mt) gene rearrangements, although the issue of whether mtDNA recombination occurs in animals has been controversial. In this study, we sequenced the entire mt genome of the megaspilid wasp Conostigmus sp., which possessed a highly rearranged mt genome. The sequence of the A+T-rich region contained a number of different types of repeats, similar to those reported previously in the nematode Meloidogyne javanica, in which recombination was discovered. In Conostigmus, we detected the end products of recombination: a range of minicircles. However, using isolated (cloned) fragments of the A+T-rich region, we established that some of these minicircles were found to be polymerase chain reaction (PCR) artifacts. It appears that regions with repeats are prone to PCR template switching or PCR jumping. Nevertheless, there is strong evidence that one minicircle is real, as amplification primers that straddle the putative breakpoint junction produce a single strong amplicon from genomic DNA but not from the cloned A+T-rich region. The results provide support for the direct link between recombination and mt gene rearrangement. Furthermore, we developed a model of recombination which is important for our understanding of mtDNA evolution.
Time-symmetric integration in astrophysics
NASA Astrophysics Data System (ADS)
Hernandez, David M.; Bertschinger, Edmund
2018-04-01
Calculating the long-term solution of ordinary differential equations, such as those of the N-body problem, is central to understanding a wide range of dynamics in astrophysics, from galaxy formation to planetary chaos. Because generally no analytic solution exists to these equations, researchers rely on numerical methods that are prone to various errors. In an effort to mitigate these errors, powerful symplectic integrators have been employed. But symplectic integrators can be severely limited because they are not compatible with adaptive stepping and thus they have difficulty in accommodating changing time and length scales. A promising alternative is time-reversible integration, which can handle adaptive time-stepping, but the errors due to time-reversible integration in astrophysics are less understood. The goal of this work is to study analytically and numerically the errors caused by time-reversible integration, with and without adaptive stepping. We derive the modified differential equations of these integrators to perform the error analysis. As an example, we consider the trapezoidal rule, a reversible non-symplectic integrator, and show that it gives secular energy error increase for a pendulum problem and for a Hénon-Heiles orbit. We conclude that using reversible integration does not guarantee good energy conservation and that, when possible, use of symplectic integrators is favoured. We also show that time-symmetry and time-reversibility are properties that are distinct for an integrator.
Schipler, Agnes; Iliakis, George
2013-09-01
Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice.
Evaluation of exome variants using the Ion Proton Platform to sequence error-prone regions.
Seo, Heewon; Park, Yoomi; Min, Byung Joo; Seo, Myung Eui; Kim, Ju Han
2017-01-01
The Ion Proton sequencer from Thermo Fisher accurately determines sequence variants from target regions with a rapid turnaround time at a low cost. However, misleading variant-calling errors can occur. We performed a systematic evaluation and manual curation of read-level alignments for the 675 ultrarare variants reported by the Ion Proton sequencer from 27 whole-exome sequencing data but that are not present in either the 1000 Genomes Project and the Exome Aggregation Consortium. We classified positive variant calls into 393 highly likely false positives, 126 likely false positives, and 156 likely true positives, which comprised 58.2%, 18.7%, and 23.1% of the variants, respectively. We identified four distinct error patterns of variant calling that may be bioinformatically corrected when using different strategies: simplicity region, SNV cluster, peripheral sequence read, and base inversion. Local de novo assembly successfully corrected 201 (38.7%) of the 519 highly likely or likely false positives. We also demonstrate that the two sequencing kits from Thermo Fisher (the Ion PI Sequencing 200 kit V3 and the Ion PI Hi-Q kit) exhibit different error profiles across different error types. A refined calling algorithm with better polymerase may improve the performance of the Ion Proton sequencing platform.
Monteiro, Sandra; Norman, Geoff; Sherbino, Jonathan
2018-06-01
There is general consensus that clinical reasoning involves 2 stages: a rapid stage where 1 or more diagnostic hypotheses are advanced and a slower stage where these hypotheses are tested or confirmed. The rapid hypothesis generation stage is considered inaccessible for analysis or observation. Consequently, recent research on clinical reasoning has focused specifically on improving the accuracy of the slower, hypothesis confirmation stage. Three perspectives have developed in this line of research, and each proposes different error reduction strategies for clinical reasoning. This paper considers these 3 perspectives and examines the underlying assumptions. Additionally, this paper reviews the evidence, or lack of, behind each class of error reduction strategies. The first perspective takes an epidemiological stance, appealing to the benefits of incorporating population data and evidence-based medicine in every day clinical reasoning. The second builds on the heuristic and bias research programme, appealing to a special class of dual process reasoning models that theorizes a rapid error prone cognitive process for problem solving with a slower more logical cognitive process capable of correcting those errors. Finally, the third perspective borrows from an exemplar model of categorization that explicitly relates clinical knowledge and experience to diagnostic accuracy. © 2018 John Wiley & Sons, Ltd.
Impact of Standardized Communication Techniques on Errors during Simulated Neonatal Resuscitation.
Yamada, Nicole K; Fuerch, Janene H; Halamek, Louis P
2016-03-01
Current patterns of communication in high-risk clinical situations, such as resuscitation, are imprecise and prone to error. We hypothesized that the use of standardized communication techniques would decrease the errors committed by resuscitation teams during neonatal resuscitation. In a prospective, single-blinded, matched pairs design with block randomization, 13 subjects performed as a lead resuscitator in two simulated complex neonatal resuscitations. Two nurses assisted each subject during the simulated resuscitation scenarios. In one scenario, the nurses used nonstandard communication; in the other, they used standardized communication techniques. The performance of the subjects was scored to determine errors committed (defined relative to the Neonatal Resuscitation Program algorithm), time to initiation of positive pressure ventilation (PPV), and time to initiation of chest compressions (CC). In scenarios in which subjects were exposed to standardized communication techniques, there was a trend toward decreased error rate, time to initiation of PPV, and time to initiation of CC. While not statistically significant, there was a 1.7-second improvement in time to initiation of PPV and a 7.9-second improvement in time to initiation of CC. Should these improvements in human performance be replicated in the care of real newborn infants, they could improve patient outcomes and enhance patient safety. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch
2012-07-06
Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.« less
Sinner, Jim; Ellis, Joanne; Kandlikar, Milind; Halpern, Benjamin S.; Satterfield, Terre; Chan, Kai
2017-01-01
The elicitation of expert judgment is an important tool for assessment of risks and impacts in environmental management contexts, and especially important as decision-makers face novel challenges where prior empirical research is lacking or insufficient. Evidence-driven elicitation approaches typically involve techniques to derive more accurate probability distributions under fairly specific contexts. Experts are, however, prone to overconfidence in their judgements. Group elicitations with diverse experts can reduce expert overconfidence by allowing cross-examination and reassessment of prior judgements, but groups are also prone to uncritical “groupthink” errors. When the problem context is underspecified the probability that experts commit groupthink errors may increase. This study addresses how structured workshops affect expert variability among and certainty within responses in a New Zealand case study. We find that experts’ risk estimates before and after a workshop differ, and that group elicitations provided greater consistency of estimates, yet also greater uncertainty among experts, when addressing prominent impacts to four different ecosystem services in coastal New Zealand. After group workshops, experts provided more consistent ranking of risks and more consistent best estimates of impact through increased clarity in terminology and dampening of extreme positions, yet probability distributions for impacts widened. The results from this case study suggest that group elicitations have favorable consequences for the quality and uncertainty of risk judgments within and across experts, making group elicitation techniques invaluable tools in contexts of limited data. PMID:28767694
The p21 and PCNA partnership: a new twist for an old plot.
Prives, Carol; Gottifredi, Vanesa
2008-12-15
The contribution of error-prone DNA polymerases to the DNA damage response has been a subject of great interest in the last decade. Error-prone polymerases are required for translesion DNA synthesis (TLS), a process that involves synthesis past a DNA lesion. Under certain circumstances, TLS polymerases can achieve bypass with good efficiency and fidelity. However, they can also in some cases be mutagenic, and so negative regulators of TLS polymerases would have the important function of inhibiting their recruitment to undamaged DNA templates. Recent work from Livneh's and our groups have provided evidence regarding the role of the cyclin kinase inhibitor p21 as a negative regulator of TLS. Interestingly, both the cyclin dependent kinase (CDK) and proliferating cell nuclear antigen (PCNA) binding domains of p21 are involved in different aspects of the modulation of TLS, affecting both the interaction between PCNA and the TLS-specific pol eta as well as PCNA ubiquitination status. In line with this, p21 was shown to reduce the efficiency but increase the accuracy of TLS. Hence, in absence of DNA damage p21 may work to impede accidental loading of pol eta to undamaged DNA and avoid consequential mutagenesis. After UV irradiation, when TLS plays a decisive role, p21 is progressively degraded. This might allow gradual release of replication fork blockage by TLS polymerases. For these reasons, in higher eukaryotes p21 might represent a key regulator of the equilibrium between mutagenesis and cell survival.
Java Performance for Scientific Applications on LLNL Computer Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapfer, C; Wissink, A
2002-05-10
Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less
Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J; Berdis, Anthony J
2016-02-18
This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data
Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.
2017-01-01
Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395
mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.
Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M
2017-08-15
Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Rice, Stephen; McCarley, Jason S
2011-12-01
Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.
PSO4: a novel gene involved in error-prone repair in Saccharomyces cerevisiae.
Henriques, J A; Vicente, E J; Leandro da Silva, K V; Schenberg, A C
1989-09-01
The haploid xs9 mutant, originally selected for on the basis of a slight sensitivity to the lethal effect of X-rays, was found to be extremely sensitive to inactivation by 8-methoxypsoralen (8MOP) photoaddition, especially when cells are treated in the G2 phase of the cell cycle. As the xs9 mutation showed no allelism with any of the 3 known pso mutations, it was now given the name of pso4-1. Regarding inactivation, the pso4-1 mutant is also sensitive to mono- (HN1) or bi-functional (HN2) nitrogen mustards, it is slightly sensitive to 254 nm UV radiation (UV), and shows nearly normal sensitivity to 3-carbethoxypsoralen (3-CPs) photoaddition or methyl methanesulfonate (MMS). Regarding mutagenesis, the pso4-1 mutation completely blocks reverse and forward mutations induced by either 8MOP or 3CPs photoaddition, or by gamma-rays. In the cases of UV, HN1, HN2 or MMS treatments, while reversion induction is still completely abolished, forward mutagenesis is only partially inhibited for UV, HN1, or MMS, and it is unaffected for HN2. Besides severely inhibiting induced mutagenesis, the pso4-1 mutation was found to be semi-dominant, to block sporulation, to abolish the diploid resistance effect, and to block induced mitotic recombination, which indicates that the PSO4 gene is involved in a recombinational pathway of error-prone repair, comparable to the E. coli SOS repair pathway.
Error Recovery in the Time-Triggered Paradigm with FTT-CAN.
Marques, Luis; Vasconcelos, Verónica; Pedreiras, Paulo; Almeida, Luís
2018-01-11
Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots.
The accuracy of self-reported pregnancy-related weight: a systematic review.
Headen, I; Cohen, A K; Mujahid, M; Abrams, B
2017-03-01
Self-reported maternal weight is error-prone, and the context of pregnancy may impact error distributions. This systematic review summarizes error in self-reported weight across pregnancy and assesses implications for bias in associations between pregnancy-related weight and birth outcomes. We searched PubMed and Google Scholar through November 2015 for peer-reviewed articles reporting accuracy of self-reported, pregnancy-related weight at four time points: prepregnancy, delivery, over gestation and postpartum. Included studies compared maternal self-report to anthropometric measurement or medical report of weights. Sixty-two studies met inclusion criteria. We extracted data on magnitude of error and misclassification. We assessed impact of reporting error on bias in associations between pregnancy-related weight and birth outcomes. Women underreported prepregnancy (PPW: -2.94 to -0.29 kg) and delivery weight (DW: -1.28 to 0.07 kg), and over-reported gestational weight gain (GWG: 0.33 to 3 kg). Magnitude of error was small, ranged widely, and varied by prepregnancy weight class and race/ethnicity. Misclassification was moderate (PPW: 0-48.3%; DW: 39.0-49.0%; GWG: 16.7-59.1%), and overestimated some estimates of population prevalence. However, reporting error did not largely bias associations between pregnancy-related weight and birth outcomes. Although measured weight is preferable, self-report is a cost-effective and practical measurement approach. Future researchers should develop bias correction techniques for self-reported pregnancy-related weight. © 2017 World Obesity Federation.
Error Recovery in the Time-Triggered Paradigm with FTT-CAN
Pedreiras, Paulo; Almeida, Luís
2018-01-01
Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots. PMID:29324723
Correcting for Measurement Error in Time-Varying Covariates in Marginal Structural Models.
Kyle, Ryan P; Moodie, Erica E M; Klein, Marina B; Abrahamowicz, Michał
2016-08-01
Unbiased estimation of causal parameters from marginal structural models (MSMs) requires a fundamental assumption of no unmeasured confounding. Unfortunately, the time-varying covariates used to obtain inverse probability weights are often error-prone. Although substantial measurement error in important confounders is known to undermine control of confounders in conventional unweighted regression models, this issue has received comparatively limited attention in the MSM literature. Here we propose a novel application of the simulation-extrapolation (SIMEX) procedure to address measurement error in time-varying covariates, and we compare 2 approaches. The direct approach to SIMEX-based correction targets outcome model parameters, while the indirect approach corrects the weights estimated using the exposure model. We assess the performance of the proposed methods in simulations under different clinically plausible assumptions. The simulations demonstrate that measurement errors in time-dependent covariates may induce substantial bias in MSM estimators of causal effects of time-varying exposures, and that both proposed SIMEX approaches yield practically unbiased estimates in scenarios featuring low-to-moderate degrees of error. We illustrate the proposed approach in a simple analysis of the relationship between sustained virological response and liver fibrosis progression among persons infected with hepatitis C virus, while accounting for measurement error in γ-glutamyltransferase, using data collected in the Canadian Co-infection Cohort Study from 2003 to 2014. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Multiple description distributed image coding with side information for mobile wireless transmission
NASA Astrophysics Data System (ADS)
Wu, Min; Song, Daewon; Chen, Chang Wen
2005-03-01
Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.
Gruet, Antoine; Dosnon, Marion; Vassena, Andrea; Lombard, Vincent; Gerlier, Denis; Bignon, Christophe; Longhi, Sonia
2013-09-23
In view of getting insights into the molecular determinants of the binding efficiency of intrinsically disordered proteins (IDPs), we used random mutagenesis. As a proof of concept, we chose the interaction between the intrinsically disordered C-terminal domain of the measles virus nucleoprotein (NTAIL) and the X domain (XD) of the viral phosphoprotein and assessed how amino acid substitutions introduced at random within NTAIL affect partner recognition. In contrast with directed evolution approaches, we did not apply any selection and used the gene library approach not for production purposes but for achieving a better understanding of the NTAIL/XD interaction. For that reason, and to differentiate our approach from similar approaches that make use of systematic (i.e., targeted) mutagenesis, we propose to call it "descriptive random mutagenesis" (DRM). NTAIL variants generated by error-prone PCR were picked at random in the absence of selection pressure and were characterized in terms of sequence and binding abilities toward XD. DRM not only identified determinants of NTAIL/XD interaction that were in good agreement with previous work but also provided new insights. In particular, we discovered that the primary interaction site is poorly evolvable in terms of binding abilities toward XD. We also identified a critical NTAIL residue whose role in stabilizing the NTAIL/XD complex had previously escaped detection, and we identified NTAIL regulatory sites that dampen the interaction while being located outside the primary interaction site. Results show that DRM is a valuable approach to study binding abilities of IDPs. © 2013 Elsevier Ltd. All rights reserved.
Ultrasensitive Genotypic Detection of Antiviral Resistance in Hepatitis B Virus Clinical Isolates▿ †
Fang, Jie; Wichroski, Michael J.; Levine, Steven M.; Baldick, Carl J.; Mazzucco, Charles E.; Walsh, Ann W.; Kienzle, Bernadette K.; Rose, Ronald E.; Pokornowski, Kevin A.; Colonno, Richard J.; Tenney, Daniel J.
2009-01-01
Amino acid substitutions that confer reduced susceptibility to antivirals arise spontaneously through error-prone viral polymerases and are selected as a result of antiviral therapy. Resistance substitutions first emerge in a fraction of the circulating virus population, below the limit of detection by nucleotide sequencing of either the population or limited sets of cloned isolates. These variants can expand under drug pressure to dominate the circulating virus population. To enhance detection of these viruses in clinical samples, we established a highly sensitive quantitative, real-time allele-specific PCR assay for hepatitis B virus (HBV) DNA. Sensitivity was accomplished using a high-fidelity DNA polymerase and oligonucleotide primers containing locked nucleic acid bases. Quantitative measurement of resistant and wild-type variants was accomplished using sequence-matched standards. Detection methodology that was not reliant on hybridization probes, and assay modifications, minimized the effect of patient-specific sequence polymorphisms. The method was validated using samples from patients chronically infected with HBV through parallel sequencing of large numbers of cloned isolates. Viruses with resistance to lamivudine and other l-nucleoside analogs and entecavir, involving 17 different nucleotide substitutions, were reliably detected at levels at or below 0.1% of the total population. The method worked across HBV genotypes. Longitudinal analysis of patient samples showed earlier emergence of resistance on therapy than was seen with sequencing methodologies, including some cases of resistance that existed prior to treatment. In summary, we established and validated an ultrasensitive method for measuring resistant HBV variants in clinical specimens, which enabled earlier, quantitative measurement of resistance to therapy. PMID:19433559
Nilsson, Ola B; Adedoyin, Justus; Rhyner, Claudio; Neimert-Andersson, Theresa; Grundström, Jeanette; Berndt, Kurt D; Crameri, Reto; Grönlund, Hans
2011-01-01
Allergy and asthma to cat (Felis domesticus) affects about 10% of the population in affluent countries. Immediate allergic symptoms are primarily mediated via IgE antibodies binding to B cell epitopes, whereas late phase inflammatory reactions are mediated via activated T cell recognition of allergen-specific T cell epitopes. Allergen-specific immunotherapy relieves symptoms and is the only treatment inducing a long-lasting protection by induction of protective immune responses. The aim of this study was to produce an allergy vaccine designed with the combined features of attenuated T cell activation, reduced anaphylactic properties, retained molecular integrity and induction of efficient IgE blocking IgG antibodies for safer and efficacious treatment of patients with allergy and asthma to cat. The template gene coding for rFel d 1 was used to introduce random mutations, which was subsequently expressed in large phage libraries. Despite accumulated mutations by up to 7 rounds of iterative error-prone PCR and biopanning, surface topology and structure was essentially maintained using IgE-antibodies from cat allergic patients for phage enrichment. Four candidates were isolated, displaying similar or lower IgE binding, reduced anaphylactic activity as measured by their capacity to induce basophil degranulation and, importantly, a significantly lower T cell reactivity in lymphoproliferative assays compared to the original rFel d 1. In addition, all mutants showed ability to induce blocking antibodies in immunized mice.The approach presented here provides a straightforward procedure to generate a novel type of allergy vaccines for safer and efficacious treatment of allergic patients.
A mediator-adapted diaphorase variant for a glucose dehydrogenase-diaphorase biocatalytic system.
Sugiyama, Taiki; Goto, Yoshio; Matsumoto, Ryuhei; Sakai, Hideki; Tokita, Yuichi; Hatazawa, Tsuyonobu
2010-10-15
Biofuel cell is an energy conversion device of the next generation which enables use of safer and higher energy-density fuels such as glucose. We have been developing a biofuel cell that comprises the three enzymes: glucose dehydrogenase (GDH) and diaphorase (DI) on anode, and bilirubin oxidase (BOD) on cathode. In this work, we have developed a DI variant suitable for our biofuel cell by using directed molecular evolution method. A gene library of DI variants was constructed by using error-prone PCR and the variant proteins were expressed in an Escherichia coli system. 8000 isolated variants have been screened with activity against 2-amino-1,4-naphthoquinone (ANQ), and 10 of them have been qualified which were then purified and examined their activities against ANQ. A highest activity was observed in G122D variant of which glycine residue at position 122 is substituted to aspartate. Enzymatic kinetic analyses show that KM for ANQ in G122D is 1/3 of that in wild type (G122D: 356 μM, wild type: 1.08 mM), whereas kcat and KM for NADH is almost the same, clearly showing that G122D mutation has given DI an improvement in enzymatic activity at lower ANQ concentration. The effect of this mutation was considered electrochemically in solution and in immobilized layer. The results show that G122D variant DI gave a higher current at lower ANQ concentration in solution, as well as in immobilized condition where GDH is co-immobilized within. Copyright © 2010 Elsevier B.V. All rights reserved.
Zhan, Tao; Zhang, Kai; Chen, Yangyan; Lin, Yongjun; Wu, Gaobing; Zhang, Lili; Yao, Pei; Shao, Zongze; Liu, Ziduo
2013-01-01
Glyphosate, a broad spectrum herbicide widely used in agriculture all over the world, inhibits 5-enolpyruvylshikimate-3-phosphate synthase in the shikimate pathway, and glycine oxidase (GO) has been reported to be able to catalyze the oxidative deamination of various amines and cleave the C-N bond in glyphosate. Here, in an effort to improve the catalytic activity of the glycine oxidase that was cloned from a glyphosate-degrading marine strain of Bacillus cereus (BceGO), we used a bacteriophage T7 lysis-based method for high-throughput screening of oxidase activity and engineered the gene encoding BceGO by directed evolution. Six mutants exhibiting enhanced activity toward glyphosate were screened from two rounds of error-prone PCR combined with site directed mutagenesis, and the beneficial mutations of the six evolved variants were recombined by DNA shuffling. Four recombinants were generated and, when compared with the wild-type BceGO, the most active mutant B3S1 showed the highest activity, exhibiting a 160-fold increase in substrate affinity, a 326-fold enhancement in catalytic efficiency against glyphosate, with little difference between their pH and temperature stabilities. The role of these mutations was explored through structure modeling and molecular docking, revealing that the Arg51 mutation is near the active site and could be an important residue contributing to the stabilization of glyphosate binding, while the role of the remaining mutations is unclear. These results provide insight into the application of directed evolution in optimizing glycine oxidase function and have laid a foundation for the development of glyphosate-tolerant crops. PMID:24223901
Qu, Weina; Ge, Yan; Zhang, Qian; Zhao, Wenguo; Zhang, Kan
2015-07-01
Driver inattention is a significant cause of motor vehicle collisions and incidents. The purpose of this study was to translate the Attention-Related Driving Error Scale (ARDES) into Chinese and to verify its reliability and validity. A total of 317 drivers completed the Chinese version of the ARDES, the Dula Dangerous Driving Index (DDDI), the Attention-Related Cognitive Errors Scale (ARCES) and the Mindful Attention Awareness Scale (MAAS) questionnaires. Specific sociodemographic variables and traffic violations were also measured. Psychometric results confirm that the ARDES-China has adequate psychometric properties (Cronbach's alpha=0.88) to be a useful tool for evaluating proneness to attentional errors in the Chinese driving population. First, ARDES-China scores were positively correlated with both DDDI scores and number of accidents in the prior year; in addition, ARDES-China scores were a significant predictor of dangerous driving behavior as measured by DDDI. Second, we found that ARDES-China scores were strongly correlated with ARCES scores and negatively correlated with MAAS scores. Finally, different demographic groups exhibited significant differences in ARDES scores; in particular, ARDES scores varied with years of driving experience. Copyright © 2015 Elsevier Ltd. All rights reserved.
Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array
Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Tao, Yuan
2018-01-01
Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%. PMID:29734742
Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array.
Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Abu-Siada, Ahmed; Tao, Yuan
2018-05-05
Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%.
A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks
Costa, Daniel G.; Guedes, Luiz Affonso
2011-01-01
Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908
Sánchez-Durán, José A; Hidalgo-López, José A; Castellanos-Ramos, Julián; Oballe-Peinado, Óscar; Vidal-Verdú, Fernando
2015-08-19
Tactile sensors suffer from many types of interference and errors like crosstalk, non-linearity, drift or hysteresis, therefore calibration should be carried out to compensate for these deviations. However, this procedure is difficult in sensors mounted on artificial hands for robots or prosthetics for instance, where the sensor usually bends to cover a curved surface. Moreover, the calibration procedure should be repeated often because the correction parameters are easily altered by time and surrounding conditions. Furthermore, this intensive and complex calibration could be less determinant, or at least simpler. This is because manipulation algorithms do not commonly use the whole data set from the tactile image, but only a few parameters such as the moments of the tactile image. These parameters could be changed less by common errors and interferences, or at least their variations could be in the order of those caused by accepted limitations, like reduced spatial resolution. This paper shows results from experiments to support this idea. The experiments are carried out with a high performance commercial sensor as well as with a low-cost error-prone sensor built with a common procedure in robotics.
[Factors Related to Presenteeism in Young and Middle-aged Nurses].
Yoshida, Mami; Miki, Akiko
2018-04-03
Presenteeism is considered to be not only a work-related stressor but also a factor involved in the development of workaholism and error proneness, which is often described as careless. Additionally, increasing health issues arising from aging suggest the possibility that presenteeism in middle-aged nurses is different than that in young ones. Therefore, the present study aimed to identify and tease apart factors involved in presenteeism among young and middle-aged nurses. An anonymous self-administered questionnaire survey was conducted among 2,006 nurses working at 10 hospitals. In total, 761 nurses aged <40 years and 536 nurses aged ≥40 years were enrolled in this study. Work Impairment Scores (WIS) on the Japanese version of the Stanford Presenteeism Scale were measured for presenteeism. Job stressors, workaholism, and error proneness were measured for related factors. Multiple regression analysis was conducted after determining the WIS as the dependent variable and related factors as independent variables. Overall, 70.8% of the young nurses reported health problems compared to 82.5% of the middle-aged nurses. However, WIS in young nurses was significantly higher than that in middle-aged ones (p < 0.001). WIS in young nurses showed a significant relationship with the degree of stressors, "difficulty of work" (β = 0.28, p < 0.001) and tendency to "work excessively" (β = 0.18, p < 0.001), which is a subscale of workaholism, error proneness of "action slips" (β = 0.14, p < 0.01) and "cognitive narrowing" (β = 0.11, p < 0.05). Conversely, WIS in middle-aged nurses showed a significant relationship with "cognitive narrowing" (β = 0.29, p < 0.001) and to "work excessively" (β = 0.17, p < 0.001), the degree of stressors on "difficulty of work" (β = 0.12, p < 0.05) and "lack of communication" (β = 0.13, p < 0.01). It was clarified that the increased health problems of middle-aged nurses does not necessarily lower their working capacity. Also, compared to young nurses, the degree of failing tendency, rather than the degree of job stressors, was more related to presenteeism for middle-aged nurses. It can be considered that middle-aged nurses simply realize that their working ability is hindered because of incidents resulting from attention narrowing. As fatigue and state of tension tend to cause narrowing of attention, it may be necessary to reduce such risks and adjust work environments so mistakes can be avoided.
Microarray Analysis of Iris Gene Expression in Mice with Mutations Influencing Pigmentation
Trantow, Colleen M.; Cuffy, Tryphena L.; Fingert, John H.; Kuehn, Markus H.
2011-01-01
Purpose. Several ocular diseases involve the iris, notably including oculocutaneous albinism, pigment dispersion syndrome, and exfoliation syndrome. To screen for candidate genes that may contribute to the pathogenesis of these diseases, genome-wide iris gene expression patterns were comparatively analyzed from mouse models of these conditions. Methods. Iris samples from albino mice with a Tyr mutation, pigment dispersion–prone mice with Tyrp1 and Gpnmb mutations, and mice resembling exfoliation syndrome with a Lyst mutation were compared with samples from wild-type mice. All mice were strain (C57BL/6J), age (60 days old), and sex (female) matched. Microarrays were used to compare transcriptional profiles, and differentially expressed transcripts were described by functional annotation clustering using DAVID Bioinformatics Resources. Quantitative real-time PCR was performed to validate a subset of identified changes. Results. Compared with wild-type C57BL/6J mice, each disease context exhibited a large number of statistically significant changes in gene expression, including 685 transcripts differentially expressed in albino irides, 403 in pigment dispersion–prone irides, and 460 in exfoliative-like irides. Conclusions. Functional annotation clusterings were particularly striking among the overrepresented genes, with albino and pigment dispersion–prone irides both exhibiting overall evidence of crystallin-mediated stress responses. Exfoliative-like irides from mice with a Lyst mutation showed overall evidence of involvement of genes that influence immune system processes, lytic vacuoles, and lysosomes. These findings have several biologically relevant implications, particularly with respect to secondary forms of glaucoma, and represent a useful resource as a hypothesis-generating dataset. PMID:20739468
A probabilistic approach to remote compositional analysis of planetary surfaces
Lapotre, Mathieu G.A.; Ehlmann, Bethany L.; Minson, Sarah E.
2017-01-01
Reflected light from planetary surfaces provides information, including mineral/ice compositions and grain sizes, by study of albedo and absorption features as a function of wavelength. However, deconvolving the compositional signal in spectra is complicated by the nonuniqueness of the inverse problem. Trade-offs between mineral abundances and grain sizes in setting reflectance, instrument noise, and systematic errors in the forward model are potential sources of uncertainty, which are often unquantified. Here we adopt a Bayesian implementation of the Hapke model to determine sets of acceptable-fit mineral assemblages, as opposed to single best fit solutions. We quantify errors and uncertainties in mineral abundances and grain sizes that arise from instrument noise, compositional end members, optical constants, and systematic forward model errors for two suites of ternary mixtures (olivine-enstatite-anorthite and olivine-nontronite-basaltic glass) in a series of six experiments in the visible-shortwave infrared (VSWIR) wavelength range. We show that grain sizes are generally poorly constrained from VSWIR spectroscopy. Abundance and grain size trade-offs lead to typical abundance errors of ≤1 wt % (occasionally up to ~5 wt %), while ~3% noise in the data increases errors by up to ~2 wt %. Systematic errors further increase inaccuracies by a factor of 4. Finally, phases with low spectral contrast or inaccurate optical constants can further increase errors. Overall, typical errors in abundance are <10%, but sometimes significantly increase for specific mixtures, prone to abundance/grain-size trade-offs that lead to high unmixing uncertainties. These results highlight the need for probabilistic approaches to remote determination of planetary surface composition.
Fabbretti, G
2010-06-01
Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.
Multistrip western blotting to increase quantitative data output.
Kiyatkin, Anatoly; Aksamitiene, Edita
2009-01-01
The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.
Bartels, Daniel M; Pizarro, David A
2011-10-01
Researchers have recently argued that utilitarianism is the appropriate framework by which to evaluate moral judgment, and that individuals who endorse non-utilitarian solutions to moral dilemmas (involving active vs. passive harm) are committing an error. We report a study in which participants responded to a battery of personality assessments and a set of dilemmas that pit utilitarian and non-utilitarian options against each other. Participants who indicated greater endorsement of utilitarian solutions had higher scores on measures of Psychopathy, machiavellianism, and life meaninglessness. These results question the widely-used methods by which lay moral judgments are evaluated, as these approaches lead to the counterintuitive conclusion that those individuals who are least prone to moral errors also possess a set of psychological characteristics that many would consider prototypically immoral. Copyright © 2011 Elsevier B.V. All rights reserved.
Enhanced analysis of real-time PCR data by using a variable efficiency model: FPK-PCR
Lievens, Antoon; Van Aelst, S.; Van den Bulcke, M.; Goetghebeur, E.
2012-01-01
Current methodology in real-time Polymerase chain reaction (PCR) analysis performs well provided PCR efficiency remains constant over reactions. Yet, small changes in efficiency can lead to large quantification errors. Particularly in biological samples, the possible presence of inhibitors forms a challenge. We present a new approach to single reaction efficiency calculation, called Full Process Kinetics-PCR (FPK-PCR). It combines a kinetically more realistic model with flexible adaptation to the full range of data. By reconstructing the entire chain of cycle efficiencies, rather than restricting the focus on a ‘window of application’, one extracts additional information and loses a level of arbitrariness. The maximal efficiency estimates returned by the model are comparable in accuracy and precision to both the golden standard of serial dilution and other single reaction efficiency methods. The cycle-to-cycle changes in efficiency, as described by the FPK-PCR procedure, stay considerably closer to the data than those from other S-shaped models. The assessment of individual cycle efficiencies returns more information than other single efficiency methods. It allows in-depth interpretation of real-time PCR data and reconstruction of the fluorescence data, providing quality control. Finally, by implementing a global efficiency model, reproducibility is improved as the selection of a window of application is avoided. PMID:22102586
Zubakov, Dmitry; Boersma, Anton W. M.; Choi, Ying; van Kuijk, Patricia F.; Wiemer, Erik A. C.
2010-01-01
MicroRNAs (miRNAs) are non-protein coding molecules with important regulatory functions; many have tissue-specific expression patterns. Their very small size in principle makes them less prone to degradation processes, unlike messenger RNAs (mRNAs), which were previously proposed as molecular tools for forensic body fluid identification. To identify suitable miRNA markers for forensic body fluid identification, we first screened total RNA samples derived from saliva, semen, vaginal secretion, and venous and menstrual blood for the expression of 718 human miRNAs using a microarray platform. All body fluids could be easily distinguished from each other on the basis of complete array-based miRNA expression profiles. Results from quantitative reverse transcription PCR (RT-PCR; TaqMan) assays for microarray candidate markers confirmed strong over-expression in the targeting body fluid of several miRNAs for venous blood and several others for semen. However, no candidate markers from array experiments for other body fluids such as saliva, vaginal secretion, or menstrual blood could be confirmed by RT-PCR. Time-wise degradation of venous blood and semen stains for at least 1 year under lab conditions did not significantly affect the detection sensitivity of the identified miRNA markers. The detection limit of the TaqMan assays tested for selected venous blood and semen miRNA markers required only subpicogram amounts of total RNA per single RT-PCR test, which is considerably less than usually needed for reliable mRNA RT-PCR detection. We therefore propose the application of several stable miRNA markers for the forensic identification of blood stains and several others for semen stain identification, using commercially available TaqMan assays. Additional work remains necessary in search for suitable miRNA markers for other forensically relevant body fluids. Electronic supplementary material The online version of this article (doi:10.1007/s00414-009-0402-3) contains supplementary material, which is available to authorized users. PMID:20145944
Residents' numeric inputting error in computerized physician order entry prescription.
Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong
2016-04-01
Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial incidence of errors found in this study. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Program For Engineering Electrical Connections
NASA Technical Reports Server (NTRS)
Billitti, Joseph W.
1990-01-01
DFACS is interactive multiuser computer-aided-engineering software tool for system-level electrical integration and cabling engineering. Purpose of program to provide engineering community with centralized data base for putting in and gaining access to data on functional definition of system, details of end-circuit pinouts in systems and subsystems, and data on wiring harnesses. Objective, to provide instantaneous single point of interchange of information, thus avoiding error-prone, time-consuming, and costly shuttling of data along multiple paths. Designed to operate on DEC VAX mini or micro computer using Version 5.0/03 of INGRES.
Learning class descriptions from a data base of spectral reflectance with multiple view angles
NASA Technical Reports Server (NTRS)
Kimes, Daniel S.; Harrison, Patrick R.; Harrison, P. A.
1992-01-01
A learning program has been developed which combines 'learning by example' with the generate-and-test paradigm to furnish a robust learning environment capable of handling error-prone data. The problem is shown to be capable of learning class descriptions from positive and negative training examples of spectral and directional reflectance data taken from soil and vegetation. The program, which used AI techniques to automate very tedious processes, found the sequence of relationships that contained the most important information which could distinguish the classes.
System-on-Chip Data Processing and Data Handling Spaceflight Electronics
NASA Technical Reports Server (NTRS)
Kleyner, I.; Katz, R.; Tiggeler, H.
1999-01-01
This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.
NASA Technical Reports Server (NTRS)
Johnson, S. C.
1986-01-01
Semi-Markov models can be used to compute the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in a model of a complex system can be devastingly tedious and error-prone. The ASSIST program allows the user to describe the semi-Markov model in a high-level language. Instead of specifying the individual states of the model, the user specifies the rules governing the behavior of the system and these are used by ASSIST to automatically generate the model. The ASSIST program is described and illustrated by examples.
Miskolci, Veronika; Spiering, Désirée; Cox, Dianne; Hodgson, Louis
2014-01-01
Cytokine stimulations of leukocytes many times result in transient activation of the p21 Rho family of small GTPases. The role of these molecules during cell migration and chemotaxis is well established. The traditional approach to study the activation dynamics of these proteins involves affinity pull-downs that are often cumbersome and prone to errors. Here, we describe a reagent and a method of simple "mix-and-measure" approach useful for determining the activation status of endogenous Cdc42 GTPase from cell lysates.
Critical evaluation of sample pretreatment techniques.
Hyötyläinen, Tuulia
2009-06-01
Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.
Temperature-dependent spectral mismatch corrections
Osterwald, Carl R.; Campanelli, Mark; Moriarty, Tom; ...
2015-11-01
This study develops the mathematical foundation for a translation of solar cell short-circuit current from one thermal and spectral irradiance operating condition to another without the use of ill-defined and error-prone temperature coefficients typically employed in solar cell metrology. Using the partial derivative of quantum efficiency with respect to temperature, the conventional isothermal expression for spectral mismatch corrections is modified to account for changes of current due to temperature; this modification completely eliminates the need for short-circuit-current temperature coefficients. An example calculation is provided to demonstrate use of the new translation.
Computer-aided programming for message-passing system; Problems and a solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.Y.; Gajski, D.D.
1989-12-01
As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.
Polymerase chain reaction technology as analytical tool in agricultural biotechnology.
Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping
2005-01-01
The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.
Medication administration errors in nursing homes using an automated medication dispensing system.
van den Bemt, Patricia M L A; Idzinga, Jetske C; Robertz, Hans; Kormelink, Dennis Groot; Pels, Neske
2009-01-01
OBJECTIVE To identify the frequency of medication administration errors as well as their potential risk factors in nursing homes using a distribution robot. DESIGN The study was a prospective, observational study conducted within three nursing homes in the Netherlands caring for 180 individuals. MEASUREMENTS Medication errors were measured using the disguised observation technique. Types of medication errors were described. The correlation between several potential risk factors and the occurrence of medication errors was studied to identify potential causes for the errors. RESULTS In total 2,025 medication administrations to 127 clients were observed. In these administrations 428 errors were observed (21.2%). The most frequently occurring types of errors were use of wrong administration techniques (especially incorrect crushing of medication and not supervising the intake of medication) and wrong time errors (administering the medication at least 1 h early or late).The potential risk factors female gender (odds ratio (OR) 1.39; 95% confidence interval (CI) 1.05-1.83), ATC medication class antibiotics (OR 11.11; 95% CI 2.66-46.50), medication crushed (OR 7.83; 95% CI 5.40-11.36), number of dosages/day/client (OR 1.03; 95% CI 1.01-1.05), nursing home 2 (OR 3.97; 95% CI 2.86-5.50), medication not supplied by distribution robot (OR 2.92; 95% CI 2.04-4.18), time classes "7-10 am" (OR 2.28; 95% CI 1.50-3.47) and "10 am-2 pm" (OR 1.96; 1.18-3.27) and day of the week "Wednesday" (OR 1.46; 95% CI 1.03-2.07) are associated with a higher risk of administration errors. CONCLUSIONS Medication administration in nursing homes is prone to many errors. This study indicates that the handling of the medication after removing it from the robot packaging may contribute to this high error frequency, which may be reduced by training of nurse attendants, by automated clinical decision support and by measures to reduce workload.
Data entry errors and design for model-based tight glycemic control in critical care.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.
Microvariation Artifacts Introduced by PCR and Cloning of Closely Related 16S rRNA Gene Sequences†
Speksnijder, Arjen G. C. L.; Kowalchuk, George A.; De Jong, Sander; Kline, Elizabeth; Stephen, John R.; Laanbroek, Hendrikus J.
2001-01-01
A defined template mixture of seven closely related 16S-rDNA clones was used in a PCR-cloning experiment to assess and track sources of artifactual sequence variation in 16S rDNA clone libraries. At least 14% of the recovered clones contained aberrations. Artifact sources were polymerase errors, a mutational hot spot, and cloning of heteroduplexes and chimeras. These data may partially explain the high degree of microheterogeneity typical of sequence clusters detected in environmental clone libraries. PMID:11133483
Carlson Scholz, Jodi A; Garg, Rohit; Compton, Susan R; Allore, Heather G; Zeiss, Caroline J; Uchio, Edward M
2011-10-01
The arterivirus lactate dehydrogenase-elevating virus (LDV) causes life-long viremia in mice. Although LDV infection generally does not cause disease, infected mice that are homozygous for the Fv1(n) allele are prone to develop poliomyelitis when immunosuppressed, a condition known as age-dependent poliomyelitis. The development of age-dependent poliomyelitis requires coinfection with endogenous murine leukemia virus. Even though LDV is a common contaminant of transplantable tumors, clinical signs of poliomyelitis after inadvertent exposure to LDV have not been described in recent literature. In addition, LDV-induced poliomyelitis has not been reported in SCID or ICR mice. Here we describe the occurrence of poliomyelitis in ICR-SCID mice resulting from injection of LDV-contaminated basement membrane matrix. After exposure to LDV, a subset of mice presented with clinical signs including paresis, which was associated with atrophy of the hindlimb musculature, and tachypnea; in addition, some mice died suddenly with or without premonitory signs. Mice presenting within the first 6 mo after infection had regions of spongiosis, neuronal necrosis and astrocytosis of the ventral spinal cord, and less commonly, brainstem. Axonal degeneration of ventral roots prevailed in more chronically infected mice. LDV was identified by RT-PCR in 12 of 15 mice with typical neuropathology; positive antiLDV immunolabeling was identified in all PCR-positive animals (n = 7) tested. Three of 8 mice with neuropathology but no clinical signs were LDV negative by RT-PCR. RT-PCR yielded murine leukemia virus in spinal cords of all mice tested, regardless of clinical presentation or neuropathology.
Intapan, Pewpan M; Thanchomnang, Tongjit; Lulitanond, Viraphong; Maleewong, Wanchai
2009-01-01
We developed a single-step real-time fluorescence resonance energy transfer (FRET) multiplex polymerase chain reaction (PCR) merged with melting curve analysis for the detection of Wuchereria bancrofti and Brugia malayi DNA in blood-fed mosquitoes. Real-time FRET multiplex PCR is based on fluorescence melting curve analysis of a hybrid of amplicons generated from two families of repeated DNA elements: the 188 bp SspI repeated sequence, specific to W. bancrofti, and the 153-bp HhaI repeated sequence, specific to the genus Brugia and two pairs of specific fluorophore-labeled probes. Both W. bancrofti and B. malayi can be differentially detected in infected vectors by this process through their different fluorescence channel and melting temperatures. The assay could distinguish both human filarial DNAs in infected vectors from the DNAs of Dirofilaria immitis- and Plasmodium falciparum-infected human red blood cells and noninfected mosquitoes and human leukocytes. The technique showed 100% sensitivity and specificity and offers a rapid and reliable procedure for differentially identifying lymphatic filariasis. The introduced real-time FRET multiplex PCR can reduce labor time and reagent costs and is not prone to carry over contamination. The test can be used to screen mosquito vectors in endemic areas and therefore should be a useful diagnostic tool for the evaluation of infection rate of the mosquito populations and for xenomonitoring in the community after eradication programs such as the Global Program to Eliminate Lymphatic Filariasis.
van de Vossenberg, B T L H; Ibáñez-Justicia, A; Metz-Verschure, E; van Veen, E J; Bruil-Dieters, M L; Scholte, E J
2015-05-01
Since 2009, The Netherlands Food and Consumer Product Safety Authority carries out surveys focusing on, amongst others, the presence of invasive mosquito species (IMS). Special attention is given to exotic container-breeding Aedes species Aedes aegypti (L.), Aedes albopictus (Skuse), Aedes atropalpus (Coquillett), and Aedes japonicus japonicus (Theobald). This study describes the implementation of real-time PCR tests described by Hill et al. (2008) for the identification of Ae. aegypti and Ae. albopictus, and the development of two novel real-time PCR tests for the identification of Ae. atropalpus and Ae. j. japonicus. Initial test showed that optimization of elements of the Ae. aegypti and Ae. albopictus tests was needed. Method validation tests were performed to determine if the implemented and newly developed tests are fit for routine diagnostics. Performance criteria of analytical sensitivity, analytical specificity, selectivity, repeatability, and reproducibility were determined. In addition, experiments were performed to determine the influence of environmental conditions on the usability of DNA extracted from mosquito specimens trapped in BG-Sentinel traps. The real-time PCR tests were demonstrated to be sensitive, specific, repeatable, reproducible, and are less prone to false negative results compared to partial cytochrome c oxidase I gene sequencing owing to the DNA fragmentation caused by environmental influences. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Error baseline rates of five sample preparation methods used to characterize RNA virus populations.
Kugelman, Jeffrey R; Wiley, Michael R; Nagle, Elyse R; Reyes, Daniel; Pfeffer, Brad P; Kuhn, Jens H; Sanchez-Lockhart, Mariano; Palacios, Gustavo F
2017-01-01
Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic "no amplification" method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a "targeted" amplification method, sequence-independent single-primer amplification (SISPA) as a "random" amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced "no amplification" method, and Illumina TruSeq RNA Access as a "targeted" enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4-5) of all compared methods.
Error baseline rates of five sample preparation methods used to characterize RNA virus populations
Kugelman, Jeffrey R.; Wiley, Michael R.; Nagle, Elyse R.; Reyes, Daniel; Pfeffer, Brad P.; Kuhn, Jens H.; Sanchez-Lockhart, Mariano; Palacios, Gustavo F.
2017-01-01
Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic “no amplification” method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a “targeted” amplification method, sequence-independent single-primer amplification (SISPA) as a “random” amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced “no amplification” method, and Illumina TruSeq RNA Access as a “targeted” enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4−5) of all compared methods. PMID:28182717
Intraoperative visualization and assessment of electromagnetic tracking error
NASA Astrophysics Data System (ADS)
Harish, Vinyas; Ungi, Tamas; Lasso, Andras; MacDonald, Andrew; Nanji, Sulaiman; Fichtinger, Gabor
2015-03-01
Electromagnetic tracking allows for increased flexibility in designing image-guided interventions, however it is well understood that electromagnetic tracking is prone to error. Visualization and assessment of the tracking error should take place in the operating room with minimal interference with the clinical procedure. The goal was to achieve this ideal in an open-source software implementation in a plug and play manner, without requiring programming from the user. We use optical tracking as a ground truth. An electromagnetic sensor and optical markers are mounted onto a stylus device, pivot calibrated for both trackers. Electromagnetic tracking error is defined as difference of tool tip position between electromagnetic and optical readings. Multiple measurements are interpolated into the thin-plate B-spline transform visualized in real time using 3D Slicer. All tracked devices are used in a plug and play manner through the open-source SlicerIGT and PLUS extensions of the 3D Slicer platform. Tracking error was measured multiple times to assess reproducibility of the method, both with and without placing ferromagnetic objects in the workspace. Results from exhaustive grid sampling and freehand sampling were similar, indicating that a quick freehand sampling is sufficient to detect unexpected or excessive field distortion in the operating room. The software is available as a plug-in for the 3D Slicer platforms. Results demonstrate potential for visualizing electromagnetic tracking error in real time for intraoperative environments in feasibility clinical trials in image-guided interventions.
Reoccurrence of botulinum neurotoxin subtype A3 inducing food-borne botulism, Slovakia, 2015.
Mad'arová, Lucia; Dorner, Brigitte G; Schaade, Lars; Donáth, Vladimír; Avdičová, Mária; Fatkulinová, Milota; Strhársky, Jozef; Sedliačiková, Ivana; Klement, Cyril; Dorner, Martin B
2017-08-10
A case of food-borne botulism occurred in Slovakia in 2015. Clostridium botulinum type A was isolated from three nearly empty commercial hummus tubes. The product, which was sold in Slovakia and the Czech Republic, was withdrawn from the market and a warning was issued immediately through the European Commission's Rapid Alert System for Food and Feed (RASFF). Further investigation revealed the presence of botulinum neurotoxin (BoNT) subtype BoNT/A3, a very rare subtype implicated in only one previous outbreak (Loch Maree in Scotland, 1922). It is the most divergent subtype of BoNT/A with 15.4% difference at the amino acid level compared with the prototype BoNT/A1. This makes it more prone to evading immunological and PCR-based detection. It is recommended that testing laboratories are advised that this subtype has been associated with food-borne botulism for the second time since the first outbreak almost 100 years ago, and to validate their immunological or PCR-based methods against this divergent subtype. This article is copyright of The Authors, 2017.
Infralimbic EphB2 Modulates Fear Extinction in Adolescent Rats.
Cruz, Emmanuel; Soler-Cedeño, Omar; Negrón, Geovanny; Criado-Marrero, Marangelie; Chompré, Gladys; Porter, James T
2015-09-09
Adolescent rats are prone to impaired fear extinction, suggesting that mechanistic differences in extinction could exist in adolescent and adult rats. Since the infralimbic cortex (IL) is critical for fear extinction, we used PCR array technology to identify gene expression changes in IL induced by fear extinction in adolescent rats. Interestingly, the ephrin type B receptor 2 (EphB2), a tyrosine kinase receptor associated with synaptic development, was downregulated in IL after fear extinction. Consistent with the PCR array results, EphB2 levels of mRNA and protein were reduced in IL after fear extinction compared with fear conditioning, suggesting that EphB2 signaling in IL regulates fear extinction memory in adolescents. Finally, reducing EphB2 synthesis in IL with shRNA accelerated fear extinction learning in adolescent rats, but not in adult rats. These findings identify EphB2 in IL as a key regulator of fear extinction during adolescence, perhaps due to the increase in synaptic remodeling occurring during this developmental phase. Copyright © 2015 the authors 0270-6474/15/3512394-10$15.00/0.
Safeguarding the process of drug administration with an emphasis on electronic support tools
Seidling, Hanna M; Lampert, Anette; Lohmann, Kristina; Schiele, Julia T; Send, Alexander J F; Witticke, Diana; Haefeli, Walter E
2013-01-01
Aims The aim of this work is to understand the process of drug administration and identify points in the workflow that resulted in interventions by clinical information systems in order to improve patient safety. Methods To identify a generic way to structure the drug administration process we performed peer-group discussions and supplemented these discussions with a literature search for studies reporting errors in drug administration and strategies for their prevention. Results We concluded that the drug administration process might consist of up to 11 sub-steps, which can be grouped into the four sub-processes of preparation, personalization, application and follow-up. Errors in drug handling and administration are diverse and frequent and in many cases not caused by the patient him/herself, but by family members or nurses. Accordingly, different prevention strategies have been set in place with relatively few approaches involving e-health technology. Conclusions A generic structuring of the administration process and particular error-prone sub-steps may facilitate the allocation of prevention strategies and help to identify research gaps. PMID:24007450
NASA Astrophysics Data System (ADS)
Zhang, Guojian; Yu, Chengxin; Ding, Xinhua
2018-01-01
In this study, digital photography is used to monitor the instantaneous deformation of a masonry wall in seismic oscillation. In order to obtain higher measurement accuracy, the image matching-time baseline parallax method (IM-TBPM) is used to correct errors caused by the change of intrinsic and extrinsic parameters of digital cameras. Results show that the average errors of control point C5 are 0.79mm, 0.44mm and 0.96mm in X, Z and comprehensive direction, respectively. The average errors of control point C6 are 0.49mm, 0.44mm and 0.71mm in X, Z and comprehensive direction, respectively. These suggest that IM-TBPM can meet the accuracy requirements of instantaneous deformation monitoring. In seismic oscillation the middle to lower of the masonry wall develops cracks firstly. Then the shear failure occurs on the middle of masonry wall. This study provides technical basis for analyzing the crack development pattern of masonry structure in seismic oscillation and have significant implications for improved construction of masonry structures in earthquake prone areas.
Nunez-Iglesias, Juan; Kennedy, Ryan; Plaza, Stephen M.; Chakraborty, Anirban; Katz, William T.
2014-01-01
The aim in high-resolution connectomics is to reconstruct complete neuronal connectivity in a tissue. Currently, the only technology capable of resolving the smallest neuronal processes is electron microscopy (EM). Thus, a common approach to network reconstruction is to perform (error-prone) automatic segmentation of EM images, followed by manual proofreading by experts to fix errors. We have developed an algorithm and software library to not only improve the accuracy of the initial automatic segmentation, but also point out the image coordinates where it is likely to have made errors. Our software, called gala (graph-based active learning of agglomeration), improves the state of the art in agglomerative image segmentation. It is implemented in Python and makes extensive use of the scientific Python stack (numpy, scipy, networkx, scikit-learn, scikit-image, and others). We present here the software architecture of the gala library, and discuss several designs that we consider would be generally useful for other segmentation packages. We also discuss the current limitations of the gala library and how we intend to address them. PMID:24772079
Combination with anti-tit-for-tat remedies problems of tit-for-tat.
Yi, Su Do; Baek, Seung Ki; Choi, Jung-Kyoo
2017-01-07
One of the most important questions in game theory concerns how mutual cooperation can be achieved and maintained in a social dilemma. In Axelrod's tournaments of the iterated prisoner's dilemma, Tit-for-Tat (TFT) demonstrated the role of reciprocity in the emergence of cooperation. However, the stability of TFT does not hold in the presence of implementation error, and a TFT population is prone to neutral drift to unconditional cooperation, which eventually invites defectors. We argue that a combination of TFT and anti-TFT (ATFT) overcomes these difficulties in a noisy environment, provided that ATFT is defined as choosing the opposite to the opponent's last move. According to this TFT-ATFT strategy, a player normally uses TFT; turns to ATFT upon recognizing his or her own error; returns to TFT either when mutual cooperation is recovered or when the opponent unilaterally defects twice in a row. The proposed strategy provides simple and deterministic behavioral rules for correcting implementation error in a way that cannot be exploited by the opponent, and suppresses the neutral drift to unconditional cooperation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Safonova, Yana; Bonissone, Stefano; Kurpilyansky, Eugene; Starostina, Ekaterina; Lapidus, Alla; Stinson, Jeremy; DePalatis, Laura; Sandoval, Wendy; Lill, Jennie; Pevzner, Pavel A.
2015-01-01
The analysis of concentrations of circulating antibodies in serum (antibody repertoire) is a fundamental, yet poorly studied, problem in immunoinformatics. The two current approaches to the analysis of antibody repertoires [next generation sequencing (NGS) and mass spectrometry (MS)] present difficult computational challenges since antibodies are not directly encoded in the germline but are extensively diversified by somatic recombination and hypermutations. Therefore, the protein database required for the interpretation of spectra from circulating antibodies is custom for each individual. Although such a database can be constructed via NGS, the reads generated by NGS are error-prone and even a single nucleotide error precludes identification of a peptide by the standard proteomics tools. Here, we present the IgRepertoireConstructor algorithm that performs error-correction of immunosequencing reads and uses mass spectra to validate the constructed antibody repertoires. Availability and implementation: IgRepertoireConstructor is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from http://bioinf.spbau.ru/igtools. Contact: ppevzner@ucsd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072509
NASA Astrophysics Data System (ADS)
Porter, J. M.; Jeffries, J. B.; Hanson, R. K.
2009-09-01
A novel three-wavelength mid-infrared laser-based absorption/extinction diagnostic has been developed for simultaneous measurement of temperature and vapor-phase mole fraction in an evaporating hydrocarbon fuel aerosol (vapor and liquid droplets). The measurement technique was demonstrated for an n-decane aerosol with D 50˜3 μ m in steady and shock-heated flows with a measurement bandwidth of 125 kHz. Laser wavelengths were selected from FTIR measurements of the C-H stretching band of vapor and liquid n-decane near 3.4 μm (3000 cm -1), and from modeled light scattering from droplets. Measurements were made for vapor mole fractions below 2.3 percent with errors less than 10 percent, and simultaneous temperature measurements over the range 300 K< T<900 K were made with errors less than 3 percent. The measurement technique is designed to provide accurate values of temperature and vapor mole fraction in evaporating polydispersed aerosols with small mean diameters ( D 50<10 μ m), where near-infrared laser-based scattering corrections are prone to error.
Fairfield, Beth; Mammarella, Nicola; Di Domenico, Alberto; D'Aurora, Marco; Stuppia, Liborio; Gatta, Valentina
2017-08-30
False memories are common memory distortions in everyday life and seem to increase with affectively connoted complex information. In line with recent studies showing a significant interaction between the noradrenergic system and emotional memory, we investigated whether healthy volunteer carriers of the deletion variant of the ADRA2B gene that codes for the α2b-adrenergic receptor are more prone to false memories than non-carriers. In this study, we collected genotype data from 212 healthy female volunteers; 91 ADRA2B carriers and 121 non-carriers. To assess gene effects on false memories for affective information, factorial mixed model analysis of variances (ANOVAs) were conducted with genotype as the between-subjects factor and type of memory error as the within-subjects factor. We found that although carriers and non-carriers made comparable numbers of false memory errors, they showed differences in the direction of valence biases, especially for inferential causal errors. Specifically, carriers produced fewer causal false memory errors for scripts with a negative outcome, whereas non-carriers showed a more general emotional effect and made fewer causal errors with both positive and negative outcomes. These findings suggest that putatively higher levels of noradrenaline in deletion carriers may enhance short-term consolidation of negative information and lead to fewer memory distortions when facing negative events. Copyright © 2017 Elsevier B.V. All rights reserved.
Survival analysis with error-prone time-varying covariates: a risk set calibration approach
Liao, Xiaomei; Zucker, David M.; Li, Yi; Spiegelman, Donna
2010-01-01
Summary Occupational, environmental, and nutritional epidemiologists are often interested in estimating the prospective effect of time-varying exposure variables such as cumulative exposure or cumulative updated average exposure, in relation to chronic disease endpoints such as cancer incidence and mortality. From exposure validation studies, it is apparent that many of the variables of interest are measured with moderate to substantial error. Although the ordinary regression calibration approach is approximately valid and efficient for measurement error correction of relative risk estimates from the Cox model with time-independent point exposures when the disease is rare, it is not adaptable for use with time-varying exposures. By re-calibrating the measurement error model within each risk set, a risk set regression calibration method is proposed for this setting. An algorithm for a bias-corrected point estimate of the relative risk using an RRC approach is presented, followed by the derivation of an estimate of its variance, resulting in a sandwich estimator. Emphasis is on methods applicable to the main study/external validation study design, which arises in important applications. Simulation studies under several assumptions about the error model were carried out, which demonstrated the validity and efficiency of the method in finite samples. The method was applied to a study of diet and cancer from Harvard’s Health Professionals Follow-up Study (HPFS). PMID:20486928
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.
Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science
Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918
Schipler, Agnes; Iliakis, George
2013-01-01
Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice. PMID:23804754
Assessing primary care data quality.
Lim, Yvonne Mei Fong; Yusof, Maryati; Sivasampu, Sheamini
2018-04-16
Purpose The purpose of this paper is to assess National Medical Care Survey data quality. Design/methodology/approach Data completeness and representativeness were computed for all observations while other data quality measures were assessed using a 10 per cent sample from the National Medical Care Survey database; i.e., 12,569 primary care records from 189 public and private practices were included in the analysis. Findings Data field completion ranged from 69 to 100 per cent. Error rates for data transfer from paper to web-based application varied between 0.5 and 6.1 per cent. Error rates arising from diagnosis and clinical process coding were higher than medication coding. Data fields that involved free text entry were more prone to errors than those involving selection from menus. The authors found that completeness, accuracy, coding reliability and representativeness were generally good, while data timeliness needs to be improved. Research limitations/implications Only data entered into a web-based application were examined. Data omissions and errors in the original questionnaires were not covered. Practical implications Results from this study provided informative and practicable approaches to improve primary health care data completeness and accuracy especially in developing nations where resources are limited. Originality/value Primary care data quality studies in developing nations are limited. Understanding errors and missing data enables researchers and health service administrators to prevent quality-related problems in primary care data.
Genotype identification of Math1/LacZ knockout mice based on real-time PCR with SYBR Green I dye.
Krizhanovsky, Valery; Golenser, Esther; Ben-Arie, Nissim
2004-07-30
Knockout mice are widely used in all fields of biomedical research. Determining the genotype of every newborn mouse is a tedious task, usually performed by Southern blot hybridization or Polymerase Chain Reaction (PCR). We describe here a quick and simple genotype identification assay based on real-time PCR and SYBR Green I dye, without using fluorescent primers. The discrimination between the wild type and targeted alleles is based on a PCR design that leads to a different melting temperature for each product. The identification of the genotype is obvious immediately after amplification, and no post-PCR manipulations are needed, reducing cost and time. Therefore, while the real-time PCR amplification increases the sensitivity, the fact that the reactions tubes are never opened after amplification, reduces the risk of contamination and eliminates errors, which are common during the repeated handling of dozens of samples from the same mouse line. The protocol we provide was tested on Math1 knockout mice, but is general, and may be utilized for any knockout line and real-time thermocycler, without any further modification, accessories or special reagents. Copyright 2004 Elsevier B.V.
Leshem, Rotem
2016-02-01
This study examined the relationship between trait impulsivity and cognitive control, as measured by the Barratt Impulsiveness Scale (BIS) and a focused attention dichotic listening to words task, respectively. In the task, attention was manipulated in two attention conditions differing in their cognitive control demands: one in which attention was directed to one ear at a time for a whole block of trials (blocked condition) and another in which attention was switched pseudo-randomly between the two ears from trial to trial (mixed condition). Results showed that high impulsivity participants exhibited more false alarm and intrusion errors as well as a lesser ability to distinguish between stimuli in the mixed condition, as compared to low impulsivity participants. In the blocked condition, the performance levels of the two groups were comparable with respect to these measures. In addition, total BIS scores were correlated with intrusions and laterality index in the mixed but not the blocked condition. The findings suggest that high impulsivity individuals may be less prone to attentional difficulties when cognitive load is relatively low. In contrast, when attention switching is involved, high impulsivity is associated with greater difficulty in inhibiting responses and resolving cognitive conflict than is low impulsivity, as reflected in error-prone information processing. The conclusion is that trait impulsivity in a non-clinical population is manifested more strongly when attention switching is required than during maintained attention. This may have important implications for the conceptualization and treatment of impulsivity in both non-clinical and clinical populations.
Automated algorithm for mapping regions of cold-air pooling in complex terrain
NASA Astrophysics Data System (ADS)
Lundquist, Jessica D.; Pepin, Nicholas; Rochford, Caitlin
2008-11-01
In complex terrain, air in contact with the ground becomes cooled from radiative energy loss on a calm clear night and, being denser than the free atmosphere at the same elevation, sinks to valley bottoms. Cold-air pooling (CAP) occurs where this cooled air collects on the landscape. This article focuses on identifying locations on a landscape subject to considerably lower minimum temperatures than the regional average during conditions of clear skies and weak synoptic-scale winds, providing a simple automated method to map locations where cold air is likely to pool. Digital elevation models of regions of complex terrain were used to derive surfaces of local slope, curvature, and percentile elevation relative to surrounding terrain. Each pixel was classified as prone to CAP, not prone to CAP, or exhibiting no signal, based on the criterion that CAP occurs in regions with flat slopes in local depressions or valleys (negative curvature and low percentile). Along-valley changes in the topographic amplification factor (TAF) were then calculated to determine whether the cold air in the valley was likely to drain or pool. Results were checked against distributed temperature measurements in Loch Vale, Rocky Mountain National Park, Colorado; in the Eastern Pyrenees, France; and in Yosemite National Park, Sierra Nevada, California. Using CAP classification to interpolate temperatures across complex terrain resulted in improvements in root-mean-square errors compared to more basic interpolation techniques at most sites within the three areas examined, with average error reductions of up to 3°C at individual sites and about 1°C averaged over all sites in the study areas.
Maddukuri, Leena; Ketkar, Amit; Eddy, Sarah; Zafar, Maroof K.; Eoff, Robert L.
2014-01-01
Human DNA polymerase kappa (hpol κ) is the only Y-family member to preferentially insert dAMP opposite 7,8-dihydro-8-oxo-2′-deoxyguanosine (8-oxo-dG) during translesion DNA synthesis. We have studied the mechanism of action by which hpol κ activity is modulated by the Werner syndrome protein (WRN), a RecQ helicase known to influence repair of 8-oxo-dG. Here we show that WRN stimulates the 8-oxo-dG bypass activity of hpol κ in vitro by enhancing the correct base insertion opposite the lesion, as well as extension from dC:8-oxo-dG base pairs. Steady-state kinetic analysis reveals that WRN improves hpol κ-catalyzed dCMP insertion opposite 8-oxo-dG ∼10-fold and extension from dC:8-oxo-dG by 2.4-fold. Stimulation is primarily due to an increase in the rate constant for polymerization (kpol), as assessed by pre-steady-state kinetics, and it requires the RecQ C-terminal (RQC) domain. In support of the functional data, recombinant WRN and hpol κ were found to physically interact through the exo and RQC domains of WRN, and co-localization of WRN and hpol κ was observed in human cells treated with hydrogen peroxide. Thus, WRN limits the error-prone bypass of 8-oxo-dG by hpol κ, which could influence the sensitivity to oxidative damage that has previously been observed for Werner's syndrome cells. PMID:25294835
Detection of Methicillin-Resistant Coagulase-Negative Staphylococci by the Vitek 2 System
Johnson, Kristen N.; Andreacchio, Kathleen
2014-01-01
The accurate performance of the Vitek 2 GP67 card for detecting methicillin-resistant coagulase-negative staphylococci (CoNS) is not known. We prospectively determined the ability of the Vitek 2 GP67 card to accurately detect methicillin-resistant CoNS, with mecA PCR results used as the gold standard for a 4-month period in 2012. Included in the study were 240 consecutively collected nonduplicate CoNS isolates. Cefoxitin susceptibility by disk diffusion testing was determined for all isolates. We found that the three tested systems, Vitek 2 oxacillin and cefoxitin testing and cefoxitin disk susceptibility testing, lacked specificity and, in some cases, sensitivity for detecting methicillin resistance. The Vitek 2 oxacillin and cefoxitin tests had very major error rates of 4% and 8%, respectively, and major error rates of 38% and 26%, respectively. Disk cefoxitin testing gave the best performance, with very major and major error rates of 2% and 24%, respectively. The test performances were species dependent, with the greatest errors found for Staphylococcus saprophyticus. While the 2014 CLSI guidelines recommend reporting isolates that test resistant by the oxacillin MIC or cefoxitin disk test as oxacillin resistant, following such guidelines produces erroneous results, depending on the test method and bacterial species tested. Vitek 2 cefoxitin testing is not an adequate substitute for cefoxitin disk testing. For critical-source isolates, mecA PCR, rather than Vitek 2 or cefoxitin disk testing, is required for optimal antimicrobial therapy. PMID:24951799
Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.
Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly
2015-01-01
The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.
(Quickly) Testing the Tester via Path Coverage
NASA Technical Reports Server (NTRS)
Groce, Alex
2009-01-01
The configuration complexity and code size of an automated testing framework may grow to a point that the tester itself becomes a significant software artifact, prone to poor configuration and implementation errors. Unfortunately, testing the tester by using old versions of the software under test (SUT) may be impractical or impossible: test framework changes may have been motivated by interface changes in the tested system, or fault detection may become too expensive in terms of computing time to justify running until errors are detected on older versions of the software. We propose the use of path coverage measures as a "quick and dirty" method for detecting many faults in complex test frameworks. We also note the possibility of using techniques developed to diversify state-space searches in model checking to diversify test focus, and an associated classification of tester changes into focus-changing and non-focus-changing modifications.
Robust pupil center detection using a curvature algorithm
NASA Technical Reports Server (NTRS)
Zhu, D.; Moore, S. T.; Raphan, T.; Wall, C. C. (Principal Investigator)
1999-01-01
Determining the pupil center is fundamental for calculating eye orientation in video-based systems. Existing techniques are error prone and not robust because eyelids, eyelashes, corneal reflections or shadows in many instances occlude the pupil. We have developed a new algorithm which utilizes curvature characteristics of the pupil boundary to eliminate these artifacts. Pupil center is computed based solely on points related to the pupil boundary. For each boundary point, a curvature value is computed. Occlusion of the boundary induces characteristic peaks in the curvature function. Curvature values for normal pupil sizes were determined and a threshold was found which together with heuristics discriminated normal from abnormal curvature. Remaining boundary points were fit with an ellipse using a least squares error criterion. The center of the ellipse is an estimate of the pupil center. This technique is robust and accurately estimates pupil center with less than 40% of the pupil boundary points visible.
Spacecraft command verification: The AI solution
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.
1990-01-01
Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.
Park, Hame; Lueckmann, Jan-Matthis; von Kriegstein, Katharina; Bitzer, Sebastian; Kiebel, Stefan J.
2016-01-01
Decisions in everyday life are prone to error. Standard models typically assume that errors during perceptual decisions are due to noise. However, it is unclear how noise in the sensory input affects the decision. Here we show that there are experimental tasks for which one can analyse the exact spatio-temporal details of a dynamic sensory noise and better understand variability in human perceptual decisions. Using a new experimental visual tracking task and a novel Bayesian decision making model, we found that the spatio-temporal noise fluctuations in the input of single trials explain a significant part of the observed responses. Our results show that modelling the precise internal representations of human participants helps predict when perceptual decisions go wrong. Furthermore, by modelling precisely the stimuli at the single-trial level, we were able to identify the underlying mechanism of perceptual decision making in more detail than standard models. PMID:26752272
Virginio, Luiz A; Ricarte, Ivan Luiz Marques
2015-01-01
Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas
2015-06-01
Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.
Smart Collision Avoidance and Hazard Routing Mechanism for Intelligent Transport Network
NASA Astrophysics Data System (ADS)
Singh, Gurpreet; Gupta, Pooja; Wahab, Mohd Helmy Abd
2017-08-01
The smart vehicular ad-hoc network is the network that consists of vehicles for smooth movement and better management of the vehicular connectivity across the given network. This research paper aims to propose a set of solution for the VANETs consisting of the automatic driven vehicles, also called as the autonomous car. Such vehicular networks are always prone to collision due to the natural or un-natural reasons which must be solved before the large-scale deployment of the autonomous transport systems. The newly designed intelligent transport movement control mechanism is based upon the intelligent data propagation along with the vehicle collision and traffic jam prevention schema [8], which may help the future designs of smart cities to become more robust and less error-prone. In the proposed model, the focus is on designing a new dynamic and robust hazard routing protocol for intelligent vehicular networks for improvement of the overall performance in various aspects. It is expected to improve the overall transmission delay as well as the number of collisions or adversaries across the vehicular network zone.
Sequencing artifacts in the type A influenza databases and attempts to correct them.
Suarez, David L; Chester, Nikki; Hatfield, Jason
2014-07-01
There are over 276 000 influenza gene sequences in public databases, with the quality of the sequences determined by the contributor. As part of a high school class project, influenza sequences with possible errors were identified in the public databases based on the size of the gene being longer than expected, with the hypothesis that these sequences would have an error. Students contacted sequence submitters alerting them of the possible sequence issue(s) and requested they the suspect sequence(s) be correct as appropriate. Type A influenza viruses were screened, and gene segments longer than the accepted size were identified for further analysis. Attention was placed on sequences with additional nucleotides upstream or downstream of the highly conserved non-coding ends of the viral segments. A total of 1081 sequences were identified that met this criterion. Three types of errors were commonly observed: non-influenza primer sequence wasn't removed from the sequence; PCR product was cloned and plasmid sequence was included in the sequence; and Taq polymerase added an adenine at the end of the PCR product. Internal insertions of nucleotide sequence were also commonly observed, but in many cases it was unclear if the sequence was correct or actually contained an error. A total of 215 sequences, or 22.8% of the suspect sequences, were corrected in the public databases in the first year of the student project. Unfortunately 138 additional sequences with possible errors were added to the databases in the second year. Additional awareness of the need for data integrity of sequences submitted to public databases is needed to fully reap the benefits of these large data sets. © 2014 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
Cognitive fallacies and criminal investigations.
Ditrich, Hans
2015-03-01
The human mind is susceptible to inherent fallacies that often hamper fully rational action. Many such misconceptions have an evolutionary background and are thus difficult to avert. Deficits in the reliability of eye-witnesses are well known to legal professionals; however, less attention has been paid to such effects in crime investigators. In order to obtain an "inside view" on the role of cognitive misconceptions in criminalistic work, a list of fallacies from the literature was adapted to criminalistic settings. The statements on this list were rated by highly experienced crime scene investigators according to the assumed likelihood of these errors to appear and their severity of effect. Among others, selective perception, expectation and confirmation bias, anchoring/"pars per toto" errors and "onus probandi"--shifting the burden of proof from the investigator to the suspect--were frequently considered to negatively affect criminal investigations. As a consequence, the following measures are proposed: alerting investigating officers in their training to cognitive fallacies and promoting the exchange of experiences in peer circles of investigators on a regular basis. Furthermore, the improvement of the organizational error culture and the establishment of a failure analysis system in order to identify and alleviate error prone processes are suggested. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Towards an evaluation framework for Laboratory Information Systems.
Yusof, Maryati M; Arifin, Azila
Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
Giese, Sven H; Zickmann, Franziska; Renard, Bernhard Y
2014-01-01
Accurate estimation, comparison and evaluation of read mapping error rates is a crucial step in the processing of next-generation sequencing data, as further analysis steps and interpretation assume the correctness of the mapping results. Current approaches are either focused on sensitivity estimation and thereby disregard specificity or are based on read simulations. Although continuously improving, read simulations are still prone to introduce a bias into the mapping error quantitation and cannot capture all characteristics of an individual dataset. We introduce ARDEN (artificial reference driven estimation of false positives in next-generation sequencing data), a novel benchmark method that estimates error rates of read mappers based on real experimental reads, using an additionally generated artificial reference genome. It allows a dataset-specific computation of error rates and the construction of a receiver operating characteristic curve. Thereby, it can be used for optimization of parameters for read mappers, selection of read mappers for a specific problem or for filtering alignments based on quality estimation. The use of ARDEN is demonstrated in a general read mapper comparison, a parameter optimization for one read mapper and an application example in single-nucleotide polymorphism discovery with a significant reduction in the number of false positive identifications. The ARDEN source code is freely available at http://sourceforge.net/projects/arden/.
[Prospective assessment of medication errors in critically ill patients in a university hospital].
Salazar L, Nicole; Jirón A, Marcela; Escobar O, Leslie; Tobar, Eduardo; Romero, Carlos
2011-11-01
Critically ill patients are especially vulnerable to medication errors (ME) due to their severe clinical situation and the complexities of their management. To determine the frequency and characteristics of ME and identify shortcomings in the processes of medication management in an Intensive Care Unit. During a 3 months period, an observational prospective and randomized study was carried out in the ICU of a university hospital. Every step of patient's medication management (prescription, transcription, dispensation, preparation and administration) was evaluated by an external trained professional. Steps with higher frequency of ME and their therapeutic groups involved were identified. Medications errors were classified according to the National Coordinating Council for Medication Error Reporting and Prevention. In 52 of 124 patients evaluated, 66 ME were found in 194 drugs prescribed. In 34% of prescribed drugs, there was at least 1 ME during its use. Half of ME occurred during medication administration, mainly due to problems in infusion rates and schedule times. Antibacterial drugs had the highest rate of ME. We found a 34% rate of ME per drug prescribed, which is in concordance with international reports. The identification of those steps more prone to ME in the ICU, will allow the implementation of an intervention program to improve the quality and security of medication management.
Björkstén, Karin Sparring; Bergqvist, Monica; Andersén-Karlsson, Eva; Benson, Lina; Ulfvarson, Johanna
2016-08-24
Many studies address the prevalence of medication errors but few address medication errors serious enough to be regarded as malpractice. Other studies have analyzed the individual and system contributory factor leading to a medication error. Nurses have a key role in medication administration, and there are contradictory reports on the nurses' work experience in relation to the risk and type for medication errors. All medication errors where a nurse was held responsible for malpractice (n = 585) during 11 years in Sweden were included. A qualitative content analysis and classification according to the type and the individual and system contributory factors was made. In order to test for possible differences between nurses' work experience and associations within and between the errors and contributory factors, Fisher's exact test was used, and Cohen's kappa (k) was performed to estimate the magnitude and direction of the associations. There were a total of 613 medication errors in the 585 cases, the most common being "Wrong dose" (41 %), "Wrong patient" (13 %) and "Omission of drug" (12 %). In 95 % of the cases, an average of 1.4 individual contributory factors was found; the most common being "Negligence, forgetfulness or lack of attentiveness" (68 %), "Proper protocol not followed" (25 %), "Lack of knowledge" (13 %) and "Practice beyond scope" (12 %). In 78 % of the cases, an average of 1.7 system contributory factors was found; the most common being "Role overload" (36 %), "Unclear communication or orders" (30 %) and "Lack of adequate access to guidelines or unclear organisational routines" (30 %). The errors "Wrong patient due to mix-up of patients" and "Wrong route" and the contributory factors "Lack of knowledge" and "Negligence, forgetfulness or lack of attentiveness" were more common in less experienced nurses. The experienced nurses were more prone to "Practice beyond scope of practice" and to make errors in spite of "Lack of adequate access to guidelines or unclear organisational routines". Medication errors regarded as malpractice in Sweden were of the same character as medication errors worldwide. A complex interplay between individual and system factors often contributed to the errors.
Enumeration of verocytotoxigenic Escherichia coli (VTEC) O157 and O26 in milk by quantitative PCR.
Mancusi, Rocco; Trevisani, Marcello
2014-08-01
Quantitative real-time polymerase chain reaction (qPCR) can be a convenient alternative to the Most Probable Number (MPN) methods to count VTEC in milk. The number of VTEC is normally very low in milk; therefore with the aim of increasing the method sensitivity a qPCR protocol that relies on preliminary enrichment was developed. The growth pattern of six VTEC strains (serogroups O157 and O26) was studied using enrichment in Buffered Peptone Water (BPW) with or without acriflavine for 4-24h. Milk samples were inoculated with these strains over a five Log concentration range between 0.24-0.50 and 4.24-4.50 Log CFU/ml. DNA was extracted from the enriched samples in duplicate and each extract was analysed in duplicate by qPCR using pairs of primers specific for the serogroups O157 and O26. When samples were pre-enriched in BPW at 37°C for 8h, the relationship between threshold cycles (CT values) and VTEC Log numbers was linear over a five Log concentration range. The regression of PCR threshold cycle numbers on VTEC Log CFU/ml had a slope coefficient equal to -3.10 (R(2)=0.96) which is indicative of a 10-fold difference of the gene copy numbers between samples (with a 100 ± 10% PCR efficiency). The same 10-fold proportion used for inoculating the milk samples with VTEC was observed, therefore, also in the enriched samples at 8h. A comparison of the CT values of milk samples and controls revealed that the strains inoculated in milk grew with 3 Log increments in the 8h enrichment period. Regression lines that fitted the qPCR and MPN data revealed that the error of the qPCR estimates is lower than the error of the estimated MPN (r=0.982, R(2)=0.965 vs. r=0.967, R(2)=0.935). The growth rates of VTEC strains isolated from milk should be comparatively assessed before qPCR estimates based on the regression model are considered valid. Comparative assessment of the growth rates can be done using spectrophotometric measurements of standardized cultures of isolates and reference strains cultured in BPW at 37°C for 8h. The method developed for the serogroups O157 and O26 can be easily adapted to the other VTEC serogroups that are relevant for human health. The qPCR method is less laborious and faster than the standard MPN method and has been shown to be a good technique for quantifying VTEC in milk. Copyright © 2014 Elsevier B.V. All rights reserved.
Variability of ischiofemoral space dimensions with changes in hip flexion: an MRI study.
Johnson, Adam C; Hollman, John H; Howe, Benjamin M; Finnoff, Jonathan T
2017-01-01
The primary aim of this study was to determine if ischiofemoral space (IFS) dimensions vary with changes in hip flexion as a result of placing a bolster behind the knees during magnetic resonance imaging (MRI). A secondary aim was to determine if IFS dimensions vary between supine and prone hip neutral positions. The study employed a prospective design. Sports medicine center within a tertiary care institution. Five male and five female adult subjects (age mean = 29.2, range = 23-35; body mass index [BMI] mean = 23.5, range = 19.5-26.6) were recruited to participate in the study. An axial, T1-weighted MRI sequence of the pelvis was obtained of each subject in a supine position with their hips in neutral and flexed positions, and in a prone position with their hips in neutral position. Supine hip flexion was induced by placing a standard, 9-cm-diameter MRI knee bolster under the subject's knees. The order of image acquisition (supine hip neutral, supine hip flexed, prone hip neutral) was randomized. The IFS dimensions were then measured on a separate workstation. The investigator performing the IFS measurements was blinded to the subject position for each image. The main outcome measurements were the IFS dimensions acquired with MRI. The mean IFS dimensions in the prone position were 28.25 mm (SD 5.91 mm, standard error mean 1.32 mm). In the supine hip neutral position, the IFS dimensions were 25.1 (SD 5.6) mm. The mean difference between the two positions of 3.15 (3.6) mm was statistically significant (95 % CI of the difference = 1.4 to 4.8 mm, t 19 = 3.911, p = .001). The mean IFS dimensions in the hip flexed position were 36.9 (SD 5.7) mm. The mean difference between the two supine positions of 11.8 (4.1) mm was statistically significant (95 % CI of the difference = 9.9 to 13.7 mm, t 19 = 12.716, p < .001). Our findings demonstrate that the IFS measurements obtained with MRI are dependent upon patient positioning with respect to hip flexion and supine versus prone positions. This finding has implications when evaluating for ischiofemoral impingement, an entity resulting in hip and/or buttock pain secondary to impingement of the quadratus femoris muscle within a pathologically narrowed IFS. One will need to account for patient hip flexion and supine versus prone positioning when evaluating individuals with suspected ischiofemoral impingement.
NASA Astrophysics Data System (ADS)
Schlueter, S.; Sheppard, A.; Wildenschild, D.
2013-12-01
Imaging of fluid interfaces in three-dimensional porous media via x-ray microtomography is an efficient means to test thermodynamically derived predictions on the relationship between capillary pressure, fluid saturation and specific interfacial area (Pc-Sw-Anw) in partially saturated porous media. Various experimental studies exist to date that validate the uniqueness of the Pc-Sw-Anw relationship under static conditions and with current technological progress direct imaging of moving interfaces under dynamic conditions is also becoming available. Image acquisition and subsequent image processing currently involves many steps each prone to operator bias, like merging different scans of the same sample obtained at different beam energies into a single image or the generation of isosurfaces from the segmented multiphase image on which the interface properties are usually calculated. We demonstrate that with recent advancements in (i) image enhancement methods, (ii) multiphase segmentation methods and (iii) methods of structural analysis we can considerably decrease the time and cost of image acquisition and the uncertainty associated with the measurement of interfacial properties. In particular, we highlight three notorious problems in multiphase image processing and provide efficient solutions for each: (i) Due to noise, partial volume effects, and imbalanced volume fractions, automated histogram-based threshold detection methods frequently fail. However, these impairments can be mitigated with modern denoising methods, special treatment of gray value edges and adaptive histogram equilization, such that most of the standard methods for threshold detection (Otsu, fuzzy c-means, minimum error, maximum entropy) coincide at the same set of values. (ii) Partial volume effects due to blur may produce apparent water films around solid surfaces that alter the specific fluid-fluid interfacial area (Anw) considerably. In a synthetic test image some local segmentation methods like Bayesian Markov random field, converging active contours and watershed segmentation reduced the error in Anw associated with apparent water films from 21% to 6-11%. (iii) The generation of isosurfaces from the segmented data usually requires a lot of postprocessing in order to smooth the surface and check for consistency errors. This can be avoided by calculating specific interfacial areas directly on the segmented voxel image by means of Minkowski functionals which is highly efficient and less error prone.
A Robust and Affordable Table Indexing Approach for Multi-isocenter Dosimetrically Matched Fields.
Yu, Amy; Fahimian, Benjamin; Million, Lynn; Hsu, Annie
2017-05-23
Purpose Radiotherapy treatment planning of extended volume typically necessitates the utilization of multiple field isocenters and abutting dosimetrically matched fields in order to enable coverage beyond the field size limits. A common example includes total lymphoid irradiation (TLI) treatments, which are conventionally planned using dosimetric matching of the mantle, para-aortic/spleen, and pelvic fields. Due to the large irradiated volume and system limitations, such as field size and couch extension, a combination of couch shifts and sliding of patients are necessary to be correctly executed for accurate delivery of the plan. However, shifting of patients presents a substantial safety issue and has been shown to be prone to errors ranging from minor deviations to geometrical misses warranting a medical event. To address this complex setup and mitigate the safety issues relating to delivery, a practical technique for couch indexing of TLI treatments has been developed and evaluated through a retrospective analysis of couch position. Methods The indexing technique is based on the modification of the commonly available slide board to enable indexing of the patient position. Modifications include notching to enable coupling with indexing bars, and the addition of a headrest used to fixate the head of the patient relative to the slide board. For the clinical setup, a Varian Exact Couch TM (Varian Medical Systems, Inc, Palo Alto, CA) was utilized. Two groups of patients were treated: 20 patients with table indexing and 10 patients without. The standard deviations (SDs) of the couch positions in longitudinal, lateral, and vertical directions through the entire treatment cycle for each patient were calculated and differences in both groups were analyzed with Student's t-test. Results The longitudinal direction showed the largest improvement. In the non-indexed group, the positioning SD ranged from 2.0 to 7.9 cm. With the indexing device, the positioning SD was reduced to a range of 0.4 to 1.3 cm (p < 0.05 with 95% confidence level). The lateral positioning was slightly improved (p < 0.05 with 95% confidence level), while no improvement was observed in the vertical direction. Conclusions The conventional matched field TLI treatment is error-prone to geometrical setup error. The feasibility of full indexing TLI treatments was validated and shown to result in a significant reduction of positioning and shifting errors.
NASA Astrophysics Data System (ADS)
Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy
2015-03-01
Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.
Bigley, Andrew N.; Xu, Chengfu; Henderson, Terry J.; Harvey, Steven P.; Raushel, Frank M.
2013-01-01
The V-type nerve agents (VX and VR) are among the most toxic substances known. The high toxicity and environmental persistence of VX makes the development of novel decontamination methods particularly important. The enzyme phosphotriesterase (PTE) is capable of hydrolyzing VX but with an enzymatic efficiency more than 5-orders of magnitude lower than with its best substrate, paraoxon. PTE has previously proven amenable to directed evolution for the improvement of catalytic activity against selected compounds through the manipulation of active site residues. Here, a series of sequential two-site mutational libraries encompassing twelve active site residues of PTE was created. The libraries were screened for catalytic activity against a new VX analogue (DEVX), which contains the same thiolate leaving group of VX coupled to a di-ethoxy phosphate core rather than the ethoxy, methylphosphonate core of VX. The evolved catalytic activity with DEVX was enhanced 26-fold relative to wildtype PTE. Further improvements were facilitated by targeted error-prone PCR mutagenesis of Loop-7 and additional PTE variants were identified with up to a 78-fold increase in the rate of DEVX hydrolysis. The best mutant hydrolyzed the racemic nerve agent VX with a value of kcat/Km of 7×104 M−1 s−1; a 230-fold improvement relative to the wild-type PTE. The highest turnover number achieved by the mutants created for this investigation was 137 s−1; an enhancement of 152-fold relative to wild-type PTE. The stereoselectivity for the hydrolysis of the two enantiomers of VX was relatively low. These engineered mutants of PTE are the best catalysts ever reported for the hydrolysis of nerve agent VX. PMID:23789980
Bigley, Andrew N; Xu, Chengfu; Henderson, Terry J; Harvey, Steven P; Raushel, Frank M
2013-07-17
The V-type nerve agents (VX and VR) are among the most toxic substances known. The high toxicity and environmental persistence of VX make the development of novel decontamination methods particularly important. The enzyme phosphotriesterase (PTE) is capable of hydrolyzing VX but with an enzymatic efficiency more than 5 orders of magnitude lower than with its best substrate, paraoxon. PTE has previously proven amenable to directed evolution for the improvement of catalytic activity against selected compounds through the manipulation of active-site residues. Here, a series of sequential two-site mutational libraries encompassing 12 active-site residues of PTE was created. The libraries were screened for catalytic activity against a new VX analogue, DEVX, which contains the same thiolate leaving group of VX coupled to a diethoxyphosphate core rather than the ethoxymethylphosphonate core of VX. The evolved catalytic activity with DEVX was enhanced 26-fold relative to wild-type PTE. Further improvements were facilitated by targeted error-prone PCR mutagenesis of loop-7, and additional PTE variants were identified with up to a 78-fold increase in the rate of DEVX hydrolysis. The best mutant hydrolyzed the racemic nerve agent VX with a value of kcat/Km = 7 × 10(4) M(-1) s(-1), a 230-fold improvement relative to wild-type PTE. The highest turnover number achieved by the mutants created for this investigation was 137 s(-1), an enhancement of 152-fold relative to wild-type PTE. The stereoselectivity for the hydrolysis of the two enantiomers of VX was relatively low. These engineered mutants of PTE are the best catalysts ever reported for the hydrolysis of nerve agent VX.
Yamakawa, Hiromoto; Hirai-Kimura, Rieko; Nakata, Yuriko; Nakata, Masaru; Kuroda, Masaharu; Yamaguchi, Takeshi
2017-04-01
α-Amylase is a starch-hydrolyzing enzyme (EC 3.2.1.1) indispensable for germination of cereal seeds, but it is also expressed during the ripening stage. Previous studies demonstrated that the enzyme is activated in developing rice seeds under extremely hot weather and triggers a loss of grain quality by hindering the accumulation of storage starch in the endosperm. Since inactive or, preferably, heat-labile α-amylases are preferable for breeding premium rice, we developed a method for rapid screening of inactive and temperature-sensitive mutants of the enzyme by combining the random mutagenesis by error-prone PCR and an on-filter activity test of the recombinant enzyme expressed by Escherichia coli. This technique was applied to a major α-amylase in the developing seed, Amy3D, and the activity of the isolated mutant enzymes was verified with both the bacteria-expressed recombinant proteins and the extract from the endosperm overexpressing each of them. Then, we identified several substitutions leading to loss of the activity of amino acid residues (Leu28, Asp112, Cys149, Trp201, Asp204, Gly295, Leu300 and Cys342), as well as a variety of heat-sensitive substitutions of Asp83, Asp187 and Glu252. Furthermore, variations of the heat-labile enzymes were created by combining these heat-sensitive mutations. The effects of the respective mutations and their relationship to the structure of the enzyme molecule are discussed. © The Author 2017. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Chong, Huiqing; Geng, Hefang; Zhang, Hongfang; Song, Hao; Huang, Lei; Jiang, Rongrong
2014-04-01
The limited isobutanol tolerance of Escherichia coli is a major drawback during fermentative isobutanol production. Different from classical strain engineering approaches, this work was initiated to improve E. coli isobutanol tolerance from its transcriptional level by engineering its global transcription factor cAMP receptor protein (CRP). Random mutagenesis libraries were generated by error-prone PCR of crp, and the libraries were subjected to isobutanol stress for selection. Variant IB2 (S179P, H199R) was isolated and exhibited much better growth (0.18 h(-1) ) than the control (0.05 h(-1) ) in 1.2% (v/v) isobutanol (9.6 g/L). Genome-wide DNA microarray analysis revealed that 58 and 308 genes in IB2 had differential expression (>2-fold, p < 0.05) in the absence and presence of 1% (v/v) isobutanol, respectively. When challenged with isobutanol, genes related to acid resistance (gadABCE, hdeABD), nitrate reduction (narUZYWV), flagella and fimbrial activity (lfhA, yehB, ycgR, fimCDF), and sulfate reduction and transportation (cysIJH, cysC, cysN) were the major functional groups that were up-regulated, whereas most of the down-regulated genes were enzyme (tnaA) and transporters (proVWX, manXYZ). As demonstrated by single-gene knockout experiments, gadX, nirB, rhaS, hdeB, and ybaS were found associated with strain isobutanol resistance. The intracellular reactive oxygen species (ROS) level in IB2 was only half of that of the control when facing stress, indicating that IB2 can withstand toxic isobutanol much better than the control. © 2013 Wiley Periodicals, Inc.
Dror, Adi; Shemesh, Einav; Dayan, Natali
2014-01-01
The abilities of enzymes to catalyze reactions in nonnatural environments of organic solvents have opened new opportunities for enzyme-based industrial processes. However, the main drawback of such processes is that most enzymes have a limited stability in polar organic solvents. In this study, we employed protein engineering methods to generate a lipase for enhanced stability in methanol, which is important for biodiesel production. Two protein engineering approaches, random mutagenesis (error-prone PCR) and structure-guided consensus, were applied in parallel on an unexplored lipase gene from Geobacillus stearothermophilus T6. A high-throughput colorimetric screening assay was used to evaluate lipase activity after an incubation period in high methanol concentrations. Both protein engineering approaches were successful in producing variants with elevated half-life values in 70% methanol. The best variant of the random mutagenesis library, Q185L, exhibited 23-fold-improved stability, yet its methanolysis activity was decreased by one-half compared to the wild type. The best variant from the consensus library, H86Y/A269T, exhibited 66-fold-improved stability in methanol along with elevated thermostability (+4.3°C) and a 2-fold-higher fatty acid methyl ester yield from soybean oil. Based on in silico modeling, we suggest that the Q185L substitution facilitates a closed lid conformation that limits access for both the methanol and substrate excess into the active site. The enhanced stability of H86Y/A269T was a result of formation of new hydrogen bonds. These improved characteristics make this variant a potential biocatalyst for biodiesel production. PMID:24362426
Wei, Qiong; Wang, Liqun; Wang, Qiang; Kruger, Warren D.; Dunbrack, Roland L.
2010-01-01
Predicting the phenotypes of missense mutations uncovered by large-scale sequencing projects is an important goal in computational biology. High-confidence predictions can be an aid in focusing experimental and association studies on those mutations most likely to be associated with causative relationships between mutation and disease. As an aid in developing these methods further, we have derived a set of random mutations of the enzymatic domains of human cystathionine beta synthase. This enzyme is a dimeric protein that catalyzes the condensation of serine and homocysteine to produce cystathionine. Yeast missing this enzyme cannot grow on medium lacking a source of cysteine, while transfection of functional human CBS into yeast strains missing endogenous enzyme can successfully complement for the missing gene. We used PCR mutagenesis with error-prone Taq polymerase to produce 948 colonies, and compared cell growth in the presence or absence of a cysteine source as a measure of CBS function. We were able to infer the phenotypes of 204 single-site mutants, 79 of them deleterious and 125 neutral. This set was used to test the accuracy of six publicly available prediction methods for phenotype prediction of missense mutations: SIFT, PolyPhen, PMut, SNPs3D, PhD-SNP, and nsSNPAnalyzer. The top methods are PolyPhen, SIFT, and nsSNPAnalyzer, which have similar performance. Using kernel discriminant functions, we found that the difference in position-specific scoring matrix values is more predictive than the wild-type PSSM score alone, and that the relative surface area in the biologically relevant complex is more predictive than that of the monomeric proteins. PMID:20455263
Mutation supply and the repeatability of selection for antibiotic resistance
NASA Astrophysics Data System (ADS)
van Dijk, Thomas; Hwang, Sungmin; Krug, Joachim; de Visser, J. Arjan G. M.; Zwart, Mark P.
2017-10-01
Whether evolution can be predicted is a key question in evolutionary biology. Here we set out to better understand the repeatability of evolution, which is a necessary condition for predictability. We explored experimentally the effect of mutation supply and the strength of selective pressure on the repeatability of selection from standing genetic variation. Different sizes of mutant libraries of antibiotic resistance gene TEM-1 β-lactamase in Escherichia coli, generated by error-prone PCR, were subjected to different antibiotic concentrations. We determined whether populations went extinct or survived, and sequenced the TEM gene of the surviving populations. The distribution of mutations per allele in our mutant libraries followed a Poisson distribution. Extinction patterns could be explained by a simple stochastic model that assumed the sampling of beneficial mutations was key for survival. In most surviving populations, alleles containing at least one known large-effect beneficial mutation were present. These genotype data also support a model which only invokes sampling effects to describe the occurrence of alleles containing large-effect driver mutations. Hence, evolution is largely predictable given cursory knowledge of mutational fitness effects, the mutation rate and population size. There were no clear trends in the repeatability of selected mutants when we considered all mutations present. However, when only known large-effect mutations were considered, the outcome of selection is less repeatable for large libraries, in contrast to expectations. We show experimentally that alleles carrying multiple mutations selected from large libraries confer higher resistance levels relative to alleles with only a known large-effect mutation, suggesting that the scarcity of high-resistance alleles carrying multiple mutations may contribute to the decrease in repeatability at large library sizes.
Next-generation sequencing: the future of molecular genetics in poultry production and food safety.
Diaz-Sanchez, S; Hanning, I; Pendleton, Sean; D'Souza, Doris
2013-02-01
The era of molecular biology and automation of the Sanger chain-terminator sequencing method has led to discovery and advances in diagnostics and biotechnology. The Sanger methodology dominated research for over 2 decades, leading to significant accomplishments and technological improvements in DNA sequencing. Next-generation high-throughput sequencing (HT-NGS) technologies were developed subsequently to overcome the limitations of this first generation technology that include higher speed, less labor, and lowered cost. Various platforms developed include sequencing-by-synthesis 454 Life Sciences, Illumina (Solexa) sequencing, SOLiD sequencing (among others), and the Ion Torrent semiconductor sequencing technologies that use different detection principles. As technology advances, progress made toward third generation sequencing technologies are being reported, which include Nanopore Sequencing and real-time monitoring of PCR activity through fluorescent resonant energy transfer. The advantages of these technologies include scalability, simplicity, with increasing DNA polymerase performance and yields, being less error prone, and even more economically feasible with the eventual goal of obtaining real-time results. These technologies can be directly applied to improve poultry production and enhance food safety. For example, sequence-based (determination of the gut microbial community, genes for metabolic pathways, or presence of plasmids) and function-based (screening for function such as antibiotic resistance, or vitamin production) metagenomic analysis can be carried out. Gut microbialflora/communities of poultry can be sequenced to determine the changes that affect health and disease along with efficacy of methods to control pathogenic growth. Thus, the purpose of this review is to provide an overview of the principles of these current technologies and their potential application to improve poultry production and food safety as well as public health.
A plasmid-encoded UmuD homologue regulates expression of Pseudomonas aeruginosa SOS genes.
Díaz-Magaña, Amada; Alva-Murillo, Nayeli; Chávez-Moctezuma, Martha P; López-Meza, Joel E; Ramírez-Díaz, Martha I; Cervantes, Carlos
2015-07-01
The Pseudomonas aeruginosa plasmid pUM505 contains the umuDC operon that encodes proteins similar to error-prone repair DNA polymerase V. The umuC gene appears to be truncated and its product is probably not functional. The umuD gene, renamed umuDpR, possesses an SOS box overlapped with a Sigma factor 70 type promoter; accordingly, transcriptional fusions revealed that the umuDpR gene promoter is activated by mitomycin C. The predicted sequence of the UmuDpR protein displays 23 % identity with the Ps. aeruginosa SOS-response LexA repressor. The umuDpR gene caused increased MMC sensitivity when transferred to the Ps. aeruginosa PAO1 strain. As expected, PAO1-derived knockout lexA- mutant PW6037 showed resistance to MMC; however, when the umuDpR gene was transferred to PW6037, MMC resistance level was reduced. These data suggested that UmuDpR represses the expression of SOS genes, as LexA does. To test whether UmuDpR exerts regulatory functions, expression of PAO1 SOS genes was evaluated by reverse transcription quantitative PCR assays in the lexA- mutant with or without the pUC_umuD recombinant plasmid. Expression of lexA, imuA and recA genes increased 3.4-5.3 times in the lexA- mutant, relative to transcription of the corresponding genes in the lexA+ strain, but decreased significantly in the lexA- /umuDpR transformant. These results confirmed that the UmuDpR protein is a repressor of Ps. aeruginosa SOS genes controlled by LexA. Electrophoretic mobility shift assays, however, did not show binding of UmuDpR to 5' regions of SOS genes, suggesting an indirect mechanism of regulation.
In Vivo Myeloperoxidase Imaging and Flow Cytometry Analysis of Intestinal Myeloid Cells.
Hülsdünker, Jan; Zeiser, Robert
2016-01-01
Myeloperoxidase (MPO) imaging is a non-invasive method to detect cells that produce the enzyme MPO that is most abundant in neutrophils, macrophages, and inflammatory monocytes. While lacking specificity for any of these three cell types, MPO imaging can provide guidance for further flow cytometry-based analysis of tissues where these cell types reside. Isolation of leukocytes from the intestinal tract is an error-prone procedure. Here, we describe a protocol for intestinal leukocyte isolation that works reliable in our hands and allows for flow cytometry-based analysis, in particular of neutrophils.
NASA Technical Reports Server (NTRS)
Trosin, J.
1985-01-01
Use of the Display AButments (DAB) which plots PAN AIR geometries is presented. The DAB program creates hidden line displays of PAN AIR geometries and labels specified geometry components, such as abutments, networks, and network edges. It is used to alleviate the very time consuming and error prone abutment list checking phase of developing a valid PAN AIR geometry, and therefore represents a valuable tool for debugging complex PAN AIR geometry definitions. DAB is written in FORTRAN 77 and runs on a Digital Equipment Corporation VAX 11/780 under VMS. It utilizes a special color version of the SKETCH hidden line analysis routine.
Evolution: the dialogue between life and death
NASA Astrophysics Data System (ADS)
Holliday, robin
1997-12-01
Organisms have the ability to harness energy from the environment to create order and to reproduce. From early error-prone systems natural selection acted to produce present day organisms with high accuracy in the synthesis of macromolecules. The environment imposes strict limits on reproduction, so evolution is always accompanied by the discarding of a large proportion of the less fit cells, or organisms. Sexual reproduction depends on an immortal germline and a soma which may be immortal or mortal. Higher animals living in hazardous environments have evolved aging and death of the soma for the benefit of the ongoing germline.
Zavil'gel'skiĭ, G B
2013-01-01
This review integrates 60 years of research on SOS-repair and SOS-mutagenesis in procaryotes and eucaryotes, from Jean Weigle experiment in 1953 year (mutagenesis of lambda bacteriophage in UV-irradiated bacteria) to the latest achievements in studying SOS-mutagenesis on all living organisms--Eukarya, Archaea and Bacteria. A key role in establishing of a biochemical basis for SOS-mutagenesis belonges to the finding in 1998-1999 years that specific error-prone DNA polymerases (PolV and others) catalysed translesion synthesis on damaged DNA. This review focuses on recent studies addressing the new models for SOS-induced mutagenesis in Escherichia coli and Home sapiens cells.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
NASA Astrophysics Data System (ADS)
Houben, Georg J.; Blümel, Martin
2017-11-01
Porosity is a fundamental parameter in hydrogeology. The empirical method of Beyer and Schweiger (1969) allows the calculation of hydraulic conductivity and both the total and effective porosity from granulometric data. However, due to its graphical nature with type curves, it is tedious to apply and prone to reading errors. In this work, the type curves were digitized and emulated by mathematical functions. The latter were implemented into a spreadsheet and a visual basic program, allowing the fast automated application of the method for any number of samples.
Missed diagnostic opportunities within South Africa's early infant diagnosis program, 2010-2015.
Haeri Mazanderani, Ahmad; Moyo, Faith; Sherman, Gayle G
2017-01-01
Samples submitted for HIV PCR testing that fail to yield a positive or negative result represent missed diagnostic opportunities. We describe HIV PCR test rejections and indeterminate results, and the associated delay in diagnosis, within South Africa's early infant diagnosis (EID) program from 2010 to 2015. HIV PCR test data from January 2010 to December 2015 were extracted from the National Health Laboratory Service Corporate Data Warehouse, a central data repository of all registered test-sets within the public health sector in South Africa, by laboratory number, result, date, facility, and testing laboratory. Samples that failed to yield either a positive or negative result were categorized according to the rejection code on the laboratory information system, and descriptive analysis performed using Microsoft Excel. Delay in diagnosis was calculated for patients who had a missed diagnostic opportunity registered between January 2013 and December 2015 by means of a patient linking-algorithm employing demographic details. Between 2010 and 2015, 2 178 582 samples were registered for HIV PCR testing of which 6.2% (n = 134 339) failed to yield either a positive or negative result, decreasing proportionally from 7.0% (n = 20 556) in 2010 to 4.4% (n = 21 388) in 2015 (p<0.001). Amongst 76 972 coded missed diagnostic opportunities, 49 585 (64.4%) were a result of pre-analytical error and 27 387 (35.6%) analytical error. Amongst 49 694 patients searched for follow-up results, 16 895 (34.0%) had at least one subsequent HIV PCR test registered after a median of 29 days (IQR: 13-57), of which 8.4% tested positive compared with 3.6% of all samples submitted for the same period. Routine laboratory data provides the opportunity for near real-time surveillance and quality improvement within the EID program. Delay in diagnosis and wastage of resources associated with missed diagnostic opportunities must be addressed and infants actively followed-up as South Africa works towards elimination of mother-to-child transmission.
Carow, Katrin; Read, Christina; Häfner, Norman; Runnebaum, Ingo B; Corner, Adam; Dürst, Matthias
2017-10-30
Qualitative analyses showed that the presence of HPV mRNA in sentinel lymph nodes of cervical cancer patients with pN0 status is associated with significantly decreased recurrence free survival. To further address the clinical potential of the strategy and to define prognostic threshold levels it is necessary to use a quantitative assay. Here, we compare two methods of quantification: digital PCR and standard quantitative PCR. Serial dilutions of 5 ng-5 pg RNA (≙ 500-0.5 cells) of the cervical cancer cell line SiHa were prepared in 5 µg RNA of the HPV-negative human keratinocyte cell line HaCaT. Clinical samples consisted of 10 sentinel lymph nodes with varying HPV transcript levels. Reverse transcription of total RNA (5 µg RNA each) was performed in 100 µl and cDNA aliquots were analyzed by qPCR and dPCR. Digital PCR was run in the RainDrop ® Digital PCR system (RainDance Technologies) using a probe-based detection of HPV E6/E7 cDNA PCR products with 11 µl template. qPCR was done using a Rotor Gene Q 5plex HRM (Qiagen) amplifying HPV E6/E7 cDNA in a SYBR Green format with 1 µl template. For the analysis of both, clinical samples and serial dilution samples, dPCR and qPCR showed comparable sensitivity. With regard to reproducibility, both methods differed considerably, especially for low template samples. Here, we found with qPCR a mean variation coefficient of 126% whereas dPCR enabled a significantly lower mean variation coefficient of 40% (p = 0.01). Generally, we saw with dPCR a substantial reduction of subsampling errors, which most likely reflects the large cDNA amounts available for analysis. Compared to real-time PCR, dPCR shows higher reliability. Thus, our HPV mRNA dPCR assay holds promise for the clinical evaluation of occult tumor cells in histologically tumor-free lymph nodes in future studies.
Margot, H; Stephan, R; Guarino, S; Jagadeesan, B; Chilton, D; O'Mahony, E; Iversen, C
2013-08-01
The traditional cultural detection of Salmonella spp. is both time- and labour-intensive. Salmonella is often a release criterion for the food industry and time to result is therefore an important factor. Storage of finished products and raw materials can be costly and may adversely impact available shelf-life. The application of real-time PCR for the detection of Salmonella spp. in food samples enables a potential time-saving of up to four days. The advancement of real-time PCR coupled with the development of commercially available systems in different formats has made this technology accessible for laboratories in an industrial environment. Ideally these systems are reliable and rapid as well as easy to use. The current study represents a comparative evaluation of seven commercial real-time PCR systems for the detection of Salmonella. Forty-nine target and twenty-nine non-target strains were included in the study to assess inclusivity and exclusivity. The limit of detection for each of the method was determined in four different food products. All systems evaluated were able to correctly identify the 49 Salmonella strains. Nevertheless, false positive results (Citrobacter spp.) were obtained with four of the seven systems. In milk powder and bouillon powder, the limit of detection was similar for all systems, suggesting a minimal matrix effect with these samples. Conversely, for black tea and cocoa powder some systems were prone to inhibition from matrix components. Up to 100% of the samples were inhibited using the proprietary extracts but inhibition could be reduced considerably by application of a DNA clean-up kit. Copyright © 2013 Elsevier B.V. All rights reserved.
Detection of Rare Mutations in EGFR-ARMS-PCR-Negative Lung Adenocarcinoma by Sanger Sequencing.
Liang, Chaoyue; Wu, Zhuolin; Gan, Xiaohong; Liu, Yuanbin; You, You; Liu, Chenxian; Zhou, Chengzhi; Liang, Ying; Mo, Haiyun; Chen, Allen M; Zhang, Jiexia
2018-01-01
This study aimed to identify potential epidermal growth factor receptor (EGFR) gene mutations in non-small cell lung cancer that went undetected by amplification refractory mutation system-Scorpion real-time PCR (ARMS-PCR). A total of 200 specimens were obtained from the First Affiliated Hospital of Guangzhou Medical University from August 2014 to August 2015. In total, 100 ARMS-negative and 100 ARMS-positive specimens were evaluated for EGFR gene mutations by Sanger sequencing. The methodology and sensitivity of each method and the outcomes of EGFR-tyrosine kinase inhibitor (TKI) therapy were analyzed. Among the 100 ARMS-PCR-positive samples, 90 were positive by Sanger sequencing, while 10 cases were considered negative, because the mutation abundance was less than 10%. Among the 100 negative cases, three were positive for a rare EGFR mutation by Sanger sequencing. In the curative effect analysis of EGFR-TKIs, the progression-free survival (PFS) analysis based on ARMS and Sanger sequencing results showed no difference. However, the PFS of patients with a high abundance of EGFR mutation was 12.4 months [95% confidence interval (CI), 11.6-12.4 months], which was significantly higher than that of patients with a low abundance of mutations detected by Sanger sequencing (95% CI, 10.7-11.3 months) (p<0.001). The ARMS method demonstrated higher sensitivity than Sanger sequencing, but was prone to missing mutations due to primer design. Sanger sequencing was able to detect rare EGFR mutations and deemed applicable for confirming EGFR status. A clinical trial evaluating the efficacy of EGFR-TKIs in patients with rare EGFR mutations is needed. © Copyright: Yonsei University College of Medicine 2018
Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T
2018-03-05
Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.
Multiple Intravenous Infusions Phase 2b: Laboratory Study
Pinkney, Sonia; Fan, Mark; Chan, Katherine; Koczmara, Christine; Colvin, Christopher; Sasangohar, Farzan; Masino, Caterina; Easty, Anthony; Trbovich, Patricia
2014-01-01
Background Administering multiple intravenous (IV) infusions to a single patient via infusion pump occurs routinely in health care, but there has been little empirical research examining the risks associated with this practice or ways to mitigate those risks. Objectives To identify the risks associated with multiple IV infusions and assess the impact of interventions on nurses’ ability to safely administer them. Data Sources and Review Methods Forty nurses completed infusion-related tasks in a simulated adult intensive care unit, with and without interventions (i.e., repeated-measures design). Results Errors were observed in completing common tasks associated with the administration of multiple IV infusions, including the following (all values from baseline, which was current practice): setting up and programming multiple primary continuous IV infusions (e.g., 11.7% programming errors) identifying IV infusions (e.g., 7.7% line-tracing errors) managing dead volume (e.g., 96.0% flush rate errors following IV syringe dose administration) setting up a secondary intermittent IV infusion (e.g., 11.3% secondary clamp errors) administering an IV pump bolus (e.g., 11.5% programming errors) Of 10 interventions tested, 6 (1 practice, 3 technology, and 2 educational) significantly decreased or even eliminated errors compared to baseline. Limitations The simulation of an adult intensive care unit at 1 hospital limited the ability to generalize results. The study results were representative of nurses who received training in the interventions but had little experience using them. The longitudinal effects of the interventions were not studied. Conclusions Administering and managing multiple IV infusions is a complex and risk-prone activity. However, when a patient requires multiple IV infusions, targeted interventions can reduce identified risks. A combination of standardized practice, technology improvements, and targeted education is required. PMID:26316919
Complex contexts and relationships affect clinical decisions in group therapy.
Tasca, Giorgio A; Mcquaid, Nancy; Balfour, Louise
2016-09-01
Clinical errors tend to be underreported even though examining them can provide important training and professional development opportunities. The group therapy context may be prone to clinician errors because of the added complexity within which therapists work and patients receive treatment. We discuss clinical errors that occurred within a group therapy in which a patient for whom group was not appropriate was admitted to the treatment and then was not removed by the clinicians. This was countertherapeutic for both patient and group. Two clinicians were involved: a clinical supervisor who initially assessed and admitted the patient to the group, and a group therapist. To complicate matters, the group therapy occurred within the context of a clinical research trial. The errors, possible solutions, and recommendations are discussed within Reason's Organizational Accident Model (Reason, 2000). In particular, we discuss clinician errors in the context of countertransference and clinician heuristics, group therapy as a local work condition that complicates clinical decision-making, and the impact of the research context as a latent organizational factor. We also present clinical vignettes from the pregroup preparation, group therapy, and supervision. Group therapists are more likely to avoid errors in clinical decisions if they engage in reflective practice about their internal experiences and about the impact of the context in which they work. Therapists must keep in mind the various levels of group functioning, especially related to the group-as-a-whole (i.e., group composition, cohesion, group climate, and safety) when making complex clinical decisions in order to optimize patient outcomes. (PsycINFO Database Record (c) 2016 APA, all rights reserved).