Sample records for highly error prone

  1. Threat engagement, disengagement, and sensitivity bias in worry-prone individuals as measured by an emotional go/no-go task.

    PubMed

    Gole, Markus; Köchel, Angelika; Schäfer, Axel; Schienle, Anne

    2012-03-01

    The goal of the present study was to investigate a threat engagement, disengagement, and sensitivity bias in individuals suffering from pathological worry. Twenty participants high in worry proneness and 16 control participants low in worry proneness completed an emotional go/no-go task with worry-related threat words and neutral words. Shorter reaction times (i.e., threat engagement bias), smaller omission error rates (i.e., threat sensitivity bias), and larger commission error rates (i.e., threat disengagement bias) emerged only in the high worry group when worry-related words constituted the go-stimuli and neutral words the no-go stimuli. Also, smaller omission error rates as well as larger commission error rates were observed in the high worry group relative to the low worry group when worry-related go stimuli and neutral no-go stimuli were used. The obtained results await further replication within a generalized anxiety disorder sample. Also, further samples should include men as well. Our data suggest that worry-prone individuals are threat-sensitive, engage more rapidly with aversion, and disengage harder. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Associations between intrusive thoughts, reality discrimination and hallucination-proneness in healthy young adults.

    PubMed

    Smailes, David; Meins, Elizabeth; Fernyhough, Charles

    2015-01-01

    People who experience intrusive thoughts are at increased risk of developing hallucinatory experiences, as are people who have weak reality discrimination skills. No study has yet examined whether these two factors interact to make a person especially prone to hallucinatory experiences. The present study examined this question in a non-clinical sample. Participants were 160 students, who completed a reality discrimination task, as well as self-report measures of cannabis use, negative affect, intrusive thoughts and auditory hallucination-proneness. The possibility of an interaction between reality discrimination performance and level of intrusive thoughts was assessed using multiple regression. The number of reality discrimination errors and level of intrusive thoughts were independent predictors of hallucination-proneness. The reality discrimination errors × intrusive thoughts interaction term was significant, with participants who made many reality discrimination errors and reported high levels of intrusive thoughts being especially prone to hallucinatory experiences. Hallucinatory experiences are more likely to occur in people who report high levels of intrusive thoughts and have weak reality discrimination skills. If applicable to clinical samples, these findings suggest that improving patients' reality discrimination skills and reducing the number of intrusive thoughts they experience may reduce the frequency of hallucinatory experiences.

  3. Thermoadaptation-Directed Enzyme Evolution in an Error-Prone Thermophile Derived from Geobacillus kaustophilus HTA426

    PubMed Central

    Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi

    2014-01-01

    Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. PMID:25326311

  4. Thermoadaptation-directed enzyme evolution in an error-prone thermophile derived from Geobacillus kaustophilus HTA426.

    PubMed

    Suzuki, Hirokazu; Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi

    2015-01-01

    Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. Gene-targeted Random Mutagenesis to Select Heterochromatin-destabilizing Proteasome Mutants in Fission Yeast.

    PubMed

    Seo, Hogyu David; Lee, Daeyoup

    2018-05-15

    Random mutagenesis of a target gene is commonly used to identify mutations that yield the desired phenotype. Of the methods that may be used to achieve random mutagenesis, error-prone PCR is a convenient and efficient strategy for generating a diverse pool of mutants (i.e., a mutant library). Error-prone PCR is the method of choice when a researcher seeks to mutate a pre-defined region, such as the coding region of a gene while leaving other genomic regions unaffected. After the mutant library is amplified by error-prone PCR, it must be cloned into a suitable plasmid. The size of the library generated by error-prone PCR is constrained by the efficiency of the cloning step. However, in the fission yeast, Schizosaccharomyces pombe, the cloning step can be replaced by the use of a highly efficient one-step fusion PCR to generate constructs for transformation. Mutants of desired phenotypes may then be selected using appropriate reporters. Here, we describe this strategy in detail, taking as an example, a reporter inserted at centromeric heterochromatin.

  6. Regulation of error-prone translesion synthesis by Spartan/C1orf124

    PubMed Central

    Kim, Myoung Shin; Machida, Yuka; Vashisht, Ajay A.; Wohlschlegel, James A.; Pang, Yuan-Ping; Machida, Yuichi J.

    2013-01-01

    Translesion synthesis (TLS) employs low fidelity polymerases to replicate past damaged DNA in a potentially error-prone process. Regulatory mechanisms that prevent TLS-associated mutagenesis are unknown; however, our recent studies suggest that the PCNA-binding protein Spartan plays a role in suppression of damage-induced mutagenesis. Here, we show that Spartan negatively regulates error-prone TLS that is dependent on POLD3, the accessory subunit of the replicative DNA polymerase Pol δ. We demonstrate that the putative zinc metalloprotease domain SprT in Spartan directly interacts with POLD3 and contributes to suppression of damage-induced mutagenesis. Depletion of Spartan induces complex formation of POLD3 with Rev1 and the error-prone TLS polymerase Pol ζ, and elevates mutagenesis that relies on POLD3, Rev1 and Pol ζ. These results suggest that Spartan negatively regulates POLD3 function in Rev1/Pol ζ-dependent TLS, revealing a previously unrecognized regulatory step in error-prone TLS. PMID:23254330

  7. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    PubMed

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  8. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study.

    PubMed

    Cooperstein, Robert; Young, Morgan

    2014-01-01

    Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings.

  9. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study

    PubMed Central

    2014-01-01

    Background Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Methods Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. Results The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. Conclusions As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings. PMID:24904747

  10. Designing an algorithm to preserve privacy for medical record linkage with error-prone data.

    PubMed

    Pal, Doyel; Chen, Tingting; Zhong, Sheng; Khethavath, Praveen

    2014-01-20

    Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients' privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other's database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other's database.

  11. A continuous quality improvement project to reduce medication error in the emergency department.

    PubMed

    Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts

    2013-01-01

    Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.

  12. DNA double strand break repair in human bladder cancer is error prone and involves microhomology-associated end-joining

    PubMed Central

    Bentley, Johanne; Diggle, Christine P.; Harnden, Patricia; Knowles, Margaret A.; Kiltie, Anne E.

    2004-01-01

    In human cells DNA double strand breaks (DSBs) can be repaired by the non-homologous end-joining (NHEJ) pathway. In a background of NHEJ deficiency, DSBs with mismatched ends can be joined by an error-prone mechanism involving joining between regions of nucleotide microhomology. The majority of joins formed from a DSB with partially incompatible 3′ overhangs by cell-free extracts from human glioblastoma (MO59K) and urothelial (NHU) cell lines were accurate and produced by the overlap/fill-in of mismatched termini by NHEJ. However, repair of DSBs by extracts using tissue from four high-grade bladder carcinomas resulted in no accurate join formation. Junctions were formed by the non-random deletion of terminal nucleotides and showed a preference for annealing at a microhomology of 8 nt buried within the DNA substrate; this process was not dependent on functional Ku70, DNA-PK or XRCC4. Junctions were repaired in the same manner in MO59K extracts in which accurate NHEJ was inactivated by inhibition of Ku70 or DNA-PKcs. These data indicate that bladder tumour extracts are unable to perform accurate NHEJ such that error-prone joining predominates. Therefore, in high-grade tumours mismatched DSBs are repaired by a highly mutagenic, microhomology-mediated, alternative end-joining pathway, a process that may contribute to genomic instability observed in bladder cancer. PMID:15466592

  13. Designing an Algorithm to Preserve Privacy for Medical Record Linkage With Error-Prone Data

    PubMed Central

    Pal, Doyel; Chen, Tingting; Khethavath, Praveen

    2014-01-01

    Background Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients’ privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. Objective To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. Methods To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. Results We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other’s database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Conclusions Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other’s database. PMID:25600786

  14. Effects of Shame and Guilt on Error Reporting Among Obstetric Clinicians.

    PubMed

    Zabari, Mara Lynne; Southern, Nancy L

    2018-04-17

    To understand how the experiences of shame and guilt, coupled with organizational factors, affect error reporting by obstetric clinicians. Descriptive cross-sectional. A sample of 84 obstetric clinicians from three maternity units in Washington State. In this quantitative inquiry, a variant of the Test of Self-Conscious Affect was used to measure proneness to guilt and shame. In addition, we developed questions to assess attitudes regarding concerns about damaging one's reputation if an error was reported and the choice to keep an error to oneself. Both assessments were analyzed separately and then correlated to identify relationships between constructs. Interviews were used to identify organizational factors that affect error reporting. As a group, mean scores indicated that obstetric clinicians would not choose to keep errors to themselves. However, bivariate correlations showed that proneness to shame was positively correlated to concerns about one's reputation if an error was reported, and proneness to guilt was negatively correlated with keeping errors to oneself. Interview data analysis showed that Past Experience with Responses to Errors, Management and Leadership Styles, Professional Hierarchy, and Relationships With Colleagues were influential factors in error reporting. Although obstetric clinicians want to report errors, their decisions to report are influenced by their proneness to guilt and shame and perceptions of the degree to which organizational factors facilitate or create barriers to restore their self-images. Findings underscore the influence of the organizational context on clinicians' decisions to report errors. Copyright © 2018 AWHONN, the Association of Women’s Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  15. EEG and chaos: Description of underlying dynamics and its relation to dissociative states

    NASA Technical Reports Server (NTRS)

    Ray, William J.

    1994-01-01

    The goal of this work is the identification of states especially as related to the process of error production and lapses of awareness as might be experienced during aviation. Given the need for further articulation of the characteristics of 'error prone state' or 'hazardous state of awareness,' this NASA grant focused on basic ground work for the study of the psychophysiology of these states. In specific, the purpose of this grant was to establish the necessary methodology for addressing three broad questions. The first is how the error prone state should be conceptualized, and whether it is similar to a dissociative state, a hypnotic state, or absent mindedness. Over 1200 subjects completed a variety of psychometric measures reflecting internal states and proneness to mental lapses and absent mindedness; the study suggests that there exists a consistency of patterns displayed by individuals who self-report dissociative experiences such that those individuals who score high on measures of dissociation also score high on measures of absent mindedness, errors, and absorption, but not on scales of hypnotizability. The second broad question is whether some individuals are more prone to enter these states than others. A study of 14 young adults who scored either high or low on the dissociation experiences scale performed a series of six tasks. This study suggests that high and low dissociative individuals arrive at the experiment in similar electrocortical states and perform cognitive tasks (e.g., mental math) in a similar manner; it is in the processing of internal emotional states that differences begin to emerge. The third question to be answered is whether recent research in nonlinear dynamics, i.e., chaos, offer an addition and/or alternative to traditional signal processing methods, i.e., fast Fourier transforms, and whether chaos procedures can be modified to offer additional information useful in identifying brain states. A preliminary review suggests that current nonlinear dynamical techniques such as dimensional analysis can be successfully applied to electrocortical activity. Using the data set developed in the study of the young adults, chaos analyses using the Farmer algorithm were performed; it is concluded that dimensionality measures reflect information not contained in traditional EEG Fourier analysis.

  16. Simultaneous treatment of unspecified heteroskedastic model error distribution and mismeasured covariates for restricted moment models.

    PubMed

    Garcia, Tanya P; Ma, Yanyuan

    2017-10-01

    We develop consistent and efficient estimation of parameters in general regression models with mismeasured covariates. We assume the model error and covariate distributions are unspecified, and the measurement error distribution is a general parametric distribution with unknown variance-covariance. We construct root- n consistent, asymptotically normal and locally efficient estimators using the semiparametric efficient score. We do not estimate any unknown distribution or model error heteroskedasticity. Instead, we form the estimator under possibly incorrect working distribution models for the model error, error-prone covariate, or both. Empirical results demonstrate robustness to different incorrect working models in homoscedastic and heteroskedastic models with error-prone covariates.

  17. Exploring the relationship between boredom and sustained attention.

    PubMed

    Malkovsky, Ela; Merrifield, Colleen; Goldberg, Yael; Danckert, James

    2012-08-01

    Boredom is a common experience, prevalent in neurological and psychiatric populations, yet its cognitive characteristics remain poorly understood. We explored the relationship between boredom proneness, sustained attention and adult symptoms of attention deficit hyperactivity disorder (ADHD). The results showed that high boredom-prone individuals (HBP) performed poorly on measures of sustained attention and showed increased symptoms of ADHD and depression. The results also showed that HBP individuals can be characterised as either apathetic-in which the individual is unconcerned with his/her environment, or as agitated-in which the individual is motivated to engage in meaningful activities, although attempts to do so fail to satisfy. Apathetic boredom proneness was associated with attention lapses, whereas agitated boredom proneness was associated with decreased sensitivity to errors of sustained attention, and increased symptoms of adult ADHD. Our results suggest there is a complex relationship between attention and boredom proneness.

  18. Belief-bias reasoning in non-clinical delusion-prone individuals.

    PubMed

    Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R

    2017-03-01

    It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Belief-bias reasoning in non-clinical delusion-prone individuals.

    PubMed

    Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R

    2017-09-01

    It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Isolation and characterization of high affinity aptamers against DNA polymerase iota.

    PubMed

    Lakhin, Andrei V; Kazakov, Andrei A; Makarova, Alena V; Pavlov, Yuri I; Efremova, Anna S; Shram, Stanislav I; Tarantul, Viacheslav Z; Gening, Leonid V

    2012-02-01

    Human DNA-polymerase iota (Pol ι) is an extremely error-prone enzyme and the fidelity depends on the sequence context of the template. Using the in vitro systematic evolution of ligands by exponential enrichment (SELEX) procedure, we obtained an oligoribonucleotide with a high affinity to human Pol ι, named aptamer IKL5. We determined its dissociation constant with homogenous preparation of Pol ι and predicted its putative secondary structure. The aptamer IKL5 specifically inhibits DNA-polymerase activity of the purified enzyme Pol ι, but did not inhibit the DNA-polymerase activities of human DNA polymerases beta and kappa. IKL5 suppressed the error-prone DNA-polymerase activity of Pol ι also in cellular extracts of the tumor cell line SKOV-3. The aptamer IKL5 is useful for studies of the biological role of Pol ι and as a potential drug to suppress the increase of the activity of this enzyme in malignant cells.

  1. Validation, Edits, and Application Processing Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    The impact of quality assurance procedures on the correct award of Basic Educational Opportunity Grants (BEOGs) for 1979-1980 was assessed, and a model for detecting error-prone applications early in processing was developed. The Bureau of Student Financial Aid introduced new comments into the edit system in 1979 and expanded the pre-established…

  2. Meiotic Divisions: No Place for Gender Equality.

    PubMed

    El Yakoubi, Warif; Wassmann, Katja

    2017-01-01

    In multicellular organisms the fusion of two gametes with a haploid set of chromosomes leads to the formation of the zygote, the first cell of the embryo. Accurate execution of the meiotic cell division to generate a female and a male gamete is required for the generation of healthy offspring harboring the correct number of chromosomes. Unfortunately, meiosis is error prone. This has severe consequences for fertility and under certain circumstances, health of the offspring. In humans, female meiosis is extremely error prone. In this chapter we will compare male and female meiosis in humans to illustrate why and at which frequency errors occur, and describe how this affects pregnancy outcome and health of the individual. We will first introduce key notions of cell division in meiosis and how they differ from mitosis, followed by a detailed description of the events that are prone to errors during the meiotic divisions.

  3. A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction

    NASA Astrophysics Data System (ADS)

    Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf

    2018-07-01

    Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.

  4. Lung Basal Stem Cells Rapidly Repair DNA Damage Using the Error-Prone Nonhomologous End-Joining Pathway

    PubMed Central

    Weeden, Clare E.; Chen, Yunshun; Ma, Stephen B.; Hu, Yifang; Ramm, Georg; Sutherland, Kate D.; Smyth, Gordon K.

    2017-01-01

    Lung squamous cell carcinoma (SqCC), the second most common subtype of lung cancer, is strongly associated with tobacco smoking and exhibits genomic instability. The cellular origins and molecular processes that contribute to SqCC formation are largely unexplored. Here we show that human basal stem cells (BSCs) isolated from heavy smokers proliferate extensively, whereas their alveolar progenitor cell counterparts have limited colony-forming capacity. We demonstrate that this difference arises in part because of the ability of BSCs to repair their DNA more efficiently than alveolar cells following ionizing radiation or chemical-induced DNA damage. Analysis of mice harbouring a mutation in the DNA-dependent protein kinase catalytic subunit (DNA-PKcs), a key enzyme in DNA damage repair by nonhomologous end joining (NHEJ), indicated that BSCs preferentially repair their DNA by this error-prone process. Interestingly, polyploidy, a phenomenon associated with genetically unstable cells, was only observed in the human BSC subset. Expression signature analysis indicated that BSCs are the likely cells of origin of human SqCC and that high levels of NHEJ genes in SqCC are correlated with increasing genomic instability. Hence, our results favour a model in which heavy smoking promotes proliferation of BSCs, and their predilection for error-prone NHEJ could lead to the high mutagenic burden that culminates in SqCC. Targeting DNA repair processes may therefore have a role in the prevention and therapy of SqCC. PMID:28125611

  5. Two-Step Fair Scheduling of Continuous Media Streams over Error-Prone Wireless Channels

    NASA Astrophysics Data System (ADS)

    Oh, Soohyun; Lee, Jin Wook; Park, Taejoon; Jo, Tae-Chang

    In wireless cellular networks, streaming of continuous media (with strict QoS requirements) over wireless links is challenging due to their inherent unreliability characterized by location-dependent, bursty errors. To address this challenge, we present a two-step scheduling algorithm for a base station to provide streaming of continuous media to wireless clients over the error-prone wireless links. The proposed algorithm is capable of minimizing the packet loss rate of individual clients in the presence of error bursts, by transmitting packets in the round-robin manner and also adopting a mechanism for channel prediction and swapping.

  6. Comparing errors in Medicaid reporting across surveys: evidence to date.

    PubMed

    Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria

    2013-04-01

    To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. All available validation studies. Compare results from existing research to understand variation in reporting across surveys. Synthesize all available studies validating survey reports of Medicaid coverage. Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. © Health Research and Educational Trust.

  7. Update: Validation, Edits, and Application Processing. Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    An update to the Validation, Edits, and Application Processing and Error-Prone Model Report (Section 1, July 3, 1980) is presented. The objective is to present the most current data obtained from the June 1980 Basic Educational Opportunity Grant applicant and recipient files and to determine whether the findings reported in Section 1 of the July…

  8. Stem revenue losses with effective CDM management.

    PubMed

    Alwell, Michael

    2003-09-01

    Effective CDM management not only minimizes revenue losses due to denied claims, but also helps eliminate administrative costs associated with correcting coding errors. Accountability for CDM management should be assigned to a single individual, who ideally reports to the CFO or high-level finance director. If your organization is prone to making billing errors due to CDM deficiencies, you should consider purchasing CDM software to help you manage your CDM.

  9. Population size estimation in Yellowstone wolves with error-prone noninvasive microsatellite genotypes.

    PubMed

    Creel, Scott; Spong, Goran; Sands, Jennifer L; Rotella, Jay; Zeigle, Janet; Joe, Lawrence; Murphy, Kerry M; Smith, Douglas

    2003-07-01

    Determining population sizes can be difficult, but is essential for conservation. By counting distinct microsatellite genotypes, DNA from noninvasive samples (hair, faeces) allows estimation of population size. Problems arise because genotypes from noninvasive samples are error-prone, but genotyping errors can be reduced by multiple polymerase chain reaction (PCR). For faecal genotypes from wolves in Yellowstone National Park, error rates varied substantially among samples, often above the 'worst-case threshold' suggested by simulation. Consequently, a substantial proportion of multilocus genotypes held one or more errors, despite multiple PCR. These genotyping errors created several genotypes per individual and caused overestimation (up to 5.5-fold) of population size. We propose a 'matching approach' to eliminate this overestimation bias.

  10. Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.

    PubMed

    Yamamoto, Loren; Kanemori, Joan

    2010-06-01

    Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  11. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  12. Path-following in model predictive rollover prevention using front steering and braking

    NASA Astrophysics Data System (ADS)

    Ghazali, Mohammad; Durali, Mohammad; Salarieh, Hassan

    2017-01-01

    In this paper vehicle path-following in the presence of rollover risk is investigated. Vehicles with high centre of mass are prone to roll instability. Untripped rollover risk is increased in high centre of gravity vehicles and high-friction road condition. Researches introduce strategies to handle the short-duration rollover condition. In these researches, however, trajectory tracking is affected and not thoroughly investigated. This paper puts stress on tracking error from rollover prevention. A lower level model predictive front steering controller is adopted to deal with rollover and tracking error as a priority sequence. A brake control is included in lower level controller which directly obeys an upper level controller (ULC) command. The ULC manages vehicle speed regarding primarily tracking error. Simulation results show that the proposed control framework maintains roll stability while tracking error is confined to predefined error limit.

  13. Comparing Errors in Medicaid Reporting across Surveys: Evidence to Date

    PubMed Central

    Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria

    2013-01-01

    Objective To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. Data Sources All available validation studies. Study Design Compare results from existing research to understand variation in reporting across surveys. Data Collection Methods Synthesize all available studies validating survey reports of Medicaid coverage. Principal Findings Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Conclusions Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. PMID:22816493

  14. Development of a Dependency Theory Toolbox for Database Design.

    DTIC Science & Technology

    1987-12-01

    published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and

  15. Absence of Mutagenic Activity of Hycanthone in Serratia marcescens,

    DTIC Science & Technology

    1986-05-29

    repair system but is enhanced by the plasmid pKMl01, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , .1...enhanced by the plasmid pKM10, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , intercalates between the stacked bases...Roth (1974) lave suggested that proflavin , which has a planar triple ring structure similar to hycanthone, interacts with DNA, which upon replication

  16. Errors Affect Hypothetical Intertemporal Food Choice in Women

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2014-01-01

    Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534

  17. OPTIMA: sensitive and accurate whole-genome alignment of error-prone genomic maps by combinatorial indexing and technology-agnostic statistical analysis.

    PubMed

    Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan

    2016-01-01

    Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.

  18. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Error-Prone Model Derived from 1978-1979 Quality Control Study. Data Report. [Task 3.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; Kuchak, JoAnn

    An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…

  19. Random mutagenesis of BoNT/E Hc nanobody to construct a secondary phage-display library.

    PubMed

    Shahi, B; Mousavi Gargari, S L; Rasooli, I; Rajabi Bazl, M; Hoseinpoor, R

    2014-08-01

    To construct secondary mutant phage-display library of recombinant single variable domain (VHH) against botulinum neurotoxin E by error-prone PCR. The gene coding for specific VHH derived from the camel immunized with binding domain of botulinum neurotoxin E (BoNT/E) was amplified by error-prone PCR. Several biopanning rounds were used to screen the phage-displaying BoNT/E Hc nanobodies. The final nanobody, SHMR4, with increased affinity recognized BoNT/E toxin with no cross-reactivity with other antigens especially with related BoNT toxins. The constructed nanobody could be a suitable candidate for VHH-based biosensor production to detect the Clostridium botulinum type E. Diagnosis and treatment of botulinum neurotoxins are important. Generation of high-affinity antibodies based on the construction of secondary libraries using affinity maturation step leads to the development of reagents for precise diagnosis and therapy. © 2014 The Society for Applied Microbiology.

  20. Improving Advising Using Technology and Data Analytics

    ERIC Educational Resources Information Center

    Phillips, Elizabeth D.

    2013-01-01

    Traditionally, the collegiate advising system provides each student with a personal academic advisor who designs a pathway to the degree for that student in face-to-face meetings. Ideally, this is a supportive mentoring relationship. In truth, however, this system is highly inefficient, error prone, expensive, and a source of ubiquitous student…

  1. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  2. A prospective three-step intervention study to prevent medication errors in drug handling in paediatric care.

    PubMed

    Niemann, Dorothee; Bertsche, Astrid; Meyrath, David; Koepf, Ellen D; Traiser, Carolin; Seebald, Katja; Schmitt, Claus P; Hoffmann, Georg F; Haefeli, Walter E; Bertsche, Thilo

    2015-01-01

    To prevent medication errors in drug handling in a paediatric ward. One in five preventable adverse drug events in hospitalised children is caused by medication errors. Errors in drug prescription have been studied frequently, but data regarding drug handling, including drug preparation and administration, are scarce. A three-step intervention study including monitoring procedure was used to detect and prevent medication errors in drug handling. After approval by the ethics committee, pharmacists monitored drug handling by nurses on an 18-bed paediatric ward in a university hospital prior to and following each intervention step. They also conducted a questionnaire survey aimed at identifying knowledge deficits. Each intervention step targeted different causes of errors. The handout mainly addressed knowledge deficits, the training course addressed errors caused by rule violations and slips, and the reference book addressed knowledge-, memory- and rule-based errors. The number of patients who were subjected to at least one medication error in drug handling decreased from 38/43 (88%) to 25/51 (49%) following the third intervention, and the overall frequency of errors decreased from 527 errors in 581 processes (91%) to 116/441 (26%). The issue of the handout reduced medication errors caused by knowledge deficits regarding, for instance, the correct 'volume of solvent for IV drugs' from 49-25%. Paediatric drug handling is prone to errors. A three-step intervention effectively decreased the high frequency of medication errors by addressing the diversity of their causes. Worldwide, nurses are in charge of drug handling, which constitutes an error-prone but often-neglected step in drug therapy. Detection and prevention of errors in daily routine is necessary for a safe and effective drug therapy. Our three-step intervention reduced errors and is suitable to be tested in other wards and settings. © 2014 John Wiley & Sons Ltd.

  3. Somatic immunoglobulin hypermutation

    PubMed Central

    Diaz, Marilyn; Casali, Paolo

    2015-01-01

    Immunoglobulin hypermutation provides the structural correlate for the affinity maturation of the antibody response. Characteristic modalities of this mechanism include a preponderance of point-mutations with prevalence of transitions over transversions, and the mutational hotspot RGYW sequence. Recent evidence suggests a mechanism whereby DNA-breaks induce error-prone DNA synthesis in immunoglobulin V(D)J regions by error-prone DNA polymerases. The nature of the targeting mechanism and the trans-factors effecting such breaks and their repair remain to be determined. PMID:11869898

  4. Topological quantum computing with a very noisy network and local error rates approaching one percent.

    PubMed

    Nickerson, Naomi H; Li, Ying; Benjamin, Simon C

    2013-01-01

    A scalable quantum computer could be built by networking together many simple processor cells, thus avoiding the need to create a single complex structure. The difficulty is that realistic quantum links are very error prone. A solution is for cells to repeatedly communicate with each other and so purify any imperfections; however prior studies suggest that the cells themselves must then have prohibitively low internal error rates. Here we describe a method by which even error-prone cells can perform purification: groups of cells generate shared resource states, which then enable stabilization of topologically encoded data. Given a realistically noisy network (≥10% error rate) we find that our protocol can succeed provided that intra-cell error rates for initialisation, state manipulation and measurement are below 0.82%. This level of fidelity is already achievable in several laboratory systems.

  5. Associations of hallucination proneness with free-recall intrusions and response bias in a nonclinical sample.

    PubMed

    Brébion, Gildas; Larøi, Frank; Van der Linden, Martial

    2010-10-01

    Hallucinations in patients with schizophrenia have been associated with a liberal response bias in signal detection and recognition tasks and with various types of source-memory error. We investigated the associations of hallucination proneness with free-recall intrusions and false recognitions of words in a nonclinical sample. A total of 81 healthy individuals were administered a verbal memory task involving free recall and recognition of one nonorganizable and one semantically organizable list of words. Hallucination proneness was assessed by means of a self-rating scale. Global hallucination proneness was associated with free-recall intrusions in the nonorganizable list and with a response bias reflecting tendency to make false recognitions of nontarget words in both types of list. The verbal hallucination score was associated with more intrusions and with a reduced tendency to make false recognitions of words. The associations between global hallucination proneness and two types of verbal memory error in a nonclinical sample corroborate those observed in patients with schizophrenia and suggest that common cognitive mechanisms underlie hallucinations in psychiatric and nonclinical individuals.

  6. How Alterations in the Cdt1 Expression Lead to Gene Amplification in Breast Cancer

    DTIC Science & Technology

    2011-07-01

    absence of extrinsic DNA damage. We measured the TLS activity by measuring the mutation frequency in a supF gene (in a shuttle vector) subjected to UV...induced DNA damage before its introduction into the cells. Error-prone TLS activity will mutate the supF gene , which is scored by a blue-white colony...Figure 4A). Sequencing of the mutant supF genes , revealed a mutation spectrum consistent with error prone TLS (Supplemental Table 1). Significantly

  7. Error-prone meiotic division and subfertility in mice with oocyte-conditional knockdown of pericentrin.

    PubMed

    Baumann, Claudia; Wang, Xiaotian; Yang, Luhan; Viveiros, Maria M

    2017-04-01

    Mouse oocytes lack canonical centrosomes and instead contain unique acentriolar microtubule-organizing centers (aMTOCs). To test the function of these distinct aMTOCs in meiotic spindle formation, pericentrin (Pcnt), an essential centrosome/MTOC protein, was knocked down exclusively in oocytes by using a transgenic RNAi approach. Here, we provide evidence that disruption of aMTOC function in oocytes promotes spindle instability and severe meiotic errors that lead to pronounced female subfertility. Pcnt-depleted oocytes from transgenic (Tg) mice were ovulated at the metaphase-II stage, but show significant chromosome misalignment, aneuploidy and premature sister chromatid separation. These defects were associated with loss of key Pcnt-interacting proteins (γ-tubulin, Nedd1 and Cep215) from meiotic spindle poles, altered spindle structure and chromosome-microtubule attachment errors. Live-cell imaging revealed disruptions in the dynamics of spindle assembly and organization, together with chromosome attachment and congression defects. Notably, spindle formation was dependent on Ran GTPase activity in Pcnt-deficient oocytes. Our findings establish that meiotic division is highly error-prone in the absence of Pcnt and disrupted aMTOCs, similar to what reportedly occurs in human oocytes. Moreover, these data underscore crucial differences between MTOC-dependent and -independent meiotic spindle assembly. © 2017. Published by The Company of Biologists Ltd.

  8. Proximal antecedents and correlates of adopted error approach: a self-regulatory perspective.

    PubMed

    Van Dyck, Cathy; Van Hooft, Edwin; De Gilder, Dick; Liesveld, Lillian

    2010-01-01

    The current study aims to further investigate earlier established advantages of an error mastery approach over an error aversion approach. The two main purposes of the study relate to (1) self-regulatory traits (i.e., goal orientation and action-state orientation) that may predict which error approach (mastery or aversion) is adopted, and (2) proximal, psychological processes (i.e., self-focused attention and failure attribution) that relate to adopted error approach. In the current study participants' goal orientation and action-state orientation were assessed, after which they worked on an error-prone task. Results show that learning goal orientation related to error mastery, while state orientation related to error aversion. Under a mastery approach, error occurrence did not result in cognitive resources "wasted" on self-consciousness. Rather, attention went to internal-unstable, thus controllable, improvement oriented causes of error. Participants that had adopted an aversion approach, in contrast, experienced heightened self-consciousness and attributed failure to internal-stable or external causes. These results imply that when working on an error-prone task, people should be stimulated to take on a mastery rather than an aversion approach towards errors.

  9. Improved acid tolerance of Lactobacillus pentosus by error-prone whole genome amplification.

    PubMed

    Ye, Lidan; Zhao, Hua; Li, Zhi; Wu, Jin Chuan

    2013-05-01

    Acid tolerance of Lactobacillus pentosus ATCC 8041 was improved by error-prone amplification of its genomic DNA using random primers and Taq DNA polymerase. The resulting amplification products were transferred into wild-type L. pentosus by electroporation and the transformants were screened for growth on low-pH agar plates. After only one round of mutation, one mutant (MT3) was identified that was able to completely consume 20 g/L of glucose to produce lactic acid at a yield of 95% in 1L MRS medium at pH 3.8 within 36 h, whereas no growth or lactic acid production was observed for the wild-type strain under the same conditions. The acid tolerance of mutant MT3 remained genetically stable for at least 25 subcultures. Therefore, the error-prone whole genome amplification technique is a very powerful tool for improving phenotypes of this lactic acid bacterium and may also be applicable for other microorganisms. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. [Effect of Mn(II) on the error-prone DNA polymerase iota activity in extracts from human normal and tumor cells].

    PubMed

    Lakhin, A V; Efremova, A S; Makarova, I V; Grishina, E E; Shram, S I; Tarantul, V Z; Gening, L V

    2013-01-01

    The DNA polymerase iota (Pol iota), which has some peculiar features and is characterized by an extremely error-prone DNA synthesis, belongs to the group of enzymes preferentially activated by Mn2+ instead of Mg2+. In this work, the effect of Mn2+ on DNA synthesis in cell extracts from a) normal human and murine tissues, b) human tumor (uveal melanoma), and c) cultured human tumor cell lines SKOV-3 and HL-60 was tested. Each group displayed characteristic features of Mn-dependent DNA synthesis. The changes in the Mn-dependent DNA synthesis caused by malignant transformation of normal tissues are described. It was also shown that the error-prone DNA synthesis catalyzed by Pol iota in extracts of all cell types was efficiently suppressed by an RNA aptamer (IKL5) against Pol iota obtained in our work earlier. The obtained results suggest that IKL5 might be used to suppress the enhanced activity of Pol iota in tumor cells.

  11. Random mutagenesis by error-prone pol plasmid replication in Escherichia coli.

    PubMed

    Alexander, David L; Lilly, Joshua; Hernandez, Jaime; Romsdahl, Jillian; Troll, Christopher J; Camps, Manel

    2014-01-01

    Directed evolution is an approach that mimics natural evolution in the laboratory with the goal of modifying existing enzymatic activities or of generating new ones. The identification of mutants with desired properties involves the generation of genetic diversity coupled with a functional selection or screen. Genetic diversity can be generated using PCR or using in vivo methods such as chemical mutagenesis or error-prone replication of the desired sequence in a mutator strain. In vivo mutagenesis methods facilitate iterative selection because they do not require cloning, but generally produce a low mutation density with mutations not restricted to specific genes or areas within a gene. For this reason, this approach is typically used to generate new biochemical properties when large numbers of mutants can be screened or selected. Here we describe protocols for an advanced in vivo mutagenesis method that is based on error-prone replication of a ColE1 plasmid bearing the gene of interest. Compared to other in vivo mutagenesis methods, this plasmid-targeted approach allows increased mutation loads and facilitates iterative selection approaches. We also describe the mutation spectrum for this mutagenesis methodology in detail, and, using cycle 3 GFP as a target for mutagenesis, we illustrate the phenotypic diversity that can be generated using our method. In sum, error-prone Pol I replication is a mutagenesis method that is ideally suited for the evolution of new biochemical activities when a functional selection is available.

  12. WISC-R Examiner Errors: Cause for Concern.

    ERIC Educational Resources Information Center

    Slate, John R.; Chick, David

    1989-01-01

    Clinical psychology graduate students (N=14) administered Wechsler Intelligence Scale for Children-Revised. Found numerous scoring and mechanical errors that influenced full-scale intelligence quotient scores on two-thirds of protocols. Particularly prone to error were Verbal subtests of Vocabulary, Comprehension, and Similarities. Noted specific…

  13. Adaptive Constructive Processes and the Future of Memory

    ERIC Educational Resources Information Center

    Schacter, Daniel L.

    2012-01-01

    Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes…

  14. Error-proneness as a handicap signal.

    PubMed

    De Jaegher, Kris

    2003-09-21

    This paper describes two discrete signalling models in which the error-proneness of signals can serve as a handicap signal. In the first model, the direct handicap of sending a high-quality signal is not large enough to assure that a low-quality signaller will not send it. However, if the receiver sometimes mistakes a high-quality signal for a low-quality one, then there is an indirect handicap to sending a high-quality signal. The total handicap of sending such a signal may then still be such that a low-quality signaller would not want to send it. In the second model, there is no direct handicap of sending signals, so that nothing would seem to stop a signaller from always sending a high-quality signal. However, the receiver sometimes fails to detect signals, and this causes an indirect handicap of sending a high-quality signal that still stops the low-quality signaller of sending such a signal. The conditions for honesty are that the probability of an error of detection is higher for a high-quality than for a low-quality signal, and that the signaller who does not detect a signal adopts a response that is bad to the signaller. In both our models, we thus obtain the result that signal accuracy should not lie above a certain level in order for honest signalling to be possible. Moreover, we show that the maximal accuracy that can be achieved is higher the lower the degree of conflict between signaller and receiver. As well, we show that it is the conditions for honest signalling that may be constraining signal accuracy, rather than the signaller trying to make honest signals as effective as possible given receiver psychology, or the signaller adapting the accuracy of honest signals depending on his interests.

  15. The Significance of the Record Length in Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Senarath, S. U.

    2013-12-01

    Of all of the potential natural hazards, flood is the most costly in many regions of the world. For example, floods cause over a third of Europe's average annual catastrophe losses and affect about two thirds of the people impacted by natural catastrophes. Increased attention is being paid to determining flow estimates associated with pre-specified return periods so that flood-prone areas can be adequately protected against floods of particular magnitudes or return periods. Flood frequency analysis, which is conducted by using an appropriate probability density function that fits the observed annual maximum flow data, is frequently used for obtaining these flow estimates. Consequently, flood frequency analysis plays an integral role in determining the flood risk in flood prone watersheds. A long annual maximum flow record is vital for obtaining accurate estimates of discharges associated with high return period flows. However, in many areas of the world, flood frequency analysis is conducted with limited flow data or short annual maximum flow records. These inevitably lead to flow estimates that are subject to error. This is especially the case with high return period flow estimates. In this study, several statistical techniques are used to identify errors caused by short annual maximum flow records. The flow estimates used in the error analysis are obtained by fitting a log-Pearson III distribution to the flood time-series. These errors can then be used to better evaluate the return period flows in data limited streams. The study findings, therefore, have important implications for hydrologists, water resources engineers and floodplain managers.

  16. Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Shijun; Yao Jianhua; Liu Jiamin

    Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice--Once supine and once prone--to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined bymore » the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27{+-}52.97 to 14.98 mm{+-}11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline.« less

  17. Externalizing psychopathology and gain-loss feedback in a simulated gambling task: dissociable components of brain response revealed by time-frequency analysis.

    PubMed

    Bernat, Edward M; Nelson, Lindsay D; Steele, Vaughn R; Gehring, William J; Patrick, Christopher J

    2011-05-01

    Externalizing is a broad construct that reflects propensity toward a variety of impulse control problems, including antisocial personality disorder and substance use disorders. Two event-related potential responses known to be reduced among individuals high in externalizing proneness are the P300, which reflects postperceptual processing of a stimulus, and the error-related negativity (ERN), which indexes performance monitoring based on endogenous representations. In the current study, the authors used a simulated gambling task to examine the relation between externalizing proneness and the feedback-related negativity (FRN), a brain response that indexes performance monitoring related to exogenous cues, which is thought to be highly related to the ERN. Time-frequency (TF) analysis was used to disentangle the FRN from the accompanying P300 response to feedback cues by parsing the overall feedback-locked potential into distinctive theta (4-7 Hz) and delta (<3 Hz) TF components. Whereas delta-P300 amplitude was reduced among individuals high in externalizing proneness, theta-FRN response was unrelated to externalizing. These findings suggest that in contrast with previously reported deficits in endogenously based performance monitoring (as indexed by the ERN), individuals prone to externalizing problems show intact monitoring of exogenous cues (as indexed by the FRN). The results also contribute to a growing body of evidence indicating that the P300 is attenuated across a broad range of task conditions in high-externalizing individuals.

  18. Dynamic power scheduling system for JPEG2000 delivery over wireless networks

    NASA Astrophysics Data System (ADS)

    Martina, Maurizio; Vacca, Fabrizio

    2003-06-01

    Third generation mobile terminals diffusion is encouraging the development of new multimedia based applications. The reliable transmission of audiovisual content will gain major interest being one of the most valuable services. Nevertheless, mobile scenario is severely power constrained: high compression ratios and refined energy management strategies are highly advisable. JPEG2000 as the source encoding stage assures excellent performance with extremely good visual quality. However the limited power budged imposes to limit the computational effort in order to save as much power as possible. Starting from an error prone environment, as the wireless one, high error-resilience features need to be employed. This paper tries to investigate the trade-off between quality and power in such a challenging environment.

  19. An Improved Unsupervised Image Segmentation Evaluation Approach Based on - and Over-Segmentation Aware

    NASA Astrophysics Data System (ADS)

    Su, Tengfei

    2018-04-01

    In this paper, an unsupervised evaluation scheme for remote sensing image segmentation is developed. Based on a method called under- and over-segmentation aware (UOA), the new approach is improved by overcoming the defect in the part of estimating over-segmentation error. Two cases of such error-prone defect are listed, and edge strength is employed to devise a solution to this issue. Two subsets of high resolution remote sensing images were used to test the proposed algorithm, and the experimental results indicate its superior performance, which is attributed to its improved OSE detection model.

  20. The importance of robust error control in data compression applications

    NASA Technical Reports Server (NTRS)

    Woolley, S. I.

    1993-01-01

    Data compression has become an increasingly popular option as advances in information technology have placed further demands on data storage capabilities. With compression ratios as high as 100:1 the benefits are clear; however, the inherent intolerance of many compression formats to error events should be given careful consideration. If we consider that efficiently compressed data will ideally contain no redundancy, then the introduction of a channel error must result in a change of understanding from that of the original source. While the prefix property of codes such as Huffman enables resynchronisation, this is not sufficient to arrest propagating errors in an adaptive environment. Arithmetic, Lempel-Ziv, discrete cosine transform (DCT) and fractal methods are similarly prone to error propagating behaviors. It is, therefore, essential that compression implementations provide sufficient combatant error control in order to maintain data integrity. Ideally, this control should be derived from a full understanding of the prevailing error mechanisms and their interaction with both the system configuration and the compression schemes in use.

  1. Precise and heritable genome editing in evolutionarily diverse nematodes using TALENs and CRISPR/Cas9 to engineer insertions and deletions.

    PubMed

    Lo, Te-Wen; Pickle, Catherine S; Lin, Steven; Ralston, Edward J; Gurling, Mark; Schartner, Caitlin M; Bian, Qian; Doudna, Jennifer A; Meyer, Barbara J

    2013-10-01

    Exploitation of custom-designed nucleases to induce DNA double-strand breaks (DSBs) at genomic locations of choice has transformed our ability to edit genomes, regardless of their complexity. DSBs can trigger either error-prone repair pathways that induce random mutations at the break sites or precise homology-directed repair pathways that generate specific insertions or deletions guided by exogenously supplied DNA. Prior editing strategies using site-specific nucleases to modify the Caenorhabditis elegans genome achieved only the heritable disruption of endogenous loci through random mutagenesis by error-prone repair. Here we report highly effective strategies using TALE nucleases and RNA-guided CRISPR/Cas9 nucleases to induce error-prone repair and homology-directed repair to create heritable, precise insertion, deletion, or substitution of specific DNA sequences at targeted endogenous loci. Our robust strategies are effective across nematode species diverged by 300 million years, including necromenic nematodes (Pristionchus pacificus), male/female species (Caenorhabditis species 9), and hermaphroditic species (C. elegans). Thus, genome-editing tools now exist to transform nonmodel nematode species into genetically tractable model organisms. We demonstrate the utility of our broadly applicable genome-editing strategies by creating reagents generally useful to the nematode community and reagents specifically designed to explore the mechanism and evolution of X chromosome dosage compensation. By developing an efficient pipeline involving germline injection of nuclease mRNAs and single-stranded DNA templates, we engineered precise, heritable nucleotide changes both close to and far from DSBs to gain or lose genetic function, to tag proteins made from endogenous genes, and to excise entire loci through targeted FLP-FRT recombination.

  2. Identification and correction of systematic error in high-throughput sequence data

    PubMed Central

    2011-01-01

    Background A feature common to all DNA sequencing technologies is the presence of base-call errors in the sequenced reads. The implications of such errors are application specific, ranging from minor informatics nuisances to major problems affecting biological inferences. Recently developed "next-gen" sequencing technologies have greatly reduced the cost of sequencing, but have been shown to be more error prone than previous technologies. Both position specific (depending on the location in the read) and sequence specific (depending on the sequence in the read) errors have been identified in Illumina and Life Technology sequencing platforms. We describe a new type of systematic error that manifests as statistically unlikely accumulations of errors at specific genome (or transcriptome) locations. Results We characterize and describe systematic errors using overlapping paired reads from high-coverage data. We show that such errors occur in approximately 1 in 1000 base pairs, and that they are highly replicable across experiments. We identify motifs that are frequent at systematic error sites, and describe a classifier that distinguishes heterozygous sites from systematic error. Our classifier is designed to accommodate data from experiments in which the allele frequencies at heterozygous sites are not necessarily 0.5 (such as in the case of RNA-Seq), and can be used with single-end datasets. Conclusions Systematic errors can easily be mistaken for heterozygous sites in individuals, or for SNPs in population analyses. Systematic errors are particularly problematic in low coverage experiments, or in estimates of allele-specific expression from RNA-Seq data. Our characterization of systematic error has allowed us to develop a program, called SysCall, for identifying and correcting such errors. We conclude that correction of systematic errors is important to consider in the design and interpretation of high-throughput sequencing experiments. PMID:22099972

  3. A comparative study of set up variations and bowel volumes in supine versus prone positions of patients treated with external beam radiation for carcinoma rectum.

    PubMed

    Rajeev, K R; Menon, Smrithy S; Beena, K; Holla, Raghavendra; Kumar, R Rajaneesh; Dinesh, M

    2014-01-01

    A prospective study was undertaken to evaluate the influence of patient positioning on the set up variations to determine the planning target volume (PTV) margins and to evaluate the clinical relevance volume assessment of the small bowel (SB) within the irradiated volume. During the period of months from December 2011 to April 2012, a computed tomography (CT) scan was done either in supine position or in prone position using a belly board (BB) for 20 consecutive patients. All the patients had histologically proven rectal cancer and received either post- or pre-operative pelvic irradiation. Using a three-dimensional planning system, the dose-volume histogram for SB was defined in each axial CT slice. Total dose was 46-50 Gy (2 Gy/fraction), delivered using the 4-field box technique. The set up variation of the study group was assessed from the data received from the electronic portal imaging device in the linear accelerator. The shift along X, Y, and Z directions were noted. Both systematic and random errors were calculated and using both these values the PTV margin was calculated. The systematic errors of patients treated in the supine position were 0.87 (X-mm), 0.66 (Y-mm), 1.6 (Z-mm) and in the prone position were 1.3 (X-mm), 0.59 (Y-mm), 1.17 (Z-mm). The random errors of patients treated in the supine positions were 1.81 (X-mm), 1.73 (Y-mm), 1.83 (Z-mm) and in prone position were 2.02 (X-mm), 1.21 (Y-mm), 3.05 (Z-mm). The calculated PTV margins in the supine position were 3.45 (X-mm), 2.87 (Y-mm), 5.31 (Z-mm) and in the prone position were 4.91 (X-mm), 2.32 (Y-mm), 5.08 (Z-mm). The mean volume of the peritoneal cavity was 648.65 cm 3 in the prone position and 1197.37 cm 3 in the supine position. The prone position using BB device was more effective in reducing irradiated SB volume in rectal cancer patients. There were no significant variations in the daily set up for patients treated in both supine and prone positions.

  4. Somatic stem cells and the kinetics of mutagenesis and carcinogenesis

    PubMed Central

    Cairns, John

    2002-01-01

    There is now strong experimental evidence that epithelial stem cells arrange their sister chromatids at mitosis such that the same template DNA strands stay together through successive divisions; DNA labeled with tritiated thymidine in infancy is still present in the stem cells of adult mice even though these cells are incorporating (and later losing) bromodeoxyuridine [Potten, C. S., Owen, G., Booth, D. & Booth, C. (2002) J. Cell Sci.115, 2381–2388]. But a cell that preserves “immortal strands” will avoid the accumulation of replication errors only if it inhibits those pathways for DNA repair that involve potentially error-prone resynthesis of damaged strands, and this appears to be a property of intestinal stem cells because they are extremely sensitive to the lethal effects of agents that damage DNA. It seems that the combination, in the stem cell, of immortal strands and the choice of death rather than error-prone repair makes epithelial stem cell systems resistant to short exposures to DNA-damaging agents, because the stem cell accumulates few if any errors, and any errors made by the daughters are destined to be discarded. This paper discusses these issues and shows that they lead to a model that explains the strange kinetics of mutagenesis and carcinogenesis in adult mammalian tissues. Coincidentally, the model also can explain why cancers arise even though the spontaneous mutation rate of differentiated mammalian cells is not high enough to generate the multiple mutations needed to form a cancer and why loss of nucleotide-excision repair does not significantly increase the frequency of the common internal cancers. PMID:12149477

  5. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  6. An abstract language for specifying Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1986-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  7. Trust and the Compliance-Reliance Paradigm: The Effects of Risk, Error Bias, and Reliability on Trust and Dependence.

    PubMed

    Chancey, Eric T; Bliss, James P; Yamani, Yusuke; Handley, Holly A H

    2017-05-01

    This study provides a theoretical link between trust and the compliance-reliance paradigm. We propose that for trust mediation to occur, the operator must be presented with a salient choice, and there must be an element of risk for dependence. Research suggests that false alarms and misses affect dependence via two independent processes, hypothesized as trust in signals and trust in nonsignals. These two trust types manifest in categorically different behaviors: compliance and reliance. Eighty-eight participants completed a primary flight task and a secondary signaling system task. Participants evaluated their trust according to the informational bases of trust: performance, process, and purpose. Participants were in a high- or low-risk group. Signaling systems varied by reliability (90%, 60%) within subjects and error bias (false alarm prone, miss prone) between subjects. False-alarm rate affected compliance but not reliance. Miss rate affected reliance but not compliance. Mediation analyses indicated that trust mediated the relationship between false-alarm rate and compliance. Bayesian mediation analyses favored evidence indicating trust did not mediate miss rate and reliance. Conditional indirect effects indicated that factors of trust mediated the relationship between false-alarm rate and compliance (i.e., purpose) and reliance (i.e., process) but only in the high-risk group. The compliance-reliance paradigm is not the reflection of two types of trust. This research could be used to update training and design recommendations that are based upon the assumption that trust causes operator responses regardless of error bias.

  8. One-step random mutagenesis by error-prone rolling circle amplification

    PubMed Central

    Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi

    2004-01-01

    In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684

  9. Producing good font attribute determination using error-prone information

    NASA Astrophysics Data System (ADS)

    Cooperman, Robert

    1997-04-01

    A method to provide estimates of font attributes in an OCR system, using detectors of individual attributes that are error-prone. For an OCR system to preserve the appearance of a scanned document, it needs accurate detection of font attributes. However, OCR environments have noise and other sources of errors, tending to make font attribute detection unreliable. Certain assumptions about font use can greatly enhance accuracy. Attributes such as boldness and italics are more likely to change between neighboring words, while attributes such as serifness are less likely to change within the same paragraph. Furthermore, the document as a whole, tends to have a limited number of sets of font attributes. These assumptions allow a better use of context than the raw data, or what would be achieved by simpler methods that would oversmooth the data.

  10. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  11. Overconfidence across the psychosis continuum: a calibration approach.

    PubMed

    Balzan, Ryan P; Woodward, Todd S; Delfabbro, Paul; Moritz, Steffen

    2016-11-01

    An 'overconfidence in errors' bias has been consistently observed in people with schizophrenia relative to healthy controls, however, the bias is seldom found to be associated with delusional ideation. Using a more precise confidence-accuracy calibration measure of overconfidence, the present study aimed to explore whether the overconfidence bias is greater in people with higher delusional ideation. A sample of 25 participants with schizophrenia and 50 non-clinical controls (25 high- and 25 low-delusion-prone) completed 30 difficult trivia questions (accuracy <75%); 15 'half-scale' items required participants to indicate their level of confidence for accuracy, and the remaining 'confidence-range' items asked participants to provide lower/upper bounds in which they were 80% confident the true answer lay within. There was a trend towards higher overconfidence for half-scale items in the schizophrenia and high-delusion-prone groups, which reached statistical significance for confidence-range items. However, accuracy was particularly low in the two delusional groups and a significant negative correlation between clinical delusional scores and overconfidence was observed for half-scale items within the schizophrenia group. Evidence in support of an association between overconfidence and delusional ideation was therefore mixed. Inflated confidence-accuracy miscalibration for the two delusional groups may be better explained by their greater unawareness of their underperformance, rather than representing genuinely inflated overconfidence in errors.

  12. The use of modified and non-natural nucleotides provide unique insights into pro-mutagenic replication catalyzed by polymerase eta

    PubMed Central

    Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J.; Berdis, Anthony J.

    2016-01-01

    This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. PMID:26717984

  13. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  14. Using Block-local Atomicity to Detect Stale-value Concurrency Errors

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus; Biere, Armin

    2004-01-01

    Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.

  15. Situating Student Errors: Linguistic-to-Algebra Translation Errors

    ERIC Educational Resources Information Center

    Adu-Gyamfi, Kwaku; Bossé, Michael J.; Chandler, Kayla

    2015-01-01

    While it is well recognized that students are prone to difficulties when performing linguistic-to-algebra translations, the nature of students' difficulties remain an issue of contention. Moreover, the literature indicates that these difficulties are not easily remediated by domain-specific instruction. Some have opined that this is the case…

  16. Errors of Inference in Structural Equation Modeling

    ERIC Educational Resources Information Center

    McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.

    2007-01-01

    Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…

  17. The Concept of Accident Proneness: A Review

    PubMed Central

    Froggatt, Peter; Smiley, James A.

    1964-01-01

    The term accident proneness was coined by psychological research workers in 1926. Since then its concept—that certain individuals are always more likely than others to sustain accidents, even though exposed to equal risk—has been questioned but seldom seriously challenged. This article describes much of the work and theory on which this concept is based, details the difficulties encountered in obtaining valid information and the interpretative errors that can arise from the examination of imperfect data, and explains why accident proneness became so readily accepted as an explanation of the facts. A recent hypothesis of accident causation, namely that a person's accident liability may vary from time to time, is outlined, and the respective abilities of this and of accident proneness to accord with data from the more reliable literature are examined. The authors conclude that the hypothesis of individual variation in liability is more realistic and in better agreement with the data than is accident proneness. PMID:14106130

  18. The High Altitude Pollution Program (1976-1982).

    DTIC Science & Technology

    1984-01-01

    ground, where air pollution problems arise due to ground level emissions from, for example, automobiles and power plants) to about 25 km above the...downward and poleward. Near the ground, in areas such as cities prone to air pollution , ozone is produced by nitrogen dioxide photolysis and reaction...Spectrophotcmeter Total Ozone Measurement Errors caused by Interfering Absorbing Species Such as SO2, NO2 and Photochemically Produced 03 IN Polluted Air ," NOAA

  19. Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

    PubMed Central

    Wang, Shijun; Yao, Jianhua; Liu, Jiamin; Petrick, Nicholas; Van Uitert, Robert L.; Periaswamy, Senthil; Summers, Ronald M.

    2009-01-01

    Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice—Once supine and once prone—to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined by the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27±52.97 to 14.98 mm±11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline. PMID:20095272

  20. Structure-Function Analysis of Chloroplast Proteins via Random Mutagenesis Using Error-Prone PCR.

    PubMed

    Dumas, Louis; Zito, Francesca; Auroy, Pascaline; Johnson, Xenie; Peltier, Gilles; Alric, Jean

    2018-06-01

    Site-directed mutagenesis of chloroplast genes was developed three decades ago and has greatly advanced the field of photosynthesis research. Here, we describe a new approach for generating random chloroplast gene mutants that combines error-prone polymerase chain reaction of a gene of interest with chloroplast complementation of the knockout Chlamydomonas reinhardtii mutant. As a proof of concept, we targeted a 300-bp sequence of the petD gene that encodes subunit IV of the thylakoid membrane-bound cytochrome b 6 f complex. By sequencing chloroplast transformants, we revealed 149 mutations in the 300-bp target petD sequence that resulted in 92 amino acid substitutions in the 100-residue target subunit IV sequence. Our results show that this method is suited to the study of highly hydrophobic, multisubunit, and chloroplast-encoded proteins containing cofactors such as hemes, iron-sulfur clusters, and chlorophyll pigments. Moreover, we show that mutant screening and sequencing can be used to study photosynthetic mechanisms or to probe the mutational robustness of chloroplast-encoded proteins, and we propose that this method is a valuable tool for the directed evolution of enzymes in the chloroplast. © 2018 American Society of Plant Biologists. All rights reserved.

  1. Error-prone PCR mutation of Ls-EPSPS gene from Liriope spicata conferring to its enhanced glyphosate-resistance.

    PubMed

    Mao, Chanjuan; Xie, Hongjie; Chen, Shiguo; Valverde, Bernal E; Qiang, Sheng

    2017-09-01

    Liriope spicata (Thunb.) Lour has a unique LsEPSPS structure contributing to the highest-ever-recognized natural glyphosate tolerance. The transformed LsEPSPS confers increased glyphosate resistance to E. coli and A. thaliana. However, the increased glyphosate-resistance level is not high enough to be of commercial value. Therefore, LsEPSPS was subjected to error-prone PCR to screen mutant EPSPS genes capable of endowing higher resistance levels. A mutant designated as ELs-EPSPS having five mutated amino acids (37Val, 67Asn, 277Ser, 351Gly and 422Gly) was selected for its ability to confer improved resistance to glyphosate. Expression of ELs-EPSPS in recombinant E. coli BL21 (DE3) strains enhanced resistance to glyphosate in comparison to both the LsEPSPS-transformed and -untransformed controls. Furthermore, transgenic ELs-EPSPS A. thaliana was about 5.4 fold and 2-fold resistance to glyphosate compared with the wild-type and the Ls-EPSPS-transgenic plants, respectively. Therefore, the mutated ELs-EPSPS gene has potential value for has potential for the development of glyphosate-resistant crops. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Portable and Error-Free DNA-Based Data Storage.

    PubMed

    Yazdi, S M Hossein Tabatabaei; Gabrys, Ryan; Milenkovic, Olgica

    2017-07-10

    DNA-based data storage is an emerging nonvolatile memory technology of potentially unprecedented density, durability, and replication efficiency. The basic system implementation steps include synthesizing DNA strings that contain user information and subsequently retrieving them via high-throughput sequencing technologies. Existing architectures enable reading and writing but do not offer random-access and error-free data recovery from low-cost, portable devices, which is crucial for making the storage technology competitive with classical recorders. Here we show for the first time that a portable, random-access platform may be implemented in practice using nanopore sequencers. The novelty of our approach is to design an integrated processing pipeline that encodes data to avoid costly synthesis and sequencing errors, enables random access through addressing, and leverages efficient portable sequencing via new iterative alignment and deletion error-correcting codes. Our work represents the only known random access DNA-based data storage system that uses error-prone nanopore sequencers, while still producing error-free readouts with the highest reported information rate/density. As such, it represents a crucial step towards practical employment of DNA molecules as storage media.

  3. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective.

    PubMed

    Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-06-23

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  4. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective

    NASA Astrophysics Data System (ADS)

    Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-07-01

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  5. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography

    DTIC Science & Technology

    1980-03-01

    interpreting/smoothing data containing a significant percentage of gross errors, and thus is ideally suited for applications in automated image ... analysis where interpretation is based on the data provided by error-prone feature detectors. A major portion of the paper describes the application of

  6. Error-prone bypass of O6-methylguanine by DNA polymerase of Pseudomonas aeruginosa phage PaP1.

    PubMed

    Gu, Shiling; Xiong, Jingyuan; Shi, Ying; You, Jia; Zou, Zhenyu; Liu, Xiaoying; Zhang, Huidong

    2017-09-01

    O 6 -Methylguanine (O 6 -MeG) is highly mutagenic and is commonly found in DNA exposed to methylating agents, generally leads to G:C to A:T mutagenesis. To study DNA replication encountering O 6 -MeG by the DNA polymerase (gp90) of P. aeruginosa phage PaP1, we analyzed steady-state and pre-steady-state kinetics of nucleotide incorporation opposite O 6 -MeG by gp90 exo - . O 6 -MeG partially inhibited full-length extension by gp90 exo - . O 6 -MeG greatly reduces dNTP incorporation efficiency, resulting in 67-fold preferential error-prone incorporation of dTTP than dCTP. Gp90 exo - extends beyond T:O 6 -MeG 2-fold more efficiently than C:O 6 -MeG. Incorporation of dCTP opposite G and incorporation of dCTP or dTTP opposite O 6 -MeG show fast burst phases. The pre-steady-state incorporation efficiency (k pol /K d,dNTP ) is decreased in the order of dCTP:G>dTTP:O 6 -MeG>dCTP:O 6 -MeG. The presence of O 6 -MeG at template does not affect the binding affinity of polymerase to DNA but it weakened their binding in the presence of dCTP and Mg 2+ . Misincorporation of dTTP opposite O 6 -MeG further weakens the binding affinity of polymerase to DNA. The priority of dTTP incorporation opposite O 6 -MeG is originated from the fact that dTTP can induce a faster conformational change step and a faster chemical step than dCTP. This study reveals that gp90 bypasses O 6 -MeG in an error-prone manner and provides further understanding in DNA replication encountering mutagenic alkylation DNA damage for P. aeruginosa phage PaP1. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The assessment of science: the relative merits of post-publication review, the impact factor, and the number of citations.

    PubMed

    Eyre-Walker, Adam; Stoletzki, Nina

    2013-10-01

    The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative.

  8. The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations

    PubMed Central

    Eyre-Walker, Adam; Stoletzki, Nina

    2013-01-01

    The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative. PMID:24115908

  9. The use of modified and non-natural nucleotides provide unique insights into pro-mutagenic replication catalyzed by polymerase eta.

    PubMed

    Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J; Berdis, Anthony J

    2016-02-18

    This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. The preliminary SOL (Sizing and Optimization Language) reference manual

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1989-01-01

    The Sizing and Optimization Language, SOL, a high-level special-purpose computer language has been developed to expedite application of numerical optimization to design problems and to make the process less error-prone. This document is a reference manual for those wishing to write SOL programs. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler and runtime library routines. An overview of SOL appears in NASA TM 100565.

  11. Furfural-tolerant Zymomonas mobilis derived from error-prone PCR-based whole genome shuffling and their tolerant mechanism.

    PubMed

    Huang, Suzhen; Xue, Tingli; Wang, Zhiquan; Ma, Yuanyuan; He, Xueting; Hong, Jiefang; Zou, Shaolan; Song, Hao; Zhang, Minhua

    2018-04-01

    Furfural-tolerant strain is essential for the fermentative production of biofuels or chemicals from lignocellulosic biomass. In this study, Zymomonas mobilis CP4 was for the first time subjected to error-prone PCR-based whole genome shuffling, and the resulting mutants F211 and F27 that could tolerate 3 g/L furfural were obtained. The mutant F211 under various furfural stress conditions could rapidly grow when the furfural concentration reduced to 1 g/L. Meanwhile, the two mutants also showed higher tolerance to high concentration of glucose than the control strain CP4. Genome resequencing revealed that the F211 and F27 had 12 and 13 single-nucleotide polymorphisms. The activity assay demonstrated that the activity of NADH-dependent furfural reductase in mutant F211 and CP4 was all increased under furfural stress, and the activity peaked earlier in mutant than in control. Also, furfural level in the culture of F211 was also more rapidly decreased. These indicate that the increase in furfural tolerance of the mutants may be resulted from the enhanced NADH-dependent furfural reductase activity during early log phase, which could lead to an accelerated furfural detoxification process in mutants. In all, we obtained Z. mobilis mutants with enhanced furfural and high concentration of glucose tolerance, and provided valuable clues for the mechanism of furfural tolerance and strain development.

  12. SU-E-J-21: Setup Variability of Colorectal Cancer Patients Treated in the Prone Position and Dosimetric Comparison with the Supine Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, A; Foster, J; Chu, W

    2015-06-15

    Purpose: Many cancer centers treat colorectal patients in the prone position on a belly board to minimize dose to the small bowel. That may potentially Result in patient setup instability with corresponding impact on dose delivery accuracy for highly conformal techniques such as IMRT/VMAT. Two aims of this work are 1) to investigate setup accuracy of rectum patients treated in the prone position on a belly board using CBCT and 2) to evaluate dosimetric impact on bladder and small bowel of treating rectum patients in supine vs. prone position. Methods: For the setup accuracy study, 10 patients were selected. Weeklymore » CBCTs were acquired and matched to bone. The CBCT-determined shifts were recorded. For the dosimetric study, 7 prone-setup patients and 7 supine-setup patients were randomly selected from our clinical database. Various clinically relevant dose volume histogram values were recorded for the small bowel and bladder. Results: The CBCT-determined rotational shifts had a wide variation. For the dataset acquired at the time of this writing, the ranges of rotational setup errors for pitch, roll, and yaw were [−3.6° 4.7°], [−4.3° 3.2°], and [−1.4° 1.4°]. For the dosimetric study: the small bowel V(45Gy) and mean dose for the prone position was 5.6±12.1% and 18.4±6.2Gy (ranges indicate standard deviations); for the supine position the corresponding dose values were 12.9±15.8% and 24.7±8.8Gy. For the bladder, the V(30Gy) and mean dose for prone position were 68.7±12.7% and 38.4±3.3Gy; for supine position these dose values were 77.1±13.7% and 40.7±3.1Gy. Conclusion: There is evidence of significant rotational instability in the prone position. The OAR dosimetry study indicates that there are some patients that may still benefit from the prone position, though many patients can be safely treated supine.« less

  13. A false positive food chain error associated with a generic predator gut content ELISA

    USDA-ARS?s Scientific Manuscript database

    Conventional prey-specific gut content ELISA and PCR assays are useful for identifying predators of insect pests in nature. However, these assays are prone to yielding certain types of food chain errors. For instance, it is possible that prey remains can pass through the food chain as the result of ...

  14. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly.

    PubMed

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-11-26

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm.

  15. The OPL Access Control Policy Language

    NASA Astrophysics Data System (ADS)

    Alm, Christopher; Wolf, Ruben; Posegga, Joachim

    Existing policy languages suffer from a limited ability of directly and elegantly expressing high-level access control principles such as history-based separation of duty [22], binding of duty [26], context constraints [24], Chinese wall properties [10], and obligations [20]. It is often difficult to extend a language in order to retrofit these features once required or it is necessary to use complicated and complex language constructs to express such concepts. The latter, however, is cumbersome and error-prone for humans dealing with policy administration.

  16. Systematic feasibility analysis of a quantitative elasticity estimation for breast anatomy using supine/prone patient postures.

    PubMed

    Hasse, Katelyn; Neylon, John; Sheng, Ke; Santhanam, Anand P

    2016-03-01

    Breast elastography is a critical tool for improving the targeted radiotherapy treatment of breast tumors. Current breast radiotherapy imaging protocols only involve prone and supine CT scans. There is a lack of knowledge on the quantitative accuracy with which breast elasticity can be systematically measured using only prone and supine CT datasets. The purpose of this paper is to describe a quantitative elasticity estimation technique for breast anatomy using only these supine/prone patient postures. Using biomechanical, high-resolution breast geometry obtained from CT scans, a systematic assessment was performed in order to determine the feasibility of this methodology for clinically relevant elasticity distributions. A model-guided inverse analysis approach is presented in this paper. A graphics processing unit (GPU)-based linear elastic biomechanical model was employed as a forward model for the inverse analysis with the breast geometry in a prone position. The elasticity estimation was performed using a gradient-based iterative optimization scheme and a fast-simulated annealing (FSA) algorithm. Numerical studies were conducted to systematically analyze the feasibility of elasticity estimation. For simulating gravity-induced breast deformation, the breast geometry was anchored at its base, resembling the chest-wall/breast tissue interface. Ground-truth elasticity distributions were assigned to the model, representing tumor presence within breast tissue. Model geometry resolution was varied to estimate its influence on convergence of the system. A priori information was approximated and utilized to record the effect on time and accuracy of convergence. The role of the FSA process was also recorded. A novel error metric that combined elasticity and displacement error was used to quantify the systematic feasibility study. For the authors' purposes, convergence was set to be obtained when each voxel of tissue was within 1 mm of ground-truth deformation. The authors' analyses showed that a ∼97% model convergence was systematically observed with no-a priori information. Varying the model geometry resolution showed no significant accuracy improvements. The GPU-based forward model enabled the inverse analysis to be completed within 10-70 min. Using a priori information about the underlying anatomy, the computation time decreased by as much as 50%, while accuracy improved from 96.81% to 98.26%. The use of FSA was observed to allow the iterative estimation methodology to converge more precisely. By utilizing a forward iterative approach to solve the inverse elasticity problem, this work indicates the feasibility and potential of the fast reconstruction of breast tissue elasticity using supine/prone patient postures.

  17. Evaluation of exome variants using the Ion Proton Platform to sequence error-prone regions.

    PubMed

    Seo, Heewon; Park, Yoomi; Min, Byung Joo; Seo, Myung Eui; Kim, Ju Han

    2017-01-01

    The Ion Proton sequencer from Thermo Fisher accurately determines sequence variants from target regions with a rapid turnaround time at a low cost. However, misleading variant-calling errors can occur. We performed a systematic evaluation and manual curation of read-level alignments for the 675 ultrarare variants reported by the Ion Proton sequencer from 27 whole-exome sequencing data but that are not present in either the 1000 Genomes Project and the Exome Aggregation Consortium. We classified positive variant calls into 393 highly likely false positives, 126 likely false positives, and 156 likely true positives, which comprised 58.2%, 18.7%, and 23.1% of the variants, respectively. We identified four distinct error patterns of variant calling that may be bioinformatically corrected when using different strategies: simplicity region, SNV cluster, peripheral sequence read, and base inversion. Local de novo assembly successfully corrected 201 (38.7%) of the 519 highly likely or likely false positives. We also demonstrate that the two sequencing kits from Thermo Fisher (the Ion PI Sequencing 200 kit V3 and the Ion PI Hi-Q kit) exhibit different error profiles across different error types. A refined calling algorithm with better polymerase may improve the performance of the Ion Proton sequencing platform.

  18. Last Year Your Answer Was… : The Impact of Dependent Interviewing Wording and Survey Factors on Reporting of Change

    ERIC Educational Resources Information Center

    Al Baghal, Tarek

    2017-01-01

    Prior studies suggest memories are potentially error prone. Proactive dependent interviewing (PDI) is a possible method to reduce errors in reports of change in longitudinal studies, reminding respondents of previous answers while asking if there has been any change since the last survey. However, little research has been conducted on the impact…

  19. The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.

    ERIC Educational Resources Information Center

    System Development Corp., Falls Church, VA.

    An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…

  20. Estimation of a cover-type change matrix from error-prone data

    Treesearch

    Steen Magnussen

    2009-01-01

    Coregistration and classification errors seriously compromise per-pixel estimates of land cover change. A more robust estimation of change is proposed in which adjacent pixels are grouped into 3x3 clusters and treated as a unit of observation. A complete change matrix is recovered in a two-step process. The diagonal elements of a change matrix are recovered from...

  1. Influence of Errors in Tactile Sensors on Some High Level Parameters Used for Manipulation with Robotic Hands.

    PubMed

    Sánchez-Durán, José A; Hidalgo-López, José A; Castellanos-Ramos, Julián; Oballe-Peinado, Óscar; Vidal-Verdú, Fernando

    2015-08-19

    Tactile sensors suffer from many types of interference and errors like crosstalk, non-linearity, drift or hysteresis, therefore calibration should be carried out to compensate for these deviations. However, this procedure is difficult in sensors mounted on artificial hands for robots or prosthetics for instance, where the sensor usually bends to cover a curved surface. Moreover, the calibration procedure should be repeated often because the correction parameters are easily altered by time and surrounding conditions. Furthermore, this intensive and complex calibration could be less determinant, or at least simpler. This is because manipulation algorithms do not commonly use the whole data set from the tactile image, but only a few parameters such as the moments of the tactile image. These parameters could be changed less by common errors and interferences, or at least their variations could be in the order of those caused by accepted limitations, like reduced spatial resolution. This paper shows results from experiments to support this idea. The experiments are carried out with a high performance commercial sensor as well as with a low-cost error-prone sensor built with a common procedure in robotics.

  2. High speed fault tolerant secure communication for muon chamber using FPGA based GBTx emulator

    NASA Astrophysics Data System (ADS)

    Sau, Suman; Mandal, Swagata; Saini, Jogender; Chakrabarti, Amlan; Chattopadhyay, Subhasis

    2015-12-01

    The Compressed Baryonic Matter (CBM) experiment is a part of the Facility for Antiproton and Ion Research (FAIR) in Darmstadt at the GSI. The CBM experiment will investigate the highly compressed nuclear matter using nucleus-nucleus collisions. This experiment will examine lieavy-ion collisions in fixed target geometry and will be able to measure hadrons, electrons and muons. CBM requires precise time synchronization, compact hardware, radiation tolerance, self-triggered front-end electronics, efficient data aggregation schemes and capability to handle high data rate (up to several TB/s). As a part of the implementation of read out chain of Muon Cliamber(MUCH) [1] in India, we have tried to implement FPGA based emulator of GBTx in India. GBTx is a radiation tolerant ASIC that can be used to implement multipurpose high speed bidirectional optical links for high-energy physics (HEP) experiments and is developed by CERN. GBTx will be used in highly irradiated area and more prone to be affected by multi bit error. To mitigate this effect instead of single bit error correcting RS code we have used two bit error correcting (15, 7) BCH code. It will increase the redundancy which in turn increases the reliability of the coded data. So the coded data will be less prone to be affected by noise due to radiation. The data will go from detector to PC through multiple nodes through the communication channel. The computing resources are connected to a network which can be accessed by authorized person to prevent unauthorized data access which might happen by compromising the network security. Thus data encryption is essential. In order to make the data communication secure, advanced encryption standard [2] (AES - a symmetric key cryptography) and RSA [3], [4] (asymmetric key cryptography) are used after the channel coding. We have implemented GBTx emulator on two Xilinx Kintex-7 boards (KC705). One will act as transmitter and other will act as receiver and they are connected through optical fiber through small form-factor pluggable (SFP) port. We have tested the setup in the runtime environment using Xilinx Cliipscope Pro Analyzer. We also measure the resource utilization, throughput., power optimization of implemented design.

  3. Relationships between trait impulsivity and cognitive control: the effect of attention switching on response inhibition and conflict resolution.

    PubMed

    Leshem, Rotem

    2016-02-01

    This study examined the relationship between trait impulsivity and cognitive control, as measured by the Barratt Impulsiveness Scale (BIS) and a focused attention dichotic listening to words task, respectively. In the task, attention was manipulated in two attention conditions differing in their cognitive control demands: one in which attention was directed to one ear at a time for a whole block of trials (blocked condition) and another in which attention was switched pseudo-randomly between the two ears from trial to trial (mixed condition). Results showed that high impulsivity participants exhibited more false alarm and intrusion errors as well as a lesser ability to distinguish between stimuli in the mixed condition, as compared to low impulsivity participants. In the blocked condition, the performance levels of the two groups were comparable with respect to these measures. In addition, total BIS scores were correlated with intrusions and laterality index in the mixed but not the blocked condition. The findings suggest that high impulsivity individuals may be less prone to attentional difficulties when cognitive load is relatively low. In contrast, when attention switching is involved, high impulsivity is associated with greater difficulty in inhibiting responses and resolving cognitive conflict than is low impulsivity, as reflected in error-prone information processing. The conclusion is that trait impulsivity in a non-clinical population is manifested more strongly when attention switching is required than during maintained attention. This may have important implications for the conceptualization and treatment of impulsivity in both non-clinical and clinical populations.

  4. Java Performance for Scientific Applications on LLNL Computer Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapfer, C; Wissink, A

    2002-05-10

    Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less

  5. Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.

    PubMed

    Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L

    2018-05-01

    Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.

  6. STARS Proceedings (3-4 December 1991)

    DTIC Science & Technology

    1991-12-04

    PROJECT PROCESS OBJECTIVES & ASSOCIATED METRICS: Prioritize ECPs: complexity & error-history measures 0 Make vs Buy decisions: Effort & Quality (or...history measures, error- proneness and past histories of trouble with particular modules are very useful measures. Make vs Buy decisions: Does the...Effort offset the gain in Quality relative to buy ... Effort and Quality (or defect rate) histories give helpful indications of how to make this decision

  7. Defense Mapping Agency (DMA) Raster-to-Vector Analysis

    DTIC Science & Technology

    1984-11-30

    model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected

  8. Inducible DNA-repair systems in yeast: competition for lesions.

    PubMed

    Mitchel, R E; Morrison, D P

    1987-03-01

    DNA lesions may be recognized and repaired by more than one DNA-repair process. If two repair systems with different error frequencies have overlapping lesion specificity and one or both is inducible, the resulting variable competition for the lesions can change the biological consequences of these lesions. This concept was demonstrated by observing mutation in yeast cells (Saccharomyces cerevisiae) exposed to combinations of mutagens under conditions which influenced the induction of error-free recombinational repair or error-prone repair. Total mutation frequency was reduced in a manner proportional to the dose of 60Co-gamma- or 254 nm UV radiation delivered prior to or subsequent to an MNNG exposure. Suppression was greater per unit radiation dose in cells gamma-irradiated in O2 as compared to N2. A rad3 (excision-repair) mutant gave results similar to wild-type but mutation in a rad52 (rec-) mutant exposed to MNNG was not suppressed by radiation. Protein-synthesis inhibition with heat shock or cycloheximide indicated that it was the mutation due to MNNG and not that due to radiation which had changed. These results indicate that MNNG lesions are recognized by both the recombinational repair system and the inducible error-prone system, but that gamma-radiation induction of error-free recombinational repair resulted in increased competition for the lesions, thereby reducing mutation. Similarly, gamma-radiation exposure resulted in a radiation dose-dependent reduction in mutation due to MNU, EMS, ENU and 8-MOP + UVA, but no reduction in mutation due to MMS. These results suggest that the number of mutational MMS lesions recognizable by the recombinational repair system must be very small relative to those produced by the other agents. MNNG induction of the inducible error-prone systems however, did not alter mutation frequencies due to ENU or MMS exposure but, in contrast to radiation, increased the mutagenic effectiveness of EMS. These experiments demonstrate that in this lower eukaryote, mutagen exposure does not necessarily result in a fixed risk of mutation, but that the risk can be markedly influenced by a variety of external stimuli including heat shock or exposure to other mutagens.

  9. Financial and clinical governance implications of clinical coding accuracy in neurosurgery: a multidisciplinary audit.

    PubMed

    Haliasos, N; Rezajooi, K; O'neill, K S; Van Dellen, J; Hudovsky, Anita; Nouraei, Sar

    2010-04-01

    Clinical coding is the translation of documented clinical activities during an admission to a codified language. Healthcare Resource Groupings (HRGs) are derived from coding data and are used to calculate payment to hospitals in England, Wales and Scotland and to conduct national audit and benchmarking exercises. Coding is an error-prone process and an understanding of its accuracy within neurosurgery is critical for financial, organizational and clinical governance purposes. We undertook a multidisciplinary audit of neurosurgical clinical coding accuracy. Neurosurgeons trained in coding assessed the accuracy of 386 patient episodes. Where clinicians felt a coding error was present, the case was discussed with an experienced clinical coder. Concordance between the initial coder-only clinical coding and the final clinician-coder multidisciplinary coding was assessed. At least one coding error occurred in 71/386 patients (18.4%). There were 36 diagnosis and 93 procedure errors and in 40 cases, the initial HRG changed (10.4%). Financially, this translated to pound111 revenue-loss per patient episode and projected to pound171,452 of annual loss to the department. 85% of all coding errors were due to accumulation of coding changes that occurred only once in the whole data set. Neurosurgical clinical coding is error-prone. This is financially disadvantageous and with the coding data being the source of comparisons within and between departments, coding inaccuracies paint a distorted picture of departmental activity and subspecialism in audit and benchmarking. Clinical engagement improves accuracy and is encouraged within a clinical governance framework.

  10. Clustered Mutation Signatures Reveal that Error-Prone DNA Repair Targets Mutations to Active Genes.

    PubMed

    Supek, Fran; Lehner, Ben

    2017-07-27

    Many processes can cause the same nucleotide change in a genome, making the identification of the mechanisms causing mutations a difficult challenge. Here, we show that clustered mutations provide a more precise fingerprint of mutagenic processes. Of nine clustered mutation signatures identified from >1,000 tumor genomes, three relate to variable APOBEC activity and three are associated with tobacco smoking. An additional signature matches the spectrum of translesion DNA polymerase eta (POLH). In lymphoid cells, these mutations target promoters, consistent with AID-initiated somatic hypermutation. In solid tumors, however, they are associated with UV exposure and alcohol consumption and target the H3K36me3 chromatin of active genes in a mismatch repair (MMR)-dependent manner. These regions normally have a low mutation rate because error-free MMR also targets H3K36me3 chromatin. Carcinogens and error-prone repair therefore redistribute mutations to the more important regions of the genome, contributing a substantial mutation load in many tumors, including driver mutations. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A filtering method to generate high quality short reads using illumina paired-end technology.

    PubMed

    Eren, A Murat; Vineis, Joseph H; Morrison, Hilary G; Sogin, Mitchell L

    2013-01-01

    Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.

  12. A hydrostatic weighing method using total lung capacity and a small tank.

    PubMed Central

    Warner, J G; Yeater, R; Sherwood, L; Weber, K

    1986-01-01

    The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing. PMID:3697596

  13. A hydrostatic weighing method using total lung capacity and a small tank.

    PubMed

    Warner, J G; Yeater, R; Sherwood, L; Weber, K

    1986-03-01

    The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing.

  14. ASSIST: User's manual

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1986-01-01

    Semi-Markov models can be used to compute the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in a model of a complex system can be devastingly tedious and error-prone. The ASSIST program allows the user to describe the semi-Markov model in a high-level language. Instead of specifying the individual states of the model, the user specifies the rules governing the behavior of the system and these are used by ASSIST to automatically generate the model. The ASSIST program is described and illustrated by examples.

  15. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    PubMed

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  16. Joint estimation over multiple individuals improves behavioural state inference from animal movement data.

    PubMed

    Jonsen, Ian

    2016-02-08

    State-space models provide a powerful way to scale up inference of movement behaviours from individuals to populations when the inference is made across multiple individuals. Here, I show how a joint estimation approach that assumes individuals share identical movement parameters can lead to improved inference of behavioural states associated with different movement processes. I use simulated movement paths with known behavioural states to compare estimation error between nonhierarchical and joint estimation formulations of an otherwise identical state-space model. Behavioural state estimation error was strongly affected by the degree of similarity between movement patterns characterising the behavioural states, with less error when movements were strongly dissimilar between states. The joint estimation model improved behavioural state estimation relative to the nonhierarchical model for simulated data with heavy-tailed Argos location errors. When applied to Argos telemetry datasets from 10 Weddell seals, the nonhierarchical model estimated highly uncertain behavioural state switching probabilities for most individuals whereas the joint estimation model yielded substantially less uncertainty. The joint estimation model better resolved the behavioural state sequences across all seals. Hierarchical or joint estimation models should be the preferred choice for estimating behavioural states from animal movement data, especially when location data are error-prone.

  17. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly

    PubMed Central

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-01-01

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm. DOI: http://dx.doi.org/10.7554/eLife.10586.001 PMID:26609813

  18. Operationalizing Proneness to Externalizing Psychopathology as a Multivariate Psychophysiological Phenotype

    PubMed Central

    Nelson, Lindsay D.; Patrick, Christopher J.; Bernat, Edward M.

    2010-01-01

    The externalizing dimension is viewed as a broad dispositional factor underlying risk for numerous disinhibitory disorders. Prior work has documented deficits in event-related brain potential (ERP) responses in individuals prone to externalizing problems. Here, we constructed a direct physiological index of externalizing vulnerability from three ERP indicators and evaluated its validity in relation to criterion measures in two distinct domains: psychometric and physiological. The index was derived from three ERP measures that covaried in their relations with externalizing proneness the error-related negativity and two variants of the P3. Scores on this ERP composite predicted psychometric criterion variables and accounted for externalizing-related variance in P3 response from a separate task. These findings illustrate how a diagnostic construct can be operationalized as a composite (multivariate) psychophysiological variable (phenotype). PMID:20573054

  19. The application of Aronson's taxonomy to medication errors in nursing.

    PubMed

    Johnson, Maree; Young, Helen

    2011-01-01

    Medication administration is a frequent nursing activity that is prone to error. In this study of 318 self-reported medication incidents (including near misses), very few resulted in patient harm-7% required intervention or prolonged hospitalization or caused temporary harm. Aronson's classification system provided an excellent framework for analysis of the incidents with a close connection between the type of error and the change strategy to minimize medication incidents. Taking a behavioral approach to medication error classification has provided helpful strategies for nurses such as nurse-call cards on patient lockers when patients are absent and checking of medication sign-off by outgoing and incoming staff at handover.

  20. The Sizing and Optimization Language, (SOL): Computer language for design problems

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1988-01-01

    The Sizing and Optimization Language, (SOL), a new high level, special purpose computer language was developed to expedite application of numerical optimization to design problems and to make the process less error prone. SOL utilizes the ADS optimization software and provides a clear, concise syntax for describing an optimization problem, the OPTIMIZE description, which closely parallels the mathematical description of the problem. SOL offers language statements which can be used to model a design mathematically, with subroutines or code logic, and with existing FORTRAN routines. In addition, SOL provides error checking and clear output of the optimization results. Because of these language features, SOL is best suited to model and optimize a design concept when the model consits of mathematical expressions written in SOL. For such cases, SOL's unique syntax and error checking can be fully utilized. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler, runtime library routines, and a SOL reference manual.

  1. ACTS 118x Final Report High-Speed TCP Interoperability Testing

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.; Brooks, David E.; Beering, Dave R.; Welch, Arun

    1999-01-01

    With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The satellite system providers are interested in solving TCP efficiency problems associated with long delays and error-prone links. Similarly, the terrestrial community is interested in solving TCP problems over high-bandwidth links. Whereas the wireless community is interested in improving TCP performance over bandwidth constrained, error-prone links. NASA realized that solutions had already been proposed for most of the problems associated with efficient data transfer over large bandwidth-delay links (which include satellite links). The solutions are detailed in various Internet Engineering Task Force (IETF) Request for Comments (RFCs). Unfortunately, most of these solutions had not been tested at high-speed (155+ Mbps). Therefore, the NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine how far the protocol can be optimized over a 622 Mbps satellite link. These experiments were known as the 118i and 118j experiments. During the 118i and 118j experiments, NASA worked closely with SUN Microsystems and FORE Systems to improve the operating system, TCP stacks. and network interface cards and drivers. We were able to obtain instantaneous data throughput rates of greater than 520 Mbps and average throughput rates of 470 Mbps using TCP over Asynchronous Transfer Mode (ATM) over a 622 Mbps Synchronous Optical Network (SONET) OC12 link. Following the success of these experiments and the successful government/industry collaboration, a new series of experiments. the 118x experiments. were developed.

  2. Improving travel information products via robust estimation techniques : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    Traffic-monitoring systems, such as those using loop detectors, are prone to coverage gaps, arising from sensor noise, processing errors and : transmission problems. Such gaps adversely affect the accuracy of Advanced Traveler Information Systems. Th...

  3. Keeping mammalian mutation load in check: regulation of the activity of error-prone DNA polymerases by p53 and p21.

    PubMed

    Livneh, Zvi

    2006-09-01

    To overcome DNA lesions that block replication the cell employs translesion DNA synthesis (TLS) polymerases, a group of low fidelity DNA polymerases that have the capacity to bypass a wide range of DNA lesions. This TLS process is also termed error-prone repair, due to its inherent mutagenic nature. We have recently shown that the tumor suppressor p53 and the cell cycle inhibitor p21 are global regulators of TLS. When these proteins are missing or nonfunctional, TLS gets out of control: its extent increases to very high levels, and its fidelity decreases, causing an overall increase in mutation load. This may be explained by the loss of selectivity in the bypass of specific DNA lesions by their cognate specialized polymerases, such that lesion bypass continues to a maximum, regardless of the price paid in increased mutations. The p53 and p21 proteins are also required for efficient UV light-induced monoubiquitination of PCNA, which is consistent with a model in which this modification of PCNA is necessary but not sufficient for the normal activity of TLS. This regulation suggests that TLS evolved in mammals as a system that balances gain in survival with a tolerable mutational cost, and that disturbing this balance causes a potentially harmful increase in mutations, which might play a role in carcinogenesis.

  4. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.

    2015-07-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.

  5. Perspective-taking abilities in the balance between autism tendencies and psychosis proneness.

    PubMed

    Abu-Akel, Ahmad M; Wood, Stephen J; Hansen, Peter C; Apperly, Ian A

    2015-06-07

    Difficulties with the ability to appreciate the perspective of others (mentalizing) is central to both autism and schizophrenia spectrum disorders. While the disorders are diagnostically independent, they can co-occur in the same individual. The effect of such co-morbidity is hypothesized to worsen mentalizing abilities. The recent influential 'diametric brain theory', however, suggests that the disorders are etiologically and phenotypically diametrical, predicting opposing effects on one's mentalizing abilities. To test these contrasting hypotheses, we evaluated the effect of psychosis and autism tendencies on the perspective-taking (PT) abilities of 201 neurotypical adults, on the assumption that autism tendencies and psychosis proneness are heritable dimensions of normal variation. We show that while both autism tendencies and psychosis proneness induce PT errors, their interaction reduced these errors. Our study is, to our knowledge, the first to observe that co-occurring autistic and psychotic traits can exert opposing influences on performance, producing a normalizing effect possibly by way of their diametrical effects on socio-cognitive abilities. This advances the notion that some individuals may, to some extent, be buffered against developing either illness or present fewer symptoms owing to a balanced expression of autistic and psychosis liability. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  6. Multistrip western blotting to increase quantitative data output.

    PubMed

    Kiyatkin, Anatoly; Aksamitiene, Edita

    2009-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.

  7. Identifying chronic errors at freeway loop detectors- splashover, pulse breakup, and sensitivity settings.

    DOT National Transportation Integrated Search

    2011-03-01

    Traffic Management applications such as ramp metering, incident detection, travel time prediction, and vehicle : classification greatly depend on the accuracy of data collected from inductive loop detectors, but these data are : prone to various erro...

  8. First order error corrections in common introductory physics experiments

    NASA Astrophysics Data System (ADS)

    Beckey, Jacob; Baker, Andrew; Aravind, Vasudeva; Clarion Team

    As a part of introductory physics courses, students perform different standard lab experiments. Almost all of these experiments are prone to errors owing to factors like friction, misalignment of equipment, air drag, etc. Usually these types of errors are ignored by students and not much thought is paid to the source of these errors. However, paying attention to these factors that give rise to errors help students make better physics models and understand physical phenomena behind experiments in more detail. In this work, we explore common causes of errors in introductory physics experiment and suggest changes that will mitigate the errors, or suggest models that take the sources of these errors into consideration. This work helps students build better and refined physical models and understand physics concepts in greater detail. We thank Clarion University undergraduate student grant for financial support involving this project.

  9. Efficient error correction for next-generation sequencing of viral amplicons

    PubMed Central

    2012-01-01

    Background Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. Results In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Conclusions Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses. The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm PMID:22759430

  10. Efficient error correction for next-generation sequencing of viral amplicons.

    PubMed

    Skums, Pavel; Dimitrova, Zoya; Campo, David S; Vaughan, Gilberto; Rossi, Livia; Forbi, Joseph C; Yokosawa, Jonny; Zelikovsky, Alex; Khudyakov, Yury

    2012-06-25

    Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses.The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm.

  11. Coordinating DNA polymerase traffic during high and low fidelity synthesis.

    PubMed

    Sutton, Mark D

    2010-05-01

    With the discovery that organisms possess multiple DNA polymerases (Pols) displaying different fidelities, processivities, and activities came the realization that mechanisms must exist to manage the actions of these diverse enzymes to prevent gratuitous mutations. Although many of the Pols encoded by most organisms are largely accurate, and participate in DNA replication and DNA repair, a sizeable fraction display a reduced fidelity, and act to catalyze potentially error-prone translesion DNA synthesis (TLS) past lesions that persist in the DNA. Striking the proper balance between use of these different enzymes during DNA replication, DNA repair, and TLS is essential for ensuring accurate duplication of the cell's genome. This review highlights mechanisms that organisms utilize to manage the actions of their different Pols. A particular emphasis is placed on discussion of current models for how different Pols switch places with each other at the replication fork during high fidelity replication and potentially error-pone TLS. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  12. An automated calibration method for non-see-through head mounted displays.

    PubMed

    Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew

    2011-08-15

    Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Diastereoselective synthesis of L: -threo-3,4-dihydroxyphenylserine by low-specific L: -threonine aldolase mutants.

    PubMed

    Gwon, Hui-Jeong; Baik, Sang-Ho

    2010-01-01

    Diastereoselectivity-enhanced mutants of L: -threonine aldolase (L: -TA) for L: -threo-3,4-dihydroxyphenylserine (L: -threo-DOPS) synthesis were isolated by error-prone PCR followed by a high-throughput screening. The most improved mutant was achieved from the mutant T3-3mm2, showing a 4-fold increase over the wild-type L: -TA. When aldol condensation activity was examined using whole cells of T3-3mm2, its de was constantly maintained at 55% during the batch reactions for 80 h, yielding 3.8 mg L: -threo-DOPS/ml.

  14. Evolution: the dialogue between life and death

    NASA Astrophysics Data System (ADS)

    Holliday, robin

    1997-12-01

    Organisms have the ability to harness energy from the environment to create order and to reproduce. From early error-prone systems natural selection acted to produce present day organisms with high accuracy in the synthesis of macromolecules. The environment imposes strict limits on reproduction, so evolution is always accompanied by the discarding of a large proportion of the less fit cells, or organisms. Sexual reproduction depends on an immortal germline and a soma which may be immortal or mortal. Higher animals living in hazardous environments have evolved aging and death of the soma for the benefit of the ongoing germline.

  15. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis.

    PubMed

    Prein, Andreas F; Gobiet, Andreas

    2017-01-01

    Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio-temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan-European data sets and a set that combines eight very high-resolution station-based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post-processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small-scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate-mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments.

  16. Improving specialist drug prescribing in primary care using task and error analysis: an observational study.

    PubMed

    Chana, Narinder; Porat, Talya; Whittlesea, Cate; Delaney, Brendan

    2017-03-01

    Electronic prescribing has benefited from computerised clinical decision support systems (CDSSs); however, no published studies have evaluated the potential for a CDSS to support GPs in prescribing specialist drugs. To identify potential weaknesses and errors in the existing process of prescribing specialist drugs that could be addressed in the development of a CDSS. Semi-structured interviews with key informants followed by an observational study involving GPs in the UK. Twelve key informants were interviewed to investigate the use of CDSSs in the UK. Nine GPs were observed while performing case scenarios depicting requests from hospitals or patients to prescribe a specialist drug. Activity diagrams, hierarchical task analysis, and systematic human error reduction and prediction approach analyses were performed. The current process of prescribing specialist drugs by GPs is prone to error. Errors of omission due to lack of information were the most common errors, which could potentially result in a GP prescribing a specialist drug that should only be prescribed in hospitals, or prescribing a specialist drug without reference to a shared care protocol. Half of all possible errors in the prescribing process had a high probability of occurrence. A CDSS supporting GPs during the process of prescribing specialist drugs is needed. This could, first, support the decision making of whether or not to undertake prescribing, and, second, provide drug-specific parameters linked to shared care protocols, which could reduce the errors identified and increase patient safety. © British Journal of General Practice 2017.

  17. Suggestibility and signal detection performance in hallucination-prone students.

    PubMed

    Alganami, Fatimah; Varese, Filippo; Wagstaff, Graham F; Bentall, Richard P

    2017-03-01

    Auditory hallucinations are associated with signal detection biases. We examine the extent to which suggestions influence performance on a signal detection task (SDT) in highly hallucination-prone and low hallucination-prone students. We also explore the relationship between trait suggestibility, dissociation and hallucination proneness. In two experiments, students completed on-line measures of hallucination proneness (the revised Launay-Slade Hallucination Scale; LSHS-R), trait suggestibility (Inventory of Suggestibility) and dissociation (Dissociative Experiences Scale-II). Students in the upper and lower tertiles of the LSHS-R performed an auditory SDT. Prior to the task, suggestions were made pertaining to the number of expected targets (Experiment 1, N = 60: high vs. low suggestions; Experiment 2, N = 62, no suggestion vs. high suggestion vs. no voice suggestion). Correlational and regression analyses indicated that trait suggestibility and dissociation predicted hallucination proneness. Highly hallucination-prone students showed a higher SDT bias in both studies. In Experiment 1, both bias scores were significantly affected by suggestions to the same degree. In Experiment 2, highly hallucination-prone students were more reactive to the high suggestion condition than the controls. Suggestions may affect source-monitoring judgments, and this effect may be greater in those who have a predisposition towards hallucinatory experiences.

  18. Grunting's competitive advantage: Considerations of force and distraction

    PubMed Central

    Maglinti, Cj; Kingstone, Alan

    2018-01-01

    Background Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport—mixed martial arts—where distraction, rather than masking, is the most likely mechanism. Methodology/Principal findings We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent’s response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. Conclusions/Significance The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined. PMID:29470505

  19. Grunting's competitive advantage: Considerations of force and distraction.

    PubMed

    Sinnett, Scott; Maglinti, Cj; Kingstone, Alan

    2018-01-01

    Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport-mixed martial arts-where distraction, rather than masking, is the most likely mechanism. We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent's response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined.

  20. A nucleotide-analogue-induced gain of function corrects the error-prone nature of human DNA polymerase iota.

    PubMed

    Ketkar, Amit; Zafar, Maroof K; Banerjee, Surajit; Marquez, Victor E; Egli, Martin; Eoff, Robert L

    2012-06-27

    Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2'-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2'-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle, which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base-stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase.

  1. A nucleotide analogue induced gain of function corrects the error-prone nature of human DNA polymerase iota

    PubMed Central

    Ketkar, Amit; Zafar, Maroof K.; Banerjee, Surajit; Marquez, Victor E.; Egli, Martin; Eoff, Robert L

    2012-01-01

    Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2′-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2′-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle (χ), which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase. PMID:22632140

  2. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    DOE PAGES

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; ...

    2017-02-15

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less

  3. Software for Quantifying and Simulating Microsatellite Genotyping Error

    PubMed Central

    Johnson, Paul C.D.; Haydon, Daniel T.

    2007-01-01

    Microsatellite genetic marker data are exploited in a variety of fields, including forensics, gene mapping, kinship inference and population genetics. In all of these fields, inference can be thwarted by failure to quantify and account for data errors, and kinship inference in particular can benefit from separating errors into two distinct classes: allelic dropout and false alleles. Pedant is MS Windows software for estimating locus-specific maximum likelihood rates of these two classes of error. Estimation is based on comparison of duplicate error-prone genotypes: neither reference genotypes nor pedigree data are required. Other functions include: plotting of error rate estimates and confidence intervals; simulations for performing power analysis and for testing the robustness of error rate estimates to violation of the underlying assumptions; and estimation of expected heterozygosity, which is a required input. The program, documentation and source code are available from http://www.stats.gla.ac.uk/~paulj/pedant.html. PMID:20066126

  4. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    PubMed Central

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; Rudinger, Kenneth; Mizrahi, Jonathan; Fortier, Kevin; Maunz, Peter

    2017-01-01

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Here we use gate set tomography to completely characterize operations on a trapped-Yb+-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10−4). PMID:28198466

  5. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less

  6. Improvement in Patient Transfer Process From the Operating Room to the PICU Using a Lean and Six Sigma-Based Quality Improvement Project.

    PubMed

    Gleich, Stephen J; Nemergut, Michael E; Stans, Anthony A; Haile, Dawit T; Feigal, Scott A; Heinrich, Angela L; Bosley, Christopher L; Tripathi, Sandeep

    2016-08-01

    Ineffective and inefficient patient transfer processes can increase the chance of medical errors. Improvements in such processes are high-priority local institutional and national patient safety goals. At our institution, nonintubated postoperative pediatric patients are first admitted to the postanesthesia care unit before transfer to the PICU. This quality improvement project was designed to improve the patient transfer process from the operating room (OR) to the PICU. After direct observation of the baseline process, we introduced a structured, direct OR-PICU transfer process for orthopedic spinal fusion patients. We performed value stream mapping of the process to determine error-prone and inefficient areas. We evaluated primary outcome measures of handoff error reduction and the overall efficiency of patient transfer process time. Staff satisfaction was evaluated as a counterbalance measure. With the introduction of the new direct OR-PICU patient transfer process, the handoff communication error rate improved from 1.9 to 0.3 errors per patient handoff (P = .002). Inefficiency (patient wait time and non-value-creating activity) was reduced from 90 to 32 minutes. Handoff content was improved with fewer information omissions (P < .001). Staff satisfaction significantly improved among nearly all PICU providers. By using quality improvement methodology to design and implement a new direct OR-PICU transfer process with a structured multidisciplinary verbal handoff, we achieved sustained improvements in patient safety and efficiency. Handoff communication was enhanced, with fewer errors and content omissions. The new process improved efficiency, with high staff satisfaction. Copyright © 2016 by the American Academy of Pediatrics.

  7. Model validation and error estimation of tsunami runup using high resolution data in Sadeng Port, Gunungkidul, Yogyakarta

    NASA Astrophysics Data System (ADS)

    Basith, Abdul; Prakoso, Yudhono; Kongko, Widjo

    2017-07-01

    A tsunami model using high resolution geometric data is indispensable in efforts to tsunami mitigation, especially in tsunami prone areas. It is one of the factors that affect the accuracy results of numerical modeling of tsunami. Sadeng Port is a new infrastructure in the Southern Coast of Java which could potentially hit by massive tsunami from seismic gap. This paper discusses validation and error estimation of tsunami model created using high resolution geometric data in Sadeng Port. Tsunami model validation uses the height wave of Tsunami Pangandaran 2006 recorded by Tide Gauge of Sadeng. Tsunami model will be used to accommodate the tsunami numerical modeling involves the parameters of earthquake-tsunami which is derived from the seismic gap. The validation results using t-test (student) shows that the height of the tsunami modeling results and observation in Tide Gauge of Sadeng are considered statistically equal at 95% confidence level and the value of the RMSE and NRMSE are 0.428 m and 22.12%, while the differences of tsunami wave travel time is 12 minutes.

  8. Impact of Standardized Communication Techniques on Errors during Simulated Neonatal Resuscitation.

    PubMed

    Yamada, Nicole K; Fuerch, Janene H; Halamek, Louis P

    2016-03-01

    Current patterns of communication in high-risk clinical situations, such as resuscitation, are imprecise and prone to error. We hypothesized that the use of standardized communication techniques would decrease the errors committed by resuscitation teams during neonatal resuscitation. In a prospective, single-blinded, matched pairs design with block randomization, 13 subjects performed as a lead resuscitator in two simulated complex neonatal resuscitations. Two nurses assisted each subject during the simulated resuscitation scenarios. In one scenario, the nurses used nonstandard communication; in the other, they used standardized communication techniques. The performance of the subjects was scored to determine errors committed (defined relative to the Neonatal Resuscitation Program algorithm), time to initiation of positive pressure ventilation (PPV), and time to initiation of chest compressions (CC). In scenarios in which subjects were exposed to standardized communication techniques, there was a trend toward decreased error rate, time to initiation of PPV, and time to initiation of CC. While not statistically significant, there was a 1.7-second improvement in time to initiation of PPV and a 7.9-second improvement in time to initiation of CC. Should these improvements in human performance be replicated in the care of real newborn infants, they could improve patient outcomes and enhance patient safety. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  9. Effect of lethality on the extinction and on the error threshold of quasispecies.

    PubMed

    Tejero, Hector; Marín, Arturo; Montero, Francisco

    2010-02-21

    In this paper the effect of lethality on error threshold and extinction has been studied in a population of error-prone self-replicating molecules. For given lethality and a simple fitness landscape, three dynamic regimes can be obtained: quasispecies, error catastrophe, and extinction. Using a simple model in which molecules are classified as master, lethal and non-lethal mutants, it is possible to obtain the mutation rates of the transitions between the three regimes analytically. The numerical resolution of the extended model, in which molecules are classified depending on their Hamming distance to the master sequence, confirms the results obtained in the simple model and shows how an error catastrophe regime changes when lethality is taken in account. (c) 2009 Elsevier Ltd. All rights reserved.

  10. Ability/Motivation Interactions in Complex Skill Acquisition

    DTIC Science & Technology

    1988-04-28

    attentional resources. Finally, in the declarative knowledge phase, performance is slow and error prone. Once the learner has come to an adequate cognitive...mediation by the learner. After a substantial amount of consistent task practice, skilled performance becomes fast , accurate, and the task can often be

  11. DNA polymerase η mutational signatures are found in a variety of different types of cancer.

    PubMed

    Rogozin, Igor B; Goncearenco, Alexander; Lada, Artem G; De, Subhajyoti; Yurchenko, Vyacheslav; Nudelman, German; Panchenko, Anna R; Cooper, David N; Pavlov, Youri I

    2018-01-01

    DNA polymerase (pol) η is a specialized error-prone polymerase with at least two quite different and contrasting cellular roles: to mitigate the genetic consequences of solar UV irradiation, and promote somatic hypermutation in the variable regions of immunoglobulin genes. Misregulation and mistargeting of pol η can compromise genome integrity. We explored whether the mutational signature of pol η could be found in datasets of human somatic mutations derived from normal and cancer cells. A substantial excess of single and tandem somatic mutations within known pol η mutable motifs was noted in skin cancer as well as in many other types of human cancer, suggesting that somatic mutations in A:T bases generated by DNA polymerase η are a common feature of tumorigenesis. Another peculiarity of pol ηmutational signatures, mutations in YCG motifs, led us to speculate that error-prone DNA synthesis opposite methylated CpG dinucleotides by misregulated pol η in tumors might constitute an additional mechanism of cytosine demethylation in this hypermutable dinucleotide.

  12. Inducible error-prone repair in B. subtilis. Final report, September 1, 1979-June 30, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasbin, R. E.

    1981-06-01

    The research performed under this contract has been concentrated on the relationship between inducible DNA repair systems, mutagenesis and the competent state in the gram positive bacterium Bacillus subtilis. The following results have been obtained from this research: (1) competent Bacillus subtilis cells have been developed into a sensitive tester system for carcinogens; (2) competent B. subtilis cells have an efficient excision-repair system, however, this system will not function on bacteriophage DNA taken into the cell via the process of transfection; (3) DNA polymerase III is essential in the mechanism of the process of W-reactivation; (4) B. subtilis strains curedmore » of their defective prophages have been isolated and are now being developed for gene cloning systems; (5) protoplasts of B. subtilis have been shown capable of acquiring DNA repair enzymes (i.e., enzyme therapy); and (6) a plasmid was characterized which enhanced inducible error-prone repair in a gram positive organism.« less

  13. Delusion proneness and emotion appraisal in individuals with high psychosis vulnerability.

    PubMed

    Szily, Erika; Kéri, Szabolcs

    2013-01-01

    Evidence suggests that emotional processes play an important role in the development of delusions. The aim of the present study was to investigate emotion appraisal in individuals with high and low psychosis proneness. We compared 30 individuals who experienced a transient psychotic episode followed by a complete remission with 30 healthy control volunteers. The participants received the Peters et al. Delusion Inventory (PDI) and the Scherer's Emotion Appraisal Questionnaire. We also assessed the IQ and the severity of depressive and anxiety symptoms. Results revealed that individuals with high psychosis proneness displayed increased PDI scores and more pronounced anxiety compared with individuals with low psychosis proneness. There was a specific pattern of emotion appraisal in individuals with high psychosis proneness. In the case of fear, they achieved higher scores for external causality and immorality, and lower scores for coping ability and self-esteem compared with individuals with low proneness. The PDI scores were weakly related to external causality (r = 0.41) and self-esteem (r = -0.37). In the case of sadness and joy, no emotion appraisal differences were found between participants with low and high proneness. These results suggest that individuals who had a history of psychotic breakdown and therefore exhibit high psychosis proneness display an altered appraisal of fear, emphasizing external circumstances, feeling less power to cope and experience low self-esteem. Patients remitted from a transient psychotic episode still exhibit milder forms of delusion proneness. Emotion appraisal for fear is related to delusion proneness. Clinicians should pay a special attention to self-esteem and attribution biases in psychosis-prone individuals. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Hybrid learning in signalling games

    NASA Astrophysics Data System (ADS)

    Barrett, Jeffrey A.; Cochran, Calvin T.; Huttegger, Simon; Fujiwara, Naoki

    2017-09-01

    Lewis-Skyrms signalling games have been studied under a variety of low-rationality learning dynamics. Reinforcement dynamics are stable but slow and prone to evolving suboptimal signalling conventions. A low-inertia trial-and-error dynamical like win-stay/lose-randomise is fast and reliable at finding perfect signalling conventions but unstable in the context of noise or agent error. Here we consider a low-rationality hybrid of reinforcement and win-stay/lose-randomise learning that exhibits the virtues of both. This hybrid dynamics is reliable, stable and exceptionally fast.

  15. Reduced vision selectively impairs spatial updating in fall-prone older adults.

    PubMed

    Barrett, Maeve M; Doheny, Emer P; Setti, Annalisa; Maguinness, Corrina; Foran, Timothy G; Kenny, Rose Anne; Newell, Fiona N

    2013-01-01

    The current study examined the role of vision in spatial updating and its potential contribution to an increased risk of falls in older adults. Spatial updating was assessed using a path integration task in fall-prone and healthy older adults. Specifically, participants conducted a triangle completion task in which they were guided along two sides of a triangular route and were then required to return, unguided, to the starting point. During the task, participants could either clearly view their surroundings (full vision) or visuo-spatial information was reduced by means of translucent goggles (reduced vision). Path integration performance was measured by calculating the distance and angular deviation from the participant's return point relative to the starting point. Gait parameters for the unguided walk were also recorded. We found equivalent performance across groups on all measures in the full vision condition. In contrast, in the reduced vision condition, where participants had to rely on interoceptive cues to spatially update their position, fall-prone older adults made significantly larger distance errors relative to healthy older adults. However, there were no other performance differences between fall-prone and healthy older adults. These findings suggest that fall-prone older adults, compared to healthy older adults, have greater difficulty in reweighting other sensory cues for spatial updating when visual information is unreliable.

  16. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  17. Adopting Extensible Business Reporting Language (XBRL): A Grounded Theory

    ERIC Educational Resources Information Center

    Cruz, Marivic

    2010-01-01

    In 2007 and 2008, government challenges consisted of error prone, manually intensive, and inefficient environments for financial reporting. Banking regulators worldwide faced issues with respect to transparency, timeliness, quality, and managing risks associated with accounting opacity. The general problem was the existing reporting standards and…

  18. Efficient Dependency Computation for Dynamic Hybrid Bayesian Network in On-line System Health Management Applications

    DTIC Science & Technology

    2014-10-02

    intervals (Neil, Tailor, Marquez, Fenton , & Hear, 2007). This is cumbersome, error prone and usually inaccurate. Even though a universal framework...Science. Neil, M., Tailor, M., Marquez, D., Fenton , N., & Hear. (2007). Inference in Bayesian networks using dynamic discretisation. Statistics

  19. Graph-based active learning of agglomeration (GALA): a Python library to segment 2D and 3D neuroimages

    PubMed Central

    Nunez-Iglesias, Juan; Kennedy, Ryan; Plaza, Stephen M.; Chakraborty, Anirban; Katz, William T.

    2014-01-01

    The aim in high-resolution connectomics is to reconstruct complete neuronal connectivity in a tissue. Currently, the only technology capable of resolving the smallest neuronal processes is electron microscopy (EM). Thus, a common approach to network reconstruction is to perform (error-prone) automatic segmentation of EM images, followed by manual proofreading by experts to fix errors. We have developed an algorithm and software library to not only improve the accuracy of the initial automatic segmentation, but also point out the image coordinates where it is likely to have made errors. Our software, called gala (graph-based active learning of agglomeration), improves the state of the art in agglomerative image segmentation. It is implemented in Python and makes extensive use of the scientific Python stack (numpy, scipy, networkx, scikit-learn, scikit-image, and others). We present here the software architecture of the gala library, and discuss several designs that we consider would be generally useful for other segmentation packages. We also discuss the current limitations of the gala library and how we intend to address them. PMID:24772079

  20. Decreasing scoring errors on Wechsler Scale Vocabulary, Comprehension, and Similarities subtests: a preliminary study.

    PubMed

    Linger, Michele L; Ray, Glen E; Zachar, Peter; Underhill, Andrea T; LoBello, Steven G

    2007-10-01

    Studies of graduate students learning to administer the Wechsler scales have generally shown that training is not associated with the development of scoring proficiency. Many studies report on the reduction of aggregated administration and scoring errors, a strategy that does not highlight the reduction of errors on subtests identified as most prone to error. This study evaluated the development of scoring proficiency specifically on the Wechsler (WISC-IV and WAIS-III) Vocabulary, Comprehension, and Similarities subtests during training by comparing a set of 'early test administrations' to 'later test administrations.' Twelve graduate students enrolled in an intelligence-testing course participated in the study. Scoring errors (e.g., incorrect point assignment) were evaluated on the students' actual practice administration test protocols. Errors on all three subtests declined significantly when scoring errors on 'early' sets of Wechsler scales were compared to those made on 'later' sets. However, correcting these subtest scoring errors did not cause significant changes in subtest scaled scores. Implications for clinical instruction and future research are discussed.

  1. Simplified stereo-optical ultrasound plane calibration

    NASA Astrophysics Data System (ADS)

    Hoßbach, Martin; Noll, Matthias; Wesarg, Stefan

    2013-03-01

    Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke

  2. Implementing High-Performance Geometric Multigrid Solver with Naturally Grained Messages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shan, Hongzhang; Williams, Samuel; Zheng, Yili

    2015-10-26

    Structured-grid linear solvers often require manually packing and unpacking of communication data to achieve high performance.Orchestrating this process efficiently is challenging, labor-intensive, and potentially error-prone.In this paper, we explore an alternative approach that communicates the data with naturally grained messagesizes without manual packing and unpacking. This approach is the distributed analogue of shared-memory programming, taking advantage of the global addressspace in PGAS languages to provide substantial programming ease. However, its performance may suffer from the large number of small messages. We investigate theruntime support required in the UPC ++ library for this naturally grained version to close the performance gapmore » between the two approaches and attain comparable performance at scale using the High-Performance Geometric Multgrid (HPGMG-FV) benchmark as a driver.« less

  3. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis

    PubMed Central

    Gobiet, Andreas

    2016-01-01

    ABSTRACT Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio‐temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan‐European data sets and a set that combines eight very high‐resolution station‐based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post‐processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small‐scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate‐mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments. PMID:28111497

  4. "Jumping to conclusions" in delusion-prone participants: an experimental economics approach.

    PubMed

    van der Leer, Leslie; McKay, Ryan

    2014-01-01

    That delusional and delusion-prone individuals "jump to conclusions" on probabilistic reasoning tasks is a key finding in cognitive neuropsychiatry. Here we focused on a less frequently investigated aspect of "jumping to conclusions" (JTC): certainty judgments. We incorporated rigorous procedures from experimental economics to eliminate potential confounds of miscomprehension and motivation and systematically investigated the effect of incentives on task performance. Low- and high-delusion-prone participants (n = 109) completed a series of computerised trials; on each trial, they were shown a black or a white fish, caught from one of the two lakes containing fish of both colours in complementary ratios. In the betting condition, participants were given £4 to distribute over the two lakes as they wished; in the control condition, participants simply provided an estimate of how probable each lake was. Deviations from Bayesian probabilities were investigated. Whereas high-delusion-prone participants in both the control and betting conditions underestimated the Bayesian probabilities (i.e. were conservative), low-delusion-prone participants in the control condition underestimated but those in the betting condition provided accurate estimates. In the control condition, there was a trend for high-delusion-prone participants to give higher estimates than low-delusion-prone participants, which is consistent with previous reports of "jumping to conclusions" in delusion-prone participants. However, our findings in the betting condition, where high-delusion-prone participants provided lower estimates than low-delusion-prone participants (who were accurate), are inconsistent with the jumping-to-conclusions effect in both a relative and an absolute sense. Our findings highlight the key role of task incentives and underscore the importance of comparing the responses of delusion-prone participants to an objective rational standard as well as to the responses of non-delusion-prone participants.

  5. Efficient Variational Quantum Simulator Incorporating Active Error Minimization

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2017-04-01

    One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.

  6. Noise-induced errors in geophysical parameter estimation from retarding potential analyzers in low Earth orbit

    NASA Astrophysics Data System (ADS)

    Debchoudhury, Shantanab; Earle, Gregory

    2017-04-01

    Retarding Potential Analyzers (RPA) have a rich flight heritage. Standard curve-fitting analysis techniques exist that can infer state variables in the ionospheric plasma environment from RPA data, but the estimation process is prone to errors arising from a number of sources. Previous work has focused on the effects of grid geometry on uncertainties in estimation; however, no prior study has quantified the estimation errors due to additive noise. In this study, we characterize the errors in estimation of thermal plasma parameters by adding noise to the simulated data derived from the existing ionospheric models. We concentrate on low-altitude, mid-inclination orbits since a number of nano-satellite missions are focused on this region of the ionosphere. The errors are quantified and cross-correlated for varying geomagnetic conditions.

  7. List of Error-Prone Abbreviations, Symbols, and Dose Designations

    MedlinePlus

    ... unit dose (e.g., diltiazem 125 mg IV infusion “UD” misin- terpreted as meaning to give the entire infusion as a unit [bolus] dose) Use “as directed” ... Names Intended Meaning Misinterpretation Correction “Nitro” drip nitroglycerin infusion Mistaken as sodium nitroprusside infusion Use complete drug ...

  8. Finite element modeling of light propagation in fruit under illumination of continuous-wave beam

    USDA-ARS?s Scientific Manuscript database

    Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...

  9. Finite element simulation of light transfer in turbid media under structured illumination

    USDA-ARS?s Scientific Manuscript database

    Spatial-frequency domain (SFD) imaging technique allows to estimate the optical properties of biological tissues in a wide field of view. The technique is, however, prone to error in measurement because the two crucial assumptions used for deriving the analytical solution to diffusion approximation ...

  10. Propensity Score Weighting with Error-Prone Covariates

    ERIC Educational Resources Information Center

    McCaffrey, Daniel F.; Lockwood, J. R.; Setodji, Claude M.

    2011-01-01

    Inverse probability weighting (IPW) estimates are widely used in applications where data are missing due to nonresponse or censoring or in observational studies of causal effects where the counterfactuals cannot be observed. This extensive literature has shown the estimators to be consistent and asymptotically normal under very general conditions,…

  11. How Emotions Affect Learning.

    ERIC Educational Resources Information Center

    Sylwester, Robert

    1994-01-01

    Studies show our emotional system is a complex, widely distributed, and error-prone system that defines our basic personality early in life and is quite resistant to change. This article describes our emotional system's major parts (the peptides that carry emotional information and the body and brain structures that activate and regulate emotions)…

  12. Online Hand Holding in Fixing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…

  13. Antisaccade performance of 1,273 men: effects of schizotypy, anxiety, and depression.

    PubMed

    Smyrnis, Nikolaos; Evdokimidis, Ioannis; Stefanis, Nicholas C; Avramopoulos, Dimitrios; Constantinidis, Theodoros S; Stavropoulos, Alexios; Stefanis, Costas N

    2003-08-01

    A total of 1,273 conscripts of the Greek Air Force performed antisaccades and completed self-reporting questionnaires measuring schizotypy and current state-dependent psychopathology. Only 1.0% of variability in antisaccade performance indices was related to psychometric scores in the population and could be attributed more to current state-dependent symptoms such as anxiety rather than to schizotypy. In contrast, a specific increase of error rate and response latency variability and a high correlation of these 2 variables was observed in a group with very high schizotypy scores. This effect was independent of anxiety and depression, suggesting that a specific group of psychosis-prone individuals has a characteristic deviance in antisaccade performance that is not present in the general population.

  14. A probabilistic approach to remote compositional analysis of planetary surfaces

    USGS Publications Warehouse

    Lapotre, Mathieu G.A.; Ehlmann, Bethany L.; Minson, Sarah E.

    2017-01-01

    Reflected light from planetary surfaces provides information, including mineral/ice compositions and grain sizes, by study of albedo and absorption features as a function of wavelength. However, deconvolving the compositional signal in spectra is complicated by the nonuniqueness of the inverse problem. Trade-offs between mineral abundances and grain sizes in setting reflectance, instrument noise, and systematic errors in the forward model are potential sources of uncertainty, which are often unquantified. Here we adopt a Bayesian implementation of the Hapke model to determine sets of acceptable-fit mineral assemblages, as opposed to single best fit solutions. We quantify errors and uncertainties in mineral abundances and grain sizes that arise from instrument noise, compositional end members, optical constants, and systematic forward model errors for two suites of ternary mixtures (olivine-enstatite-anorthite and olivine-nontronite-basaltic glass) in a series of six experiments in the visible-shortwave infrared (VSWIR) wavelength range. We show that grain sizes are generally poorly constrained from VSWIR spectroscopy. Abundance and grain size trade-offs lead to typical abundance errors of ≤1 wt % (occasionally up to ~5 wt %), while ~3% noise in the data increases errors by up to ~2 wt %. Systematic errors further increase inaccuracies by a factor of 4. Finally, phases with low spectral contrast or inaccurate optical constants can further increase errors. Overall, typical errors in abundance are <10%, but sometimes significantly increase for specific mixtures, prone to abundance/grain-size trade-offs that lead to high unmixing uncertainties. These results highlight the need for probabilistic approaches to remote determination of planetary surface composition.

  15. Multi-temporal change image inference towards false alarms reduction for an operational photogrammetric rockfall detection system

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas

    2015-06-01

    Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.

  16. A scalable method to improve gray matter segmentation at ultra high field MRI.

    PubMed

    Gulban, Omer Faruk; Schneider, Marian; Marquardt, Ingo; Haast, Roy A M; De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data.

  17. A scalable method to improve gray matter segmentation at ultra high field MRI

    PubMed Central

    De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data. PMID:29874295

  18. Interior Reconstruction Using the 3d Hough Transform

    NASA Astrophysics Data System (ADS)

    Dumitru, R.-C.; Borrmann, D.; Nüchter, A.

    2013-02-01

    Laser scanners are often used to create accurate 3D models of buildings for civil engineering purposes, but the process of manually vectorizing a 3D point cloud is time consuming and error-prone (Adan and Huber, 2011). Therefore, the need to characterize and quantify complex environments in an automatic fashion arises, posing challenges for data analysis. This paper presents a system for 3D modeling by detecting planes in 3D point clouds, based on which the scene is reconstructed at a high architectural level through removing automatically clutter and foreground data. The implemented software detects openings, such as windows and doors and completes the 3D model by inpainting.

  19. Toward computer-aided emphysema quantification on ultralow-dose CT: reproducibility of ventrodorsal gravity effect measurement and correction

    NASA Astrophysics Data System (ADS)

    Wiemker, Rafael; Opfer, Roland; Bülow, Thomas; Rogalla, Patrik; Steinberg, Amnon; Dharaiya, Ekta; Subramanyan, Krishna

    2007-03-01

    Computer aided quantification of emphysema in high resolution CT data is based on identifying low attenuation areas below clinically determined Hounsfield thresholds. However, the emphysema quantification is prone to error since a gravity effect can influence the mean attenuation of healthy lung parenchyma up to +/- 50 HU between ventral and dorsal lung areas. Comparing ultra-low-dose (7 mAs) and standard-dose (70 mAs) CT scans of each patient we show that measurement of the ventrodorsal gravity effect is patient specific but reproducible. It can be measured and corrected in an unsupervised way using robust fitting of a linear function.

  20. Optical coherence refractometry.

    PubMed

    Tomlins, Peter H; Woolliams, Peter; Hart, Christian; Beaumont, Andrew; Tedaldi, Matthew

    2008-10-01

    We introduce a novel approach to refractometry using a low coherence interferometer at multiple angles of incidence. We show that for plane parallel samples it is possible to measure their phase refractive index rather than the group index that is usually measured by interferometric methods. This is a significant development because it enables bulk refractive index measurement of scattering and soft samples, not relying on surface measurements that can be prone to error. Our technique is also noncontact and compatible with in situ refractive index measurements. Here, we demonstrate this new technique on a pure silica test piece and a highly scattering resin slab, comparing the results with standard critical angle refractometry.

  1. Integration of multi-sensor data to measure soil surface changes

    NASA Astrophysics Data System (ADS)

    Eltner, Anette; Schneider, Danilo

    2016-04-01

    Digital elevation models (DEM) of high resolution and accuracy covering a suitable sized area of interest can be a promising approach to help understanding the processes of soil erosion. Thereby, the plot under investigation should remain undisturbed. The fragile marl landscape in Andalusia (Spain) is especially prone to soil detachment and transport with unique sediment connectivity characteristics due to the soil properties and climatic conditions. A 600 m² field plot is established and monitored during three field campaigns (Sep. 2013, Nov. 2013 and Feb. 2014). Unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are suitable tools to generate high resolution topography data that describe soil surface changes at large field plots. Thereby, the advantages of both methods are utilised in a synergetic manner. On the one hand, TLS data is assumed to comprise a higher reliability regarding consistent error behaviour than DEMs derived from overlapping UAV images. Therefore, global errors (e.g. dome effect) and local errors (e.g. DEM blunders due to erroneous image matching) within the UAV data are assessed with the DEMs produced by TLS. Furthermore, TLS point clouds allow for fast and reliable filtering of vegetation spots, which is not as straightforward within the UAV data due to known image matching problems in areas displaying plant cover. On the other hand, systematic DEM errors linked to TLS are detected and possibly corrected utilising the DEMs reconstructed from overlapping UAV images. Furthermore, TLS point clouds are filtered corresponding to the degree of point quality, which is estimated from parameters of the scan geometry (i.e. incidence angle and footprint size). This is especially relevant for this study because the area of interest is located at gentle hillslopes that are prone to soil erosion. Thus, the view of the scanning device onto the surface results in an adverse angle, which is solely slightly improved by the usage of a 4 m high tripod. Surface roughness is considered as a further parameter to evaluate the TLS point quality. The filtering tool allows for choosing each data point either from the TLS or UAV data corresponding to the data acquisition geometry and surface properties. The filtered points are merged into one point cloud, which is finally processed to reduce remaining data noise. DEM analysis reveals a continuous decrease of soil surface roughness after tillage, the reappearance of former wheel tracks and local patterns of erosion as well as accumulation.

  2. Jumping to conclusions and the continuum of delusional beliefs.

    PubMed

    Warman, Debbie M; Lysaker, Paul H; Martin, Joel M; Davis, Louanne; Haudenschield, Samantha L

    2007-06-01

    The present study examined the jumping to conclusions reasoning bias across the continuum of delusional ideation by investigating individuals with active delusions, delusion prone individuals, and non-delusion prone individuals. Neutral and highly self-referent probabilistic reasoning tasks were employed. Results indicated that individuals with delusions gathered significantly less information than delusion prone and non-delusion prone participants on both the neutral and self-referent tasks, (p<.001). Individuals with delusions made less accurate decisions than the delusion prone and non-delusion prone participants on both tasks (p<.001), yet were more confident about their decisions than were delusion prone and non-delusion prone participants on the self-referent task (p=.002). Those with delusions and those who were delusion prone reported higher confidence in their performance on the self-referent task than they did the neutral task (p=.02), indicating that high self-reference impacted information processing for individuals in both of these groups. The results are discussed in relation to previous research in the area of probabilistic reasoning and delusions.

  3. Spatial and temporal variability of the overall error of National Atmospheric Deposition Program measurements determined by the USGS collocated-sampler program, water years 1989-2001

    USGS Publications Warehouse

    Wetherbee, G.A.; Latysh, N.E.; Gordon, J.D.

    2005-01-01

    Data from the U.S. Geological Survey (USGS) collocated-sampler program for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) are used to estimate the overall error of NADP/NTN measurements. Absolute errors are estimated by comparison of paired measurements from collocated instruments. Spatial and temporal differences in absolute error were identified and are consistent with longitudinal distributions of NADP/NTN measurements and spatial differences in precipitation characteristics. The magnitude of error for calcium, magnesium, ammonium, nitrate, and sulfate concentrations, specific conductance, and sample volume is of minor environmental significance to data users. Data collected after a 1994 sample-handling protocol change are prone to less absolute error than data collected prior to 1994. Absolute errors are smaller during non-winter months than during winter months for selected constituents at sites where frozen precipitation is common. Minimum resolvable differences are estimated for different regions of the USA to aid spatial and temporal watershed analyses.

  4. Self-Interaction Error in Density Functional Theory: An Appraisal.

    PubMed

    Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G

    2018-05-03

    Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.

  5. The Diagnosis of Error in Histories of Science

    NASA Astrophysics Data System (ADS)

    Thomas, William

    Whether and how to diagnose error in the history of science is a contentious issue. For many scientists, diagnosis is appealing because it allows them to discuss how knowledge can progress most effectively. Many historians disagree. They consider diagnosis inappropriate because it may discard features of past actors' thought that are important to understanding it, and may have even been intellectually productive. Ironically, these historians are apt to diagnose flaws in scientists' histories as proceeding from a misguided desire to idealize scientific method, and from their attendant identification of deviations from the ideal as, ipso facto, a paramount source of error in historical science. While both views have some merit, they should be reconciled if a more harmonious and productive relationship between the disciplines is to prevail. In To Explain the World, Steven Weinberg narrates the slow but definite emergence of what we call science from long traditions of philosophical and mathematical thought. This narrative follows in a historiographical tradition charted by historians such as Alexandre Koyre and Rupert Hall about sixty years ago. It is essentially a history of the emergence of reliable (if fallible) scientific method from more error-prone thought. While some historians such as Steven Shapin view narratives of this type as fundamentally error-prone, I do not view such projects as a priori illegitimate. They are, however, perhaps more difficult than Weinberg supposes. In this presentation, I will focus on two of Weinberg's strong historical claims: that physics became detached from religion as early as the beginning of the eighteenth century, and that physics proved an effective model for placing other fields on scientific grounds. While I disagree with these claims, they represent at most an overestimation of vintage science's interest in discarding theological questions, and an overestimation of that science's ability to function at all reliably.

  6. Surface driven biomechanical breast image registration

    NASA Astrophysics Data System (ADS)

    Eiben, Björn; Vavourakis, Vasileios; Hipwell, John H.; Kabus, Sven; Lorenz, Cristian; Buelow, Thomas; Williams, Norman R.; Keshtgar, M.; Hawkes, David J.

    2016-03-01

    Biomechanical modelling enables large deformation simulations of breast tissues under different loading conditions to be performed. Such simulations can be utilised to transform prone Magnetic Resonance (MR) images into a different patient position, such as upright or supine. We present a novel integration of biomechanical modelling with a surface registration algorithm which optimises the unknown material parameters of a biomechanical model and performs a subsequent regularised surface alignment. This allows deformations induced by effects other than gravity, such as those due to contact of the breast and MR coil, to be reversed. Correction displacements are applied to the biomechanical model enabling transformation of the original pre-surgical images to the corresponding target position. The algorithm is evaluated for the prone-to-supine case using prone MR images and the skin outline of supine Computed Tomography (CT) scans for three patients. A mean target registration error (TRE) of 10:9 mm for internal structures is achieved. For the prone-to-upright scenario, an optical 3D surface scan of one patient is used as a registration target and the nipple distances after alignment between the transformed MRI and the surface are 10:1 mm and 6:3 mm respectively.

  7. Comparison of exercises inducing maximum voluntary isometric contraction for the latissimus dorsi using surface electromyography.

    PubMed

    Park, Se-yeon; Yoo, Won-gyu

    2013-10-01

    The aim of this study was to compare muscular activation during five different normalization techniques that induced maximal isometric contraction of the latissimus dorsi. Sixteen healthy men participated in the study. Each participant performed three repetitions each of five types of isometric exertion: (1) conventional shoulder extension in the prone position, (2) caudal shoulder depression in the prone position, (3) body lifting with shoulder depression in the seated position, (4) trunk bending to the right in the lateral decubitus position, and (5) downward bar pulling in the seated position. In most participants, maximal activation of the latissimus dorsi was observed during conventional shoulder extension in the prone position; the percentage of maximal voluntary contraction was significantly greater for this exercise than for all other normalization techniques except downward bar pulling in the seated position. Although differences in electrode placement among various electromyographic studies represent a limitation, normalization techniques for the latissimus dorsi are recommended to minimize error in assessing maximal muscular activation of the latissimus dorsi through the combined use of shoulder extension in the prone position and downward pulling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Impairment in flexible emotion-based learning in hallucination- and delusion-prone individuals.

    PubMed

    Cella, Matteo; Dymond, Simon; Cooper, Andrew

    2009-11-30

    Deficits in emotion-based learning are implicated in many psychiatric disorders. Research conducted with patients with schizophrenia using one of the most popular tasks for the investigation of emotion-based learning, the Iowa Gambling Task (IGT), has largely been inconclusive. The present study employed a novel, contingency-shifting variant IGT with hallucination- and delusion-prone university students to determine whether previous findings were due merely to the presence of psychosis. Following initial screening of a sample of 253 students (mean age = 20.13 years, S.D. = 3.27), 28 high (10 male, 18 female) and 27 low (12 male, 15 female) hallucination-prone and 27 high (7 male, 20 female) and 26 low (11 male, 15 female) delusion-prone individuals completed the contingency-shifting variant IGT. Results showed no significant differences between the performances of high and low hallucination- and delusion-prone individuals during the original phase of the task. Differences only emerged following the onset of the contingency-shift phases, with individuals high in hallucination- and delusion-proneness having impaired performance compared with low hallucination- and delusion-prone individuals. Overall, the present findings demonstrate that impairments associated with hallucination- and delusion-proneness are specific to the shift phase of the contingency-shifting variant IGT, which supports previous findings with patients with schizophrenia.

  9. Cognitive fallacies and criminal investigations.

    PubMed

    Ditrich, Hans

    2015-03-01

    The human mind is susceptible to inherent fallacies that often hamper fully rational action. Many such misconceptions have an evolutionary background and are thus difficult to avert. Deficits in the reliability of eye-witnesses are well known to legal professionals; however, less attention has been paid to such effects in crime investigators. In order to obtain an "inside view" on the role of cognitive misconceptions in criminalistic work, a list of fallacies from the literature was adapted to criminalistic settings. The statements on this list were rated by highly experienced crime scene investigators according to the assumed likelihood of these errors to appear and their severity of effect. Among others, selective perception, expectation and confirmation bias, anchoring/"pars per toto" errors and "onus probandi"--shifting the burden of proof from the investigator to the suspect--were frequently considered to negatively affect criminal investigations. As a consequence, the following measures are proposed: alerting investigating officers in their training to cognitive fallacies and promoting the exchange of experiences in peer circles of investigators on a regular basis. Furthermore, the improvement of the organizational error culture and the establishment of a failure analysis system in order to identify and alleviate error prone processes are suggested. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  11. Dual processing and diagnostic errors.

    PubMed

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  12. Classification-Based Spatial Error Concealment for Visual Communications

    NASA Astrophysics Data System (ADS)

    Chen, Meng; Zheng, Yefeng; Wu, Min

    2006-12-01

    In an error-prone transmission environment, error concealment is an effective technique to reconstruct the damaged visual content. Due to large variations of image characteristics, different concealment approaches are necessary to accommodate the different nature of the lost image content. In this paper, we address this issue and propose using classification to integrate the state-of-the-art error concealment techniques. The proposed approach takes advantage of multiple concealment algorithms and adaptively selects the suitable algorithm for each damaged image area. With growing awareness that the design of sender and receiver systems should be jointly considered for efficient and reliable multimedia communications, we proposed a set of classification-based block concealment schemes, including receiver-side classification, sender-side attachment, and sender-side embedding. Our experimental results provide extensive performance comparisons and demonstrate that the proposed classification-based error concealment approaches outperform the conventional approaches.

  13. A water-vapor radiometer error model. [for ionosphere in geodetic microwave techniques

    NASA Technical Reports Server (NTRS)

    Beckman, B.

    1985-01-01

    The water-vapor radiometer (WVR) is used to calibrate unpredictable delays in the wet component of the troposphere in geodetic microwave techniques such as very-long-baseline interferometry (VLBI) and Global Positioning System (GPS) tracking. Based on experience with Jet Propulsion Laboratory (JPL) instruments, the current level of accuracy in wet-troposphere calibration limits the accuracy of local vertical measurements to 5-10 cm. The goal for the near future is 1-3 cm. Although the WVR is currently the best calibration method, many instruments are prone to systematic error. In this paper, a treatment of WVR data is proposed and evaluated. This treatment reduces the effect of WVR systematic errors by estimating parameters that specify an assumed functional form for the error. The assumed form of the treatment is evaluated by comparing the results of two similar WVR's operating near each other. Finally, the observability of the error parameters is estimated by covariance analysis.

  14. Utility of eButton images for identifying food preparation behaviors and meal-related tasks in adolescents

    USDA-ARS?s Scientific Manuscript database

    Food preparation skills may encourage healthy eating. Traditional assessment of child food preparation employs self- or parent proxy-reporting methods, which are prone to error. The eButton is a wearable all-day camera that has promise as an objective, passive method for measuring child food prepara...

  15. Understanding Clinician Information Demands and Synthesis of Clinical Documents in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Farri, Oladimeji Feyisetan

    2012-01-01

    Large quantities of redundant clinical data are usually transferred from one clinical document to another, making the review of such documents cognitively burdensome and potentially error-prone. Inadequate designs of electronic health record (EHR) clinical document user interfaces probably contribute to the difficulties clinicians experience while…

  16. Finite element modeling of light propagation in turbid media under illumination of a continuous-wave beam

    USDA-ARS?s Scientific Manuscript database

    Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...

  17. ATS-PD: An Adaptive Testing System for Psychological Disorders

    ERIC Educational Resources Information Center

    Donadello, Ivan; Spoto, Andrea; Sambo, Francesco; Badaloni, Silvana; Granziol, Umberto; Vidotto, Giulio

    2017-01-01

    The clinical assessment of mental disorders can be a time-consuming and error-prone procedure, consisting of a sequence of diagnostic hypothesis formulation and testing aimed at restricting the set of plausible diagnoses for the patient. In this article, we propose a novel computerized system for the adaptive testing of psychological disorders.…

  18. Towards New Multiplatform Hybrid Online Laboratory Models

    ERIC Educational Resources Information Center

    Rodriguez-Gil, Luis; García-Zubia, Javier; Orduña, Pablo; López-de-Ipiña, Diego

    2017-01-01

    Online laboratories have traditionally been split between virtual labs, with simulated components; and remote labs, with real components. The former tend to provide less realism but to be easily scalable and less expensive to maintain, while the latter are fully real but tend to require a higher maintenance effort and be more error-prone. This…

  19. A Practical Teaching Course in Directed Protein Evolution Using the Green Fluorescent Protein as a Model

    ERIC Educational Resources Information Center

    Ruller, Roberto; Silva-Rocha, Rafael; Silva, Artur; Schneider, Maria Paula Cruz; Ward, Richard John

    2011-01-01

    Protein engineering is a powerful tool, which correlates protein structure with specific functions, both in applied biotechnology and in basic research. Here, we present a practical teaching course for engineering the green fluorescent protein (GFP) from "Aequorea victoria" by a random mutagenesis strategy using error-prone polymerase…

  20. Accuracy of an IFSAR-derived digital terrain model under a conifer forest canopy.

    Treesearch

    Hans-Erik Andersen; Stephen E. Reutebuch; Robert J. McGaughey

    2005-01-01

    Accurate digital terrain models (DTMs) are necessary for a variety of forest resource management applications, including watershed management, timber harvest planning, and fire management. Traditional methods for acquiring topographic data typically rely on aerial photogrammetry, where measurement of the terrain surface below forest canopy is difficult and error prone...

  1. Antigenic Variation in the Lyme Spirochete: Insights into Recombinational Switching with a Suggested Role for Error-Prone Repair.

    PubMed

    Verhey, Theodore B; Castellanos, Mildred; Chaconas, George

    2018-05-29

    The Lyme disease spirochete, Borrelia burgdorferi, uses antigenic variation as a strategy to evade the host's acquired immune response. New variants of surface-localized VlsE are generated efficiently by unidirectional recombination from 15 unexpressed vls cassettes into the vlsE locus. Using algorithms to analyze switching from vlsE sequencing data, we characterize a population of over 45,000 inferred recombination events generated during mouse infection. We present evidence for clustering of these recombination events within the population and along the vlsE gene, a role for the direct repeats flanking the variable region in vlsE, and the importance of sequence homology in determining the location of recombination, despite RecA's dispensability. Finally, we report that non-templated sequence variation is strongly associated with recombinational switching and occurs predominantly at the 5' end of conversion tracts. This likely results from an error-prone repair mechanism operational during recombinational switching that elevates the mutation rate > 5,000-fold in switched regions. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  2. Computationally mapping sequence space to understand evolutionary protein engineering.

    PubMed

    Armstrong, Kathryn A; Tidor, Bruce

    2008-01-01

    Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.

  3. Minimal Contribution of APOBEC3-Induced G-to-A Hypermutation to HIV-1 Recombination and Genetic Variation

    PubMed Central

    Nikolaitchik, Olga A.; Burdick, Ryan C.; Gorelick, Robert J.; Keele, Brandon F.; Hu, Wei-Shau; Pathak, Vinay K.

    2016-01-01

    Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10−5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10−21 and1 × 10−11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication. PMID:27186986

  4. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Minimal Contribution of APOBEC3-Induced G-to-A Hypermutation to HIV-1 Recombination and Genetic Variation.

    PubMed

    Delviks-Frankenberry, Krista A; Nikolaitchik, Olga A; Burdick, Ryan C; Gorelick, Robert J; Keele, Brandon F; Hu, Wei-Shau; Pathak, Vinay K

    2016-05-01

    Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10-5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10-21 and1 × 10-11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication.

  6. Implementing Access to Data Distributed on Many Processors

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A reference architecture is defined for an object-oriented implementation of domains, arrays, and distributions written in the programming language Chapel. This technology primarily addresses domains that contain arrays that have regular index sets with the low-level implementation details being beyond the scope of this discussion. What is defined is a complete set of object-oriented operators that allows one to perform data distributions for domain arrays involving regular arithmetic index sets. What is unique is that these operators allow for the arbitrary regions of the arrays to be fragmented and distributed across multiple processors with a single point of access giving the programmer the illusion that all the elements are collocated on a single processor. Today's massively parallel High Productivity Computing Systems (HPCS) are characterized by a modular structure, with a large number of processing and memory units connected by a high-speed network. Locality of access as well as load balancing are primary concerns in these systems that are typically used for high-performance scientific computation. Data distributions address these issues by providing a range of methods for spreading large data sets across the components of a system. Over the past two decades, many languages, systems, tools, and libraries have been developed for the support of distributions. Since the performance of data parallel applications is directly influenced by the distribution strategy, users often resort to low-level programming models that allow fine-tuning of the distribution aspects affecting performance, but, at the same time, are tedious and error-prone. This technology presents a reusable design of a data-distribution framework for data parallel high-performance applications. Distributions are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on the performance of applications, it is important that the distribution strategy is flexible, so its behavior can change depending on the needs of the application. At the same time, high productivity concerns require that the user be shielded from error-prone, tedious details such as communication and synchronization.

  7. Commission errors of active intentions: the roles of aging, cognitive load, and practice.

    PubMed

    Boywitt, C Dennis; Rummel, Jan; Meiser, Thorsten

    2015-01-01

    Performing an intended action when it needs to be withheld, for example, when temporarily prescribed medication is incompatible with the other medication, is referred to as commission errors of prospective memory (PM). While recent research indicates that older adults are especially prone to commission errors for finished intentions, there is a lack of research on the effects of aging on commission errors for still active intentions. The present research investigates conditions which might contribute to older adults' propensity to perform planned intentions under inappropriate conditions. Specifically, disproportionally higher rates of commission errors for still active intentions were observed in older than in younger adults with both salient (Experiment 1) and non-salient (Experiment 2) target cues. Practicing the PM task in Experiment 2, however, helped execution of the intended action in terms of higher PM performance at faster ongoing-task response times but did not increase the rate of commission errors. The results have important implications for the understanding of older adults' PM commission errors and the processes involved in these errors.

  8. The Relationship of Stress Arousal and Stress Prone Personality Traits to Menstrual Distress.

    ERIC Educational Resources Information Center

    Marini, David C.

    The various relationships of stress arousal and stress-prone personality traits to menstrual distress were investigated in order to quantify psychophysiological arousal differences between high and low menstrual distress symptom reporters and examine differences in stress-prone personality traits between high and low menstrual distress symptom…

  9. Medication administration errors in nursing homes using an automated medication dispensing system.

    PubMed

    van den Bemt, Patricia M L A; Idzinga, Jetske C; Robertz, Hans; Kormelink, Dennis Groot; Pels, Neske

    2009-01-01

    OBJECTIVE To identify the frequency of medication administration errors as well as their potential risk factors in nursing homes using a distribution robot. DESIGN The study was a prospective, observational study conducted within three nursing homes in the Netherlands caring for 180 individuals. MEASUREMENTS Medication errors were measured using the disguised observation technique. Types of medication errors were described. The correlation between several potential risk factors and the occurrence of medication errors was studied to identify potential causes for the errors. RESULTS In total 2,025 medication administrations to 127 clients were observed. In these administrations 428 errors were observed (21.2%). The most frequently occurring types of errors were use of wrong administration techniques (especially incorrect crushing of medication and not supervising the intake of medication) and wrong time errors (administering the medication at least 1 h early or late).The potential risk factors female gender (odds ratio (OR) 1.39; 95% confidence interval (CI) 1.05-1.83), ATC medication class antibiotics (OR 11.11; 95% CI 2.66-46.50), medication crushed (OR 7.83; 95% CI 5.40-11.36), number of dosages/day/client (OR 1.03; 95% CI 1.01-1.05), nursing home 2 (OR 3.97; 95% CI 2.86-5.50), medication not supplied by distribution robot (OR 2.92; 95% CI 2.04-4.18), time classes "7-10 am" (OR 2.28; 95% CI 1.50-3.47) and "10 am-2 pm" (OR 1.96; 1.18-3.27) and day of the week "Wednesday" (OR 1.46; 95% CI 1.03-2.07) are associated with a higher risk of administration errors. CONCLUSIONS Medication administration in nursing homes is prone to many errors. This study indicates that the handling of the medication after removing it from the robot packaging may contribute to this high error frequency, which may be reduced by training of nurse attendants, by automated clinical decision support and by measures to reduce workload.

  10. Skills, rules and knowledge in aircraft maintenance: errors in context

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  11. How important is an apology to you? Forecasting errors in evaluating the value of apologies.

    PubMed

    De Cremer, David; Pillutla, Madan M; Folmer, Chris Reinders

    2011-01-01

    Apologies are commonly used to deal with transgressions in relationships. Results to date, however, indicate that the positive effects of apologies vary widely, and the match between people's judgments of apologies and the true value of apologies has not been studied. Building on the affective and behavioral forecasting literature, we predicted that people would overestimate how much they value apologies in reality. Across three experimental studies, our results showed that after having been betrayed by another party (or after imagining this to be the case), people (a) rated the value of an apology much more highly when they imagined receiving an apology than when they actually received an apology and (b) displayed greater trusting behavior when they imagined receiving an apology than when they actually received an apology. These results suggest that people are prone to forecasting errors regarding the effectiveness of an apology and that they tend to overvalue the impact of receiving one.

  12. Radiation damage and repair in cells and cell components. Progress report, 1980-1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-01-01

    One aim has been to see whether, in E.coli, the various phenomena which were ascribed to the induction of the recA gene produce (p-recA) are really manifestations of one process. It was concluded that this is true for septum inhibition, Weigle-reactivation, induced inhibition of post radiation DNA degradation, and with the additional concept of a premutational lesion, for uv mutagenesis. lambda prophage induction may perhaps be brought into line with p-recA induction with the consideration of the additional secondary aspects of (a) activation of p-recA to make it enzymatically active and (b) the need to have the concentration of activatedmore » p-recA high enough to keep up with the rate of production of lambda-repressors. Revertants seem to be in more than one class and two of these can not easily be explained by the idea that p-recA contains an error-prone repair enzyme that makes errors at mutagenic lesions.« less

  13. "I'd only let you down": Guilt proneness and the avoidance of harmful interdependence.

    PubMed

    Wiltermuth, Scott S; Cohen, Taya R

    2014-11-01

    Five studies demonstrated that highly guilt-prone people may avoid forming interdependent partnerships with others whom they perceive to be more competent than themselves, as benefitting a partner less than the partner benefits one's self could trigger feelings of guilt. Highly guilt-prone people who lacked expertise in a domain were less willing than were those low in guilt proneness who lacked expertise in that domain to create outcome-interdependent relationships with people who possessed domain-specific expertise. These highly guilt-prone people were more likely than others both to opt to be paid on their performance alone (Studies 1, 3, 4, and 5) and to opt to be paid on the basis of the average of their performance and that of others whose competence was more similar to their own (Studies 2 and 5). Guilt proneness did not predict people's willingness to form outcome-interdependent relationships with potential partners who lacked domain-specific expertise (Studies 4 and 5). It also did not predict people's willingness to form relationships when poor individual performance would not negatively affect partner outcomes (Study 4). Guilt proneness therefore predicts whether, and with whom, people develop interdependent relationships. The findings also demonstrate that highly guilt-prone people sacrifice financial gain out of concern about how their actions would influence others' welfare. As such, the findings demonstrate a novel way in which guilt proneness limits free-riding and therefore reduces the incidence of potentially unethical behavior. Lastly, the findings demonstrate that people who lack competence may not always seek out competence in others when choosing partners.

  14. A Semantic Analysis of XML Schema Matching for B2B Systems Integration

    ERIC Educational Resources Information Center

    Kim, Jaewook

    2011-01-01

    One of the most critical steps to integrating heterogeneous e-Business applications using different XML schemas is schema matching, which is known to be costly and error-prone. Many automatic schema matching approaches have been proposed, but the challenge is still daunting because of the complexity of schemas and immaturity of technologies in…

  15. A Logically Centralized Approach for Control and Management of Large Computer Networks

    ERIC Educational Resources Information Center

    Iqbal, Hammad A.

    2012-01-01

    Management of large enterprise and Internet service provider networks is a complex, error-prone, and costly challenge. It is widely accepted that the key contributors to this complexity are the bundling of control and data forwarding in traditional routers and the use of fully distributed protocols for network control. To address these…

  16. The Influence of Improper Sets of Information on Judgment: How Irrelevant Information Can Bias Judged Probability

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Sprenger, Amber

    2006-01-01

    This article introduces 2 new sources of bias in probability judgment, discrimination failure and inhibition failure, which are conceptualized as arising from an interaction between error prone memory processes and a support theory like comparison process. Both sources of bias stem from the influence of irrelevant information on participants'…

  17. Pre-Modeling Ensures Accurate Solid Models

    ERIC Educational Resources Information Center

    Gow, George

    2010-01-01

    Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…

  18. An Evaluation of a New Printing Instrument to Aid in Identifying the Failure-prone Preschool Child.

    ERIC Educational Resources Information Center

    Simner, Marvin L.

    Involving 619 preschool children, a longitudinal investigation evaluated a new test for identifying preschool children who produce an excessive number of form errors in printing. All children participating were fluent in English and were in the appropriate grades for their ages, either pre-kindergarten or kindergarten, when they were given the…

  19. Computer programs for optical dendrometer measurements of standing tree profiles

    Treesearch

    Jacob R. Beard; Thomas G. Matney; Emily B. Schultz

    2015-01-01

    Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...

  20. Conducting Web-Based Surveys. ERIC Digest.

    ERIC Educational Resources Information Center

    Solomon, David J.

    Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…

  1. Creating an automated tool for measuring software cohesion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tutton, J.M.; Zucconi, L.

    1994-05-06

    Program modules with high complexity tend to be more error prone and more difficult to understand. These factors increase maintenance and enhancement costs. Hence, a tool that can help programmers determine a key factor in module complexity should be very useful. Our goal is to create a software tool that will automatically give a quantitative measure of the cohesiveness of a given module, and hence give us an estimate of the {open_quotes}maintainability{close_quotes} of that module. The Tool will use a metric developed by Professors Linda M. Ott and James M. Bieman. The Ott/Bieman metric gives quantitative measures that indicate themore » degree of functional cohesion using abstract data slices.« less

  2. An Analysis of Misconceptions in Science Textbooks: Earth science in England and Wales

    NASA Astrophysics Data System (ADS)

    King, Chris John Henry

    2010-03-01

    Surveys of the earth science content of all secondary (high school) science textbooks and related publications used in England and Wales have revealed high levels of error/misconception. The 29 science textbooks or textbook series surveyed (51 texts in all) showed poor coverage of National Curriculum earth science and contained a mean level of one earth science error/misconception per page. Science syllabuses and examinations surveyed also showed errors/misconceptions. More than 500 instances of misconception were identified through the surveys. These were analysed for frequency, indicating that those areas of the earth science curriculum most prone to misconception are sedimentary processes/rocks, earthquakes/Earth's structure, and plate tectonics. For the 15 most frequent misconceptions, examples of quotes from the textbooks are given, together with the scientific consensus view, a discussion, and an example of a misconception of similar significance in another area of science. The misconceptions identified in the surveys are compared with those described in the literature. This indicates that the misconceptions found in college students and pre-service/practising science teachers are often also found in published materials, and therefore are likely to reinforce the misconceptions in teachers and their students. The analysis may also reflect the prevalence earth science misconceptions in the UK secondary (high school) science-teaching population. The analysis and discussion provide the opportunity for writers of secondary science materials to improve their work on earth science and to provide a platform for improved teaching and learning of earth science in the future.

  3. Fact or factitious? A psychobiological study of authentic and simulated dissociative identity states.

    PubMed

    Reinders, A A T S; Reinders, A A T Simone; Willemsen, Antoon T M; Vos, Herry P J; den Boer, Johan A; Nijenhuis, Ellert R S

    2012-01-01

    Dissociative identity disorder (DID) is a disputed psychiatric disorder. Research findings and clinical observations suggest that DID involves an authentic mental disorder related to factors such as traumatization and disrupted attachment. A competing view indicates that DID is due to fantasy proneness, suggestibility, suggestion, and role-playing. Here we examine whether dissociative identity state-dependent psychobiological features in DID can be induced in high or low fantasy prone individuals by instructed and motivated role-playing, and suggestion. DID patients, high fantasy prone and low fantasy prone controls were studied in two different types of identity states (neutral and trauma-related) in an autobiographical memory script-driven (neutral or trauma-related) imagery paradigm. The controls were instructed to enact the two DID identity states. Twenty-nine subjects participated in the study: 11 patients with DID, 10 high fantasy prone DID simulating controls, and 8 low fantasy prone DID simulating controls. Autonomic and subjective reactions were obtained. Differences in psychophysiological and neural activation patterns were found between the DID patients and both high and low fantasy prone controls. That is, the identity states in DID were not convincingly enacted by DID simulating controls. Thus, important differences regarding regional cerebral bloodflow and psychophysiological responses for different types of identity states in patients with DID were upheld after controlling for DID simulation. The findings are at odds with the idea that differences among different types of dissociative identity states in DID can be explained by high fantasy proneness, motivated role-enactment, and suggestion. They indicate that DID does not have a sociocultural (e.g., iatrogenic) origin.

  4. [Error prevention through management of complications in urology: standard operating procedures from commercial aviation as a model].

    PubMed

    Kranz, J; Sommer, K-J; Steffens, J

    2014-05-01

    Patient safety and risk/complication management rank among the current megatrends in modern medicine, which has undoubtedly become more complex. In time-critical, error-prone and difficult situations, which often occur repeatedly in everyday clinical practice, guidelines are inappropriate for acting rapidly and intelligently. With the establishment and consistent use of standard operating procedures like in commercial aviation, a possible strategic approach is available. These medical aids to decision-making - quick reference cards - are short, optimized instructions that enable a standardized procedure in case of medical claims.

  5. Predicting Psychotic-Like Experiences during Sensory Deprivation

    PubMed Central

    Daniel, Christina; Mason, Oliver J.

    2015-01-01

    Aims. This study aimed to establish the contribution of hallucination proneness, anxiety, suggestibility, and fantasy proneness to psychotic-like experiences (PLEs) reported during brief sensory deprivation. Method. Twenty-four high and 22 low hallucination-prone participants reported on PLEs occurring during brief sensory deprivation and at baseline. State/trait anxiety, suggestibility, and fantasy proneness were also measured. Results. Both groups experienced a significant increase in PLEs in sensory deprivation. The high hallucination prone group reported more PLEs both at baseline and in sensory deprivation. They also scored significantly higher on measures of state/trait anxiety, suggestibility, and fantasy proneness, though these did not explain the effects of group or condition. Regression analysis found hallucination proneness to be the best predictor of the increase in PLEs, with state anxiety also being a significant predictor. Fantasy proneness and suggestibility were not significant predictors. Conclusion. This study suggests the increase in PLEs reported during sensory deprivation reflects a genuine aberration in perceptual experience, as opposed to increased tendency to make false reports due to suggestibility of fantasy proneness. The study provides further support for the use of sensory deprivation as a safe and effective nonpharmacological model of psychosis. PMID:25811027

  6. Skull registration for prone patient position using tracked ultrasound

    NASA Astrophysics Data System (ADS)

    Underwood, Grace; Ungi, Tamas; Baum, Zachary; Lasso, Andras; Kronreif, Gernot; Fichtinger, Gabor

    2017-03-01

    PURPOSE: Tracked navigation has become prevalent in neurosurgery. Problems with registration of a patient and a preoperative image arise when the patient is in a prone position. Surfaces accessible to optical tracking on the back of the head are unreliable for registration. We investigated the accuracy of surface-based registration using points accessible through tracked ultrasound. Using ultrasound allows access to bone surfaces that are not available through optical tracking. Tracked ultrasound could eliminate the need to work (i) under the table for registration and (ii) adjust the tracker between surgery and registration. In addition, tracked ultrasound could provide a non-invasive method in comparison to an alternative method of registration involving screw implantation. METHODS: A phantom study was performed to test the feasibility of tracked ultrasound for registration. An initial registration was performed to partially align the pre-operative computer tomography data and skull phantom. The initial registration was performed by an anatomical landmark registration. Surface points accessible by tracked ultrasound were collected and used to perform an Iterative Closest Point Algorithm. RESULTS: When the surface registration was compared to a ground truth landmark registration, the average TRE was found to be 1.6+/-0.1mm and the average distance of points off the skull surface was 0.6+/-0.1mm. CONCLUSION: The use of tracked ultrasound is feasible for registration of patients in prone position and eliminates the need to perform registration under the table. The translational component of error found was minimal. Therefore, the amount of TRE in registration is due to a rotational component of error.

  7. Safe prescribing: a titanic challenge

    PubMed Central

    Routledge, Philip A

    2012-01-01

    The challenge to achieve safe prescribing merits the adjective ‘titanic’. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the ‘Seven C's’. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. PMID:22738396

  8. The fitness cost of mis-splicing is the main determinant of alternative splicing patterns.

    PubMed

    Saudemont, Baptiste; Popa, Alexandra; Parmley, Joanna L; Rocher, Vincent; Blugeon, Corinne; Necsulea, Anamaria; Meyer, Eric; Duret, Laurent

    2017-10-30

    Most eukaryotic genes are subject to alternative splicing (AS), which may contribute to the production of protein variants or to the regulation of gene expression via nonsense-mediated messenger RNA (mRNA) decay (NMD). However, a fraction of splice variants might correspond to spurious transcripts and the question of the relative proportion of splicing errors to functional splice variants remains highly debated. We propose a test to quantify the fraction of AS events corresponding to errors. This test is based on the fact that the fitness cost of splicing errors increases with the number of introns in a gene and with expression level. We analyzed the transcriptome of the intron-rich eukaryote Paramecium tetraurelia. We show that in both normal and in NMD-deficient cells, AS rates strongly decrease with increasing expression level and with increasing number of introns. This relationship is observed for AS events that are detectable by NMD as well as for those that are not, which invalidates the hypothesis of a link with the regulation of gene expression. Our results show that in genes with a median expression level, 92-98% of observed splice variants correspond to errors. We observed the same patterns in human transcriptomes and we further show that AS rates correlate with the fitness cost of splicing errors. These observations indicate that genes under weaker selective pressure accumulate more maladaptive substitutions and are more prone to splicing errors. Thus, to a large extent, patterns of gene expression variants simply reflect the balance between selection, mutation, and drift.

  9. Mutation at a distance caused by homopolymeric guanine repeats in Saccharomyces cerevisiae

    PubMed Central

    McDonald, Michael J.; Yu, Yen-Hsin; Guo, Jheng-Fen; Chong, Shin Yen; Kao, Cheng-Fu; Leu, Jun-Yi

    2016-01-01

    Mutation provides the raw material from which natural selection shapes adaptations. The rate at which new mutations arise is therefore a key factor that determines the tempo and mode of evolution. However, an accurate assessment of the mutation rate of a given organism is difficult because mutation rate varies on a fine scale within a genome. A central challenge of evolutionary genetics is to determine the underlying causes of this variation. In earlier work, we had shown that repeat sequences not only are prone to a high rate of expansion and contraction but also can cause an increase in mutation rate (on the order of kilobases) of the sequence surrounding the repeat. We perform experiments that show that simple guanine repeats 13 bp (base pairs) in length or longer (G13+) increase the substitution rate 4- to 18-fold in the downstream DNA sequence, and this correlates with DNA replication timing (R = 0.89). We show that G13+ mutagenicity results from the interplay of both error-prone translesion synthesis and homologous recombination repair pathways. The mutagenic repeats that we study have the potential to be exploited for the artificial elevation of mutation rate in systems biology and synthetic biology applications. PMID:27386516

  10. Probability of misclassifying biological elements in surface waters.

    PubMed

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  11. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    NASA Astrophysics Data System (ADS)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.

  12. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  13. 13Check_RNA: A tool to evaluate 13C chemical shifts assignments of RNA.

    PubMed

    Icazatti, A A; Martin, O A; Villegas, M; Szleifer, I; Vila, J A

    2018-06-19

    Chemical shifts (CS) are an important source of structural information of macromolecules such as RNA. In addition to the scarce availability of CS for RNA, the observed values are prone to errors due to a wrong re-calibration or miss assignments. Different groups have dedicated their efforts to correct CS systematic errors on RNA. Despite this, there are not automated and freely available algorithms for correct assignments of RNA 13C CS before their deposition to the BMRB or re-reference already deposited CS with systematic errors. Based on an existent method we have implemented an open source python module to correct 13C CS (from here on 13Cexp) systematic errors of RNAs and then return the results in 3 formats including the nmrstar one. This software is available on GitHub at https://github.com/BIOS-IMASL/13Check_RNA under a MIT license. Supplementary data are available at Bioinformatics online.

  14. Persistent damaged bases in DNA allow mutagenic break repair in Escherichia coli.

    PubMed

    Moore, Jessica M; Correa, Raul; Rosenberg, Susan M; Hastings, P J

    2017-07-01

    Bacteria, yeast and human cancer cells possess mechanisms of mutagenesis upregulated by stress responses. Stress-inducible mutagenesis potentially accelerates adaptation, and may provide important models for mutagenesis that drives cancers, host pathogen interactions, antibiotic resistance and possibly much of evolution generally. In Escherichia coli repair of double-strand breaks (DSBs) becomes mutagenic, using low-fidelity DNA polymerases under the control of the SOS DNA-damage response and RpoS general stress response, which upregulate and allow the action of error-prone DNA polymerases IV (DinB), II and V to make mutations during repair. Pol IV is implied to compete with and replace high-fidelity DNA polymerases at the DSB-repair replisome, causing mutagenesis. We report that up-regulated Pol IV is not sufficient for mutagenic break repair (MBR); damaged bases in the DNA are also required, and that in starvation-stressed cells, these are caused by reactive-oxygen species (ROS). First, MBR is reduced by either ROS-scavenging agents or constitutive activation of oxidative-damage responses, both of which reduce cellular ROS levels. The ROS promote MBR other than by causing DSBs, saturating mismatch repair, oxidizing proteins, or inducing the SOS response or the general stress response. We find that ROS drive MBR through oxidized guanines (8-oxo-dG) in DNA, in that overproduction of a glycosylase that removes 8-oxo-dG from DNA prevents MBR. Further, other damaged DNA bases can substitute for 8-oxo-dG because ROS-scavenged cells resume MBR if either DNA pyrimidine dimers or alkylated bases are induced. We hypothesize that damaged bases in DNA pause the replisome and allow the critical switch from high fidelity to error-prone DNA polymerases in the DSB-repair replisome, thus allowing MBR. The data imply that in addition to the indirect stress-response controlled switch to MBR, a direct cis-acting switch to MBR occurs independently of DNA breakage, caused by ROS oxidation of DNA potentially regulated by ROS regulators.

  15. Persistent damaged bases in DNA allow mutagenic break repair in Escherichia coli

    PubMed Central

    Moore, Jessica M.; Correa, Raul; Rosenberg, Susan M.

    2017-01-01

    Bacteria, yeast and human cancer cells possess mechanisms of mutagenesis upregulated by stress responses. Stress-inducible mutagenesis potentially accelerates adaptation, and may provide important models for mutagenesis that drives cancers, host pathogen interactions, antibiotic resistance and possibly much of evolution generally. In Escherichia coli repair of double-strand breaks (DSBs) becomes mutagenic, using low-fidelity DNA polymerases under the control of the SOS DNA-damage response and RpoS general stress response, which upregulate and allow the action of error-prone DNA polymerases IV (DinB), II and V to make mutations during repair. Pol IV is implied to compete with and replace high-fidelity DNA polymerases at the DSB-repair replisome, causing mutagenesis. We report that up-regulated Pol IV is not sufficient for mutagenic break repair (MBR); damaged bases in the DNA are also required, and that in starvation-stressed cells, these are caused by reactive-oxygen species (ROS). First, MBR is reduced by either ROS-scavenging agents or constitutive activation of oxidative-damage responses, both of which reduce cellular ROS levels. The ROS promote MBR other than by causing DSBs, saturating mismatch repair, oxidizing proteins, or inducing the SOS response or the general stress response. We find that ROS drive MBR through oxidized guanines (8-oxo-dG) in DNA, in that overproduction of a glycosylase that removes 8-oxo-dG from DNA prevents MBR. Further, other damaged DNA bases can substitute for 8-oxo-dG because ROS-scavenged cells resume MBR if either DNA pyrimidine dimers or alkylated bases are induced. We hypothesize that damaged bases in DNA pause the replisome and allow the critical switch from high fidelity to error-prone DNA polymerases in the DSB-repair replisome, thus allowing MBR. The data imply that in addition to the indirect stress-response controlled switch to MBR, a direct cis-acting switch to MBR occurs independently of DNA breakage, caused by ROS oxidation of DNA potentially regulated by ROS regulators. PMID:28727736

  16. Fact or Factitious? A Psychobiological Study of Authentic and Simulated Dissociative Identity States

    PubMed Central

    Simone Reinders, A. A. T.; Willemsen, Antoon T. M.; Vos, Herry P. J.; den Boer, Johan A.; Nijenhuis, Ellert R. S.

    2012-01-01

    Background Dissociative identity disorder (DID) is a disputed psychiatric disorder. Research findings and clinical observations suggest that DID involves an authentic mental disorder related to factors such as traumatization and disrupted attachment. A competing view indicates that DID is due to fantasy proneness, suggestibility, suggestion, and role-playing. Here we examine whether dissociative identity state-dependent psychobiological features in DID can be induced in high or low fantasy prone individuals by instructed and motivated role-playing, and suggestion. Methodology/Principal Findings DID patients, high fantasy prone and low fantasy prone controls were studied in two different types of identity states (neutral and trauma-related) in an autobiographical memory script-driven (neutral or trauma-related) imagery paradigm. The controls were instructed to enact the two DID identity states. Twenty-nine subjects participated in the study: 11 patients with DID, 10 high fantasy prone DID simulating controls, and 8 low fantasy prone DID simulating controls. Autonomic and subjective reactions were obtained. Differences in psychophysiological and neural activation patterns were found between the DID patients and both high and low fantasy prone controls. That is, the identity states in DID were not convincingly enacted by DID simulating controls. Thus, important differences regarding regional cerebral bloodflow and psychophysiological responses for different types of identity states in patients with DID were upheld after controlling for DID simulation. Conclusions/Significance The findings are at odds with the idea that differences among different types of dissociative identity states in DID can be explained by high fantasy proneness, motivated role-enactment, and suggestion. They indicate that DID does not have a sociocultural (e.g., iatrogenic) origin. PMID:22768068

  17. Corrected score estimation in the proportional hazards model with misclassified discrete covariates

    PubMed Central

    Zucker, David M.; Spiegelman, Donna

    2013-01-01

    SUMMARY We consider Cox proportional hazards regression when the covariate vector includes error-prone discrete covariates along with error-free covariates, which may be discrete or continuous. The misclassification in the discrete error-prone covariates is allowed to be of any specified form. Building on the work of Nakamura and his colleagues, we present a corrected score method for this setting. The method can handle all three major study designs (internal validation design, external validation design, and replicate measures design), both functional and structural error models, and time-dependent covariates satisfying a certain ‘localized error’ condition. We derive the asymptotic properties of the method and indicate how to adjust the covariance matrix of the regression coefficient estimates to account for estimation of the misclassification matrix. We present the results of a finite-sample simulation study under Weibull survival with a single binary covariate having known misclassification rates. The performance of the method described here was similar to that of related methods we have examined in previous works. Specifically, our new estimator performed as well as or, in a few cases, better than the full Weibull maximum likelihood estimator. We also present simulation results for our method for the case where the misclassification probabilities are estimated from an external replicate measures study. Our method generally performed well in these simulations. The new estimator has a broader range of applicability than many other estimators proposed in the literature, including those described in our own earlier work, in that it can handle time-dependent covariates with an arbitrary misclassification structure. We illustrate the method on data from a study of the relationship between dietary calcium intake and distal colon cancer. PMID:18219700

  18. Intransparent German number words complicate transcoding - a translingual comparison with Japanese.

    PubMed

    Moeller, Korbinian; Zuber, Julia; Olsen, Naoko; Nuerk, Hans-Christoph; Willmes, Klaus

    2015-01-01

    Superior early numerical competencies of children in several Asian countries have (amongst others) been attributed to the higher transparency of their number word systems. Here, we directly investigated this claim by evaluating whether Japanese children's transcoding performance when writing numbers to dictation (e.g., "twenty five" → 25) was less error prone than that of German-speaking children - both in general as well as when considering language-specific attributes of the German number word system such as the inversion property, in particular. In line with this hypothesis we observed that German-speaking children committed more transcoding errors in general than their Japanese peers. Moreover, their error pattern reflected the specific inversion intransparency of the German number-word system. Inversion errors in transcoding represented the most prominent error category in German-speaking children, but were almost absent in Japanese-speaking children. We conclude that the less transparent German number-word system complicates the acquisition of the correspondence between symbolic Arabic numbers and their respective verbal number words.

  19. Migration and risk: net migration in marginal ecosystems and hazardous areas

    NASA Astrophysics Data System (ADS)

    de Sherbinin, Alex; Levy, Marc; Adamo, Susana; MacManus, Kytt; Yetman, Greg; Mara, Valentina; Razafindrazay, Liana; Goodrich, Benjamin; Srebotnjak, Tanja; Aichele, Cody; Pistolesi, Linda

    2012-12-01

    The potential for altered ecosystems and extreme weather events in the context of climate change has raised questions concerning the role that migration plays in either increasing or reducing risks to society. Using modeled data on net migration over three decades from 1970 to 2000, we identify sensitive ecosystems and regions at high risk of climate hazards that have seen high levels of net in-migration and out-migration over the time period. This paper provides a literature review on migration related to ecosystems, briefly describes the methodology used to develop the estimates of net migration, then uses those data to describe the patterns of net migration for various ecosystems and high risk regions. The study finds that negative net migration generally occurs over large areas, reflecting its largely rural character, whereas areas of positive net migration are typically smaller, reflecting its largely urban character. The countries with largest population such as China and India tend to drive global results for all the ecosystems found in those countries. Results suggest that from 1970 to 2000, migrants in developing countries have tended to move out of marginal dryland and mountain ecosystems and out of drought-prone areas, and have moved towards coastal ecosystems and areas that are prone to floods and cyclones. For North America results are reversed for dryland and mountain ecosystems, which saw large net influxes of population in the period of record. Uncertainties and potential sources of error in these estimates are addressed.

  20. Factor Structure and Measurement Invariance of the Cognitive Failures Questionnaire across the Adult Life Span

    ERIC Educational Resources Information Center

    Rast, Philippe; Zimprich, Daniel; Van Boxtel, Martin; Jolles, Jellemer

    2009-01-01

    The Cognitive Failures Questionnaire (CFQ) is designed to assess a person's proneness to committing cognitive slips and errors in the completion of everyday tasks. Although the CFQ is a widely used instrument, its factor structure remains an issue of scientific debate. The present study used data of a representative sample (N = 1,303, 24-83 years…

  1. Ground-based digital imagery for tree stem analysis

    Treesearch

    Neil Clark; Daniel L. Schmoldt; Randolph H. Wynne; Matthew F. Winn; Philip A. Araman

    2000-01-01

    In the USA, a subset of permanent forest sample plots within each geographic region are intensively measured to obtain estimates of tree volume and products. The detailed field measurements required for this type of sampling are both time consuming and error prone. We are attempting to reduce both of these factors with the aid of a commercially-available solid-state...

  2. Applying recovery biomarkers to calibrate self-report measures of energy and protein in the Hispanic Community Health Study/Study of Latinos

    USDA-ARS?s Scientific Manuscript database

    We investigated measurement error in the self-reported diets of US Hispanics/Latinos, who are prone to obesity and related comorbidities, by background (Central American, Cuban, Dominican, Mexican, Puerto Rican, and South American) in 2010–2012. In 477 participants aged 18–74 years, doubly labeled w...

  3. Developing a Machine-Supported Coding System for Constructed-Response Items in PISA. Research Report. ETS RR-17-47

    ERIC Educational Resources Information Center

    Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias

    2017-01-01

    Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…

  4. RADH, a gene of Saccharomyces cerevisiae encoding a putative DNA helicase involved in DNA repair. Characteristics of radH mutants and sequence of the gene.

    PubMed

    Aboussekhra, A; Chanet, R; Zgaga, Z; Cassier-Chauvat, C; Heude, M; Fabre, F

    1989-09-25

    A new type of radiation-sensitive mutant of S. cerevisiae is described. The recessive radH mutation sensitizes to the lethal effect of UV radiations haploids in the G1 but not in the G2 mitotic phase. Homozygous diploids are as sensitive as G1 haploids. The UV-induced mutagenesis is depressed, while the induction of gene conversion is increased. The mutation is believed to channel the repair of lesions engaged in the mutagenic pathway into a recombination process, successful if the events involve sister-chromatids but lethal if they involve homologous chromosomes. The sequence of the RADH gene reveals that it may code for a DNA helicase, with a Mr of 134 kDa. All the consensus domains of known DNA helicases are present. Besides these consensus regions, strong homologies with the Rep and UvrD helicases of E. coli were found. The RadH putative helicase appears to belong to the set of proteins involved in the error-prone repair mechanism, at least for UV-induced lesions, and could act in coordination with the Rev3 error-prone DNA polymerase.

  5. Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.

    Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less

  6. Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System

    DOE PAGES

    Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.; ...

    2017-03-22

    Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less

  7. Boredom proneness: its relationship to psychological- and physical-health symptoms.

    PubMed

    Sommers, J; Vodanovich, S J

    2000-01-01

    The relationship between boredom proneness and health-symptom reporting was examined. Undergraduate students (N = 200) completed the Boredom Proneness Scale and the Hopkins Symptom Checklist. A multiple analysis of covariance indicated that individuals with high boredom-proneness total scores reported significantly higher ratings on all five subscales of the Hopkins Symptom Checklist (Obsessive-Compulsive, Somatization, Anxiety, Interpersonal Sensitivity, and Depression). The results suggest that boredom proneness may be an important element to consider when assessing symptom reporting. Implications for determining the effects of boredom proneness on psychological- and physical-health symptoms. as well as the application in clinical settings, are discussed.

  8. Inhibiting HER3-mediated tumor cell growth with affibody molecules engineered to low picomolar affinity by position-directed error-prone PCR-like diversification.

    PubMed

    Malm, Magdalena; Kronqvist, Nina; Lindberg, Hanna; Gudmundsdotter, Lindvi; Bass, Tarek; Frejd, Fredrik Y; Höidén-Guthenberg, Ingmarie; Varasteh, Zohreh; Orlova, Anna; Tolmachev, Vladimir; Ståhl, Stefan; Löfblom, John

    2013-01-01

    The HER3 receptor is implicated in the progression of various cancers as well as in resistance to several currently used drugs, and is hence a potential target for development of new therapies. We have previously generated Affibody molecules that inhibit heregulin-induced signaling of the HER3 pathways. The aim of this study was to improve the affinity of the binders to hopefully increase receptor inhibition efficacy and enable a high receptor-mediated uptake in tumors. We explored a novel strategy for affinity maturation of Affibody molecules that is based on alanine scanning followed by design of library diversification to mimic the result from an error-prone PCR reaction, but with full control over mutated positions and thus less biases. Using bacterial surface display and flow-cytometric sorting of the maturation library, the affinity for HER3 was improved more than 30-fold down to 21 pM. The affinity is among the higher that has been reported for Affibody molecules and we believe that the maturation strategy should be generally applicable for improvement of affinity proteins. The new binders also demonstrated an improved thermal stability as well as complete refolding after denaturation. Moreover, inhibition of ligand-induced proliferation of HER3-positive breast cancer cells was improved more than two orders of magnitude compared to the previously best-performing clone. Radiolabeled Affibody molecules showed specific targeting of a number of HER3-positive cell lines in vitro as well as targeting of HER3 in in vivo mouse models and represent promising candidates for future development of targeted therapies and diagnostics.

  9. Magellan spacecraft and memory state tracking: Lessons learned, future thoughts

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.

  10. RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error.

    PubMed

    Huang, Chengqiang; Yang, Youchang; Wu, Bo; Yu, Weize

    2018-06-01

    The sub-pixel arrangement of the RGBG panel and the image with RGB format are different and the algorithm that converts RGB to RGBG is urgently needed to display an image with RGB arrangement on the RGBG panel. However, the information loss is still large although color fringing artifacts are weakened in the published papers that study this conversion. In this paper, an RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error (EDMSE) is proposed. The main points of innovation include the following: (1) the edge detection is first proposed to distinguish image details with serious color fringing artifacts and image details which are prone to be lost in the process of RGB-RGBG conversion; (2) for image details with serious color fringing artifacts, the weighting factor 0.5 is applied to weaken the color fringing artifacts; and (3) for image details that are prone to be lost in the process of RGB-RGBG conversion, a special mechanism to minimize square error is proposed. The experiment shows that the color fringing artifacts are slightly improved by EDMSE, and the values of MSE of the image processed are 19.6% and 7% smaller than those of the image processed by the direct assignment and weighting factor algorithm, respectively. The proposed algorithm is implemented on a field programmable gate array to enable the image display on the RGBG panel.

  11. Magellan spacecraft and memory state tracking: Lessons learned, future thoughts

    NASA Astrophysics Data System (ADS)

    Bucher, Allen W.

    1993-03-01

    Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.

  12. Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials

    PubMed Central

    Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.

    2013-01-01

    Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072

  13. A new discrete dipole kernel for quantitative susceptibility mapping.

    PubMed

    Milovic, Carlos; Acosta-Cabronero, Julio; Pinto, José Miguel; Mattern, Hendrik; Andia, Marcelo; Uribe, Sergio; Tejos, Cristian

    2018-09-01

    Most approaches for quantitative susceptibility mapping (QSM) are based on a forward model approximation that employs a continuous Fourier transform operator to solve a differential equation system. Such formulation, however, is prone to high-frequency aliasing. The aim of this study was to reduce such errors using an alternative dipole kernel formulation based on the discrete Fourier transform and discrete operators. The impact of such an approach on forward model calculation and susceptibility inversion was evaluated in contrast to the continuous formulation both with synthetic phantoms and in vivo MRI data. The discrete kernel demonstrated systematically better fits to analytic field solutions, and showed less over-oscillations and aliasing artifacts while preserving low- and medium-frequency responses relative to those obtained with the continuous kernel. In the context of QSM estimation, the use of the proposed discrete kernel resulted in error reduction and increased sharpness. This proof-of-concept study demonstrated that discretizing the dipole kernel is advantageous for QSM. The impact on small or narrow structures such as the venous vasculature might by particularly relevant to high-resolution QSM applications with ultra-high field MRI - a topic for future investigations. The proposed dipole kernel has a straightforward implementation to existing QSM routines. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Child anger proneness moderates associations between child-mother attachment security and child behavior with mothers at 33 months.

    PubMed

    McElwain, Nancy L; Holland, Ashley S; Engle, Jennifer M; Wong, Maria S

    2012-02-01

    Child-mother attachment security, assessed via a modified Strange Situation procedure (Cassidy & Marvin, with the MacArthur Attachment Working Group, 1992), and parent-reported child proneness to anger were examined as correlates of observed child behavior toward mothers during a series of interactive tasks (N = 120, 60 girls). Controlling for maternal sensitivity and child gender and expressive language ability, greater attachment security, and lower levels of anger proneness were related to more child responsiveness to maternal requests and suggestions during play and snack sessions. As hypothesized, anger proneness also moderated several security-behavior associations. Greater attachment security was related to (a) more committed compliance during clean-up and snack-delay tasks for children high on anger proneness, (b) more self-assertiveness during play and snack for children moderate or high on anger proneness, and (c) more help-seeking during play and snack for children moderate or low on anger proneness. Findings further our understanding of the behavioral correlates of child-mother attachment security assessed during late toddlerhood via the Cassidy-Marvin system and underscore child anger proneness as a moderator of attachment-related differences in child behavior during this developmental period.

  15. A Case-Series Test of the Interactive Two-step Model of Lexical Access: Predicting Word Repetition from Picture Naming

    PubMed Central

    Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.

    2010-01-01

    Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error patterns of 65 aphasic subjects from their naming errors. The model’s characterizations of the subjects’ naming errors were taken from the companion paper to this one (Schwartz, Dell, N. Martin, Gahl & Sobel, 2006), and their repetition was predicted from the model on the assumption that naming involves two error prone steps, word and phonological retrieval, whereas repetition only creates errors in the second of these steps. A version of the model in which lexical-semantic and lexical-phonological connections could be independently lesioned was generally successful in predicting repetition for the aphasics. An analysis of the few cases in which model predictions were inaccurate revealed the role of input phonology in the repetition task. PMID:21085621

  16. Is Disgust Proneness Associated With Anxiety and Related Disorders? A Qualitative Review and Meta-Analysis of Group Comparison and Correlational Studies.

    PubMed

    Olatunji, Bunmi O; Armstrong, Thomas; Elwood, Lisa

    2017-07-01

    Research suggests that disgust may be linked to the etiology of some anxiety-related disorders. The present investigation reviews this literature and employs separate meta-analyses of clinical group comparison and correlational studies to examine the association between disgust proneness and anxiety-related disorder symptoms. Meta-analysis of 43 group comparison studies revealed those high in anxiety disorder symptoms reported significantly more disgust proneness than those low in anxiety symptoms. Although this effect was not moderated by clinical versus analogue studies or type of disorder, larger group differences were observed for those high in anxiety symptoms associated with contagion concerns compared to those high in anxiety symptoms not associated with contagion concerns. Similarly, meta-analysis of correlational data across 83 samples revealed moderate associations between disgust proneness and anxiety-related disorder symptoms. Moderator analysis revealed that the association between disgust proneness and anxiety-related disorder symptoms was especially robust for anxiety symptoms associated with contagion concerns. After controlling for measures of negative affect, disgust proneness continued to be moderately correlated with anxiety-related disorder symptoms. However, negative affect was no longer significantly associated with symptoms of anxiety-related disorders when controlling for disgust proneness. The implications of these findings are discussed in the context of a novel transdiagnostic model.

  17. The association of family functioning and psychosis proneness in five countries that differ in cultural values and family structures.

    PubMed

    Wüsten, Caroline; Lincoln, Tania M

    2017-07-01

    For decades, researchers have attributed the better prognosis of psychosis in developing countries to a host of socio-cultural factors, including family functioning. Nevertheless, it is unknown whether family functioning and its association with symptoms differ across countries. This study assessed family functioning (support, satisfaction with family relations, and criticism) and psychosis proneness in community samples from Chile (n =399), Colombia (n=486), Indonesia (n=115), Germany (n=174) and the USA (n=143). Family functioning was compared between prototypical developing countries (Chile, Columbia, Indonesia) and highly industrialized countries (Germany, USA). Hierarchical regression analysis was used to test for the moderating effect of country on the associations between family functioning and psychosis proneness. Participants from developing countries perceived more support and felt more satisfied. However, they also perceived more criticism than participants from highly industrialized countries. Criticism and family satisfaction were significantly associated with psychosis proneness. Moreover, the relationship between criticism and psychosis proneness was significantly stronger in developing countries compared to highly industrialized countries. Generally, family satisfaction and criticism appear to be more relevant to psychosis proneness than the quantity of family support. Moreover, criticism seems to be more closely related to psychosis proneness in developing countries. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. [Risk and risk management in aviation].

    PubMed

    Müller, Manfred

    2004-10-01

    RISK MANAGEMENT: The large proportion of human errors in aviation accidents suggested the solution--at first sight brilliant--to replace the fallible human being by an "infallible" digitally-operating computer. However, even after the introduction of the so-called HITEC-airplanes, the factor human error still accounts for 75% of all accidents. Thus, if the computer is ruled out as the ultimate safety system, how else can complex operations involving quick and difficult decisions be controlled? OPTIMIZED TEAM INTERACTION/PARALLEL CONNECTION OF THOUGHT MACHINES: Since a single person is always "highly error-prone", support and control have to be guaranteed by a second person. The independent work of mind results in a safety network that more efficiently cushions human errors. NON-PUNITIVE ERROR MANAGEMENT: To be able to tackle the actual problems, the open discussion of intervened errors must not be endangered by the threat of punishment. It has been shown in the past that progress is primarily achieved by investigating and following up mistakes, failures and catastrophes shortly after they happened. HUMAN FACTOR RESEARCH PROJECT: A comprehensive survey showed the following result: By far the most frequent safety-critical situation (37.8% of all events) consists of the following combination of risk factors: 1. A complication develops. 2. In this situation of increased stress a human error occurs. 3. The negative effects of the error cannot be corrected or eased because there are deficiencies in team interaction on the flight deck. This means, for example, that a negative social climate has the effect of a "turbocharger" when a human error occurs. It needs to be pointed out that a negative social climate is not identical with a dispute. In many cases the working climate is burdened without the responsible person even noticing it: A first negative impression, too much or too little respect, contempt, misunderstandings, not expressing unclear concern, etc. can considerably reduce the efficiency of a team.

  19. Recent advances in quantitative analysis of fluid interfaces in multiphase fluid flow measured by synchrotron-based x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Schlueter, S.; Sheppard, A.; Wildenschild, D.

    2013-12-01

    Imaging of fluid interfaces in three-dimensional porous media via x-ray microtomography is an efficient means to test thermodynamically derived predictions on the relationship between capillary pressure, fluid saturation and specific interfacial area (Pc-Sw-Anw) in partially saturated porous media. Various experimental studies exist to date that validate the uniqueness of the Pc-Sw-Anw relationship under static conditions and with current technological progress direct imaging of moving interfaces under dynamic conditions is also becoming available. Image acquisition and subsequent image processing currently involves many steps each prone to operator bias, like merging different scans of the same sample obtained at different beam energies into a single image or the generation of isosurfaces from the segmented multiphase image on which the interface properties are usually calculated. We demonstrate that with recent advancements in (i) image enhancement methods, (ii) multiphase segmentation methods and (iii) methods of structural analysis we can considerably decrease the time and cost of image acquisition and the uncertainty associated with the measurement of interfacial properties. In particular, we highlight three notorious problems in multiphase image processing and provide efficient solutions for each: (i) Due to noise, partial volume effects, and imbalanced volume fractions, automated histogram-based threshold detection methods frequently fail. However, these impairments can be mitigated with modern denoising methods, special treatment of gray value edges and adaptive histogram equilization, such that most of the standard methods for threshold detection (Otsu, fuzzy c-means, minimum error, maximum entropy) coincide at the same set of values. (ii) Partial volume effects due to blur may produce apparent water films around solid surfaces that alter the specific fluid-fluid interfacial area (Anw) considerably. In a synthetic test image some local segmentation methods like Bayesian Markov random field, converging active contours and watershed segmentation reduced the error in Anw associated with apparent water films from 21% to 6-11%. (iii) The generation of isosurfaces from the segmented data usually requires a lot of postprocessing in order to smooth the surface and check for consistency errors. This can be avoided by calculating specific interfacial areas directly on the segmented voxel image by means of Minkowski functionals which is highly efficient and less error prone.

  20. Preparing a neuropediatric upper limb exergame rehabilitation system for home-use: a feasibility study.

    PubMed

    Gerber, Corinna N; Kunz, Bettina; van Hedel, Hubertus J A

    2016-03-23

    Home-based, computer-enhanced therapy of hand and arm function can complement conventional interventions and increase the amount and intensity of training, without interfering too much with family routines. The objective of the present study was to investigate the feasibility and usability of the new portable version of the YouGrabber® system (YouRehab AG, Zurich, Switzerland) in the home setting. Fifteen families of children (7 girls, mean age: 11.3y) with neuromotor disorders and affected upper limbs participated. They received instructions and took the system home to train for 2 weeks. After returning it, they answered questions about usability, motivation, and their general opinion of the system (Visual Analogue Scale; 0 indicating worst score, 100 indicating best score; ≤30 not satisfied, 31-69 average, ≥70 satisfied). Furthermore, total pure playtime and number of training sessions were quantified. To prove the usability of the system, number and sort of support requests were logged. The usability of the system was considered average to satisfying (mean 60.1-93.1). The lowest score was given for the occurrence of technical errors. Parents had to motivate their children to start (mean 66.5) and continue (mean 68.5) with the training. But in general, parents estimated the therapeutic benefit as high (mean 73.1) and the whole system as very good (mean 87.4). Children played on average 7 times during the 2 weeks; total pure playtime was 185 ± 45 min. Especially at the beginning of the trial, systems were very error-prone. Fortunately, we, or the company, solved most problems before the patients took the systems home. Nevertheless, 10 of 15 families contacted us at least once because of technical problems. Despite that the YouGrabber® is a promising and highly accepted training tool for home-use, currently, it is still error-prone, and the requested support exceeds the support that can be provided by clinical therapists. A technically more robust system, combined with additional attractive games, likely results in higher patient motivation and better compliance. This would reduce the need for parents to motivate their children extrinsically and allow for clinical trials to investigate the effectiveness of the system. ClinicalTrials.gov NCT02368223.

  1. Spatial calibration of an optical see-through head mounted display

    PubMed Central

    Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew

    2010-01-01

    We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125

  2. Medication errors reported to the National Medication Error Reporting System in Malaysia: a 4-year retrospective review (2009 to 2012).

    PubMed

    Samsiah, A; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi; Wan-Mohaina, W M

    2016-12-01

    Reporting and analysing the data on medication errors (MEs) is important and contributes to a better understanding of the error-prone environment. This study aims to examine the characteristics of errors submitted to the National Medication Error Reporting System (MERS) in Malaysia. A retrospective review of reports received from 1 January 2009 to 31 December 2012 was undertaken. Descriptive statistics method was applied. A total of 17,357 MEs reported were reviewed. The majority of errors were from public-funded hospitals. Near misses were classified in 86.3 % of the errors. The majority of errors (98.1 %) had no harmful effects on the patients. Prescribing contributed to more than three-quarters of the overall errors (76.1 %). Pharmacists detected and reported the majority of errors (92.1 %). Cases of erroneous dosage or strength of medicine (30.75 %) were the leading type of error, whilst cardiovascular (25.4 %) was the most common category of drug found. MERS provides rich information on the characteristics of reported MEs. Low contribution to reporting from healthcare facilities other than government hospitals and non-pharmacists requires further investigation. Thus, a feasible approach to promote MERS among healthcare providers in both public and private sectors needs to be formulated and strengthened. Preventive measures to minimise MEs should be directed to improve prescribing competency among the fallible prescribers identified.

  3. Autobalanced Ramsey Spectroscopy

    NASA Astrophysics Data System (ADS)

    Sanner, Christian; Huntemann, Nils; Lange, Richard; Tamm, Christian; Peik, Ekkehard

    2018-01-01

    We devise a perturbation-immune version of Ramsey's method of separated oscillatory fields. Spectroscopy of an atomic clock transition without compromising the clock's accuracy is accomplished by actively balancing the spectroscopic responses from phase-congruent Ramsey probe cycles of unequal durations. Our simple and universal approach eliminates a wide variety of interrogation-induced line shifts often encountered in high precision spectroscopy, among them, in particular, light shifts, phase chirps, and transient Zeeman shifts. We experimentally demonstrate autobalanced Ramsey spectroscopy on the light shift prone Yb+ 171 electric octupole optical clock transition and show that interrogation defects are not turned into clock errors. This opens up frequency accuracy perspectives below the 10-18 level for the Yb+ system and for other types of optical clocks.

  4. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  5. Toward a more sophisticated response representation in theories of medial frontal performance monitoring: The effects of motor similarity and motor asymmetries.

    PubMed

    Hochman, Eldad Yitzhak; Orr, Joseph M; Gehring, William J

    2014-02-01

    Cognitive control in the posterior medial frontal cortex (pMFC) is formulated in models that emphasize adaptive behavior driven by a computation evaluating the degree of difference between 2 conflicting responses. These functions are manifested by an event-related brain potential component coined the error-related negativity (ERN). We hypothesized that the ERN represents a regulative rather than evaluative pMFC process, exerted over the error motor representation, expediting the execution of a corrective response. We manipulated the motor representations of the error and the correct response to varying degrees. The ERN was greater when 1) the error response was more potent than when the correct response was more potent, 2) more errors were committed, 3) fewer and slower corrections were observed, and 4) the error response shared fewer motor features with the correct response. In their current forms, several prominent models of the pMFC cannot be reconciled with these findings. We suggest that a prepotent, unintended error is prone to reach the manual motor processor responsible for response execution before a nonpotent, intended correct response. In this case, the correct response is a correction and its execution must wait until the error is aborted. The ERN may reflect pMFC activity that aimed to suppress the error.

  6. Comprehensive analysis of a medication dosing error related to CPOE.

    PubMed

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  7. Neuro-Oscillatory Mechanisms of Intersensory Selective Attention and Task Switching in School-Aged Children, Adolescents and Young Adults

    ERIC Educational Resources Information Center

    Murphy, Jeremy W.; Foxe, John J.; Molholm, Sophie

    2016-01-01

    The ability to attend to one among multiple sources of information is central to everyday functioning. Just as central is the ability to switch attention among competing inputs as the task at hand changes. Such processes develop surprisingly slowly, such that even into adolescence, we remain slower and more error prone at switching among tasks…

  8. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  9. "Truth be told" - Semantic memory as the scaffold for veridical communication.

    PubMed

    Hayes, Brett K; Ramanan, Siddharth; Irish, Muireann

    2018-01-01

    Theoretical accounts placing episodic memory as central to constructive and communicative functions neglect the role of semantic memory. We argue that the decontextualized nature of semantic schemas largely supersedes the computational bottleneck and error-prone nature of episodic memory. Rather, neuroimaging and neuropsychological evidence of episodic-semantic interactions suggest that an integrative framework more accurately captures the mechanisms underpinning social communication.

  10. Automated lattice data generation

    NASA Astrophysics Data System (ADS)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  11. Increased Perceptual and Conceptual Processing Difficulty Makes the Immeasurable Measurable: Negative Priming in the Absence of Probe Distractors

    ERIC Educational Resources Information Center

    Frings, Christian; Spence, Charles

    2011-01-01

    Negative priming (NP) refers to the finding that people's responses to probe targets previously presented as prime distractors are usually slower and more error prone than to unrepeated stimuli. In a typical NP experiment, each probe target is accompanied by a distractor. It is an accepted, albeit puzzling, finding that the NP effect depends on…

  12. Exploring the Clinical Utility of the Development and Well-Being Assessment (DAWBA) in the Detection of Hyperkinetic Disorders and Associated Diagnoses in Clinical Practice

    ERIC Educational Resources Information Center

    Foreman, David; Morton, Stephanie; Ford, Tamsin

    2009-01-01

    Background: The clinical diagnosis of ADHD is time-consuming and error-prone. Secondary care referral results in long waiting times, but primary care staff may not provide reliable diagnoses. The Development And Well-Being Assessment (DAWBA) is a standardised assessment for common child mental health problems, including attention…

  13. Measuring Diameters Of Large Vessels

    NASA Technical Reports Server (NTRS)

    Currie, James R.; Kissel, Ralph R.; Oliver, Charles E.; Smith, Earnest C.; Redmon, John W., Sr.; Wallace, Charles C.; Swanson, Charles P.

    1990-01-01

    Computerized apparatus produces accurate results quickly. Apparatus measures diameter of tank or other large cylindrical vessel, without prior knowledge of exact location of cylindrical axis. Produces plot of inner circumference, estimate of true center of vessel, data on radius, diameter of best-fit circle, and negative and positive deviations of radius from circle at closely spaced points on circumference. Eliminates need for time-consuming and error-prone manual measurements.

  14. Quantitative Analysis of the Mutagenic Potential of 1-Aminopyrene-DNA Adduct Bypass Catalyzed by Y-Family DNA Polymerases

    PubMed Central

    Sherrer, Shanen M.; Taggart, David J.; Pack, Lindsey R.; Malik, Chanchal K.; Basu, Ashis K.; Suo, Zucai

    2012-01-01

    N- (deoxyguanosin-8-yl)-1-aminopyrene (dGAP) is the predominant nitro polyaromatic hydrocarbon product generated from the air pollutant 1-nitropyrene reacting with DNA. Previous studies have shown that dGAP induces genetic mutations in bacterial and mammalian cells. One potential source of these mutations is the error-prone bypass of dGAP lesions catalyzed by the low-fidelity Y-family DNA polymerases. To provide a comparative analysis of the mutagenic potential of the translesion DNA synthesis (TLS) of dGAP, we employed short oligonucleotide sequencing assays (SOSAs) with the model Y-family DNA polymerase from Sulfolobus solfataricus, DNA Polymerase IV (Dpo4), and the human Y-family DNA polymerases eta (hPolη), kappa (hPolκ), and iota (hPolι). Relative to undamaged DNA, all four enzymes generated far more mutations (base deletions, insertions, and substitutions) with a DNA template containing a site-specifically placed dGAP. Opposite dGAP and at an immediate downstream template position, the most frequent mutations made by the three human enzymes were base deletions and the most frequent base substitutions were dAs for all enzymes. Based on the SOSA data, Dpo4 was the least error-prone Y-family DNA polymerase among the four enzymes during the TLS of dGAP. Among the three human Y-family enzymes, hPolκ made the fewest mutations at all template positions except opposite the lesion site. hPolκ was significantly less error-prone than hPolι and hPolη during the extension of dGAP bypass products. Interestingly, the most frequent mutations created by hPolι at all template positions were base deletions. Although hRev1, the fourth human Y-family enzyme, could not extend dGAP bypass products in our standing start assays, it preferentially incorporated dCTP opposite the bulky lesion. Collectively, these mutagenic profiles suggest that hPolkk and hRev1 are the most suitable human Y-family DNA polymerases to perform TLS of dGAP in humans. PMID:22917544

  15. Experimental investigation of observation error in anuran call surveys

    USGS Publications Warehouse

    McClintock, B.T.; Bailey, L.L.; Pollock, K.H.; Simons, T.R.

    2010-01-01

    Occupancy models that account for imperfect detection are often used to monitor anuran and songbird species occurrence. However, presenceabsence data arising from auditory detections may be more prone to observation error (e.g., false-positive detections) than are sampling approaches utilizing physical captures or sightings of individuals. We conducted realistic, replicated field experiments using a remote broadcasting system to simulate simple anuran call surveys and to investigate potential factors affecting observation error in these studies. Distance, time, ambient noise, and observer abilities were the most important factors explaining false-negative detections. Distance and observer ability were the best overall predictors of false-positive errors, but ambient noise and competing species also affected error rates for some species. False-positive errors made up 5 of all positive detections, with individual observers exhibiting false-positive rates between 0.5 and 14. Previous research suggests false-positive errors of these magnitudes would induce substantial positive biases in standard estimators of species occurrence, and we recommend practices to mitigate for false positives when developing occupancy monitoring protocols that rely on auditory detections. These recommendations include additional observer training, limiting the number of target species, and establishing distance and ambient noise thresholds during surveys. ?? 2010 The Wildlife Society.

  16. Medication Administration Errors in Nursing Homes Using an Automated Medication Dispensing System

    PubMed Central

    van den Bemt, Patricia M.L.A.; Idzinga, Jetske C.; Robertz, Hans; Kormelink, Dennis Groot; Pels, Neske

    2009-01-01

    Objective To identify the frequency of medication administration errors as well as their potential risk factors in nursing homes using a distribution robot. Design The study was a prospective, observational study conducted within three nursing homes in the Netherlands caring for 180 individuals. Measurements Medication errors were measured using the disguised observation technique. Types of medication errors were described. The correlation between several potential risk factors and the occurrence of medication errors was studied to identify potential causes for the errors. Results In total 2,025 medication administrations to 127 clients were observed. In these administrations 428 errors were observed (21.2%). The most frequently occurring types of errors were use of wrong administration techniques (especially incorrect crushing of medication and not supervising the intake of medication) and wrong time errors (administering the medication at least 1 h early or late).The potential risk factors female gender (odds ratio (OR) 1.39; 95% confidence interval (CI) 1.05–1.83), ATC medication class antibiotics (OR 11.11; 95% CI 2.66–46.50), medication crushed (OR 7.83; 95% CI 5.40–11.36), number of dosages/day/client (OR 1.03; 95% CI 1.01–1.05), nursing home 2 (OR 3.97; 95% CI 2.86–5.50), medication not supplied by distribution robot (OR 2.92; 95% CI 2.04–4.18), time classes “7–10 am” (OR 2.28; 95% CI 1.50–3.47) and “10 am-2 pm” (OR 1.96; 1.18–3.27) and day of the week “Wednesday” (OR 1.46; 95% CI 1.03–2.07) are associated with a higher risk of administration errors. Conclusions Medication administration in nursing homes is prone to many errors. This study indicates that the handling of the medication after removing it from the robot packaging may contribute to this high error frequency, which may be reduced by training of nurse attendants, by automated clinical decision support and by measures to reduce workload. PMID:19390109

  17. Mutagenic cost of ribonucleotides in bacterial DNA

    PubMed Central

    Schroeder, Jeremy W.; Randall, Justin R.; Hirst, William G.; O’Donnell, Michael E.; Simmons, Lyle A.

    2017-01-01

    Replicative DNA polymerases misincorporate ribonucleoside triphosphates (rNTPs) into DNA approximately once every 2,000 base pairs synthesized. Ribonucleotide excision repair (RER) removes ribonucleoside monophosphates (rNMPs) from genomic DNA, replacing the error with the appropriate deoxyribonucleoside triphosphate (dNTP). Ribonucleotides represent a major threat to genome integrity with the potential to cause strand breaks. Furthermore, it has been shown in the bacterium Bacillus subtilis that loss of RER increases spontaneous mutagenesis. Despite the high rNTP error rate and the effect on genome integrity, the mechanism underlying mutagenesis in RER-deficient bacterial cells remains unknown. We performed mutation accumulation lines and genome-wide mutational profiling of B. subtilis lacking RNase HII, the enzyme that incises at single rNMP residues initiating RER. We show that loss of RER in B. subtilis causes strand- and sequence-context–dependent GC → AT transitions. Using purified proteins, we show that the replicative polymerase DnaE is mutagenic within the sequence context identified in RER-deficient cells. We also found that DnaE does not perform strand displacement synthesis. Given the use of nucleotide excision repair (NER) as a backup pathway for RER in RNase HII-deficient cells and the known mutagenic profile of DnaE, we propose that misincorporated ribonucleotides are removed by NER followed by error-prone resynthesis with DnaE. PMID:29078353

  18. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners

    PubMed Central

    Xu, Xuemiao; Zhang, Huaidong; Han, Guoqiang; Kwan, Kin Chung; Pang, Wai-Man; Fang, Jiaming; Zhao, Gansen

    2016-01-01

    Exterior orientation parameters’ (EOP) estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs) for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang’E-1, compared to the existing space resection model. PMID:27077855

  19. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners.

    PubMed

    Xu, Xuemiao; Zhang, Huaidong; Han, Guoqiang; Kwan, Kin Chung; Pang, Wai-Man; Fang, Jiaming; Zhao, Gansen

    2016-04-11

    Exterior orientation parameters' (EOP) estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs) for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang'E-1, compared to the existing space resection model.

  20. Safe prescribing: a titanic challenge.

    PubMed

    Routledge, Philip A

    2012-10-01

    The challenge to achieve safe prescribing merits the adjective 'titanic'. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the 'Seven C's'. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. © 2012 The Author. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  1. Compromised encoding of proprioceptively determined joint angles in older adults: the role of working memory and attentional load.

    PubMed

    Goble, Daniel J; Mousigian, Marianne A; Brown, Susan H

    2012-01-01

    Perceiving the positions and movements of one's body segments (i.e., proprioception) is critical for movement control. However, this ability declines with older age as has been demonstrated by joint angle matching paradigms in the absence of vision. The aim of the present study was to explore the extent to which reduced working memory and attentional load influence older adult proprioceptive matching performance. Older adults with relatively HIGH versus LOW working memory ability as determined by backward digit span and healthy younger adults, performed memory-based elbow position matching with and without attentional load (i.e., counting by 3 s) during target position encoding. Even without attentional load, older adults with LOW digit spans (i.e., 4 digits or less) had larger matching errors than younger adults. Further, LOW older adults made significantly greater errors when attentional loads were present during proprioceptive target encoding as compared to both younger and older adults with HIGH digit span scores (i.e., 5 digits or greater). These results extend previous position matching results that suggested greater errors in older adults were due to degraded input signals from peripheral mechanoreceptors. Specifically, the present work highlights the role cognitive factors play in the assessment of older adult proprioceptive acuity using memory-based matching paradigms. Older adults with LOW working memory appear prone to compromised proprioceptive encoding, especially when secondary cognitive tasks must be concurrently executed. This may ultimately result in poorer performance on various activities of daily living.

  2. DNA Repair Mechanisms and the Bypass of DNA Damage in Saccharomyces cerevisiae

    PubMed Central

    Boiteux, Serge; Jinks-Robertson, Sue

    2013-01-01

    DNA repair mechanisms are critical for maintaining the integrity of genomic DNA, and their loss is associated with cancer predisposition syndromes. Studies in Saccharomyces cerevisiae have played a central role in elucidating the highly conserved mechanisms that promote eukaryotic genome stability. This review will focus on repair mechanisms that involve excision of a single strand from duplex DNA with the intact, complementary strand serving as a template to fill the resulting gap. These mechanisms are of two general types: those that remove damage from DNA and those that repair errors made during DNA synthesis. The major DNA-damage repair pathways are base excision repair and nucleotide excision repair, which, in the most simple terms, are distinguished by the extent of single-strand DNA removed together with the lesion. Mistakes made by DNA polymerases are corrected by the mismatch repair pathway, which also corrects mismatches generated when single strands of non-identical duplexes are exchanged during homologous recombination. In addition to the true repair pathways, the postreplication repair pathway allows lesions or structural aberrations that block replicative DNA polymerases to be tolerated. There are two bypass mechanisms: an error-free mechanism that involves a switch to an undamaged template for synthesis past the lesion and an error-prone mechanism that utilizes specialized translesion synthesis DNA polymerases to directly synthesize DNA across the lesion. A high level of functional redundancy exists among the pathways that deal with lesions, which minimizes the detrimental effects of endogenous and exogenous DNA damage. PMID:23547164

  3. Environmental Health Risk Assesement in Flood-prone Area in Tamangapa Sub-District Makassar

    NASA Astrophysics Data System (ADS)

    Haris, Ibrahim Abdul; Basir, Basir

    2018-05-01

    Environmental health in Indonesia is still caution to concern, poor sanitation in Indonesia is characterized by the high incidence of infectious diseases in society. The society in flood-prone area has a high-risk exposure on the disease based on the environment because they live in disaster-prone area. This research aimed to describe the condition of sanitary facilities and risky behavior on public health in flood-prone areas in Manggala district particularly in Tamangapa sub-district of Makassar. This reserach uses an observation method with a descriptive approach. The data is processed by using SPSS and Arc View GIS applications. Environmental risk category is determined by the approach of Environmental Health Risk Assessment (EHRA). The results showed that the flood-prone area in RT 04 RW 06 was included in very high-risk category at 229 with an index value of environmental health risks 212-229. Meanwhile, RT 04 RW 05 was in the category of low risk in the amount of 155 with an index of 155-173. Environmental health hazards identified in Tamangapa flood-prone areas sub-district includes domestic sources of clean water, domestic wastewater, and household garbage.

  4. Evolution of gossip-based indirect reciprocity on a bipartite network

    PubMed Central

    Giardini, Francesca; Vilone, Daniele

    2016-01-01

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission. PMID:27885256

  5. Evolution of gossip-based indirect reciprocity on a bipartite network.

    PubMed

    Giardini, Francesca; Vilone, Daniele

    2016-11-25

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.

  6. Evolution of gossip-based indirect reciprocity on a bipartite network

    NASA Astrophysics Data System (ADS)

    Giardini, Francesca; Vilone, Daniele

    2016-11-01

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.

  7. Mutagenesis during plant responses to UVB radiation.

    PubMed

    Holá, M; Vágnerová, R; Angelis, K J

    2015-08-01

    We tested an idea that induced mutagenesis due to unrepaired DNA lesions, here the UV photoproducts, underlies the impact of UVB irradiation on plant phenotype. For this purpose we used protonemal culture of the moss Physcomitrella patens with 50% of apical cells, which mimics actively growing tissue, the most vulnerable stage for the induction of mutations. We measured the UVB mutation rate of various moss lines with defects in DNA repair (pplig4, ppku70, pprad50, ppmre11), and in selected clones resistant to 2-Fluoroadenine, which were mutated in the adenosine phosphotrasferase gene (APT), we analysed induced mutations by sequencing. In parallel we followed DNA break repair and removal of cyclobutane pyrimidine dimers with a half-life τ = 4 h 14 min determined by comet assay combined with UV dimer specific T4 endonuclease V. We show that UVB induces massive, sequence specific, error-prone bypass repair that is responsible for a high mutation rate owing to relatively slow, though error-free, removal of photoproducts by nucleotide excision repair (NER). Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  8. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  9. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE PAGES

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E; ...

    2016-02-29

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  10. From MIMO-OFDM Algorithms to a Real-Time Wireless Prototype: A Systematic Matlab-to-Hardware Design Flow

    NASA Astrophysics Data System (ADS)

    Weijers, Jan-Willem; Derudder, Veerle; Janssens, Sven; Petré, Frederik; Bourdoux, André

    2006-12-01

    To assess the performance of forthcoming 4th generation wireless local area networks, the algorithmic functionality is usually modelled using a high-level mathematical software package, for instance, Matlab. In order to validate the modelling assumptions against the real physical world, the high-level functional model needs to be translated into a prototype. A systematic system design methodology proves very valuable, since it avoids, or, at least reduces, numerous design iterations. In this paper, we propose a novel Matlab-to-hardware design flow, which allows to map the algorithmic functionality onto the target prototyping platform in a systematic and reproducible way. The proposed design flow is partly manual and partly tool assisted. It is shown that the proposed design flow allows to use the same testbench throughout the whole design flow and avoids time-consuming and error-prone intermediate translation steps.

  11. Spatial Distortion in MRI-Guided Stereotactic Procedures: Evaluation in 1.5-, 3- and 7-Tesla MRI Scanners.

    PubMed

    Neumann, Jan-Oliver; Giese, Henrik; Biller, Armin; Nagel, Armin M; Kiening, Karl

    2015-01-01

    Magnetic resonance imaging (MRI) is replacing computed tomography (CT) as the main imaging modality for stereotactic transformations. MRI is prone to spatial distortion artifacts, which can lead to inaccuracy in stereotactic procedures. Modern MRI systems provide distortion correction algorithms that may ameliorate this problem. This study investigates the different options of distortion correction using standard 1.5-, 3- and 7-tesla MRI scanners. A phantom was mounted on a stereotactic frame. One CT scan and three MRI scans were performed. At all three field strengths, two 3-dimensional sequences, volumetric interpolated breath-hold examination (VIBE) and magnetization-prepared rapid acquisition with gradient echo, were acquired, and automatic distortion correction was performed. Global stereotactic transformation of all 13 datasets was performed and two stereotactic planning workflows (MRI only vs. CT/MR image fusion) were subsequently analysed. Distortion correction on the 1.5- and 3-tesla scanners caused a considerable reduction in positional error. The effect was more pronounced when using the VIBE sequences. By using co-registration (CT/MR image fusion), even a lower positional error could be obtained. In ultra-high-field (7 T) MR imaging, distortion correction introduced even higher errors. However, the accuracy of non-corrected 7-tesla sequences was comparable to CT/MR image fusion 3-tesla imaging. MRI distortion correction algorithms can reduce positional errors by up to 60%. For stereotactic applications of utmost precision, we recommend a co-registration to an additional CT dataset. © 2015 S. Karger AG, Basel.

  12. Feedback-tuned, noise resilient gates for encoded spin qubits

    NASA Astrophysics Data System (ADS)

    Bluhm, Hendrik

    Spin 1/2 particles form native two level systems and thus lend themselves as a natural qubit implementation. However, encoding a single qubit in several spins entails benefits, such as reducing the resources necessary for qubit control and protection from certain decoherence channels. While several varieties of such encoded spin qubits have been implemented, accurate control remains challenging, and leakage out of the subspace of valid qubit states is a potential issue. Optimal performance typically requires large pulse amplitudes for fast control, which is prone to systematic errors and prohibits standard control approaches based on Rabi flopping. Furthermore, the exchange interaction typically used to electrically manipulate encoded spin qubits is inherently sensitive to charge noise. I will discuss all-electrical, high-fidelity single qubit operations for a spin qubit encoded in two electrons in a GaAs double quantum dot. Starting from a set of numerically optimized control pulses, we employ an iterative tuning procedure based on measured error syndromes to remove systematic errors.Randomized benchmarking yields an average gate fidelity exceeding 98 % and a leakage rate into invalid states of 0.2 %. These gates exhibit a certain degree of resilience to both slow charge and nuclear spin fluctuations due to dynamical correction analogous to a spin echo. Furthermore, the numerical optimization minimizes the impact of fast charge noise. Both types of noise make relevant contributions to gate errors. The general approach is also adaptable to other qubit encodings and exchange based two-qubit gates.

  13. Dietary Assessment in Food Environment Research

    PubMed Central

    Kirkpatrick, Sharon I.; Reedy, Jill; Butler, Eboneé N.; Dodd, Kevin W.; Subar, Amy F.; Thompson, Frances E.; McKinnon, Robin A.

    2015-01-01

    Context The existing evidence on food environments and diet is inconsistent, potentially due in part to heterogeneity in measures used to assess diet. The objective of this review, conducted in 2012–2013, was to examine measures of dietary intake utilized in food environment research. Evidence acquisition Included studies were published from January 2007 through June 2012 and assessed relationships between at least one food environment exposure and at least one dietary outcome. Fifty-one articles were identified using PubMed, Scopus, Web of Knowledge, and PsycINFO; references listed in the papers reviewed and relevant review articles; and the National Cancer Institute's Measures of the Food Environment website. The frequency of the use of dietary intake measures and assessment of specific dietary outcomes was examined, as were patterns of results among studies using different dietary measures. Evidence synthesis The majority of studies used brief instruments, such as screeners or one or two questions, to assess intake. Food frequency questionnaires were used in about a third of studies, one in ten used 24-hour recalls, and fewer than one in twenty used diaries. Little consideration of dietary measurement error was evident. Associations between the food environment and diet were more consistently in the expected direction in studies using less error-prone measures. Conclusions There is a tendency toward the use of brief dietary assessment instruments with low cost and burden rather than more detailed instruments that capture intake with less bias. Use of error-prone dietary measures may lead to spurious findings and reduced power to detect associations. PMID:24355678

  14. Clinical errors that can occur in the treatment decision-making process in psychotherapy.

    PubMed

    Park, Jake; Goode, Jonathan; Tompkins, Kelley A; Swift, Joshua K

    2016-09-01

    Clinical errors occur in the psychotherapy decision-making process whenever a less-than-optimal treatment or approach is chosen when working with clients. A less-than-optimal approach may be one that a client is unwilling to try or fully invest in based on his/her expectations and preferences, or one that may have little chance of success based on contraindications and/or limited research support. The doctor knows best and the independent choice models are two decision-making models that are frequently used within psychology, but both are associated with an increased likelihood of errors in the treatment decision-making process. In particular, these models fail to integrate all three components of the definition of evidence-based practice in psychology (American Psychological Association, 2006). In this article we describe both models and provide examples of clinical errors that can occur in each. We then introduce the shared decision-making model as an alternative that is less prone to clinical errors. PsycINFO Database Record (c) 2016 APA, all rights reserved

  15. Cocaine Dependence Treatment Data: Methods for Measurement Error Problems With Predictors Derived From Stationary Stochastic Processes

    PubMed Central

    Guan, Yongtao; Li, Yehua; Sinha, Rajita

    2011-01-01

    In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854

  16. Variations of Human DNA Polymerase Genes as Biomarkers of Prostate Cancer Progression

    DTIC Science & Technology

    2013-07-01

    discovery , cancer genetics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC...variations identified (including all single and double mutant combinations of the Triple mutant), and some POLK mutants • Discovery of a novel...Athens, Greece, 07/10 Makridakis N. Error-prone polymerase mutations and prostate cancer progression, COBRE /Cancer Genetics group seminar, Tulane

  17. The expanding polymerase universe.

    PubMed

    Goodman, M F; Tippin, B

    2000-11-01

    Over the past year, the number of known prokaryotic and eukaryotic DNA polymerases has exploded. Many of these newly discovered enzymes copy aberrant bases in the DNA template over which 'respectable' polymerases fear to tread. The next step is to unravel their functions, which are thought to range from error-prone copying of DNA lesions, somatic hypermutation and avoidance of skin cancer, to restarting stalled replication forks and repairing double-stranded DNA breaks.

  18. Error rates and resource overheads of encoded three-qubit gates

    NASA Astrophysics Data System (ADS)

    Takagi, Ryuji; Yoder, Theodore J.; Chuang, Isaac L.

    2017-10-01

    A non-Clifford gate is required for universal quantum computation, and, typically, this is the most error-prone and resource-intensive logical operation on an error-correcting code. Small, single-qubit rotations are popular choices for this non-Clifford gate, but certain three-qubit gates, such as Toffoli or controlled-controlled-Z (ccz), are equivalent options that are also more suited for implementing some quantum algorithms, for instance, those with coherent classical subroutines. Here, we calculate error rates and resource overheads for implementing logical ccz with pieceable fault tolerance, a nontransversal method for implementing logical gates. We provide a comparison with a nonlocal magic-state scheme on a concatenated code and a local magic-state scheme on the surface code. We find the pieceable fault-tolerance scheme particularly advantaged over magic states on concatenated codes and in certain regimes over magic states on the surface code. Our results suggest that pieceable fault tolerance is a promising candidate for fault tolerance in a near-future quantum computer.

  19. The Sizing and Optimization Language (SOL): A computer language to improve the user/optimizer interface

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Scotti, S. J.

    1989-01-01

    The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.

  20. CTF Preprocessor User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, Maria; Salko, Robert K.

    2016-05-26

    This document describes how a user should go about using the CTF pre- processor tool to create an input deck for modeling rod-bundle geometry in CTF. The tool was designed to generate input decks in a quick and less error-prone manner for CTF. The pre-processor is a completely independent utility, written in Fortran, that takes a reduced amount of input from the user. The information that the user must supply is basic information on bundle geometry, such as rod pitch, clad thickness, and axial location of spacer grids--the pre-processor takes this basic information and determines channel placement and connection informationmore » to be written to the input deck, which is the most time-consuming and error-prone segment of creating a deck. Creation of the model is also more intuitive, as the user can specify assembly and water-tube placement using visual maps instead of having to place them by determining channel/channel and rod/channel connections. As an example of the benefit of the pre-processor, a quarter-core model that contains 500,000 scalar-mesh cells was read into CTF from an input deck containing 200,000 lines of data. This 200,000 line input deck was produced automatically from a set of pre-processor decks that contained only 300 lines of data.« less

  1. A Novel Way to Relate Ontology Classes

    PubMed Central

    Choksi, Ami T.; Jinwala, Devesh C.

    2015-01-01

    The existing ontologies in the semantic web typically have anonymous union and intersection classes. The anonymous classes are limited in scope and may not be part of the whole inference process. The tools, namely, the pellet, the jena, and the protégé, interpret collection classes as (a) equivalent/subclasses of union class and (b) superclasses of intersection class. As a result, there is a possibility that the tools will produce error prone inference results for relations, namely, sub-, union, intersection, equivalent relations, and those dependent on these relations, namely, complement. To verify whether a class is complement of other involves utilization of sub- and equivalent relations. Motivated by the same, we (i) refine the test data set of the conference ontology by adding named, union, and intersection classes and (ii) propose a match algorithm to (a) calculate corrected subclasses list, (b) correctly relate intersection and union classes with their collection classes, and (c) match union, intersection, sub-, complement, and equivalent classes in a proper sequence, to avoid error prone match results. We compare the results of our algorithms with those of a candidate reasoner, namely, the pellet reasoner. To the best of our knowledge, ours is a unique attempt in establishing a novel way to relate ontology classes. PMID:25984560

  2. RD Optimized, Adaptive, Error-Resilient Transmission of MJPEG2000-Coded Video over Multiple Time-Varying Channels

    NASA Astrophysics Data System (ADS)

    Bezan, Scott; Shirani, Shahram

    2006-12-01

    To reliably transmit video over error-prone channels, the data should be both source and channel coded. When multiple channels are available for transmission, the problem extends to that of partitioning the data across these channels. The condition of transmission channels, however, varies with time. Therefore, the error protection added to the data at one instant of time may not be optimal at the next. In this paper, we propose a method for adaptively adding error correction code in a rate-distortion (RD) optimized manner using rate-compatible punctured convolutional codes to an MJPEG2000 constant rate-coded frame of video. We perform an analysis on the rate-distortion tradeoff of each of the coding units (tiles and packets) in each frame and adapt the error correction code assigned to the unit taking into account the bandwidth and error characteristics of the channels. This method is applied to both single and multiple time-varying channel environments. We compare our method with a basic protection method in which data is either not transmitted, transmitted with no protection, or transmitted with a fixed amount of protection. Simulation results show promising performance for our proposed method.

  3. Development of a Stereovision-Based Technique to Measure the Spread Patterns of Granular Fertilizer Spreaders

    PubMed Central

    Cool, Simon R.; Pieters, Jan G.; Seatovic, Dejan; Mertens, Koen C.; Nuyttens, David; Van De Gucht, Tim C.; Vangeyte, Jürgen

    2017-01-01

    Centrifugal fertilizer spreaders are by far the most commonly used granular fertilizer spreader type in Europe. Their spread pattern however is error-prone, potentially leading to an undesired distribution of particles in the field and losses out of the field, which is often caused by poor calibration of the spreader for the specific fertilizer used. Due to the large environmental impact of fertilizer use, it is important to optimize the spreading process and minimize these errors. Spreader calibrations can be performed by using collection trays to determine the (field) spread pattern, but this is very time-consuming and expensive for the farmer and hence not common practice. Therefore, we developed an innovative multi-camera system to predict the spread pattern in a fast and accurate way, independent of the spreader configuration. Using high-speed stereovision, ejection parameters of particles leaving the spreader vanes were determined relative to a coordinate system associated with the spreader. The landing positions and subsequent spread patterns were determined using a ballistic model incorporating the effect of tractor motion and wind. Experiments were conducted with a commercial spreader and showed a high repeatability. The results were transformed to one spatial dimension to enable comparison with transverse spread patterns determined in the field and showed similar results. PMID:28617339

  4. Digital Droplet PCR: CNV Analysis and Other Applications.

    PubMed

    Mazaika, Erica; Homsy, Jason

    2014-07-14

    Digital droplet PCR (ddPCR) is an assay that combines state-of-the-art microfluidics technology with TaqMan-based PCR to achieve precise target DNA quantification at high levels of sensitivity and specificity. Because quantification is achieved without the need for standard assays in an easy to interpret, unambiguous digital readout, ddPCR is far simpler, faster, and less error prone than real-time qPCR. The basic protocol can be modified with minor adjustments to suit a wide range of applications, such as CNV analysis, rare variant detection, SNP genotyping, and transcript quantification. This unit describes the ddPCR workflow in detail for the Bio-Rad QX100 system, but the theory and data interpretation are generalizable to any ddPCR system. Copyright © 2014 John Wiley & Sons, Inc.

  5. Modelling Hepatitis B Virus Antiviral Therapy and Drug Resistant Mutant Strains

    NASA Astrophysics Data System (ADS)

    Bernal, Julie; Dix, Trevor; Allison, Lloyd; Bartholomeusz, Angeline; Yuen, Lilly

    Despite the existence of vaccines, the Hepatitis B virus (HBV) is still a serious global health concern. HBV targets liver cells. It has an unusual replication process involving an RNA pre-genome that the reverse transcriptase domain of the viral polymerase protein translates into viral DNA. The reverse transcription process is error prone and together with the high replication rates of the virus, allows the virus to exist as a heterogeneous population of mutants, known as a quasispecies, that can adapt and become resistant to antiviral therapy. This study presents an individual-based model of HBV inside an artificial liver, and associated blood serum, undergoing antiviral therapy. This model aims to provide insights into the evolution of the HBV quasispecies and the individual contribution of HBV mutations in the outcome of therapy.

  6. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  7. Standardized Competencies for Parenteral Nutrition Prescribing: The American Society for Parenteral and Enteral Nutrition Model.

    PubMed

    Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David

    2015-08-01

    Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.

  8. Vocational interests and career indecision among psychosis-prone college students.

    PubMed

    Poreh, A M; Schullen, C

    1998-10-01

    This study investigated the relationship between scores on scales that purport to measure psychosis-proneness and scores on vocational interests, identity, and differentiation scales in a sample of 233 college students who completed the Perceptual Aberration and Magical Ideation scales, the Strong Campbell Interest Inventory, and the Career Decision Scale. The present findings are consistent with prior work indicating a sex-related association of scores on measures of psychosis-proneness and vocational interests. A positive correlation between scores on vocational indecision and measures of psychosis-proneness was also found, suggesting that both men and women who score high on psychosis-proneness find it difficult to formulate long-term career goals. Finally, there was no significant correlation between scores on measures of psychosis-proneness and Holland's Vocational Differentiation Index. Present results are discussed in light of previously reported sex differences among psychosis-prone adults and diagnosed schizophrenics. The implications of the findings for vocational counselors are also addressed.

  9. High-throughput technology for novel SO2 oxidation catalysts

    PubMed Central

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427

  10. A high-throughput and quantitative method to assess the mutagenic potential of translesion DNA synthesis

    PubMed Central

    Taggart, David J.; Camerlengo, Terry L.; Harrison, Jason K.; Sherrer, Shanen M.; Kshetry, Ajay K.; Taylor, John-Stephen; Huang, Kun; Suo, Zucai

    2013-01-01

    Cellular genomes are constantly damaged by endogenous and exogenous agents that covalently and structurally modify DNA to produce DNA lesions. Although most lesions are mended by various DNA repair pathways in vivo, a significant number of damage sites persist during genomic replication. Our understanding of the mutagenic outcomes derived from these unrepaired DNA lesions has been hindered by the low throughput of existing sequencing methods. Therefore, we have developed a cost-effective high-throughput short oligonucleotide sequencing assay that uses next-generation DNA sequencing technology for the assessment of the mutagenic profiles of translesion DNA synthesis catalyzed by any error-prone DNA polymerase. The vast amount of sequencing data produced were aligned and quantified by using our novel software. As an example, the high-throughput short oligonucleotide sequencing assay was used to analyze the types and frequencies of mutations upstream, downstream and at a site-specifically placed cis–syn thymidine–thymidine dimer generated individually by three lesion-bypass human Y-family DNA polymerases. PMID:23470999

  11. R classes and methods for SNP array data.

    PubMed

    Scharpf, Robert B; Ruczinski, Ingo

    2010-01-01

    The Bioconductor project is an "open source and open development software project for the analysis and comprehension of genomic data" (1), primarily based on the R programming language. Infrastructure packages, such as Biobase, are maintained by Bioconductor core developers and serve several key roles to the broader community of Bioconductor software developers and users. In particular, Biobase introduces an S4 class, the eSet, for high-dimensional assay data. Encapsulating the assay data as well as meta-data on the samples, features, and experiment in the eSet class definition ensures propagation of the relevant sample and feature meta-data throughout an analysis. Extending the eSet class promotes code reuse through inheritance as well as interoperability with other R packages and is less error-prone. Recently proposed class definitions for high-throughput SNP arrays extend the eSet class. This chapter highlights the advantages of adopting and extending Biobase class definitions through a working example of one implementation of classes for the analysis of high-throughput SNP arrays.

  12. Adaptive constructive processes and the future of memory.

    PubMed

    Schacter, Daniel L

    2012-11-01

    Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes and focuses in particular on the process of imagining or simulating events that might occur in one's personal future. Simulating future events relies on many of the same cognitive and neural processes as remembering past events, which may help to explain why imagination and memory can be easily confused. The article considers both pitfalls and adaptive aspects of future event simulation in the context of research on planning, prediction, problem solving, mind-wandering, prospective and retrospective memory, coping and positivity bias, and the interconnected set of brain regions known as the default network. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  13. Identification of Which High Risk Youth Smoke Cigarettes Regularly.

    ERIC Educational Resources Information Center

    Sussman, Steve; And Others

    This study investigated whether or not high or low risk youths differed on previous items discriminative of problem-prone youth, particularly problem-prone attitudes and preferences, and social and environmental smoking. In addition, the study examined whether high or low use youths differed on items related to a health orientation including…

  14. Error-free versus mutagenic processing of genomic uracil--relevance to cancer.

    PubMed

    Krokan, Hans E; Sætrom, Pål; Aas, Per Arne; Pettersen, Henrik Sahlin; Kavli, Bodil; Slupphaug, Geir

    2014-07-01

    Genomic uracil is normally processed essentially error-free by base excision repair (BER), with mismatch repair (MMR) as an apparent backup for U:G mismatches. Nuclear uracil-DNA glycosylase UNG2 is the major enzyme initiating BER of uracil of U:A pairs as well as U:G mismatches. Deficiency in UNG2 results in several-fold increases in genomic uracil in mammalian cells. Thus, the alternative uracil-removing glycosylases, SMUG1, TDG and MBD4 cannot efficiently complement UNG2-deficiency. A major function of SMUG1 is probably to remove 5-hydroxymethyluracil from DNA with general back-up for UNG2 as a minor function. TDG and MBD4 remove deamination products U or T mismatched to G in CpG/mCpG contexts, but may have equally or more important functions in development, epigenetics and gene regulation. Genomic uracil was previously thought to arise only from spontaneous cytosine deamination and incorporation of dUMP, generating U:G mismatches and U:A pairs, respectively. However, the identification of activation-induced cytidine deaminase (AID) and other APOBEC family members as DNA-cytosine deaminases has spurred renewed interest in the processing of genomic uracil. Importantly, AID triggers the adaptive immune response involving error-prone processing of U:G mismatches, but also contributes to B-cell lymphomagenesis. Furthermore, mutational signatures in a substantial fraction of other human cancers are consistent with APOBEC-induced mutagenesis, with U:G mismatches as prime suspects. Mutations can be caused by replicative polymerases copying uracil in U:G mismatches, or by translesion polymerases that insert incorrect bases opposite abasic sites after uracil-removal. In addition, kataegis, localized hypermutations in one strand in the vicinity of genomic rearrangements, requires APOBEC protein, UNG2 and translesion polymerase REV1. What mechanisms govern error-free versus error prone processing of uracil in DNA remains unclear. In conclusion, genomic uracil is an essential intermediate in adaptive immunity and innate antiviral responses, but may also be a fundamental cause of a wide range of malignancies. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Hematology of camelids.

    PubMed

    Vap, Linda; Bohn, Andrea A

    2015-01-01

    Interpretation of camelid hematology results is similar to that of other mammals. Obtaining accurate results and using appropriate reference intervals can be a bit problematic, particularly when evaluating the erythron. Camelid erythrocytes vary from other mammals in that they are small, flat, and elliptical. This variation makes data obtained from samples collected from these species prone to error when using some automated instruments. Normal and abnormal findings in camelid blood are reviewed as well as how to ensure accurate results.

  16. Coordinating Robot Teams for Disaster Relief

    DTIC Science & Technology

    2015-05-01

    eventually guide vehicles in cooperation with its Operator(s), but in this paper we assume static mission goals, a fixed number of vehicles, and a...is tedious and error prone. Kress-Gazit et al. (2009) instead synthesize an FSA from an LTL specification using a game theory approach (Bloem et al...helping an Operator coordinate a team of vehicles in Disaster Relief. Acknowledgements Thanks to OSD ASD (R&E) for sponsoring this research. The

  17. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  18. Toward an Operational Definition of Workload: A Workload Assessment of Aviation Maneuvers

    DTIC Science & Technology

    2010-08-01

    and evaluated by the learner . With practice, the learner moves into the second phase, where optimal strategies are strengthened. The final stage of...The first phase demands a great amount of resources as performance is slow and prone to errors. During this phase, strategies are being formulated...asked to assess mental, physical, visual, aural , and verbal demands of each task. The new assessment is a cost effective method of assessing workload

  19. Tension-Induced Error Correction and Not Kinetochore Attachment Status Activates the SAC in an Aurora-B/C-Dependent Manner in Oocytes.

    PubMed

    Vallot, Antoine; Leontiou, Ioanna; Cladière, Damien; El Yakoubi, Warif; Bolte, Susanne; Buffin, Eulalie; Wassmann, Katja

    2018-01-08

    Cell division with partitioning of the genetic material should take place only when paired chromosomes named bivalents (meiosis I) or sister chromatids (mitosis and meiosis II) are correctly attached to the bipolar spindle in a tension-generating manner. For this to happen, the spindle assembly checkpoint (SAC) checks whether unattached kinetochores are present, in which case anaphase onset is delayed to permit further establishment of attachments. Additionally, microtubules are stabilized when they are attached and under tension. In mitosis, attachments not under tension activate the so-named error correction pathway depending on Aurora B kinase substrate phosphorylation. This leads to microtubule detachments, which in turn activates the SAC [1-3]. Meiotic divisions in mammalian oocytes are highly error prone, with severe consequences for fertility and health of the offspring [4, 5]. Correct attachment of chromosomes in meiosis I leads to the generation of stretched bivalents, but-unlike mitosis-not to tension between sister kinetochores, which co-orient. Here, we set out to address whether reduction of tension applied by the spindle on bioriented bivalents activates error correction and, as a consequence, the SAC. Treatment of oocytes in late prometaphase I with Eg5 kinesin inhibitor affects spindle tension, but not attachments, as we show here using an optimized protocol for confocal imaging. After Eg5 inhibition, bivalents are correctly aligned but less stretched, and as a result, Aurora-B/C-dependent error correction with microtubule detachment takes place. This loss of attachments leads to SAC activation. Crucially, SAC activation itself does not require Aurora B/C kinase activity in oocytes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Insight From the Statistics of Nothing: Estimating Limits of Change Detection Using Inferred No-Change Areas in DEM Difference Maps and Application to Landslide Hazard Studies

    NASA Astrophysics Data System (ADS)

    Haneberg, W. C.

    2017-12-01

    Remote characterization of new landslides or areas of ongoing movement using differences in high resolution digital elevation models (DEMs) created through time, for example before and after major rains or earthquakes, is an attractive proposition. In the case of large catastrophic landslides, changes may be apparent enough that simple subtraction suffices. In other cases, statistical noise can obscure landslide signatures and place practical limits on detection. In ideal cases on land, GPS surveys of representative areas at the time of DEM creation can quantify the inherent errors. In less-than-ideal terrestrial cases and virtually all submarine cases, it may be impractical or impossible to independently estimate the DEM errors. Examining DEM difference statistics for areas reasonably inferred to have no change, however, can provide insight into the limits of detectability. Data from inferred no-change areas of airborne LiDAR DEM difference maps of the 2014 Oso, Washington landslide and landslide-prone colluvium slopes along the Ohio River valley in northern Kentucky, show that DEM difference maps can have non-zero mean and slope dependent error components consistent with published studies of DEM errors. Statistical thresholds derived from DEM difference error and slope data can help to distinguish between DEM differences that are likely real—and which may indicate landsliding—from those that are likely spurious or irrelevant. This presentation describes and compares two different approaches, one based upon a heuristic assumption about the proportion of the study area likely covered by new landslides and another based upon the amount of change necessary to ensure difference at a specified level of probability.

  1. [Analysis of judicial sentences issued against traumatologists between 1995 and 2011 as regards medical negligence].

    PubMed

    Cardoso-Cita, Z; Perea-Pérez, B; Albarrán-Juan, M E; Labajo-González, M E; López-Durán, L; Marco-Martínez, F; Santiago-Saéz, A

    2016-01-01

    Traumatology and Orthopaedic Surgery is one of the specialities with most complaints due to its scope and complexity. The aim of this study is to determine the characteristics of the complaints made against medical specialists in Traumatology, taking into account those variables that might have an influence both on the presenting of the complaint as well as on the resolving of the process. An analysis was performed on 303 legal judgments (1995-2011) collected in the health legal judgements archive of the Madrid School of Medicine, which is linked to the Westlaw Aranzadi data base. Civil jurisdiction was the most used. The specific processes with most complaints were bone-joint disorders followed by vascular-nerve problems and infections. The injury claimed against most was in the lower limb, particularly the knee. The most frequent general cause of complaint was surgical treatment error, followed by diagnostic error. There was lack of information in 14.9%. There was sentencing in 49.8% of the cases, with compensation mainly being less than 50,000 euros. Traumatology and Orthopaedic Surgery is a speciality prone to complaints due to malpractice. The number of sentences against traumatologists is high, but compensations are usually less than 50,000 euros. The main reason for sentencing is surgical treatment error; thus being the basic surgical procedure and where precautions should be maximised. The judgements due to lack of information are high, with adequate doctor-patient communication being essential as well as the correct completion of the informed consent. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  2. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  3. Landslide Hazard Analysis and Damage Assessment for Tourism Destination at Candikuning Village, Tabanan Regency, Bali, Indonesia

    NASA Astrophysics Data System (ADS)

    Sunarta, I. N.; Susila, K. D.; Kariasa, I. N.

    2018-02-01

    Landslide is a movement down the slope by the soil mass or slope constituent rock, a result of disturbance of the stability of the soil or rocks that make up the slope.Bali as one of the best tourism destinations in the world, also has landslide prone areas. Tourism attraction in Bali that is prone to landslides are Lake Beratan and Pura Ulun Danu Beratan in Candikuning Village, Tabanan Regency, Bali Province, Indonesia. Candikunig village area has tourismdestination, settlements and agricultural land. This study aims to analyze landslide- prone areas and the losses caused by landslides include damage analysis for the attractions of Beratan Lake and Ulun Danu Beratan Temple and settlements. The method used is matching and scoring with parameters of rainfall, soil type, slope and land use.The result is, Beratan Lake area has moderate to high landslide prone areas in the eastern and southern parts where most of the settlements in Candikuning Village are located in areas prone to moderate and high landslides hazard.

  4. Identification of User Facility Related Publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Stahl, Christopher G; Wells, Jack C

    2012-01-01

    Scientific user facilities provide physical resources and technical support that enable scientists to conduct experiments or simulations pertinent to their respective research. One metric for evaluating the scientific value or impact of a facility is the number of publications by users as a direct result of using that facility. Unfortunately, for a variety of reasons, capturing accurate values for this metric proves time consuming and error-prone. This work describes a new approach that leverages automated browser technology combined with text analytics to reduce the time and error involved in identifying publications related to user facilities. With this approach, scientific usermore » facilities gain more accurate measures of their impact as well as insight into policy revisions for user access.« less

  5. Interactions and Localization of Escherichia coli Error-Prone DNA Polymerase IV after DNA Damage.

    PubMed

    Mallik, Sarita; Popodi, Ellen M; Hanson, Andrew J; Foster, Patricia L

    2015-09-01

    Escherichia coli's DNA polymerase IV (Pol IV/DinB), a member of the Y family of error-prone polymerases, is induced during the SOS response to DNA damage and is responsible for translesion bypass and adaptive (stress-induced) mutation. In this study, the localization of Pol IV after DNA damage was followed using fluorescent fusions. After exposure of E. coli to DNA-damaging agents, fluorescently tagged Pol IV localized to the nucleoid as foci. Stepwise photobleaching indicated ∼60% of the foci consisted of three Pol IV molecules, while ∼40% consisted of six Pol IV molecules. Fluorescently tagged Rep, a replication accessory DNA helicase, was recruited to the Pol IV foci after DNA damage, suggesting that the in vitro interaction between Rep and Pol IV reported previously also occurs in vivo. Fluorescently tagged RecA also formed foci after DNA damage, and Pol IV localized to them. To investigate if Pol IV localizes to double-strand breaks (DSBs), an I-SceI endonuclease-mediated DSB was introduced close to a fluorescently labeled LacO array on the chromosome. After DSB induction, Pol IV localized to the DSB site in ∼70% of SOS-induced cells. RecA also formed foci at the DSB sites, and Pol IV localized to the RecA foci. These results suggest that Pol IV interacts with RecA in vivo and is recruited to sites of DSBs to aid in the restoration of DNA replication. DNA polymerase IV (Pol IV/DinB) is an error-prone DNA polymerase capable of bypassing DNA lesions and aiding in the restart of stalled replication forks. In this work, we demonstrate in vivo localization of fluorescently tagged Pol IV to the nucleoid after DNA damage and to DNA double-strand breaks. We show colocalization of Pol IV with two proteins: Rep DNA helicase, which participates in replication, and RecA, which catalyzes recombinational repair of stalled replication forks. Time course experiments suggest that Pol IV recruits Rep and that RecA recruits Pol IV. These findings provide in vivo evidence that Pol IV aids in maintaining genomic stability not only by bypassing DNA lesions but also by participating in the restoration of stalled replication forks. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  6. Association of cytokine gene polymorphisms and risk factors with otitis media proneness in children.

    PubMed

    Miljanović, Olivera; Cikota-Aleksić, Bojana; Likić, Dragan; Vojvodić, Danilo; Jovićević, Ognjen; Magić, Zvonko

    2016-06-01

    In order to assess the association between gene polymorphisms and otitis media (OM) proneness, tumor necrosis factor alpha (TNFA) -308, interleukin (IL) 10-1082 and -3575, IL6 -597, IL2 -330, and CD14 -159 genotyping was performed in 58 OM-prone children and 85 controls who were exposed to similar number and frequency of environmental and host risk factors. The frequencies of genotypes (wild type vs. genotypes containing at least one polymorphic allele) were not significantly different between groups, except for IL10 -1082. Polymorphic genotypes IL10 -1082 GA and GG were more frequent in OM-prone children than in control group (RR 1.145, 95 % CI 1.011-1.298; p = 0.047). However, logistic regression did not confirm IL10 -1082 polymorphic genotypes as an independent risk factor for OM proneness. The present study indicates that high-producing IL10 -1082 GA/GG genotypes may increase the risk for OM proneness in its carriers when exposed to other environmental/host risk factors (day care attendance, passive smoking, male sex, respiratory infections, and atopic manifestations). This study revealed no significant independent genetic association, but the lack of breastfeeding in infancy was found to be the only independent risk factor for development of OM-prone phenotype, implying that breastfeeding had a protective role in development of susceptibility to OM. • The pathogenesis of OM is of multifactorial nature, dependent on infection, environmental factors, and immune response of the child. • Cytokines and CD14 play an important role in the presentation and clinical course of otitis media, but a clear link with otitis media proneness was not established. What is new: • This is the first clinical and genetic study on Montenegrin children with the otitis media-prone phenotype. • The study revealed that high-producing IL10 -1082 genotypes may influence otitis media proneness in children exposed to other environmental/host risk factors.

  7. Comparison between conventional protective mechanical ventilation and high-frequency oscillatory ventilation associated with the prone position.

    PubMed

    Fioretto, José Roberto; Klefens, Susiane Oliveira; Pires, Rafaelle Fernandes; Kurokawa, Cilmery Suemi; Carpi, Mario Ferreira; Bonatto, Rossano César; Moraes, Marcos Aurélio; Ronchi, Carlos Fernando

    2017-01-01

    To compare the effects of high-frequency oscillatory ventilation and conventional protective mechanical ventilation associated with the prone position on oxygenation, histology and pulmonary oxidative damage in an experimental model of acute lung injury. Forty-five rabbits with tracheostomy and vascular access were underwent mechanical ventilation. Acute lung injury was induced by tracheal infusion of warm saline. Three experimental groups were formed: healthy animals + conventional protective mechanical ventilation, supine position (Control Group; n = 15); animals with acute lung injury + conventional protective mechanical ventilation, prone position (CMVG; n = 15); and animals with acute lung injury + high-frequency oscillatory ventilation, prone position (HFOG; n = 15). Ten minutes after the beginning of the specific ventilation of each group, arterial gasometry was collected, with this timepoint being called time zero, after which the animal was placed in prone position and remained in this position for 4 hours. Oxidative stress was evaluated by the total antioxidant performance assay. Pulmonary tissue injury was determined by histopathological score. The level of significance was 5%. Both groups with acute lung injury showed worsening of oxygenation after induction of injury compared with the Control Group. After 4 hours, there was a significant improvement in oxygenation in the HFOG group compared with CMVG. Analysis of total antioxidant performance in plasma showed greater protection in HFOG. HFOG had a lower histopathological lesion score in lung tissue than CMVG. High-frequency oscillatory ventilation, associated with prone position, improves oxygenation and attenuates oxidative damage and histopathological lung injury compared with conventional protective mechanical ventilation.

  8. Comparison between conventional protective mechanical ventilation and high-frequency oscillatory ventilation associated with the prone position

    PubMed Central

    Fioretto, José Roberto; Klefens, Susiane Oliveira; Pires, Rafaelle Fernandes; Kurokawa, Cilmery Suemi; Carpi, Mario Ferreira; Bonatto, Rossano César; Moraes, Marcos Aurélio; Ronchi, Carlos Fernando

    2017-01-01

    Objective To compare the effects of high-frequency oscillatory ventilation and conventional protective mechanical ventilation associated with the prone position on oxygenation, histology and pulmonary oxidative damage in an experimental model of acute lung injury. Methods Forty-five rabbits with tracheostomy and vascular access were underwent mechanical ventilation. Acute lung injury was induced by tracheal infusion of warm saline. Three experimental groups were formed: healthy animals + conventional protective mechanical ventilation, supine position (Control Group; n = 15); animals with acute lung injury + conventional protective mechanical ventilation, prone position (CMVG; n = 15); and animals with acute lung injury + high-frequency oscillatory ventilation, prone position (HFOG; n = 15). Ten minutes after the beginning of the specific ventilation of each group, arterial gasometry was collected, with this timepoint being called time zero, after which the animal was placed in prone position and remained in this position for 4 hours. Oxidative stress was evaluated by the total antioxidant performance assay. Pulmonary tissue injury was determined by histopathological score. The level of significance was 5%. Results Both groups with acute lung injury showed worsening of oxygenation after induction of injury compared with the Control Group. After 4 hours, there was a significant improvement in oxygenation in the HFOG group compared with CMVG. Analysis of total antioxidant performance in plasma showed greater protection in HFOG. HFOG had a lower histopathological lesion score in lung tissue than CMVG. Conclusion High-frequency oscillatory ventilation, associated with prone position, improves oxygenation and attenuates oxidative damage and histopathological lung injury compared with conventional protective mechanical ventilation. PMID:29236845

  9. Causal inference with measurement error in outcomes: Bias analysis and estimation methods.

    PubMed

    Shu, Di; Yi, Grace Y

    2017-01-01

    Inverse probability weighting estimation has been popularly used to consistently estimate the average treatment effect. Its validity, however, is challenged by the presence of error-prone variables. In this paper, we explore the inverse probability weighting estimation with mismeasured outcome variables. We study the impact of measurement error for both continuous and discrete outcome variables and reveal interesting consequences of the naive analysis which ignores measurement error. When a continuous outcome variable is mismeasured under an additive measurement error model, the naive analysis may still yield a consistent estimator; when the outcome is binary, we derive the asymptotic bias in a closed-form. Furthermore, we develop consistent estimation procedures for practical scenarios where either validation data or replicates are available. With validation data, we propose an efficient method for estimation of average treatment effect; the efficiency gain is substantial relative to usual methods of using validation data. To provide protection against model misspecification, we further propose a doubly robust estimator which is consistent even when either the treatment model or the outcome model is misspecified. Simulation studies are reported to assess the performance of the proposed methods. An application to a smoking cessation dataset is presented.

  10. IPTV multicast with peer-assisted lossy error control

    NASA Astrophysics Data System (ADS)

    Li, Zhi; Zhu, Xiaoqing; Begen, Ali C.; Girod, Bernd

    2010-07-01

    Emerging IPTV technology uses source-specific IP multicast to deliver television programs to end-users. To provide reliable IPTV services over the error-prone DSL access networks, a combination of multicast forward error correction (FEC) and unicast retransmissions is employed to mitigate the impulse noises in DSL links. In existing systems, the retransmission function is provided by the Retransmission Servers sitting at the edge of the core network. In this work, we propose an alternative distributed solution where the burden of packet loss repair is partially shifted to the peer IP set-top boxes. Through Peer-Assisted Repair (PAR) protocol, we demonstrate how the packet repairs can be delivered in a timely, reliable and decentralized manner using the combination of server-peer coordination and redundancy of repairs. We also show that this distributed protocol can be seamlessly integrated with an application-layer source-aware error protection mechanism called forward and retransmitted Systematic Lossy Error Protection (SLEP/SLEPr). Simulations show that this joint PARSLEP/ SLEPr framework not only effectively mitigates the bottleneck experienced by the Retransmission Servers, thus greatly enhancing the scalability of the system, but also efficiently improves the resistance to the impulse noise.

  11. Automated and unsupervised detection of malarial parasites in microscopic images.

    PubMed

    Purwar, Yashasvi; Shah, Sirish L; Clarke, Gwen; Almugairi, Areej; Muehlenbachs, Atis

    2011-12-13

    Malaria is a serious infectious disease. According to the World Health Organization, it is responsible for nearly one million deaths each year. There are various techniques to diagnose malaria of which manual microscopy is considered to be the gold standard. However due to the number of steps required in manual assessment, this diagnostic method is time consuming (leading to late diagnosis) and prone to human error (leading to erroneous diagnosis), even in experienced hands. The focus of this study is to develop a robust, unsupervised and sensitive malaria screening technique with low material cost and one that has an advantage over other techniques in that it minimizes human reliance and is, therefore, more consistent in applying diagnostic criteria. A method based on digital image processing of Giemsa-stained thin smear image is developed to facilitate the diagnostic process. The diagnosis procedure is divided into two parts; enumeration and identification. The image-based method presented here is designed to automate the process of enumeration and identification; with the main advantage being its ability to carry out the diagnosis in an unsupervised manner and yet have high sensitivity and thus reducing cases of false negatives. The image based method is tested over more than 500 images from two independent laboratories. The aim is to distinguish between positive and negative cases of malaria using thin smear blood slide images. Due to the unsupervised nature of method it requires minimal human intervention thus speeding up the whole process of diagnosis. Overall sensitivity to capture cases of malaria is 100% and specificity ranges from 50-88% for all species of malaria parasites. Image based screening method will speed up the whole process of diagnosis and is more advantageous over laboratory procedures that are prone to errors and where pathological expertise is minimal. Further this method provides a consistent and robust way of generating the parasite clearance curves.

  12. Demonstration of Qubit Operations Below a Rigorous Fault Tolerance Threshold With Gate Set Tomography (Open Access, Publisher’s Version)

    DTIC Science & Technology

    2017-02-15

    Maunz2 Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone...information processors have been demonstrated experimentally using superconducting circuits1–3, electrons in semiconductors4–6, trapped atoms and...qubit quantum information processor has been realized14, and single- qubit gates have demonstrated randomized benchmarking (RB) infidelities as low as 10

  13. MRO DKF Post-Processing Tool

    NASA Technical Reports Server (NTRS)

    Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat

    2008-01-01

    This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.

  14. The Implications of Self-Reporting Systems for Maritime Domain Awareness

    DTIC Science & Technology

    2006-12-01

    SIA), offrent des avantages significatifs comparativement à la poursuite des navires par détecteur ordinaire et que la disponibilité de l’information...reporting system for sea-going vessels that originated in Sweden in the early 1990s. It was designed primarily for safety of life at sea (SOLAS) and...report information is prone to human error and potential malicious altering and the system itself was not designed with these vulnerabilities in mind

  15. Multiple point mutations in a shuttle vector propagated in human cells: evidence for an error-prone DNA polymerase activity.

    PubMed

    Seidman, M M; Bredberg, A; Seetharam, S; Kraemer, K H

    1987-07-01

    Mutagenesis was studied at the DNA-sequence level in human fibroblast and lymphoid cells by use of a shuttle vector plasmid, pZ189, containing a suppressor tRNA marker gene. In a series of experiments, 62 plasmids were recovered that had two to six base substitutions in the 160-base-pair marker gene. Approximately 20-30% of the mutant plasmids that were recovered after passing ultraviolet-treated pZ189 through a repair-proficient human fibroblast line contained these multiple mutations. In contrast, passage of ultraviolet-treated pZ189 through an excision-repair-deficient (xeroderma pigmentosum) line yielded only 2% multiple base substitution mutants. Introducing a single-strand nick in otherwise unmodified pZ189 adjacent to the marker, followed by passage through the xeroderma pigmentosum cells, resulted in about 66% multiple base substitution mutants. The multiple mutations were found in a 160-base-pair region containing the marker gene but were rarely found in an adjacent 170-base-pair region. Passing ultraviolet-treated or nicked pZ189 through a repair-proficient human B-cell line also yielded multiple base substitution mutations in 20-33% of the mutant plasmids. An explanation for these multiple mutations is that they were generated by an error-prone polymerase while filling gaps. These mutations share many of the properties displayed by mutations in the immunoglobulin hypervariable regions.

  16. The PSO4 gene is responsible for an error-prone recombinational DNA repair pathway in Saccharomyces cerevisiae.

    PubMed

    de Andrade, H H; Marques, E K; Schenberg, A C; Henriques, J A

    1989-06-01

    The induction of mitotic gene conversion and crossing-over in Saccharomyces cerevisiae diploid cells homozygous for the pso4-1 mutation was examined in comparison to the corresponding wild-type strain. The pso4-1 mutant strain was found to be completely blocked in mitotic recombination induced by photoaddition of mono- and bifunctional psoralen derivatives as well as by mono- (HN1) and bifunctional (HN2) nitrogen mustards or 254 nm UV radiation in both stationary and exponential phases of growth. Concerning the lethal effect, diploids homozygous for the pso4-1 mutation are more sensitive to all agents tested in any growth phase. However, this effect is more pronounced in the G2 phase of the cell cycle. These results imply that the ploidy effect and the resistance of budding cells are under the control of the PSO4 gene. On the other hand, the pso4-1 mutant is mutationally defective for all agents used. Therefore, the pso4-1 mutant has a generalized block in both recombination and mutation ability. This indicates that the PSO4 gene is involved in an error-prone repair pathway which relies on a recombinational mechanism, strongly suggesting an analogy between the pso4-1 mutation and the RecA or LexA mutation of Escherichia coli.

  17. SU-F-T-404: Dosimetric Advantages of Flattening Free Beams to Prone Accelerated Partial Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galavis, P; Barbee, D; Jozsef, G

    2016-06-15

    Purpose: Prone accelerated partial breast irradiation (APBI) results in dose reduction to the heart and lung. Flattening filter free beams (FFF) reduce out of field dose due to the reduced scatter from the removal of the flattening filter and reduce the buildup region. The aim of this work is to evaluate the dosimetric advantages of FFF beams to prone APBI target coverage and reduction in dose to organs at risk. Methods: Fifteen clinical prone APBI cases using flattened photon beams were retrospectively re-planned in Eclipse-TPS using FFF beams. FFF plans were designed to provide equivalent target coverage with similar hotspotsmore » using the same field arrangements, resulting in comparable target DVHs. Both plans were transferred to a prone breast phantom and delivered on Varian-Edge-Linac. GafChromic-film was placed in the coronal plane of the phantom, partially overlapping the treatment field and extending into OARs to compare dose profiles from both plans. Results: FFF plans were comparable to the clinical plans with maximum doses of (108.3±2.3)% and (109.2±2.4)% and mean doses of (104.5±1.0)% and (104.6±1.2)%, respectively. Similar mean dose doses to the heart and contralateral lungs were observed from both plans, whereas the mean dose to the contra-lateral breast was (2.79±1.18) cGy and (2.86±1.40) cGy for FFF and clinical plans respectively. However for both plans the error between calculated and measured doses at 4 cm from the field edge was 10%. Conclusion: The results showed that FFF beams in prone APBI provide dosimetrically equivalent target coverage and improved coverage in superficial target due to softer energy spectra. Film analysis showed that the TPS underestimates dose outside field edges for both cases. The FFF measured plans showed less dose outside the beam that might reduce the probability of secondary cancers in the contralateral breast.« less

  18. Moral heuristics.

    PubMed

    Sunstein, Cass R

    2005-08-01

    With respect to questions of fact, people use heuristics--mental short-cuts, or rules of thumb, that generally work well, but that also lead to systematic errors. People use moral heuristics too--moral short-cuts, or rules of thumb, that lead to mistaken and even absurd moral judgments. These judgments are highly relevant not only to morality, but to law and politics as well. examples are given from a number of domains, including risk regulation, punishment, reproduction and sexuality, and the act/omission distinction. in all of these contexts, rapid, intuitive judgments make a great deal of sense, but sometimes produce moral mistakes that are replicated in law and policy. One implication is that moral assessments ought not to be made by appealing to intuitions about exotic cases and problems; those intuitions are particularly unlikely to be reliable. Another implication is that some deeply held moral judgments are unsound if they are products of moral heuristics. The idea of error-prone heuristics is especially controversial in the moral domain, where agreement on the correct answer may be hard to elicit; but in many contexts, heuristics are at work and they do real damage. Moral framing effects, including those in the context of obligations to future generations, are also discussed.

  19. Semi-supervised anomaly detection - towards model-independent searches of new physics

    NASA Astrophysics Data System (ADS)

    Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu

    2012-06-01

    Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.

  20. Who Believes in the Storybook Image of the Scientist?

    PubMed

    Veldkamp, Coosje L S; Hartgerink, Chris H J; van Assen, Marcel A L M; Wicherts, Jelte M

    2017-01-01

    Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the "storybook image of the scientist" is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than to other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and Ph.D. students, and higher levels to Ph.D. students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one's own group than to people in other groups may decrease scientists' willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science.

  1. Who Believes in the Storybook Image of the Scientist?

    PubMed Central

    Veldkamp, Coosje L. S.; Hartgerink, Chris H. J.; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2017-01-01

    ABSTRACT Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the “storybook image of the scientist” is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than to other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and Ph.D. students, and higher levels to Ph.D. students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one’s own group than to people in other groups may decrease scientists’ willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science. PMID:28001440

  2. Infrequent identity mismatches are frequently undetected

    PubMed Central

    Goldinger, Stephen D.

    2014-01-01

    The ability to quickly and accurately match faces to photographs bears critically on many domains, from controlling purchase of age-restricted goods to law enforcement and airport security. Despite its pervasiveness and importance, research has shown that face matching is surprisingly error prone. The majority of face-matching research is conducted under idealized conditions (e.g., using photographs of individuals taken on the same day) and with equal proportions of match and mismatch trials, a rate that is likely not observed in everyday face matching. In four experiments, we presented observers with photographs of faces taken an average of 1.5 years apart and tested whether face-matching performance is affected by the prevalence of identity mismatches, comparing conditions of low (10 %) and high (50 %) mismatch prevalence. Like the low-prevalence effect in visual search, we observed inflated miss rates under low-prevalence conditions. This effect persisted when participants were allowed to correct their initial responses (Experiment 2), when they had to verify every decision with a certainty judgment (Experiment 3) and when they were permitted “second looks” at face pairs (Experiment 4). These results suggest that, under realistic viewing conditions, the low-prevalence effect in face matching is a large, persistent source of errors. PMID:24500751

  3. Medical malpractice, defensive medicine and role of the "media" in Italy.

    PubMed

    Toraldo, Domenico M; Vergari, Ughetta; Toraldo, Marta

    2015-01-01

    For many years until now, Italy has been subjected to an inconsistent and contradictory media campaign. On one hand the "media" present us with bold and reassuring messages about the progress of medical science; on the other hand they are prone to kneejerk criticism every time medical treatment does not have the desired effect, routinely describing such cases as glaring examples of "malasanità", an Italian word of recent coinage used to denote medical malpractice. Newspaper reports of legal proceedings involving health treatment are frequently full of errors and lack any scientific basis. The published data confirm the unsustainably high number of lawsuits against doctors and medical structures, accompanied by demands for compensation arising from true or alleged medical errors or mistakes blamed on the work of health structures. Currently Italian citizens have a greater awareness of their right to health than in the past, and patients' expectations have risen. A discrepancy is emerging between the current state of medical science and the capacities of individual doctors and health structures. Lastly, there is a need for greater monitoring of the quality of health care services and a greater emphasis on health risk prevention.

  4. ASSIST user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1995-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H; Gao, Y; Liu, T

    Purpose: To develop quantitative clinical guidelines between supine Deep Inspiratory Breath Hold (DIBH) and prone free breathing treatments for breast patients, we applied 3D deformable phantoms to perform Monte Carlo simulation to predict corresponding Dose to the Organs at Risk (OARs). Methods: The RPI-adult female phantom (two selected cup sizes: A and D) was used to represent the female patient, and it was simulated using the MCNP6 Monte Carlo code. Doses to OARs were investigated for supine DIBH and prone treatments, considering two breast sizes. The fluence maps of the 6-MV opposed tangential fields were exported. In the Monte Carlomore » simulation, the fluence maps allow each simulated photon particle to be weighed in the final dose calculation. The relative error of all dose calculations was kept below 5% by simulating 3*10{sup 7} photons for each projection. Results: In terms of dosimetric accuracy, the RPI Adult Female phantom with cup size D in DIBH positioning matched with a DIBH treatment plan of the patient. Based on the simulation results, for cup size D phantom, prone positioning reduced the cardiac dose and the dose to other OARs, while cup size A phantom benefits more from DIBH positioning. Comparing simulation results for cup size A and D phantom, dose to OARs was generally higher for the large breast size due to increased scattering arising from a larger portion of the body in the primary beam. The lower dose that was registered for the heart in the large breast phantom in prone positioning was due to the increase of the distance between the heart and the primary beam when the breast was pendulous. Conclusion: Our 3D deformable phantom appears an excellent tool to predict dose to the OARs for the supine DIBH and prone positions, which might help quantitative clinical decisions. Further investigation will be conducted. National Institutes of Health R01EB015478.« less

  6. Medical errors and uncertainty in primary healthcare: A comparative study of coping strategies among young and experienced GPs

    PubMed Central

    Kuikka, Liisa; Pitkälä, Kaisu

    2014-01-01

    Abstract Objective. To study coping differences between young and experienced GPs in primary care who experience medical errors and uncertainty. Design. Questionnaire-based survey (self-assessment) conducted in 2011. Setting. Finnish primary practice offices in Southern Finland. Subjects. Finnish GPs engaged in primary health care from two different respondent groups: young (working experience ≤ 5years, n = 85) and experienced (working experience > 5 years, n = 80). Main outcome measures. Outcome measures included experiences and attitudes expressed by the included participants towards medical errors and tolerance of uncertainty, their coping strategies, and factors that may influence (positively or negatively) sources of errors. Results. In total, 165/244 GPs responded (response rate: 68%). Young GPs expressed significantly more often fear of committing a medical error (70.2% vs. 48.1%, p = 0.004) and admitted more often than experienced GPs that they had committed a medical error during the past year (83.5% vs. 68.8%, p = 0.026). Young GPs were less prone to apologize to a patient for an error (44.7% vs. 65.0%, p = 0.009) and found, more often than their more experienced colleagues, on-site consultations and electronic databases useful for avoiding mistakes. Conclusion. Experienced GPs seem to better tolerate uncertainty and also seem to fear medical errors less than their young colleagues. Young and more experienced GPs use different coping strategies for dealing with medical errors. Implications. When GPs become more experienced, they seem to get better at coping with medical errors. Means to support these skills should be studied in future research. PMID:24914458

  7. Considerations for analysis of time-to-event outcomes measured with error: Bias and correction with SIMEX.

    PubMed

    Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A

    2018-04-15

    For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Prone positioning reduces mortality from acute respiratory distress syndrome in the low tidal volume era: a meta-analysis

    PubMed Central

    Shaefi, Shahzad; Montesi, Sydney B.; Devlin, Amy; Loring, Stephen H.; Talmor, Daniel; Malhotra, Atul

    2014-01-01

    Purpose Prone positioning for ARDS has been performed for decades without definitive evidence of clinical benefit. A recent multicenter trial demonstrated for the first time significantly reduced mortality with prone positioning. This meta-analysis was performed to integrate these findings with existing literature and test whether differences in tidal volume explain conflicting results among randomized trials. Methods Studies were identified using MEDLINE, EMBASE, Cochrane Register of Controlled Trials, LILACS, and citation review. Included were randomized trials evaluating the effect on mortality of prone versus supine positioning during conventional ventilation for ARDS. The primary outcome was risk ratio of death at 60 days meta-analyzed using random effects models. Analysis stratified by high (>8 ml/kg predicted body weight) or low (≤8 ml/kg PBW) mean baseline tidal volume was planned a priori. Results Seven trials were identified including 2,119 patients, of whom 1,088 received prone positioning. Overall, prone positioning was not significantly associated with the risk ratio of death (RR 0.83; 95 % CI 0.68–1.02; p = 0.073; I2 = 64 %). When stratified by high or low tidal volume, prone positioning was associated with a significant decrease in RR of death only among studies with low baseline tidal volume (RR 0.66; 95 % CI 0.50–0.86; p = 0.002; I2 = 25 %). Stratification by tidal volume explained over half the between-study heterogeneity observed in the unstratified analysis. Conclusions Prone positioning is associated with significantly reduced mortality from ARDS in the low tidal volume era. Substantial heterogeneity across studies can be explained by differences in tidal volume. PMID:24435203

  9. Utilizing Multiple Datasets for Snow Cover Mapping

    NASA Technical Reports Server (NTRS)

    Tait, Andrew B.; Hall, Dorothy K.; Foster, James L.; Armstrong, Richard L.

    1999-01-01

    Snow-cover maps generated from surface data are based on direct measurements, however they are prone to interpolation errors where climate stations are sparsely distributed. Snow cover is clearly discernable using satellite-attained optical data because of the high albedo of snow, yet the surface is often obscured by cloud cover. Passive microwave (PM) data is unaffected by clouds, however, the snow-cover signature is significantly affected by melting snow and the microwaves may be transparent to thin snow (less than 3cm). Both optical and microwave sensors have problems discerning snow beneath forest canopies. This paper describes a method that combines ground and satellite data to produce a Multiple-Dataset Snow-Cover Product (MDSCP). Comparisons with current snow-cover products show that the MDSCP draws together the advantages of each of its component products while minimizing their potential errors. Improved estimates of the snow-covered area are derived through the addition of two snow-cover classes ("thin or patchy" and "high elevation" snow cover) and from the analysis of the climate station data within each class. The compatibility of this method for use with Moderate Resolution Imaging Spectroradiometer (MODIS) data, which will be available in 2000, is also discussed. With the assimilation of these data, the resolution of the MDSCP would be improved both spatially and temporally and the analysis would become completely automated.

  10. Body-object interaction ratings for 1,618 monosyllabic nouns.

    PubMed

    Tillotson, Sherri M; Siakaluk, Paul D; Pexman, Penny M

    2008-11-01

    Body-object interaction (BOI) assesses the ease with which a human body can physically interact with a word's referent. Recent research has shown that BOI influences visual word recognition processes in such a way that responses to high-BOI words (e.g., couch) are faster and less error prone than responses to low-BOI words (e.g., cliff). Importantly, the high-BOI words and the low-BOI words that were used in those studies were matched on imageability. In the present study, we collected BOI ratings for a large set of words. BOI ratings, on a 1-7 scale, were obtained for 1,618 monosyllabic nouns. These ratings allowed us to test the generalizability of BOI effects to a large set of items, and they should be useful to researchers who are interested in manipulating or controlling for the effects of BOI. The body-object interaction ratings for this study may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  11. Digitally synthesized beat frequency-multiplexed fluorescence lifetime spectroscopy

    PubMed Central

    Chan, Jacky C. K.; Diebold, Eric D.; Buckley, Brandon W.; Mao, Sien; Akbari, Najva; Jalali, Bahram

    2014-01-01

    Frequency domain fluorescence lifetime imaging is a powerful technique that enables the observation of subtle changes in the molecular environment of a fluorescent probe. This technique works by measuring the phase delay between the optical emission and excitation of fluorophores as a function of modulation frequency. However, high-resolution measurements are time consuming, as the excitation modulation frequency must be swept, and faster low-resolution measurements at a single frequency are prone to large errors. Here, we present a low cost optical system for applications in real-time confocal lifetime imaging, which measures the phase vs. frequency spectrum without sweeping. Deemed Lifetime Imaging using Frequency-multiplexed Excitation (LIFE), this technique uses a digitally-synthesized radio frequency comb to drive an acousto-optic deflector, operated in a cat’s-eye configuration, to produce a single laser excitation beam modulated at multiple beat frequencies. We demonstrate simultaneous fluorescence lifetime measurements at 10 frequencies over a bandwidth of 48 MHz, enabling high speed frequency domain lifetime analysis of single- and multi-component sample mixtures. PMID:25574449

  12. FANCJ suppresses microsatellite instability and lymphomagenesis independent of the Fanconi anemia pathway.

    PubMed

    Matsuzaki, Kenichiro; Borel, Valerie; Adelman, Carrie A; Schindler, Detlev; Boulton, Simon J

    2015-12-15

    Microsatellites are short tandem repeat sequences that are highly prone to expansion/contraction due to their propensity to form non-B-form DNA structures, which hinder DNA polymerases and provoke template slippage. Although error correction by mismatch repair plays a key role in preventing microsatellite instability (MSI), which is a hallmark of Lynch syndrome, activities must also exist that unwind secondary structures to facilitate replication fidelity. Here, we report that Fancj helicase-deficient mice, while phenotypically resembling Fanconi anemia (FA), are also hypersensitive to replication inhibitors and predisposed to lymphoma. Whereas metabolism of G4-DNA structures is largely unaffected in Fancj(-/-) mice, high levels of spontaneous MSI occur, which is exacerbated by replication inhibition. In contrast, MSI is not observed in Fancd2(-/-) mice but is prevalent in human FA-J patients. Together, these data implicate FANCJ as a key factor required to counteract MSI, which is functionally distinct from its role in the FA pathway. © 2015 Matsuzaki et al.; Published by Cold Spring Harbor Laboratory Press.

  13. Computer-aided target tracking in motion analysis studies

    NASA Astrophysics Data System (ADS)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  14. A hybrid computational-experimental approach for automated crystal structure solution

    NASA Astrophysics Data System (ADS)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  15. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    PubMed

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. TRIP13 promotes error-prone nonhomologous end joining and induces chemoresistance in head and neck cancer

    PubMed Central

    Banerjee, Rajat; Russo, Nickole; Liu, Min; Basrur, Venkatesha; Bellile, Emily; Palanisamy, Nallasivam; Scanlon, Christina S.; van Tubergen, Elizabeth; Inglehart, Ronald C.; Metwally, Tarek; Mani, Ram-Shankar; Yocum, Anastasia; Nyati, Mukesh K.; Castilho, Rogerio M.; Varambally, Sooryanarayana; Chinnaiyan, Arul M.

    2014-01-01

    Head and neck cancer (SCCHN) is a common, aggressive, treatment-resistant cancer with a high recurrence rate and mortality, but the mechanism of treatment-resistance remains unclear. Here we describe a mechanism where the AAA-ATPase TRIP13 promotes treatment-resistance. Overexpression of TRIP13 in non-malignant cells results in malignant transformation. High expression of TRIP13 in SCCHN leads to aggressive, treatment-resistant tumors and enhanced repair of DNA damage. Using mass spectrometry, we identify DNA-PKcs complex proteins that mediate non homologous end joining (NHEJ), as TRIP13 binding partners. Using repair-deficient reporter systems, we show that TRIP13 promotes NHEJ, even when homologous recombination is intact. Importantly, overexpression of TRIP13 sensitizes SCCHN to an inhibitor of DNA-PKcs. Thus, this study defines a new mechanism of treatment resistance in SCCHN and underscores the importance of targeting NHEJ to overcome treatment failure in SCCHN and potentially in other cancers that overexpress TRIP13. PMID:25078033

  17. Prone positioning in the patient who has acute respiratory distress syndrome: the art and science.

    PubMed

    Vollman, Kathleen M

    2004-09-01

    Acute respiratory distress syndrome (ARDS) remains a significant contributor to the morbidity and mortality of patients in the ICU. A variety of treatments are used to support the lung of the patient who has ARDS and improve gas exchange during the acute injury phase. It seems, however, that the simple, safe, and noninvasive act of prone positioning of the critically ill patient who has ARDS may improve gas exchange while preventing potential complications of high positive end-expiratory pressure, volutrauma, and oxygen toxicity. This article provides the critical care nurse with the physiologic rationale for use of the prone position, indications and contraindications for use, safe strategies for prone positioning, and care techniques and monitoring methods of the patient who is in the prone position.

  18. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  19. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  20. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  1. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  2. Modeling the Error of the Medtronic Paradigm Veo Enlite Glucose Sensor.

    PubMed

    Biagi, Lyvia; Ramkissoon, Charrise M; Facchinetti, Andrea; Leal, Yenny; Vehi, Josep

    2017-06-12

    Continuous glucose monitors (CGMs) are prone to inaccuracy due to time lags, sensor drift, calibration errors, and measurement noise. The aim of this study is to derive the model of the error of the second generation Medtronic Paradigm Veo Enlite (ENL) sensor and compare it with the Dexcom SEVEN PLUS (7P), G4 PLATINUM (G4P), and advanced G4 for Artificial Pancreas studies (G4AP) systems. An enhanced methodology to a previously employed technique was utilized to dissect the sensor error into several components. The dataset used included 37 inpatient sessions in 10 subjects with type 1 diabetes (T1D), in which CGMs were worn in parallel and blood glucose (BG) samples were analyzed every 15 ± 5 min Calibration error and sensor drift of the ENL sensor was best described by a linear relationship related to the gain and offset. The mean time lag estimated by the model is 9.4 ± 6.5 min. The overall average mean absolute relative difference (MARD) of the ENL sensor was 11.68 ± 5.07% Calibration error had the highest contribution to total error in the ENL sensor. This was also reported in the 7P, G4P, and G4AP. The model of the ENL sensor error will be useful to test the in silico performance of CGM-based applications, i.e., the artificial pancreas, employing this kind of sensor.

  3. Fidelity of DNA Replication in Normal and Malignant Human Breast Cells

    DTIC Science & Technology

    1998-07-01

    synthesome has been extensively demonstrated to carry out full length DNA replication in vitro, and to accurately depict the DNA replication process as it...occurs in the intact cell. By examining the fidelity of the DNA replication process carried out by the DNA synthesome from a number of breast cell types...we have demonstrated for the first time, that the cellular DNA replication machinery of malignant human breast cells is significantly more error-prone than that of non- malignant human breast cells.

  4. Addition to the Lewis Chemical Equilibrium Program to allow computation from coal composition data

    NASA Technical Reports Server (NTRS)

    Sevigny, R.

    1980-01-01

    Changes made to the Coal Gasification Project are reported. The program was developed by equilibrium combustion in rocket engines. It can be applied directly to the entrained flow coal gasification process. The particular problem addressed is the reduction of the coal data into a form suitable to the program, since the manual process is involved and error prone. A similar problem in relating the normal output of the program to parameters meaningful to the coal gasification process is also addressed.

  5. Effects of Non-Normal Outlier-Prone Error Distribution on Kalman Filter Track

    DTIC Science & Technology

    1991-09-01

    other possibilities exist. For example the GST (Generic Statistical Tracker) uses four motion models [Ref. 41. The GST keeps track of both the target...1.011 + + + 3.113 1.291 4 Although this procedure is not easily statistically interpretable, it was used for the sake of comparison with the other... TRANSITOR TARGET’ WRITE(6,*)’ 3 SECOND ORDER GAUSS MARKOV TARGET’ WRITE(6,*)’ 4 RANDOM TOUR TARGET’ READ(6,*) CHOICE IF((CHOICE.LT.1).OR.(CHOICE.GT.4

  6. Computer Aided Software Engineering (CASE) Environment Issues.

    DTIC Science & Technology

    1987-06-01

    tasks tend to be error prone and slowv when done by humans . Ti-.c,. are e’.el nt anidates for automation using a computer. (MacLennan. 10S1. p. 51 2...CASE r,’sourCcs; * human resources. Lonsisting of the people who use and facilitate utilization in !:1e case of manual resource, of the environment...engineering process in a given er,%irent rnizthe nature of rnanua! and human resources. CA.SU_ -esources should provide the softwvare enizincerin2 team

  7. Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual

    DTIC Science & Technology

    1988-12-01

    The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.

  8. Error Estimation of Pathfinder Version 5.3 SST Level 3C Using Three-way Error Analysis

    NASA Astrophysics Data System (ADS)

    Saha, K.; Dash, P.; Zhao, X.; Zhang, H. M.

    2017-12-01

    One of the essential climate variables for monitoring as well as detecting and attributing climate change, is Sea Surface Temperature (SST). A long-term record of global SSTs are available with observations obtained from ships in the early days to the more modern observation based on in-situ as well as space-based sensors (satellite/aircraft). There are inaccuracies associated with satellite derived SSTs which can be attributed to the errors associated with spacecraft navigation, sensor calibrations, sensor noise, retrieval algorithms, and leakages due to residual clouds. Thus it is important to estimate accurate errors in satellite derived SST products to have desired results in its applications.Generally for validation purposes satellite derived SST products are compared against the in-situ SSTs which have inaccuracies due to spatio/temporal inhomogeneity between in-situ and satellite measurements. A standard deviation in their difference fields usually have contributions from both satellite as well as the in-situ measurements. A real validation of any geophysical variable must require the knowledge of the "true" value of the said variable. Therefore a one-to-one comparison of satellite based SST with in-situ data does not truly provide us the real error in the satellite SST and there will be ambiguity due to errors in the in-situ measurements and their collocation differences. A Triple collocation (TC) or three-way error analysis using 3 mutually independent error-prone measurements, can be used to estimate root-mean square error (RMSE) associated with each of the measurements with high level of accuracy without treating any one system a perfectly-observed "truth". In this study we are estimating the absolute random errors associated with Pathfinder Version 5.3 Level-3C SST product Climate Data record. Along with the in-situ SST data, the third source of dataset used for this analysis is the AATSR reprocessing of climate (ARC) dataset for the corresponding period. All three SST observations are collocated, and statistics of difference between each pair is estimated. Instead of using a traditional TC analysis we have implemented the Extended Triple Collocation (ETC) approach to estimate the correlation coefficient of each measurement system w.r.t. the unknown target variable along with their RMSE.

  9. Gamification of Clinical Routine: The Dr. Fill Approach.

    PubMed

    Bukowski, Mark; Kühn, Martin; Zhao, Xiaoqing; Bettermann, Ralf; Jonas, Stephan

    2016-01-01

    Gamification is used in clinical context in the health care education. Furthermore, it has shown great promises to improve the performance of the health care staff in their daily routine. In this work we focus on the medication sorting task, which is performed manually in hospitals. This task is very error prone and needs to be performed daily. Nevertheless, errors in the medication are crucial and lead to serious complications. In this work we present a real world gamification approach of the medication sorting task in a patient's daily pill organizer. The player of the game needs to sort the correct medication into the correct dispenser slots and is rewarded or punished in real time. At the end of the game, a score is given and the user can register in a leaderboard.

  10. Predicted Errors In Children's Early Sentence Comprehension

    PubMed Central

    Gertner, Yael; Fisher, Cynthia

    2012-01-01

    Children use syntax to interpret sentences and learn verbs; this is syntactic bootstrapping. The structure-mapping account of early syntactic bootstrapping proposes that a partial representation of sentence structure, the set of nouns occurring with the verb, guides initial interpretation and provides an abstract format for new learning. This account predicts early successes, but also telltale errors: Toddlers should be unable to tell transitive sentences from other sentences containing two nouns. In testing this prediction, we capitalized on evidence that 21-month-olds use what they have learned about noun order in English sentences to understand new transitive verbs. In two experiments, 21-month-olds applied this noun-order knowledge to two-noun intransitive sentences, mistakenly assigning different interpretations to “The boy and the girl are gorping!” and “The girl and the boy are gorping!”. This suggests that toddlers exploit partial representations of sentence structure to guide sentence interpretation; these sparse representations are useful, but error-prone. PMID:22525312

  11. Metrics to quantify the importance of mixing state for CCN activity

    DOE PAGES

    Ching, Joseph; Fast, Jerome; West, Matthew; ...

    2017-06-21

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  12. Landmark-based elastic registration using approximating thin-plate splines.

    PubMed

    Rohr, K; Stiehl, H S; Sprengel, R; Buzug, T M; Weese, J; Kuhn, M H

    2001-06-01

    We consider elastic image registration based on a set of corresponding anatomical point landmarks and approximating thin-plate splines. This approach is an extension of the original interpolating thin-plate spline approach and allows to take into account landmark localization errors. The extension is important for clinical applications since landmark extraction is always prone to error. Our approach is based on a minimizing functional and can cope with isotropic as well as anisotropic landmark errors. In particular, in the latter case it is possible to include different types of landmarks, e.g., unique point landmarks as well as arbitrary edge points. Also, the scheme is general with respect to the image dimension and the order of smoothness of the underlying functional. Optimal affine transformations as well as interpolating thin-plate splines are special cases of this scheme. To localize landmarks we use a semi-automatic approach which is based on three-dimensional (3-D) differential operators. Experimental results are presented for two-dimensional as well as 3-D tomographic images of the human brain.

  13. Diagnosing Crime and Diagnosing Disease: Bias Reduction Strategies in the Forensic and Clinical Sciences.

    PubMed

    Lockhart, Joseph J; Satya-Murti, Saty

    2017-11-01

    Cognitive effort is an essential part of both forensic and clinical decision-making. Errors occur in both fields because the cognitive process is complex and prone to bias. We performed a selective review of full-text English language literature on cognitive bias leading to diagnostic and forensic errors. Earlier work (1970-2000) concentrated on classifying and raising bias awareness. Recently (2000-2016), the emphasis has shifted toward strategies for "debiasing." While the forensic sciences have focused on the control of misleading contextual cues, clinical debiasing efforts have relied on checklists and hypothetical scenarios. No single generally applicable and effective bias reduction strategy has emerged so far. Generalized attempts at bias elimination have not been particularly successful. It is time to shift focus to the study of errors within specific domains, and how to best communicate uncertainty in order to improve decision making on the part of both the expert and the trier-of-fact. © 2017 American Academy of Forensic Sciences.

  14. Metrics to quantify the importance of mixing state for CCN activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ching, Joseph; Fast, Jerome; West, Matthew

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  15. Strongest Earthquake-Prone Areas in Kamchatka

    NASA Astrophysics Data System (ADS)

    Dzeboev, B. A.; Agayan, S. M.; Zharkikh, Yu. I.; Krasnoperov, R. I.; Barykina, Yu. V.

    2018-03-01

    The paper continues the series of our works on recognizing the areas prone to the strongest, strong, and significant earthquakes with the use of the Formalized Clustering And Zoning (FCAZ) intellectual clustering system. We recognized the zones prone to the probable emergence of epicenters of the strongest ( M ≥ 74/3) earthquakes on the Pacific Coast of Kamchatka. The FCAZ-zones are compared to the zones that were recognized in 1984 by the classical recognition method for Earthquake-Prone Areas (EPA) by transferring the criteria of high seismicity from the Andes mountain belt to the territory of Kamchatka. The FCAZ recognition was carried out with two-dimensional and three-dimensional objects of recognition.

  16. Mapping radon-prone areas using γ-radiation dose rate and geological information.

    PubMed

    García-Talavera, M; García-Pérez, A; Rey, C; Ramos, L

    2013-09-01

    Identifying radon-prone areas is key to policies on the control of this environmental carcinogen. In the current paper, we present the methodology followed to delineate radon-prone areas in Spain. It combines information from indoor radon measurements with γ-radiation and geological maps. The advantage of the proposed approach is that it lessens the requirement for a high density of measurements by making use of commonly available information. It can be applied for an initial definition of radon-prone areas in countries committed to introducing a national radon policy or to improving existing radon maps in low population regions.

  17. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection

    PubMed Central

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors. PMID:24688709

  18. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection.

    PubMed

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors.

  19. A mutation in EXO1 defines separable roles in DNA mismatch repair and post-replication repair

    PubMed Central

    Tran, Phuoc T.; Fey, Julien P.; Erdeniz, Naz; Gellon, Lionel; Boiteux, Serge; Liskay, R. Michael

    2007-01-01

    Replication forks stall at DNA lesions or as a result of an unfavorable replicative environment. These fork stalling events have been associated with recombination and gross chromosomal rearrangements. Recombination and fork bypass pathways are the mechanisms accountable for restart of stalled forks. An important lesion bypass mechanism is the highly conserved post-replication repair (PRR) pathway that is composed of error-prone translesion and error-free bypass branches. EXO1 codes for a Rad2p family member nuclease that has been implicated in a multitude of eukaryotic DNA metabolic pathways that include DNA repair, recombination, replication, and telomere integrity. In this report, we show EXO1 functions in the MMS2 error-free branch of the PRR pathway independent of the role of EXO1 in DNA mismatch repair (MMR). Consistent with the idea that EXO1 functions independently in two separate pathways, we defined a domain of Exo1p required for PRR distinct from those required for interaction with MMR proteins. We then generated a point mutant exo1 allele that was defective for the function of Exo1p in MMR due to disrupted interaction with Mlh1p, but still functional for PRR. Lastly, by using a compound exo1 mutant that was defective for interaction with Mlh1p and deficient for nuclease activity, we provide further evidence that Exo1p plays both structural and catalytic roles during MMR. PMID:17602897

  20. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool.

    PubMed

    Yurko, Yuliya Y; Scerbo, Mark W; Prabhu, Ajita S; Acker, Christina E; Stefanidis, Dimitrios

    2010-10-01

    Increased workload during task performance may increase fatigue and facilitate errors. The National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is a previously validated tool for workload self-assessment. We assessed the relationship of workload and performance during simulator training on a complex laparoscopic task. NASA-TLX workload data from three separate trials were analyzed. All participants were novices (n = 28), followed the same curriculum on the fundamentals of laparoscopic surgery suturing model, and were tested in the animal operating room (OR) on a Nissen fundoplication model after training. Performance and workload scores were recorded at baseline, after proficiency achievement, and during the test. Performance, NASA-TLX scores, and inadvertent injuries during the test were analyzed and compared. Workload scores declined during training and mirrored performance changes. NASA-TLX scores correlated significantly with performance scores (r = -0.5, P < 0.001). Participants with higher workload scores caused more inadvertent injuries to adjacent structures in the OR (r = 0.38, P < 0.05). Increased mental and physical workload scores at baseline correlated with higher workload scores in the OR (r = 0.52-0.82; P < 0.05) and more inadvertent injuries (r = 0.52, P < 0.01). Increased workload is associated with inferior task performance and higher likelihood of errors. The NASA-TLX questionnaire accurately reflects workload changes during simulator training and may identify individuals more likely to experience high workload and more prone to errors during skill transfer to the clinical environment.

  1. On-field mounting position estimation of a lidar sensor

    NASA Astrophysics Data System (ADS)

    Khan, Owes; Bergelt, René; Hardt, Wolfram

    2017-10-01

    In order to retrieve a highly accurate view of their environment, autonomous cars are often equipped with LiDAR sensors. These sensors deliver a three dimensional point cloud in their own co-ordinate frame, where the origin is the sensor itself. However, the common co-ordinate system required by HAD (Highly Autonomous Driving) software systems has its origin at the center of the vehicle's rear axle. Thus, a transformation of the acquired point clouds to car co-ordinates is necessary, and thereby the determination of the exact mounting position of the LiDAR system in car coordinates is required. Unfortunately, directly measuring this position is a time-consuming and error-prone task. Therefore, different approaches have been suggested for its estimation which mostly require an exhaustive test-setup and are again time-consuming to prepare. When preparing a high number of LiDAR mounted test vehicles for data acquisition, most approaches fall short due to time or money constraints. In this paper we propose an approach for mounting position estimation which features an easy execution and setup, thus making it feasible for on-field calibration.

  2. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    PubMed Central

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  3. DEM-based Approaches for the Identification of Flood Prone Areas

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Manfreda, Salvatore; Nardi, Fernando; Grimaldi, Salvatore; Roth, Giorgio; Sole, Aurelia

    2013-04-01

    The remarkable number of inundations that caused, in the last decades, thousands of deaths and huge economic losses, testifies the extreme vulnerability of many Countries to the flood hazard. As a matter of fact, human activities are often developed in the floodplains, creating conditions of extremely high risk. Terrain morphology plays an important role in understanding, modelling and analyzing the hydraulic behaviour of flood waves. Research during the last 10 years has shown that the delineation of flood prone areas can be carried out using fast methods that relay on basin geomorphologic features. In fact, the availability of new technologies to measure surface elevation (e.g., GPS, SAR, SAR interferometry, RADAR and LASER altimetry) has given a strong impulse to the development of Digital Elevation Models (DEMs) based approaches. The identification of the dominant topographic controls on the flood inundation process is a critical research question that we try to tackle with a comparative analysis of several techniques. We reviewed four different approaches for the morphological characterization of a river basin with the aim to provide a description of their performances and to identify their range of applicability. In particular, we explored the potential of the following tools. 1) The hydrogeomorphic method proposed by Nardi et al. (2006) which defines the flood prone areas according to the water level in the river network through the hydrogeomorphic theory. 2) The linear binary classifier proposed by Degiorgis et al. (2012) which allows distinguishing flood-prone areas using two features related to the location of the site under exam with respect to the nearest hazard source. The two features, proposed in the study, are the length of the path that hydrologically connects the location under exam to the nearest element of the drainage network and the difference in elevation between the cell under exam and the final point of the same path. 3) The method by Manfreda et al. (2011) that suggested a modified Topographic Index (TIm) for the identification of flood prone area. 4) The downslope index proposed by Hjerdt et al. (2004) that quantifies the topographic controls on hydrology by evaluating head differences following the (surface) flow path in the steepest direction. The method does not use the exit point at the stream as reference; instead, the algorithm looks at how far a parcel of water has to travel along its flow path to lose a given head potential, d [m]. This last index was not defined with the aim to describe flood prone areas; in fact it represents an interesting alternative descriptor of morphological features that deserve to be tested. Analyses have been carried out for some Italian catchments. The outcomes of the four methods are presented using, for calibration and validation purposes, flood inundation maps made available by River Basin Authorities. The aim is, therefore, to evaluate the reliability and the relative errors in the detection of the areas subject to the flooding hazard. These techniques should not be considered as an alternative of traditional procedures, but additional tool for the identification of flood-prone areas and hazard graduation over large regions or when a preliminary identification is needed. Reference Degiorgis M., G. Gnecco, S. Gorni, G. Roth, M. Sanguineti, A. C. Taramasso, Classifiers for the detection of flood-prone areas using remote sensed elevation data, J. Hydrol., 470-471, 302-315, 2012. Hjerdt, K. N., J. J. McDonnell, J. Seibert, A. Rodhe, A new topographic index to quantify downslope controls on local drainage, Water Resour. Res., 40, W05602, 2004. Manfreda, S., M. Di Leo, A. Sole, Detection of Flood Prone Areas using Digital Elevation Models, Journal of Hydrologic Engineering, Vol. 16, No. 10, 781-790, 2011. Nardi, F., E. R. Vivoni, S. Grimaldi, Investigating a floodplain scaling relation using a hydrogeomorphic delineation method, Water Resour. Res., 42, W09409, 2006.

  4. Delusion proneness and 'jumping to conclusions': relative and absolute effects.

    PubMed

    van der Leer, L; Hartig, B; Goldmanis, M; McKay, R

    2015-04-01

    That delusional and delusion-prone individuals 'jump to conclusions' is one of the most robust and important findings in the literature on delusions. However, although the notion of 'jumping to conclusions' (JTC) implies gathering insufficient evidence and reaching premature decisions, previous studies have not investigated whether the evidence gathering of delusion-prone individuals is, in fact, suboptimal. The standard JTC effect is a relative effect but using relative comparisons to substantiate absolute claims is problematic. In this study we investigated whether delusion-prone participants jump to conclusions in both a relative and an absolute sense. Healthy participants (n = 112) completed an incentivized probabilistic reasoning task in which correct decisions were rewarded and additional information could be requested for a small price. This combination of rewards and costs generated optimal decision points. Participants also completed measures of delusion proneness, intelligence and risk aversion. Replicating the standard relative finding, we found that delusion proneness significantly predicted task decisions, such that the more delusion prone the participants were, the earlier they decided. This finding was robust when accounting for the effects of risk aversion and intelligence. Importantly, high-delusion-prone participants also decided in advance of an objective rational optimum, gathering fewer data than would have maximized their expected payoff. Surprisingly, we found that even low-delusion-prone participants jumped to conclusions in this absolute sense. Our findings support and clarify the claim that delusion formation is associated with a tendency to 'jump to conclusions'. In short, most people jump to conclusions, but more delusion-prone individuals 'jump further'.

  5. A Modified Obesity Proneness Model Predicts Adolescent Weight Concerns and Inability to Self-Regulate Eating

    ERIC Educational Resources Information Center

    Nickelson, Jen; Bryant, Carol A.; McDermott, Robert J.; Buhi, Eric R.; DeBate, Rita D.

    2012-01-01

    Background: The prevalence of obesity among high school students has risen in recent decades. Many high school students report trying to lose weight and some engage in disordered eating to do so. The obesity proneness model suggests that parents may influence their offspring's development of disordered eating. This study examined the viability of…

  6. Time-symmetric integration in astrophysics

    NASA Astrophysics Data System (ADS)

    Hernandez, David M.; Bertschinger, Edmund

    2018-04-01

    Calculating the long-term solution of ordinary differential equations, such as those of the N-body problem, is central to understanding a wide range of dynamics in astrophysics, from galaxy formation to planetary chaos. Because generally no analytic solution exists to these equations, researchers rely on numerical methods that are prone to various errors. In an effort to mitigate these errors, powerful symplectic integrators have been employed. But symplectic integrators can be severely limited because they are not compatible with adaptive stepping and thus they have difficulty in accommodating changing time and length scales. A promising alternative is time-reversible integration, which can handle adaptive time-stepping, but the errors due to time-reversible integration in astrophysics are less understood. The goal of this work is to study analytically and numerically the errors caused by time-reversible integration, with and without adaptive stepping. We derive the modified differential equations of these integrators to perform the error analysis. As an example, we consider the trapezoidal rule, a reversible non-symplectic integrator, and show that it gives secular energy error increase for a pendulum problem and for a Hénon-Heiles orbit. We conclude that using reversible integration does not guarantee good energy conservation and that, when possible, use of symplectic integrators is favoured. We also show that time-symmetry and time-reversibility are properties that are distinct for an integrator.

  7. DNA double-strand-break complexity levels and their possible contributions to the probability for error-prone processing and repair pathway choice.

    PubMed

    Schipler, Agnes; Iliakis, George

    2013-09-01

    Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice.

  8. The 3 faces of clinical reasoning: Epistemological explorations of disparate error reduction strategies.

    PubMed

    Monteiro, Sandra; Norman, Geoff; Sherbino, Jonathan

    2018-06-01

    There is general consensus that clinical reasoning involves 2 stages: a rapid stage where 1 or more diagnostic hypotheses are advanced and a slower stage where these hypotheses are tested or confirmed. The rapid hypothesis generation stage is considered inaccessible for analysis or observation. Consequently, recent research on clinical reasoning has focused specifically on improving the accuracy of the slower, hypothesis confirmation stage. Three perspectives have developed in this line of research, and each proposes different error reduction strategies for clinical reasoning. This paper considers these 3 perspectives and examines the underlying assumptions. Additionally, this paper reviews the evidence, or lack of, behind each class of error reduction strategies. The first perspective takes an epidemiological stance, appealing to the benefits of incorporating population data and evidence-based medicine in every day clinical reasoning. The second builds on the heuristic and bias research programme, appealing to a special class of dual process reasoning models that theorizes a rapid error prone cognitive process for problem solving with a slower more logical cognitive process capable of correcting those errors. Finally, the third perspective borrows from an exemplar model of categorization that explicitly relates clinical knowledge and experience to diagnostic accuracy. © 2018 John Wiley & Sons, Ltd.

  9. Blocking by the carcinogen, L-ethionine, of SOS functions in a tif-1 mutant of Escherichia coli B/r.

    PubMed

    Wiesner, R; Troll, W

    1981-11-01

    In Escherichia coli, DNA damage by carcinogenic agents results in the coordinate expression of a diversity of functions (SOS functions), many of which are thermally inducible without any damage to DNA in a tif-1 mutant. These include prophage induction, filamentous growth, and an error-prone DNA repair activity, which is responsible for ultraviolet-induced mutagenesis. Ethionine causes hepatic carcinoma in rats after prolonged feeding but is not a mutagen in the Ames test. The present study shows that 10 mM ethionine prevents the thermal induction of lambda-prophage in a tif-1 derivative of E. coli. The enhancement of mutation, which normally occurs at high temperature after a low dose of ultraviolet light, is also blocked by ethionine. Ethionine does not block, to any appreciable extent, the incorporation of radioactive precursors into RNA, DNA, or protein.

  10. Promises and pitfalls of Illumina sequencing for HIV resistance genotyping.

    PubMed

    Brumme, Chanson J; Poon, Art F Y

    2017-07-15

    Genetic sequencing ("genotyping") plays a critical role in the modern clinical management of HIV infection. This virus evolves rapidly within patients because of its error-prone reverse transcriptase and short generation time. Consequently, HIV variants with mutations that confer resistance to one or more antiretroviral drugs can emerge during sub-optimal treatment. There are now multiple HIV drug resistance interpretation algorithms that take the region of the HIV genome encoding the major drug targets as inputs; expert use of these algorithms can significantly improve to clinical outcomes in HIV treatment. Next-generation sequencing has the potential to revolutionize HIV resistance genotyping by lowering the threshold that rare but clinically significant HIV variants can be detected reproducibly, and by conferring improved cost-effectiveness in high-throughput scenarios. In this review, we discuss the relative merits and challenges of deploying the Illumina MiSeq instrument for clinical HIV genotyping. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Expanding the Nucleotide and Sugar 1-Phosphate Promiscuity of Nucleotidyltransferase RmlA via Directed Evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moretti, Rocco; Chang, Aram; Peltier-Pain, Pauline

    2012-03-15

    Directed evolution is a valuable technique to improve enzyme activity in the absence of a priori structural knowledge, which can be typically enhanced via structure-guided strategies. In this study, a combination of both whole-gene error-prone polymerase chain reaction and site-saturation mutagenesis enabled the rapid identification of mutations that improved RmlA activity toward non-native substrates. These mutations have been shown to improve activities over 10-fold for several targeted substrates, including non-native pyrimidine- and purine-based NTPs as well as non-native d- and l-sugars (both a- and b-isomers). This study highlights the first broadly applicable high throughput sugar-1-phosphate nucleotidyltransferase screen and the firstmore » proof of concept for the directed evolution of this enzyme class toward the identification of uniquely permissive RmlA variants.« less

  12. Developmental history and application of CRISPR in human disease.

    PubMed

    Liang, Puping; Zhang, Xiya; Chen, Yuxi; Huang, Junjiu

    2017-06-01

    Genome-editing tools are programmable artificial nucleases, mainly including zinc-finger nucleases, transcription activator-like effector nucleases and clustered regularly interspaced short palindromic repeat (CRISPR). By recognizing and cleaving specific DNA sequences, genome-editing tools make it possible to generate site-specific DNA double-strand breaks (DSBs) in the genome. DSBs will then be repaired by either error-prone nonhomologous end joining or high-fidelity homologous recombination mechanisms. Through these two different mechanisms, endogenous genes can be knocked out or precisely repaired/modified. Rapid developments in genome-editing tools, especially CRISPR, have revolutionized human disease models generation, for example, various zebrafish, mouse, rat, pig, monkey and human cell lines have been constructed. Here, we review the developmental history of CRISPR and its application in studies of human diseases. In addition, we also briefly discussed the therapeutic application of CRISPR in the near future. Copyright © 2017 John Wiley & Sons, Ltd.

  13. TOOKUIL: A case study in user interface development for safety code application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less

  14. Complex monitoring performance and the coronary-prone Type A behavior pattern.

    DOT National Transportation Integrated Search

    1986-03-01

    The present study examined the possible relationship of the coronary-prone Type A behavior pattern to performance of a complex monitoring task. The task was designed to functionally simulate the general task characteristics of future, highly automate...

  15. Implementation of an audit with feedback knowledge translation intervention to promote medication error reporting in health care: a protocol.

    PubMed

    Hutchinson, Alison M; Sales, Anne E; Brotto, Vanessa; Bucknall, Tracey K

    2015-05-19

    Health professionals strive to deliver high-quality care in an inherently complex and error-prone environment. Underreporting of medical errors challenges attempts to understand causative factors and impedes efforts to implement preventive strategies. Audit with feedback is a knowledge translation strategy that has potential to modify health professionals' medical error reporting behaviour. However, evidence regarding which aspects of this complex, multi-dimensional intervention work best is lacking. The aims of the Safe Medication Audit Reporting Translation (SMART) study are to: 1. Implement and refine a reporting mechanism to feed audit data on medication errors back to nurses 2. Test the feedback reporting mechanism to determine its utility and effect 3. Identify characteristics of organisational context associated with error reporting in response to feedback A quasi-experimental design, incorporating two pairs of matched wards at an acute care hospital, is used. Randomisation occurs at the ward level; one ward from each pair is randomised to receive the intervention. A key stakeholder reference group informs the design and delivery of the feedback intervention. Nurses on the intervention wards receive the feedback intervention (feedback of analysed audit data) on a quarterly basis for 12 months. Data for the feedback intervention come from medication documentation point-prevalence audits and weekly reports on routinely collected medication error data. Weekly reports on these data are obtained for the control wards. A controlled interrupted time series analysis is used to evaluate the effect of the feedback intervention. Self-report data are also collected from nurses on all four wards at baseline and at completion of the intervention to elicit their perceptions of the work context. Additionally, following each feedback cycle, nurses on the intervention wards are invited to complete a survey to evaluate the feedback and to establish their intentions to change their reporting behaviour. To assess sustainability of the intervention, at 6 months following completion of the intervention a point-prevalence chart audit is undertaken and a report of routinely collected medication errors for the previous 6 months is obtained. This intervention will have wider application for delivery of feedback to promote behaviour change for other areas of preventable error and adverse events.

  16. Group elicitations yield more consistent, yet more uncertain experts in understanding risks to ecosystem services in New Zealand bays

    PubMed Central

    Sinner, Jim; Ellis, Joanne; Kandlikar, Milind; Halpern, Benjamin S.; Satterfield, Terre; Chan, Kai

    2017-01-01

    The elicitation of expert judgment is an important tool for assessment of risks and impacts in environmental management contexts, and especially important as decision-makers face novel challenges where prior empirical research is lacking or insufficient. Evidence-driven elicitation approaches typically involve techniques to derive more accurate probability distributions under fairly specific contexts. Experts are, however, prone to overconfidence in their judgements. Group elicitations with diverse experts can reduce expert overconfidence by allowing cross-examination and reassessment of prior judgements, but groups are also prone to uncritical “groupthink” errors. When the problem context is underspecified the probability that experts commit groupthink errors may increase. This study addresses how structured workshops affect expert variability among and certainty within responses in a New Zealand case study. We find that experts’ risk estimates before and after a workshop differ, and that group elicitations provided greater consistency of estimates, yet also greater uncertainty among experts, when addressing prominent impacts to four different ecosystem services in coastal New Zealand. After group workshops, experts provided more consistent ranking of risks and more consistent best estimates of impact through increased clarity in terminology and dampening of extreme positions, yet probability distributions for impacts widened. The results from this case study suggest that group elicitations have favorable consequences for the quality and uncertainty of risk judgments within and across experts, making group elicitation techniques invaluable tools in contexts of limited data. PMID:28767694

  17. The p21 and PCNA partnership: a new twist for an old plot.

    PubMed

    Prives, Carol; Gottifredi, Vanesa

    2008-12-15

    The contribution of error-prone DNA polymerases to the DNA damage response has been a subject of great interest in the last decade. Error-prone polymerases are required for translesion DNA synthesis (TLS), a process that involves synthesis past a DNA lesion. Under certain circumstances, TLS polymerases can achieve bypass with good efficiency and fidelity. However, they can also in some cases be mutagenic, and so negative regulators of TLS polymerases would have the important function of inhibiting their recruitment to undamaged DNA templates. Recent work from Livneh's and our groups have provided evidence regarding the role of the cyclin kinase inhibitor p21 as a negative regulator of TLS. Interestingly, both the cyclin dependent kinase (CDK) and proliferating cell nuclear antigen (PCNA) binding domains of p21 are involved in different aspects of the modulation of TLS, affecting both the interaction between PCNA and the TLS-specific pol eta as well as PCNA ubiquitination status. In line with this, p21 was shown to reduce the efficiency but increase the accuracy of TLS. Hence, in absence of DNA damage p21 may work to impede accidental loading of pol eta to undamaged DNA and avoid consequential mutagenesis. After UV irradiation, when TLS plays a decisive role, p21 is progressively degraded. This might allow gradual release of replication fork blockage by TLS polymerases. For these reasons, in higher eukaryotes p21 might represent a key regulator of the equilibrium between mutagenesis and cell survival.

  18. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data

    PubMed Central

    Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.

    2017-01-01

    Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395

  19. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    PubMed

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  20. Effects of response bias and judgment framing on operator use of an automated aid in a target detection task.

    PubMed

    Rice, Stephen; McCarley, Jason S

    2011-12-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.

  1. PSO4: a novel gene involved in error-prone repair in Saccharomyces cerevisiae.

    PubMed

    Henriques, J A; Vicente, E J; Leandro da Silva, K V; Schenberg, A C

    1989-09-01

    The haploid xs9 mutant, originally selected for on the basis of a slight sensitivity to the lethal effect of X-rays, was found to be extremely sensitive to inactivation by 8-methoxypsoralen (8MOP) photoaddition, especially when cells are treated in the G2 phase of the cell cycle. As the xs9 mutation showed no allelism with any of the 3 known pso mutations, it was now given the name of pso4-1. Regarding inactivation, the pso4-1 mutant is also sensitive to mono- (HN1) or bi-functional (HN2) nitrogen mustards, it is slightly sensitive to 254 nm UV radiation (UV), and shows nearly normal sensitivity to 3-carbethoxypsoralen (3-CPs) photoaddition or methyl methanesulfonate (MMS). Regarding mutagenesis, the pso4-1 mutation completely blocks reverse and forward mutations induced by either 8MOP or 3CPs photoaddition, or by gamma-rays. In the cases of UV, HN1, HN2 or MMS treatments, while reversion induction is still completely abolished, forward mutagenesis is only partially inhibited for UV, HN1, or MMS, and it is unaffected for HN2. Besides severely inhibiting induced mutagenesis, the pso4-1 mutation was found to be semi-dominant, to block sporulation, to abolish the diploid resistance effect, and to block induced mitotic recombination, which indicates that the PSO4 gene is involved in a recombinational pathway of error-prone repair, comparable to the E. coli SOS repair pathway.

  2. Error Recovery in the Time-Triggered Paradigm with FTT-CAN.

    PubMed

    Marques, Luis; Vasconcelos, Verónica; Pedreiras, Paulo; Almeida, Luís

    2018-01-11

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots.

  3. The accuracy of self-reported pregnancy-related weight: a systematic review.

    PubMed

    Headen, I; Cohen, A K; Mujahid, M; Abrams, B

    2017-03-01

    Self-reported maternal weight is error-prone, and the context of pregnancy may impact error distributions. This systematic review summarizes error in self-reported weight across pregnancy and assesses implications for bias in associations between pregnancy-related weight and birth outcomes. We searched PubMed and Google Scholar through November 2015 for peer-reviewed articles reporting accuracy of self-reported, pregnancy-related weight at four time points: prepregnancy, delivery, over gestation and postpartum. Included studies compared maternal self-report to anthropometric measurement or medical report of weights. Sixty-two studies met inclusion criteria. We extracted data on magnitude of error and misclassification. We assessed impact of reporting error on bias in associations between pregnancy-related weight and birth outcomes. Women underreported prepregnancy (PPW: -2.94 to -0.29 kg) and delivery weight (DW: -1.28 to 0.07 kg), and over-reported gestational weight gain (GWG: 0.33 to 3 kg). Magnitude of error was small, ranged widely, and varied by prepregnancy weight class and race/ethnicity. Misclassification was moderate (PPW: 0-48.3%; DW: 39.0-49.0%; GWG: 16.7-59.1%), and overestimated some estimates of population prevalence. However, reporting error did not largely bias associations between pregnancy-related weight and birth outcomes. Although measured weight is preferable, self-report is a cost-effective and practical measurement approach. Future researchers should develop bias correction techniques for self-reported pregnancy-related weight. © 2017 World Obesity Federation.

  4. Error Recovery in the Time-Triggered Paradigm with FTT-CAN

    PubMed Central

    Pedreiras, Paulo; Almeida, Luís

    2018-01-01

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots. PMID:29324723

  5. Correcting for Measurement Error in Time-Varying Covariates in Marginal Structural Models.

    PubMed

    Kyle, Ryan P; Moodie, Erica E M; Klein, Marina B; Abrahamowicz, Michał

    2016-08-01

    Unbiased estimation of causal parameters from marginal structural models (MSMs) requires a fundamental assumption of no unmeasured confounding. Unfortunately, the time-varying covariates used to obtain inverse probability weights are often error-prone. Although substantial measurement error in important confounders is known to undermine control of confounders in conventional unweighted regression models, this issue has received comparatively limited attention in the MSM literature. Here we propose a novel application of the simulation-extrapolation (SIMEX) procedure to address measurement error in time-varying covariates, and we compare 2 approaches. The direct approach to SIMEX-based correction targets outcome model parameters, while the indirect approach corrects the weights estimated using the exposure model. We assess the performance of the proposed methods in simulations under different clinically plausible assumptions. The simulations demonstrate that measurement errors in time-dependent covariates may induce substantial bias in MSM estimators of causal effects of time-varying exposures, and that both proposed SIMEX approaches yield practically unbiased estimates in scenarios featuring low-to-moderate degrees of error. We illustrate the proposed approach in a simple analysis of the relationship between sustained virological response and liver fibrosis progression among persons infected with hepatitis C virus, while accounting for measurement error in γ-glutamyltransferase, using data collected in the Canadian Co-infection Cohort Study from 2003 to 2014. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Multiple description distributed image coding with side information for mobile wireless transmission

    NASA Astrophysics Data System (ADS)

    Wu, Min; Song, Daewon; Chen, Chang Wen

    2005-03-01

    Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.

  7. Self-esteem and delusion proneness.

    PubMed

    Warman, Debbie M; Lysaker, Paul H; Luedtke, Brandi; Martin, Joel M

    2010-06-01

    The present study was an examination of global self-esteem and various types of unusual beliefs in a nonclinical population. Individuals with no history of psychotic disorder (N = 121) completed a measure of delusion-proneness and also a measure of self-esteem. Results indicated high delusion prone individuals had lower self-esteem than low delusion prone individuals (p = 0.044). In addition, higher levels of paranoid ideation and suspiciousness were associated with lower self-esteem (p < 0.001). Significant, yet smaller relationships also emerged between low self-esteem and higher levels of beliefs related to thought disturbances, catastrophic ideation/thought broadcasting, and ideation of reference/influence. The significance of these findings as they relate to theories of delusion formation is discussed.

  8. Cellular Strategies for Regulating Functional and Nonfunctional Protein Aggregation

    PubMed Central

    Gsponer, Jörg; Babu, M. Madan

    2012-01-01

    Summary Growing evidence suggests that aggregation-prone proteins are both harmful and functional for a cell. How do cellular systems balance the detrimental and beneficial effect of protein aggregation? We reveal that aggregation-prone proteins are subject to differential transcriptional, translational, and degradation control compared to nonaggregation-prone proteins, which leads to their decreased synthesis, low abundance, and high turnover. Genetic modulators that enhance the aggregation phenotype are enriched in genes that influence expression homeostasis. Moreover, genes encoding aggregation-prone proteins are more likely to be harmful when overexpressed. The trends are evolutionarily conserved and suggest a strategy whereby cellular mechanisms specifically modulate the availability of aggregation-prone proteins to (1) keep concentrations below the critical ones required for aggregation and (2) shift the equilibrium between the monomeric and oligomeric/aggregate form, as explained by Le Chatelier’s principle. This strategy may prevent formation of undesirable aggregates and keep functional assemblies/aggregates under control. PMID:23168257

  9. Assessing dangerous driving behavior during driving inattention: Psychometric adaptation and validation of the Attention-Related Driving Errors Scale in China.

    PubMed

    Qu, Weina; Ge, Yan; Zhang, Qian; Zhao, Wenguo; Zhang, Kan

    2015-07-01

    Driver inattention is a significant cause of motor vehicle collisions and incidents. The purpose of this study was to translate the Attention-Related Driving Error Scale (ARDES) into Chinese and to verify its reliability and validity. A total of 317 drivers completed the Chinese version of the ARDES, the Dula Dangerous Driving Index (DDDI), the Attention-Related Cognitive Errors Scale (ARCES) and the Mindful Attention Awareness Scale (MAAS) questionnaires. Specific sociodemographic variables and traffic violations were also measured. Psychometric results confirm that the ARDES-China has adequate psychometric properties (Cronbach's alpha=0.88) to be a useful tool for evaluating proneness to attentional errors in the Chinese driving population. First, ARDES-China scores were positively correlated with both DDDI scores and number of accidents in the prior year; in addition, ARDES-China scores were a significant predictor of dangerous driving behavior as measured by DDDI. Second, we found that ARDES-China scores were strongly correlated with ARCES scores and negatively correlated with MAAS scores. Finally, different demographic groups exhibited significant differences in ARDES scores; in particular, ARDES scores varied with years of driving experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array

    PubMed Central

    Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Tao, Yuan

    2018-01-01

    Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%. PMID:29734742

  11. Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array.

    PubMed

    Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Abu-Siada, Ahmed; Tao, Yuan

    2018-05-05

    Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%.

  12. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    PubMed Central

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  13. [Factors Related to Presenteeism in Young and Middle-aged Nurses].

    PubMed

    Yoshida, Mami; Miki, Akiko

    2018-04-03

    Presenteeism is considered to be not only a work-related stressor but also a factor involved in the development of workaholism and error proneness, which is often described as careless. Additionally, increasing health issues arising from aging suggest the possibility that presenteeism in middle-aged nurses is different than that in young ones. Therefore, the present study aimed to identify and tease apart factors involved in presenteeism among young and middle-aged nurses. An anonymous self-administered questionnaire survey was conducted among 2,006 nurses working at 10 hospitals. In total, 761 nurses aged <40 years and 536 nurses aged ≥40 years were enrolled in this study. Work Impairment Scores (WIS) on the Japanese version of the Stanford Presenteeism Scale were measured for presenteeism. Job stressors, workaholism, and error proneness were measured for related factors. Multiple regression analysis was conducted after determining the WIS as the dependent variable and related factors as independent variables. Overall, 70.8% of the young nurses reported health problems compared to 82.5% of the middle-aged nurses. However, WIS in young nurses was significantly higher than that in middle-aged ones (p < 0.001). WIS in young nurses showed a significant relationship with the degree of stressors, "difficulty of work" (β = 0.28, p < 0.001) and tendency to "work excessively" (β = 0.18, p < 0.001), which is a subscale of workaholism, error proneness of "action slips" (β = 0.14, p < 0.01) and "cognitive narrowing" (β = 0.11, p < 0.05). Conversely, WIS in middle-aged nurses showed a significant relationship with "cognitive narrowing" (β = 0.29, p < 0.001) and to "work excessively" (β = 0.17, p < 0.001), the degree of stressors on "difficulty of work" (β = 0.12, p < 0.05) and "lack of communication" (β = 0.13, p < 0.01). It was clarified that the increased health problems of middle-aged nurses does not necessarily lower their working capacity. Also, compared to young nurses, the degree of failing tendency, rather than the degree of job stressors, was more related to presenteeism for middle-aged nurses. It can be considered that middle-aged nurses simply realize that their working ability is hindered because of incidents resulting from attention narrowing. As fatigue and state of tension tend to cause narrowing of attention, it may be necessary to reduce such risks and adjust work environments so mistakes can be avoided.

  14. Design and Development of Virtual Reality Simulation for Teaching High-Risk Low-Volume Problem-Prone Office-Based Medical Emergencies

    ERIC Educational Resources Information Center

    Lemheney, Alexander J.

    2014-01-01

    Physicians' offices are not the usual place where emergencies occur; thus how staff remains prepared and current regarding medical emergencies presents an ongoing challenge for private practitioners. The very nature of low-volume, high-risk, and problem-prone medical emergencies is that they occur with such infrequency it is difficult for staff to…

  15. Risk management: correct patient and specimen identification in a surgical pathology laboratory. The experience of Infermi Hospital, Rimini, Italy.

    PubMed

    Fabbretti, G

    2010-06-01

    Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.

  16. The mismeasure of morals: antisocial personality traits predict utilitarian responses to moral dilemmas.

    PubMed

    Bartels, Daniel M; Pizarro, David A

    2011-10-01

    Researchers have recently argued that utilitarianism is the appropriate framework by which to evaluate moral judgment, and that individuals who endorse non-utilitarian solutions to moral dilemmas (involving active vs. passive harm) are committing an error. We report a study in which participants responded to a battery of personality assessments and a set of dilemmas that pit utilitarian and non-utilitarian options against each other. Participants who indicated greater endorsement of utilitarian solutions had higher scores on measures of Psychopathy, machiavellianism, and life meaninglessness. These results question the widely-used methods by which lay moral judgments are evaluated, as these approaches lead to the counterintuitive conclusion that those individuals who are least prone to moral errors also possess a set of psychological characteristics that many would consider prototypically immoral. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Clarifying the link between job satisfaction and absenteeism: The role of guilt proneness.

    PubMed

    Schaumberg, Rebecca L; Flynn, Francis J

    2017-06-01

    We propose that the relationship between job satisfaction and absenteeism depends partly on guilt proneness. Drawing on withdrawal and process models of absenteeism, we argue that job satisfaction predicts absences for employees who are low (but not high) in guilt proneness because low guilt-prone people's behaviors are governed more by fulfilling their own egoistic desires than by fulfilling others' normative expectations. We find support for this prediction in a sample of customer service agents working for a major telecommunications company and a sample of working adults employed in a range of industries. In each study, we use measures of employees' guilt proneness and job satisfaction to predict their subsequent workplace absences. In Study 2, we extend our hypothesis tests to 2 traits that are conceptually comparable to guilt proneness (i.e., moral identity and agreeableness), showing that these traits similarly moderate the relationship between job satisfaction and absenteeism. We discuss the implications of these findings for extant models of absenteeism and research on moral affectivity in the workplace. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. The effect of bridge exercise method on the strength of rectus abdominis muscle and the muscle activity of paraspinal muscles while doing treadmill walking with high heels.

    PubMed

    Kang, Taewook; Lee, Jaeseok; Seo, Junghoon; Han, Dongwook

    2017-04-01

    [Purpose] The purpose of this research is to investigate the effect of the method of bridge exercise on the change of rectus abdominis muscle and the muscle activity of paraspinal muscles while doing treadmill walking with high heels. [Subjects and Methods] The subjects of this research are healthy female students consisting of 10 persons performing bridge exercises in a supine group, 10 persons performing bridge exercises in a prone group, and 10 persons in a control group while in S university in Busan. Bridge exercise in supine position is performed in hook lying position. Bridge exercise in prone position is plank exercise in prostrate position. To measure the strength of rectus abdominis muscle, maintaining times of the posture was used. To measure the muscle activity of paraspinal muscles, EMG (4D-MT & EMD-11, Relive, Korea) was used. [Results] The strength of rectus abdominis muscle of both bridge exercises in the supine group and bridge exercises in the prone group increases significantly after exercise. The muscle activity of paraspinal muscle such as thoracic parts and lumbar parts in bridge exercises in the prone group decreases statistically while walking on a treadmill with high heels. Muscle activity of thoracic parts paraspinal muscle and bridge exercises in the supine group decreased significantly. [Conclusion] According to this study, we noticed that bridge exercise in a prone position is desirable for women who prefer wearing high heels as a back pain prevention exercise method.

  19. A Novel Real-Time Reference Key Frame Scan Matching Method.

    PubMed

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-05-07

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions' environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.

  20. Tool Support for Software Lookup Table Optimization

    DOE PAGES

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  1. Self-Supervised Chinese Ontology Learning from Online Encyclopedias

    PubMed Central

    Shao, Zhiqing; Ruan, Tong

    2014-01-01

    Constructing ontology manually is a time-consuming, error-prone, and tedious task. We present SSCO, a self-supervised learning based chinese ontology, which contains about 255 thousand concepts, 5 million entities, and 40 million facts. We explore the three largest online Chinese encyclopedias for ontology learning and describe how to transfer the structured knowledge in encyclopedias, including article titles, category labels, redirection pages, taxonomy systems, and InfoBox modules, into ontological form. In order to avoid the errors in encyclopedias and enrich the learnt ontology, we also apply some machine learning based methods. First, we proof that the self-supervised machine learning method is practicable in Chinese relation extraction (at least for synonymy and hyponymy) statistically and experimentally and train some self-supervised models (SVMs and CRFs) for synonymy extraction, concept-subconcept relation extraction, and concept-instance relation extraction; the advantages of our methods are that all training examples are automatically generated from the structural information of encyclopedias and a few general heuristic rules. Finally, we evaluate SSCO in two aspects, scale and precision; manual evaluation results show that the ontology has excellent precision, and high coverage is concluded by comparing SSCO with other famous ontologies and knowledge bases; the experiment results also indicate that the self-supervised models obviously enrich SSCO. PMID:24715819

  2. Self-supervised Chinese ontology learning from online encyclopedias.

    PubMed

    Hu, Fanghuai; Shao, Zhiqing; Ruan, Tong

    2014-01-01

    Constructing ontology manually is a time-consuming, error-prone, and tedious task. We present SSCO, a self-supervised learning based chinese ontology, which contains about 255 thousand concepts, 5 million entities, and 40 million facts. We explore the three largest online Chinese encyclopedias for ontology learning and describe how to transfer the structured knowledge in encyclopedias, including article titles, category labels, redirection pages, taxonomy systems, and InfoBox modules, into ontological form. In order to avoid the errors in encyclopedias and enrich the learnt ontology, we also apply some machine learning based methods. First, we proof that the self-supervised machine learning method is practicable in Chinese relation extraction (at least for synonymy and hyponymy) statistically and experimentally and train some self-supervised models (SVMs and CRFs) for synonymy extraction, concept-subconcept relation extraction, and concept-instance relation extraction; the advantages of our methods are that all training examples are automatically generated from the structural information of encyclopedias and a few general heuristic rules. Finally, we evaluate SSCO in two aspects, scale and precision; manual evaluation results show that the ontology has excellent precision, and high coverage is concluded by comparing SSCO with other famous ontologies and knowledge bases; the experiment results also indicate that the self-supervised models obviously enrich SSCO.

  3. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    PubMed

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  4. Use of Existing CAD Models for Radiation Shielding Analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. T.; Barzilla, J. E.; Wilson, P.; Davis, A.; Zachman, J.

    2015-01-01

    The utility of a radiation exposure analysis depends not only on the accuracy of the underlying particle transport code, but also on the accuracy of the geometric representations of both the vehicle used as radiation shielding mass and the phantom representation of the human form. The current NASA/Space Radiation Analysis Group (SRAG) process to determine crew radiation exposure in a vehicle design incorporates both output from an analytic High Z and Energy Particle Transport (HZETRN) code and the properties (i.e., material thicknesses) of a previously processed drawing. This geometry pre-process can be time-consuming, and the results are less accurate than those determined using a Monte Carlo-based particle transport code. The current work aims to improve this process. Although several Monte Carlo programs (FLUKA, Geant4) are readily available, most use an internal geometry engine. The lack of an interface with the standard CAD formats used by the vehicle designers limits the ability of the user to communicate complex geometries. Translation of native CAD drawings into a format readable by these transport programs is time consuming and prone to error. The Direct Accelerated Geometry -United (DAGU) project is intended to provide an interface between the native vehicle or phantom CAD geometry and multiple particle transport codes to minimize problem setup, computing time and analysis error.

  5. Residents' numeric inputting error in computerized physician order entry prescription.

    PubMed

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial incidence of errors found in this study. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Reusable and Extensible High Level Data Distributions

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Chamberlain, Bradford; James, Mark L.; Zima, Hans P.

    2005-01-01

    This paper presents a reusable design of a data distribution framework for data parallel high performance applications. We are implementing the design in the context of the Chapel high productivity programming language. Distributions in Chapel are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on,the performance of applications, it is important that the distribution strategy can be chosen by a user. At the same time, high productivity concerns require that the user is shielded from error-prone, tedious details such as communication and synchronization. We propose an approach to distributions that enables the user to refine a language-provided distribution type and adjust it to optimize the performance of the application. Additionally, we conceal from the user low-level communication and synchronization details to increase productivity. To emphasize the generality of our distribution machinery, we present its abstract design in the form of a design pattern, which is independent of a concrete implementation. To illustrate the applicability of our distribution framework design, we outline the implementation of data distributions in terms of the Chapel language.

  7. Program For Engineering Electrical Connections

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1990-01-01

    DFACS is interactive multiuser computer-aided-engineering software tool for system-level electrical integration and cabling engineering. Purpose of program to provide engineering community with centralized data base for putting in and gaining access to data on functional definition of system, details of end-circuit pinouts in systems and subsystems, and data on wiring harnesses. Objective, to provide instantaneous single point of interchange of information, thus avoiding error-prone, time-consuming, and costly shuttling of data along multiple paths. Designed to operate on DEC VAX mini or micro computer using Version 5.0/03 of INGRES.

  8. Learning class descriptions from a data base of spectral reflectance with multiple view angles

    NASA Technical Reports Server (NTRS)

    Kimes, Daniel S.; Harrison, Patrick R.; Harrison, P. A.

    1992-01-01

    A learning program has been developed which combines 'learning by example' with the generate-and-test paradigm to furnish a robust learning environment capable of handling error-prone data. The problem is shown to be capable of learning class descriptions from positive and negative training examples of spectral and directional reflectance data taken from soil and vegetation. The program, which used AI techniques to automate very tedious processes, found the sequence of relationships that contained the most important information which could distinguish the classes.

  9. System-on-Chip Data Processing and Data Handling Spaceflight Electronics

    NASA Technical Reports Server (NTRS)

    Kleyner, I.; Katz, R.; Tiggeler, H.

    1999-01-01

    This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.

  10. A mix-and-measure assay for determining the activation status of endogenous Cdc42 in cytokine-stimulated macrophage cell lysates.

    PubMed

    Miskolci, Veronika; Spiering, Désirée; Cox, Dianne; Hodgson, Louis

    2014-01-01

    Cytokine stimulations of leukocytes many times result in transient activation of the p21 Rho family of small GTPases. The role of these molecules during cell migration and chemotaxis is well established. The traditional approach to study the activation dynamics of these proteins involves affinity pull-downs that are often cumbersome and prone to errors. Here, we describe a reagent and a method of simple "mix-and-measure" approach useful for determining the activation status of endogenous Cdc42 GTPase from cell lysates.

  11. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  12. Temperature-dependent spectral mismatch corrections

    DOE PAGES

    Osterwald, Carl R.; Campanelli, Mark; Moriarty, Tom; ...

    2015-11-01

    This study develops the mathematical foundation for a translation of solar cell short-circuit current from one thermal and spectral irradiance operating condition to another without the use of ill-defined and error-prone temperature coefficients typically employed in solar cell metrology. Using the partial derivative of quantum efficiency with respect to temperature, the conventional isothermal expression for spectral mismatch corrections is modified to account for changes of current due to temperature; this modification completely eliminates the need for short-circuit-current temperature coefficients. An example calculation is provided to demonstrate use of the new translation.

  13. Computer-aided programming for message-passing system; Problems and a solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M.Y.; Gajski, D.D.

    1989-12-01

    As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.

  14. From Serpent to CEO: Improving First-Term Security Forces Airman Performance Through Neuroscience Education

    DTIC Science & Technology

    2017-06-09

    full ability to inhibit ANS and limbic response are prone to be impulsive, 25 unintentional, or hesitant when faced with high -threat decisions...graduate degrees in Criminal Justice, a Graduate Certificate in Organizational Leadership, and a current American Society for Industrial Security...experience and full ability to inhibit ANS and limbic response are prone to be impulsive, unintentional, or hesitant when faced with high -threat

  15. Wildland fire risk and social vulnerability in the Southeastern United States: An exploratory spatial data analysis approach

    Treesearch

    Cassandra Johnson Gaither; N.C. Poudyal; S. Goodrick; J.M. Bowker; S. Malone; J. Gan

    2011-01-01

    The southeastern U.S. is one of the more wildland fire prone areas of the country and also contains some of the poorest or most socially vulnerable rural communities. Our project addresses wildland fire risk in this part of the U.S and its intersection with social vulnerability. We examine spatial association between high wildland fire prone areas which also rank high...

  16. Data entry errors and design for model-based tight glycemic control in critical care.

    PubMed

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.

  17. Recognition of strong earthquake-prone areas with a single learning class

    NASA Astrophysics Data System (ADS)

    Gvishiani, A. D.; Agayan, S. M.; Dzeboev, B. A.; Belov, I. O.

    2017-05-01

    This article presents a new Barrier recognition algorithm with learning, designed for recognition of earthquake-prone areas. In comparison to the Crust (Kora) algorithm, used by the classical EPA approach, the Barrier algorithm proceeds with learning just on one "pure" high-seismic class. The new algorithm operates in the space of absolute values of the geological-geophysical parameters of the objects. The algorithm is used for recognition of earthquake-prone areas with M ≥ 6.0 in the Caucasus region. Comparative analysis of the Crust and Barrier algorithms justifies their productive coherence.

  18. Error baseline rates of five sample preparation methods used to characterize RNA virus populations.

    PubMed

    Kugelman, Jeffrey R; Wiley, Michael R; Nagle, Elyse R; Reyes, Daniel; Pfeffer, Brad P; Kuhn, Jens H; Sanchez-Lockhart, Mariano; Palacios, Gustavo F

    2017-01-01

    Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic "no amplification" method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a "targeted" amplification method, sequence-independent single-primer amplification (SISPA) as a "random" amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced "no amplification" method, and Illumina TruSeq RNA Access as a "targeted" enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4-5) of all compared methods.

  19. Error baseline rates of five sample preparation methods used to characterize RNA virus populations

    PubMed Central

    Kugelman, Jeffrey R.; Wiley, Michael R.; Nagle, Elyse R.; Reyes, Daniel; Pfeffer, Brad P.; Kuhn, Jens H.; Sanchez-Lockhart, Mariano; Palacios, Gustavo F.

    2017-01-01

    Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic “no amplification” method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a “targeted” amplification method, sequence-independent single-primer amplification (SISPA) as a “random” amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced “no amplification” method, and Illumina TruSeq RNA Access as a “targeted” enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4−5) of all compared methods. PMID:28182717

  20. Intraoperative visualization and assessment of electromagnetic tracking error

    NASA Astrophysics Data System (ADS)

    Harish, Vinyas; Ungi, Tamas; Lasso, Andras; MacDonald, Andrew; Nanji, Sulaiman; Fichtinger, Gabor

    2015-03-01

    Electromagnetic tracking allows for increased flexibility in designing image-guided interventions, however it is well understood that electromagnetic tracking is prone to error. Visualization and assessment of the tracking error should take place in the operating room with minimal interference with the clinical procedure. The goal was to achieve this ideal in an open-source software implementation in a plug and play manner, without requiring programming from the user. We use optical tracking as a ground truth. An electromagnetic sensor and optical markers are mounted onto a stylus device, pivot calibrated for both trackers. Electromagnetic tracking error is defined as difference of tool tip position between electromagnetic and optical readings. Multiple measurements are interpolated into the thin-plate B-spline transform visualized in real time using 3D Slicer. All tracked devices are used in a plug and play manner through the open-source SlicerIGT and PLUS extensions of the 3D Slicer platform. Tracking error was measured multiple times to assess reproducibility of the method, both with and without placing ferromagnetic objects in the workspace. Results from exhaustive grid sampling and freehand sampling were similar, indicating that a quick freehand sampling is sufficient to detect unexpected or excessive field distortion in the operating room. The software is available as a plug-in for the 3D Slicer platforms. Results demonstrate potential for visualizing electromagnetic tracking error in real time for intraoperative environments in feasibility clinical trials in image-guided interventions.

  1. Monitoring of infrastructural sites by means of advanced multi-temporal DInSAR methods

    NASA Astrophysics Data System (ADS)

    Vollrath, Andreas; Zucca, Francesco; Stramondo, Salvatore

    2013-10-01

    With the launch of Sentinel-1, advanced interferometric measurements will become more applicable then ever. The foreseen standard Wide Area Product (WAP), with its higher spatial and temporal resolution than comparable SAR missions, will provide the basement for the use of new wide scale and multitemporal analysis. By now the use of SAR interferometry methods with respect to risk assessment are mainly conducted for active tectonic zones, plate boundaries, volcanoes as well as urban areas, where local surface movement rates exceed the expected error and enough pixels per area contain a relatively stable phase. This study, in contrast, aims to focus on infrastructural sites that are located outside cities and are therefore surrounded by rural landscapes. The stumbling bock was given by the communication letter by the European Commission with regard to the stress tests of nuclear power plants in Europe in 2012. It is mentioned that continuously re-evaluated risk and safety assessments are necessary to guarantee highest possible security to the European citizens and environment. This is also true for other infrastructural sites, that are prone to diverse geophysical hazards. In combination with GPS and broadband seismology, multitemporal Differential Interferometric SAR approaches demonstrated great potential in contributing valuable information to surface movement phenomenas. At this stage of the project, first results of the Stamps-MTI approach (combined PSInSAR and SBAS) will be presented for the industrial area around Priolo Gargallo in South East Sicily by using ENVISAT ASAR IM mode data from 2003-2010. This area is located between the Malta Escarpment fault system and the Hyblean plateau and is prone to earthquake and tsunami risk. It features a high density of oil refineries that are directly located at the coast. The general potential of these techniques with respect to the SENTINEL-1 mission will be shown for this area and a road-map for further improvements is given in order to overcome limitations that refer to the influence of the atmosphere, orbit or DEM errors. Further steps will also include validation and tectonic modeling for risk assessment.

  2. Disruption of N terminus long range non covalent interactions shifted temp.opt 25°C to cold: Evolution of point mutant Bacillus lipase by error prone PCR.

    PubMed

    Goomber, Shelly; Kumar, Arbind; Kaur, Jagdeep

    2016-01-15

    Cold adapted enzymes have applications in detergent, textile, food, bioremediation and biotechnology processes. Bacillus lipases are 'generally recognized as safe' (GRAS) and hence are industrially attractive. Bacillus lipase of 1.4 subfamily are of lowest molecular weight and are reversibly unfolded due to absence of disulphide bonds. Therefore these are largely used to study energetic of protein stability that represents unfolding of native protein to fully unfolded state. In present study, metagenomically isolated Bacillus LipJ was laboratory evolved for cold adaptation by error Prone PCR. Library of variants were screened for high relative activity at low temperature of 10°C compared to native protein LipJ. Point mutant sequenced as Phe19→Leu was determined to be active at cold and was selected for extensive biochemical, biophysical characterization. Variant F19L showed its maximum activity at 10°C where parent protein LipJ had 20% relative activity. Psychrophilic nature of F19L was established with about 50% relative active at 5°C where native protein was frozen to act. Variant F19L showed no activity at temperature 40°C and above, establishing its thermolabile nature. Thermostability studies determined mutant to be unstable above 20°C and three fold decrease in its half life at 30°C compared to native protein. Far UV-CD and intrinsic fluorescence study demonstrated unstable tertiary structure of point variant F19L leading to its unfolding at low temperature of 20°C. Cold adaptation of mutant F19L is accompanied with increased specific activity. Mutant was catalytically more efficient with 1.3 fold increase in kcat. Homologue structure modelling predicted disruption of intersecondary hydrophobic core formed by aromatic ring of Phe19 with non polar residues placed at β3, β4, β5, β6, αF. Increased local flexibility of variant F19L explains molecular basis of its psychrophilic nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Enhancement of cellulosome-mediated deconstruction of cellulose by improving enzyme thermostability.

    PubMed

    Moraïs, Sarah; Stern, Johanna; Kahn, Amaranta; Galanopoulou, Anastasia P; Yoav, Shahar; Shamshoum, Melina; Smith, Matthew A; Hatzinikolaou, Dimitris G; Arnold, Frances H; Bayer, Edward A

    2016-01-01

    The concerted action of three complementary cellulases from Clostridium thermocellum, engineered to be stable at elevated temperatures, was examined on a cellulosic substrate and compared to that of the wild-type enzymes. Exoglucanase Cel48S and endoglucanase Cel8A, both key elements of the natural cellulosome from this bacterium, were engineered previously for increased thermostability, either by SCHEMA, a structure-guided, site-directed protein recombination method, or by consensus-guided mutagenesis combined with random mutagenesis using error-prone PCR, respectively. A thermostable β-glucosidase BglA mutant was also selected from a library generated by error-prone PCR that will assist the two cellulases in their methodic deconstruction of crystalline cellulose. The effects of a thermostable scaffoldin versus those of a largely mesophilic scaffoldin were also examined. By improving the stability of the enzyme subunits and the structural component, we aimed to improve cellulosome-mediated deconstruction of cellulosic substrates. The results demonstrate that the combination of thermostable enzymes as free enzymes and a thermostable scaffoldin was more active on the cellulosic substrate than the wild-type enzymes. Significantly, "thermostable" designer cellulosomes exhibited a 1.7-fold enhancement in cellulose degradation compared to the action of conventional designer cellulosomes that contain the respective wild-type enzymes. For designer cellulosome formats, the use of the thermostabilized scaffoldin proved critical for enhanced enzymatic performance under conditions of high temperatures. Simple improvement in the activity of a given enzyme does not guarantee its suitability for use in an enzyme cocktail or as a designer cellulosome component. The true merit of improvement resides in its ultimate contribution to synergistic action, which can only be determined experimentally. The relevance of the mutated thermostable enzymes employed in this study as components in multienzyme systems has thus been confirmed using designer cellulosome technology. Enzyme integration via a thermostable scaffoldin is critical to the ultimate stability of the complex at higher temperatures. Engineering of thermostable cellulases and additional lignocellulosic enzymes may prove a determinant parameter for development of state-of-the-art designer cellulosomes for their employment in the conversion of cellulosic biomass to soluble sugars.Graphical abstractConversion of conventional designer cellulosomes into thermophilic designer cellulosomes.

  4. Safeguarding the process of drug administration with an emphasis on electronic support tools

    PubMed Central

    Seidling, Hanna M; Lampert, Anette; Lohmann, Kristina; Schiele, Julia T; Send, Alexander J F; Witticke, Diana; Haefeli, Walter E

    2013-01-01

    Aims The aim of this work is to understand the process of drug administration and identify points in the workflow that resulted in interventions by clinical information systems in order to improve patient safety. Methods To identify a generic way to structure the drug administration process we performed peer-group discussions and supplemented these discussions with a literature search for studies reporting errors in drug administration and strategies for their prevention. Results We concluded that the drug administration process might consist of up to 11 sub-steps, which can be grouped into the four sub-processes of preparation, personalization, application and follow-up. Errors in drug handling and administration are diverse and frequent and in many cases not caused by the patient him/herself, but by family members or nurses. Accordingly, different prevention strategies have been set in place with relatively few approaches involving e-health technology. Conclusions A generic structuring of the administration process and particular error-prone sub-steps may facilitate the allocation of prevention strategies and help to identify research gaps. PMID:24007450

  5. Analyzing crack development pattern of masonry structure in seismic oscillation by digital photography

    NASA Astrophysics Data System (ADS)

    Zhang, Guojian; Yu, Chengxin; Ding, Xinhua

    2018-01-01

    In this study, digital photography is used to monitor the instantaneous deformation of a masonry wall in seismic oscillation. In order to obtain higher measurement accuracy, the image matching-time baseline parallax method (IM-TBPM) is used to correct errors caused by the change of intrinsic and extrinsic parameters of digital cameras. Results show that the average errors of control point C5 are 0.79mm, 0.44mm and 0.96mm in X, Z and comprehensive direction, respectively. The average errors of control point C6 are 0.49mm, 0.44mm and 0.71mm in X, Z and comprehensive direction, respectively. These suggest that IM-TBPM can meet the accuracy requirements of instantaneous deformation monitoring. In seismic oscillation the middle to lower of the masonry wall develops cracks firstly. Then the shear failure occurs on the middle of masonry wall. This study provides technical basis for analyzing the crack development pattern of masonry structure in seismic oscillation and have significant implications for improved construction of masonry structures in earthquake prone areas.

  6. Combination with anti-tit-for-tat remedies problems of tit-for-tat.

    PubMed

    Yi, Su Do; Baek, Seung Ki; Choi, Jung-Kyoo

    2017-01-07

    One of the most important questions in game theory concerns how mutual cooperation can be achieved and maintained in a social dilemma. In Axelrod's tournaments of the iterated prisoner's dilemma, Tit-for-Tat (TFT) demonstrated the role of reciprocity in the emergence of cooperation. However, the stability of TFT does not hold in the presence of implementation error, and a TFT population is prone to neutral drift to unconditional cooperation, which eventually invites defectors. We argue that a combination of TFT and anti-TFT (ATFT) overcomes these difficulties in a noisy environment, provided that ATFT is defined as choosing the opposite to the opponent's last move. According to this TFT-ATFT strategy, a player normally uses TFT; turns to ATFT upon recognizing his or her own error; returns to TFT either when mutual cooperation is recovered or when the opponent unilaterally defects twice in a row. The proposed strategy provides simple and deterministic behavioral rules for correcting implementation error in a way that cannot be exploited by the opponent, and suppresses the neutral drift to unconditional cooperation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. IgRepertoireConstructor: a novel algorithm for antibody repertoire construction and immunoproteogenomics analysis

    PubMed Central

    Safonova, Yana; Bonissone, Stefano; Kurpilyansky, Eugene; Starostina, Ekaterina; Lapidus, Alla; Stinson, Jeremy; DePalatis, Laura; Sandoval, Wendy; Lill, Jennie; Pevzner, Pavel A.

    2015-01-01

    The analysis of concentrations of circulating antibodies in serum (antibody repertoire) is a fundamental, yet poorly studied, problem in immunoinformatics. The two current approaches to the analysis of antibody repertoires [next generation sequencing (NGS) and mass spectrometry (MS)] present difficult computational challenges since antibodies are not directly encoded in the germline but are extensively diversified by somatic recombination and hypermutations. Therefore, the protein database required for the interpretation of spectra from circulating antibodies is custom for each individual. Although such a database can be constructed via NGS, the reads generated by NGS are error-prone and even a single nucleotide error precludes identification of a peptide by the standard proteomics tools. Here, we present the IgRepertoireConstructor algorithm that performs error-correction of immunosequencing reads and uses mass spectra to validate the constructed antibody repertoires. Availability and implementation: IgRepertoireConstructor is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from http://bioinf.spbau.ru/igtools. Contact: ppevzner@ucsd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072509

  8. Mid-infrared laser-absorption diagnostic for vapor-phase measurements in an evaporating n-decane aerosol

    NASA Astrophysics Data System (ADS)

    Porter, J. M.; Jeffries, J. B.; Hanson, R. K.

    2009-09-01

    A novel three-wavelength mid-infrared laser-based absorption/extinction diagnostic has been developed for simultaneous measurement of temperature and vapor-phase mole fraction in an evaporating hydrocarbon fuel aerosol (vapor and liquid droplets). The measurement technique was demonstrated for an n-decane aerosol with D 50˜3 μ m in steady and shock-heated flows with a measurement bandwidth of 125 kHz. Laser wavelengths were selected from FTIR measurements of the C-H stretching band of vapor and liquid n-decane near 3.4 μm (3000 cm -1), and from modeled light scattering from droplets. Measurements were made for vapor mole fractions below 2.3 percent with errors less than 10 percent, and simultaneous temperature measurements over the range 300 K< T<900 K were made with errors less than 3 percent. The measurement technique is designed to provide accurate values of temperature and vapor mole fraction in evaporating polydispersed aerosols with small mean diameters ( D 50<10 μ m), where near-infrared laser-based scattering corrections are prone to error.

  9. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    PubMed

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-10

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.

  10. The ADRA2B gene in the production of false memories for affective information in healthy female volunteers.

    PubMed

    Fairfield, Beth; Mammarella, Nicola; Di Domenico, Alberto; D'Aurora, Marco; Stuppia, Liborio; Gatta, Valentina

    2017-08-30

    False memories are common memory distortions in everyday life and seem to increase with affectively connoted complex information. In line with recent studies showing a significant interaction between the noradrenergic system and emotional memory, we investigated whether healthy volunteer carriers of the deletion variant of the ADRA2B gene that codes for the α2b-adrenergic receptor are more prone to false memories than non-carriers. In this study, we collected genotype data from 212 healthy female volunteers; 91 ADRA2B carriers and 121 non-carriers. To assess gene effects on false memories for affective information, factorial mixed model analysis of variances (ANOVAs) were conducted with genotype as the between-subjects factor and type of memory error as the within-subjects factor. We found that although carriers and non-carriers made comparable numbers of false memory errors, they showed differences in the direction of valence biases, especially for inferential causal errors. Specifically, carriers produced fewer causal false memory errors for scripts with a negative outcome, whereas non-carriers showed a more general emotional effect and made fewer causal errors with both positive and negative outcomes. These findings suggest that putatively higher levels of noradrenaline in deletion carriers may enhance short-term consolidation of negative information and lead to fewer memory distortions when facing negative events. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Survival analysis with error-prone time-varying covariates: a risk set calibration approach

    PubMed Central

    Liao, Xiaomei; Zucker, David M.; Li, Yi; Spiegelman, Donna

    2010-01-01

    Summary Occupational, environmental, and nutritional epidemiologists are often interested in estimating the prospective effect of time-varying exposure variables such as cumulative exposure or cumulative updated average exposure, in relation to chronic disease endpoints such as cancer incidence and mortality. From exposure validation studies, it is apparent that many of the variables of interest are measured with moderate to substantial error. Although the ordinary regression calibration approach is approximately valid and efficient for measurement error correction of relative risk estimates from the Cox model with time-independent point exposures when the disease is rare, it is not adaptable for use with time-varying exposures. By re-calibrating the measurement error model within each risk set, a risk set regression calibration method is proposed for this setting. An algorithm for a bias-corrected point estimate of the relative risk using an RRC approach is presented, followed by the derivation of an estimate of its variance, resulting in a sandwich estimator. Emphasis is on methods applicable to the main study/external validation study design, which arises in important applications. Simulation studies under several assumptions about the error model were carried out, which demonstrated the validity and efficiency of the method in finite samples. The method was applied to a study of diet and cancer from Harvard’s Health Professionals Follow-up Study (HPFS). PMID:20486928

  12. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    PubMed

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  13. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science

    PubMed Central

    Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918

  14. DNA double-strand–break complexity levels and their possible contributions to the probability for error-prone processing and repair pathway choice

    PubMed Central

    Schipler, Agnes; Iliakis, George

    2013-01-01

    Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice. PMID:23804754

  15. Assessing primary care data quality.

    PubMed

    Lim, Yvonne Mei Fong; Yusof, Maryati; Sivasampu, Sheamini

    2018-04-16

    Purpose The purpose of this paper is to assess National Medical Care Survey data quality. Design/methodology/approach Data completeness and representativeness were computed for all observations while other data quality measures were assessed using a 10 per cent sample from the National Medical Care Survey database; i.e., 12,569 primary care records from 189 public and private practices were included in the analysis. Findings Data field completion ranged from 69 to 100 per cent. Error rates for data transfer from paper to web-based application varied between 0.5 and 6.1 per cent. Error rates arising from diagnosis and clinical process coding were higher than medication coding. Data fields that involved free text entry were more prone to errors than those involving selection from menus. The authors found that completeness, accuracy, coding reliability and representativeness were generally good, while data timeliness needs to be improved. Research limitations/implications Only data entered into a web-based application were examined. Data omissions and errors in the original questionnaires were not covered. Practical implications Results from this study provided informative and practicable approaches to improve primary health care data completeness and accuracy especially in developing nations where resources are limited. Originality/value Primary care data quality studies in developing nations are limited. Understanding errors and missing data enables researchers and health service administrators to prevent quality-related problems in primary care data.

  16. The effects of angry and happy expressions on recognition memory for unfamiliar faces in delusion-prone individuals.

    PubMed

    Larøi, Frank; D'Argembeau, Arnaud; Van der Linden, Martial

    2006-12-01

    Numerous studies suggest a cognitive bias for threat-related material in delusional ideation. However, few studies have examined this bias using a memory task. We investigated the influence of delusion-proneness on identity and expression memory for angry and happy faces. Participants high and low in delusion-proneness were presented with happy and angry faces and were later asked to recognise the same faces displaying a neutral expression. They also had to remember what the initial expressions of the faces had been. Remember/know/guess judgments were asked for both identity and expression memory. Results showed that delusion-prone participants better recognised the identity of angry faces compared to non-delusional participants. Also, this difference between the two groups was mainly due to a greater number of remember responses in delusion-prone participants. These findings extend previous studies by showing that delusions are associated with a memory bias for threat-related stimuli.

  17. Distress after a single violent crime: how shame-proneness and event-related shame work together as risk factors for post-victimization symptoms.

    PubMed

    Semb, Olof; Strömsten, Lotta M J; Sundbom, Elisabet; Fransson, Per; Henningsson, Mikael

    2011-08-01

    To increase understanding of post-victimization symptom development, the present study investigated the role of shame- and guilt-proneness and event-related shame and guilt as potential risk factors. 35 individuals (M age = 31.7 yr.; 48.5% women), recently victimized by a single event of severe violent crime, were assessed regarding shame- and guilt-proneness, event-related shame and guilt, and post-victimization symptoms. The mediating role of event-related shame was investigated with structural equation modeling (SEM), using bootstrapping. The guilt measures were unrelated to each other and to post-victimization symptoms. The shame measures were highly intercorrelated and were both positively correlated to more severe post-victimization symptom levels. Event-related shame as mediator between shame-proneness and post-victimization symptoms was demonstrated by prevalent significant indirect effects. Both shame measures are potent risk factors for distress after victimization, whereby part of the effect of shame-proneness on post-victimization symptoms is explained by event-related shame.

  18. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  20. Intellicount: High-Throughput Quantification of Fluorescent Synaptic Protein Puncta by Machine Learning

    PubMed Central

    Fantuzzo, J. A.; Mirabella, V. R.; Zahn, J. D.

    2017-01-01

    Abstract Synapse formation analyses can be performed by imaging and quantifying fluorescent signals of synaptic markers. Traditionally, these analyses are done using simple or multiple thresholding and segmentation approaches or by labor-intensive manual analysis by a human observer. Here, we describe Intellicount, a high-throughput, fully-automated synapse quantification program which applies a novel machine learning (ML)-based image processing algorithm to systematically improve region of interest (ROI) identification over simple thresholding techniques. Through processing large datasets from both human and mouse neurons, we demonstrate that this approach allows image processing to proceed independently of carefully set thresholds, thus reducing the need for human intervention. As a result, this method can efficiently and accurately process large image datasets with minimal interaction by the experimenter, making it less prone to bias and less liable to human error. Furthermore, Intellicount is integrated into an intuitive graphical user interface (GUI) that provides a set of valuable features, including automated and multifunctional figure generation, routine statistical analyses, and the ability to run full datasets through nested folders, greatly expediting the data analysis process. PMID:29218324

  1. A high-speed linear algebra library with automatic parallelism

    NASA Technical Reports Server (NTRS)

    Boucher, Michael L.

    1994-01-01

    Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.

  2. Effect of prone positioning during mechanical ventilation on mortality among patients with acute respiratory distress syndrome: a systematic review and meta-analysis.

    PubMed

    Sud, Sachin; Friedrich, Jan O; Adhikari, Neill K J; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A Q; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude

    2014-07-08

    Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. We identified 11 RCTs (n=2341) that met our inclusion criteria. In the 6 trials (n=1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59-0.95; I2=29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2<50%) for most of the clinical and physiologic outcomes. Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. © 2014 Canadian Medical Association or its licensors.

  3. Effect of prone positioning during mechanical ventilation on mortality among patients with acute respiratory distress syndrome: a systematic review and meta-analysis

    PubMed Central

    Sud, Sachin; Friedrich, Jan O.; Adhikari, Neill K. J.; Taccone, Paolo; Mancebo, Jordi; Polli, Federico; Latini, Roberto; Pesenti, Antonio; Curley, Martha A.Q.; Fernandez, Rafael; Chan, Ming-Cheng; Beuret, Pascal; Voggenreiter, Gregor; Sud, Maneesh; Tognoni, Gianni; Gattinoni, Luciano; Guérin, Claude

    2014-01-01

    Background: Mechanical ventilation in the prone position is used to improve oxygenation and to mitigate the harmful effects of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS). We sought to determine the effect of prone positioning on mortality among patients with ARDS receiving protective lung ventilation. Methods: We searched electronic databases and conference proceedings to identify relevant randomized controlled trials (RCTs) published through August 2013. We included RCTs that compared prone and supine positioning during mechanical ventilation in patients with ARDS. We assessed risk of bias and obtained data on all-cause mortality (determined at hospital discharge or, if unavailable, after longest follow-up period). We used random-effects models for the pooled analyses. Results: We identified 11 RCTs (n = 2341) that met our inclusion criteria. In the 6 trials (n = 1016) that used a protective ventilation strategy with reduced tidal volumes, prone positioning significantly reduced mortality (risk ratio 0.74, 95% confidence interval 0.59–0.95; I2 = 29%) compared with supine positioning. The mortality benefit remained in several sensitivity analyses. The overall quality of evidence was high. The risk of bias was low in all of the trials except one, which was small. Statistical heterogeneity was low (I2 < 50%) for most of the clinical and physiologic outcomes. Interpretation: Our analysis of high-quality evidence showed that use of the prone position during mechanical ventilation improved survival among patients with ARDS who received protective lung ventilation. PMID:24863923

  4. Automated algorithm for mapping regions of cold-air pooling in complex terrain

    NASA Astrophysics Data System (ADS)

    Lundquist, Jessica D.; Pepin, Nicholas; Rochford, Caitlin

    2008-11-01

    In complex terrain, air in contact with the ground becomes cooled from radiative energy loss on a calm clear night and, being denser than the free atmosphere at the same elevation, sinks to valley bottoms. Cold-air pooling (CAP) occurs where this cooled air collects on the landscape. This article focuses on identifying locations on a landscape subject to considerably lower minimum temperatures than the regional average during conditions of clear skies and weak synoptic-scale winds, providing a simple automated method to map locations where cold air is likely to pool. Digital elevation models of regions of complex terrain were used to derive surfaces of local slope, curvature, and percentile elevation relative to surrounding terrain. Each pixel was classified as prone to CAP, not prone to CAP, or exhibiting no signal, based on the criterion that CAP occurs in regions with flat slopes in local depressions or valleys (negative curvature and low percentile). Along-valley changes in the topographic amplification factor (TAF) were then calculated to determine whether the cold air in the valley was likely to drain or pool. Results were checked against distributed temperature measurements in Loch Vale, Rocky Mountain National Park, Colorado; in the Eastern Pyrenees, France; and in Yosemite National Park, Sierra Nevada, California. Using CAP classification to interpolate temperatures across complex terrain resulted in improvements in root-mean-square errors compared to more basic interpolation techniques at most sites within the three areas examined, with average error reductions of up to 3°C at individual sites and about 1°C averaged over all sites in the study areas.

  5. Evidence that a burst of DNA depurination in SENCAR mouse skin induces error-prone repair and forms mutations in the H-ras gene.

    PubMed

    Chakravarti, D; Mailander, P C; Li, K M; Higginbotham, S; Zhang, H L; Gross, M L; Meza, J L; Cavalieri, E L; Rogan, E G

    2001-11-29

    Treatment of SENCAR mouse skin with dibenzo[a,l]pyrene results in abundant formation of abasic sites that undergo error-prone excision repair, forming oncogenic H-ras mutations in the early preneoplastic period. To examine whether the abundance of abasic sites causes repair infidelity, we treated SENCAR mouse skin with estradiol-3,4-quinone (E(2)-3,4-Q) and determined adduct levels 1 h after treatment, as well as mutation spectra in the H-ras gene between 6 h and 3 days after treatment. E(2)-3,4-Q formed predominantly (> or =99%) the rapidly-depurinating 4-hydroxy estradiol (4-OHE(2))-1-N3Ade adduct and the slower-depurinating 4-OHE(2)-1-N7Gua adduct. Between 6 h and 3 days, E(2)-3,4-Q induced abundant A to G mutations in H-ras DNA, frequently in the context of a 3'-G residue. Using a T.G-DNA glycosylase (TDG)-PCR assay, we determined that the early A to G mutations (6 and 12 h) were in the form of G.T heteroduplexes, suggesting misrepair at A-specific depurination sites. Since G-specific mutations were infrequent in the spectra, it appears that the slow rate of depurination of the N7Gua adducts during active repair may not generate a threshold level of G-specific abasic sites to affect repair fidelity. These results also suggest that E(2)-3,4-Q, a suspected endogenous carcinogen, is a genotoxic compound and could cause mutations.

  6. The Werner syndrome protein limits the error-prone 8-oxo-dG lesion bypass activity of human DNA polymerase kappa

    PubMed Central

    Maddukuri, Leena; Ketkar, Amit; Eddy, Sarah; Zafar, Maroof K.; Eoff, Robert L.

    2014-01-01

    Human DNA polymerase kappa (hpol κ) is the only Y-family member to preferentially insert dAMP opposite 7,8-dihydro-8-oxo-2′-deoxyguanosine (8-oxo-dG) during translesion DNA synthesis. We have studied the mechanism of action by which hpol κ activity is modulated by the Werner syndrome protein (WRN), a RecQ helicase known to influence repair of 8-oxo-dG. Here we show that WRN stimulates the 8-oxo-dG bypass activity of hpol κ in vitro by enhancing the correct base insertion opposite the lesion, as well as extension from dC:8-oxo-dG base pairs. Steady-state kinetic analysis reveals that WRN improves hpol κ-catalyzed dCMP insertion opposite 8-oxo-dG ∼10-fold and extension from dC:8-oxo-dG by 2.4-fold. Stimulation is primarily due to an increase in the rate constant for polymerization (kpol), as assessed by pre-steady-state kinetics, and it requires the RecQ C-terminal (RQC) domain. In support of the functional data, recombinant WRN and hpol κ were found to physically interact through the exo and RQC domains of WRN, and co-localization of WRN and hpol κ was observed in human cells treated with hydrogen peroxide. Thus, WRN limits the error-prone bypass of 8-oxo-dG by hpol κ, which could influence the sensitivity to oxidative damage that has previously been observed for Werner's syndrome cells. PMID:25294835

  7. (Quickly) Testing the Tester via Path Coverage

    NASA Technical Reports Server (NTRS)

    Groce, Alex

    2009-01-01

    The configuration complexity and code size of an automated testing framework may grow to a point that the tester itself becomes a significant software artifact, prone to poor configuration and implementation errors. Unfortunately, testing the tester by using old versions of the software under test (SUT) may be impractical or impossible: test framework changes may have been motivated by interface changes in the tested system, or fault detection may become too expensive in terms of computing time to justify running until errors are detected on older versions of the software. We propose the use of path coverage measures as a "quick and dirty" method for detecting many faults in complex test frameworks. We also note the possibility of using techniques developed to diversify state-space searches in model checking to diversify test focus, and an associated classification of tester changes into focus-changing and non-focus-changing modifications.

  8. Robust pupil center detection using a curvature algorithm

    NASA Technical Reports Server (NTRS)

    Zhu, D.; Moore, S. T.; Raphan, T.; Wall, C. C. (Principal Investigator)

    1999-01-01

    Determining the pupil center is fundamental for calculating eye orientation in video-based systems. Existing techniques are error prone and not robust because eyelids, eyelashes, corneal reflections or shadows in many instances occlude the pupil. We have developed a new algorithm which utilizes curvature characteristics of the pupil boundary to eliminate these artifacts. Pupil center is computed based solely on points related to the pupil boundary. For each boundary point, a curvature value is computed. Occlusion of the boundary induces characteristic peaks in the curvature function. Curvature values for normal pupil sizes were determined and a threshold was found which together with heuristics discriminated normal from abnormal curvature. Remaining boundary points were fit with an ellipse using a least squares error criterion. The center of the ellipse is an estimate of the pupil center. This technique is robust and accurately estimates pupil center with less than 40% of the pupil boundary points visible.

  9. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  10. Spatiotemporal dynamics of random stimuli account for trial-to-trial variability in perceptual decision making

    PubMed Central

    Park, Hame; Lueckmann, Jan-Matthis; von Kriegstein, Katharina; Bitzer, Sebastian; Kiebel, Stefan J.

    2016-01-01

    Decisions in everyday life are prone to error. Standard models typically assume that errors during perceptual decisions are due to noise. However, it is unclear how noise in the sensory input affects the decision. Here we show that there are experimental tasks for which one can analyse the exact spatio-temporal details of a dynamic sensory noise and better understand variability in human perceptual decisions. Using a new experimental visual tracking task and a novel Bayesian decision making model, we found that the spatio-temporal noise fluctuations in the input of single trials explain a significant part of the observed responses. Our results show that modelling the precise internal representations of human participants helps predict when perceptual decisions go wrong. Furthermore, by modelling precisely the stimuli at the single-trial level, we were able to identify the underlying mechanism of perceptual decision making in more detail than standard models. PMID:26752272

  11. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  12. When social anxiety disorder co-exists with risk-prone, approach behavior: Investigating a neglected, meaningful subset of people in the National Comorbidity Survey-Replication

    PubMed Central

    Kashdan, Todd B.; McKnight, Patrick E.; Richey, J. Anthony; Hofmann, Stefan G.

    2009-01-01

    Little is known about people with social anxiety disorder (SAD) who are not behaviorally inhibited. To advance knowledge on phenomenology, functional impairment, and treatment seeking, we investigated whether engaging in risk-prone behaviors accounts for heterogeneous outcomes in people with SAD. Using the National Comorbidity Survey-Replication (NCS-R) dataset, our analyses focused on people with current (N = 679) or lifetime (N = 1143) SAD diagnoses. Using latent class analysis on NCS-R risk-prone behavior items, results supported two SAD classes: (1) a pattern of behavioral inhibition and risk aversion and (2) an atypical pattern of high anger and aggression, and moderate/high sexual impulsivity and substance use problems. An atypical pattern of risk-prone behaviors was associated with greater functional impairment, less education and income, younger age, and particular psychiatric comorbidities. Results could not be subsumed by the severity, type, or number of social fears, or comorbid anxiety or mood disorders. Conclusions about the nature, course, and treatment of SAD may be compromised by not attending to heterogeneity in behavior patterns. PMID:19345933

  13. Smart Collision Avoidance and Hazard Routing Mechanism for Intelligent Transport Network

    NASA Astrophysics Data System (ADS)

    Singh, Gurpreet; Gupta, Pooja; Wahab, Mohd Helmy Abd

    2017-08-01

    The smart vehicular ad-hoc network is the network that consists of vehicles for smooth movement and better management of the vehicular connectivity across the given network. This research paper aims to propose a set of solution for the VANETs consisting of the automatic driven vehicles, also called as the autonomous car. Such vehicular networks are always prone to collision due to the natural or un-natural reasons which must be solved before the large-scale deployment of the autonomous transport systems. The newly designed intelligent transport movement control mechanism is based upon the intelligent data propagation along with the vehicle collision and traffic jam prevention schema [8], which may help the future designs of smart cities to become more robust and less error-prone. In the proposed model, the focus is on designing a new dynamic and robust hazard routing protocol for intelligent vehicular networks for improvement of the overall performance in various aspects. It is expected to improve the overall transmission delay as well as the number of collisions or adversaries across the vehicular network zone.

  14. APOBEC3B upregulation and genomic mutation patterns in serous ovarian carcinoma

    PubMed Central

    Leonard, Brandon; Hart, Steven N.; Burns, Michael B.; Carpenter, Michael A.; Temiz, Nuri A.; Rathore, Anurag; Vogel, Rachel Isaksson; Nikas, Jason B.; Law, Emily K.; Brown, William L.; Li, Ying; Zhang, Yuji; Maurer, Matthew J.; Oberg, Ann L.; Cunningham, Julie M.; Shridhar, Viji; Bell, Debra A.; April, Craig; Bentley, David; Bibikova, Marina; Cheetham, R. Keira; Fan, Jian-Bing; Grocock, Russell; Humphray, Sean; Kingsbury, Zoya; Peden, John; Chien, Jeremy; Swisher, Elizabeth M.; Hartmann, Lynn C.; Kalli, Kimberly R.; Goode, Ellen L.; Sicotte, Hugues; Kaufmann, Scott H.; Harris, Reuben S.

    2013-01-01

    Ovarian cancer is a clinically and molecularly heterogeneous disease. The driving forces behind this variability are unknown. Here we report wide variation in expression of the DNA cytosine deaminase APOBEC3B, with elevated expression in a majority of ovarian cancer cell lines (3 standard deviations above the mean of normal ovarian surface epithelial cells) and high grade primary ovarian cancers. APOBEC3B is active in the nucleus of several ovarian cancer cell lines and elicits a biochemical preference for deamination of cytosines in 5′TC dinucleotides. Importantly, examination of whole-genome sequence from 16 ovarian cancers reveals that APOBEC3B expression correlates with total mutation load as well as elevated levels of transversion mutations. In particular, high APOBEC3B expression correlates with C-to-A and C-to-G transversion mutations within 5′TC dinucleotide motifs in early-stage high grade serous ovarian cancer genomes, suggesting that APOBEC3B-catalyzed genomic uracil lesions are further processed by downstream DNA ‘repair’ enzymes including error-prone translesion polymerases. These data identify a potential role for APOBEC3B in serous ovarian cancer genomic instability. PMID:24154874

  15. On the use of LiF:Mg,Ti thermoluminescence dosemeters in space--a critical review.

    PubMed

    Horowitz, Y S; Satinger, D; Fuks, E; Oster, L; Podpalov, L

    2003-01-01

    The use of LiF:Mg,Ti thermoluminescence dosemeters (TLDs) in space radiation fields is reviewed. It is demonstrated in the context of modified track structure theory and microdosimetric track structure theory that there is no unique correlation between the relative thermoluminescence (TL) efficiency of heavy charged particles, neutrons of all energies and linear energy transfer (LET). Many experimental measurements dating back more than two decades also demonstrate the multivalued, non-universal, relationship between relative TL efficiency and LET. It is further demonstrated that the relative intensities of the dosimetric peaks and especially the high-temperature structure are dependent on a large number of variables, some controllable, some not. It is concluded that TL techniques employing the concept of LET (e.g. measurement of total dose, the high-temperature ratio (HTR) methods and other combinations of the relative TL efficiency of the various peaks used to estimate average Q or simulate Q-LET relationships) should be regarded as lacking a sound theoretical basis, highly prone to error and, as well, lack of reproducibility/universality due to the absence of a standardised experimental protocol essential to reliable experimental methodology.

  16. Standard operating procedure for calculating genome-to-genome distances based on high-scoring segment pairs.

    PubMed

    Auch, Alexander F; Klenk, Hans-Peter; Göker, Markus

    2010-01-28

    DNA-DNA hybridization (DDH) is a widely applied wet-lab technique to obtain an estimate of the overall similarity between the genomes of two organisms. To base the species concept for prokaryotes ultimately on DDH was chosen by microbiologists as a pragmatic approach for deciding about the recognition of novel species, but also allowed a relatively high degree of standardization compared to other areas of taxonomy. However, DDH is tedious and error-prone and first and foremost cannot be used to incrementally establish a comparative database. Recent studies have shown that in-silico methods for the comparison of genome sequences can be used to replace DDH. Considering the ongoing rapid technological progress of sequencing methods, genome-based prokaryote taxonomy is coming into reach. However, calculating distances between genomes is dependent on multiple choices for software and program settings. We here provide an overview over the modifications that can be applied to distance methods based in high-scoring segment pairs (HSPs) or maximally unique matches (MUMs) and that need to be documented. General recommendations on determining HSPs using BLAST or other algorithms are also provided. As a reference implementation, we introduce the GGDC web server (http://ggdc.gbdp.org).

  17. Towards an evaluation framework for Laboratory Information Systems.

    PubMed

    Yusof, Maryati M; Arifin, Azila

    Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  18. Specificity control for read alignments using an artificial reference genome-guided false discovery rate.

    PubMed

    Giese, Sven H; Zickmann, Franziska; Renard, Bernhard Y

    2014-01-01

    Accurate estimation, comparison and evaluation of read mapping error rates is a crucial step in the processing of next-generation sequencing data, as further analysis steps and interpretation assume the correctness of the mapping results. Current approaches are either focused on sensitivity estimation and thereby disregard specificity or are based on read simulations. Although continuously improving, read simulations are still prone to introduce a bias into the mapping error quantitation and cannot capture all characteristics of an individual dataset. We introduce ARDEN (artificial reference driven estimation of false positives in next-generation sequencing data), a novel benchmark method that estimates error rates of read mappers based on real experimental reads, using an additionally generated artificial reference genome. It allows a dataset-specific computation of error rates and the construction of a receiver operating characteristic curve. Thereby, it can be used for optimization of parameters for read mappers, selection of read mappers for a specific problem or for filtering alignments based on quality estimation. The use of ARDEN is demonstrated in a general read mapper comparison, a parameter optimization for one read mapper and an application example in single-nucleotide polymorphism discovery with a significant reduction in the number of false positive identifications. The ARDEN source code is freely available at http://sourceforge.net/projects/arden/.

  19. [Prospective assessment of medication errors in critically ill patients in a university hospital].

    PubMed

    Salazar L, Nicole; Jirón A, Marcela; Escobar O, Leslie; Tobar, Eduardo; Romero, Carlos

    2011-11-01

    Critically ill patients are especially vulnerable to medication errors (ME) due to their severe clinical situation and the complexities of their management. To determine the frequency and characteristics of ME and identify shortcomings in the processes of medication management in an Intensive Care Unit. During a 3 months period, an observational prospective and randomized study was carried out in the ICU of a university hospital. Every step of patient's medication management (prescription, transcription, dispensation, preparation and administration) was evaluated by an external trained professional. Steps with higher frequency of ME and their therapeutic groups involved were identified. Medications errors were classified according to the National Coordinating Council for Medication Error Reporting and Prevention. In 52 of 124 patients evaluated, 66 ME were found in 194 drugs prescribed. In 34% of prescribed drugs, there was at least 1 ME during its use. Half of ME occurred during medication administration, mainly due to problems in infusion rates and schedule times. Antibacterial drugs had the highest rate of ME. We found a 34% rate of ME per drug prescribed, which is in concordance with international reports. The identification of those steps more prone to ME in the ICU, will allow the implementation of an intervention program to improve the quality and security of medication management.

  20. Proneness to guilt, shame, and pride in children with Autism Spectrum Disorders and neurotypical children.

    PubMed

    Davidson, Denise; Hilvert, Elizabeth; Misiunaite, Ieva; Giordano, Michael

    2018-06-01

    Self-conscious emotions (e.g., guilt, shame, and pride) are complex emotions that require self-reflection and self-evaluation, and are thought to facilitate the maintenance of societal norms and personal standards. Despite the importance of self-conscious emotions, most research has focused on basic emotion processing in children with Autism Spectrum Disorders (ASD). Therefore, in the present study, we used the Test of Self-Conscious Affect for Children (TOSCA-C) to assess proneness to, or propensity to experience, the self-conscious emotions guilt, shame, and pride in children with ASD and neurotypical children. The TOSCA-C is designed to capture a child's natural tendency to experience a given emotion across a range of everyday situations [Tangney, Stuewig, & Mashek, 2007]. We also assessed how individual characteristics contribute to the development of proneness to self-conscious emotions, including theory of mind (ToM) and ASD symptomatology. In comparison to neurotypical children, children with ASD showed less proneness to guilt, although all children showed relatively high levels of proneness to guilt. Greater ToM ability was related to more proneness to guilt and authentic pride in children with ASD. Additionally, we found that children with ASD with more severe symptomatology were more prone to hubristic pride. Our results provide evidence of differences in proneness to self-conscious emotions in children with ASD, as well as highlight important mechanisms contributing to how children with ASD may experience self-conscious emotions. Autism Res 2018,11:883-892. ©2017 International Society for Autism Research, Wiley Periodicals, Inc. This research examined proneness to guilt, shame, and pride in children with Autism Spectrum Disorders (ASD) and neurotypical children. We found that children with ASD showed less proneness to guilt than neurotypical children. Better understanding of theory of mind was related to greater proneness to guilt and pride, but only for children with ASD. These findings are important because these complex emotions are linked with both positive and negative social behaviors towards others and oneself. © 2018 International Society for Autism Research, Wiley Periodicals, Inc.

  1. Mitogen-activated protein kinase phosphatase-1 modulates regional effects of injurious mechanical ventilation in rodent lungs.

    PubMed

    Park, Moo Suk; He, Qianbin; Edwards, Michael G; Sergew, Amen; Riches, David W H; Albert, Richard K; Douglas, Ivor S

    2012-07-01

    Mechanical ventilation induces heterogeneous lung injury by mitogen-activated protein kinase (MAPK) and nuclear factor-κB. Mechanisms regulating regional injury and protective effects of prone positioning are unclear. To determine the key regulators of the lung regional protective effects of prone positioning in rodent lungs exposed to injurious ventilation. Adult rats were ventilated with high (18 ml/kg, positive end-expiratory pressure [PEEP] 0) or low Vt (6 ml/kg; PEEP 3 cm H(2)O; 3 h) in supine or prone position. Dorsal-caudal lung mRNA was analyzed by microarray and MAPK phosphatases (MKP)-1 quantitative polymerase chain reaction. MKP-1(-/-) or wild-type mice were ventilated with very high (24 ml/kg; PEEP 0) or low Vt (6-7 ml/kg; PEEP 3 cm H(2)O). The MKP-1 regulator PG490-88 (MRx-108; 0.75 mg/kg) or phosphate-buffered saline was administered preventilation. Injury was assessed by lung mechanics, bronchioalveolar lavage cell counts, protein content, and lung injury scoring. Immunoblotting for MKP-1, and IκBα and cytokine ELISAs were performed on lung lysates. Prone positioning was protective against injurious ventilation in rats. Expression profiling demonstrated MKP-1 20-fold higher in rats ventilated prone rather than supine and regional reduction in p38 and c-jun N-terminal kinase activation. MKP-1(-/-) mice experienced amplified injury. PG490-88 improved static lung compliance and injury scores, reduced bronchioalveolar lavage cell counts and cytokine levels, and induced MKP-1 and IκBα. Injurious ventilation induces MAPK in an MKP-1-dependent fashion. Prone positioning is protective and induces MKP-1. PG490-88 induced MKP-1 and was protective against high Vt in a nuclear factor-κB-dependent manner. MKP-1 is a potential target for modulating regional effects of injurious ventilation.

  2. Medication errors as malpractice-a qualitative content analysis of 585 medication errors by nurses in Sweden.

    PubMed

    Björkstén, Karin Sparring; Bergqvist, Monica; Andersén-Karlsson, Eva; Benson, Lina; Ulfvarson, Johanna

    2016-08-24

    Many studies address the prevalence of medication errors but few address medication errors serious enough to be regarded as malpractice. Other studies have analyzed the individual and system contributory factor leading to a medication error. Nurses have a key role in medication administration, and there are contradictory reports on the nurses' work experience in relation to the risk and type for medication errors. All medication errors where a nurse was held responsible for malpractice (n = 585) during 11 years in Sweden were included. A qualitative content analysis and classification according to the type and the individual and system contributory factors was made. In order to test for possible differences between nurses' work experience and associations within and between the errors and contributory factors, Fisher's exact test was used, and Cohen's kappa (k) was performed to estimate the magnitude and direction of the associations. There were a total of 613 medication errors in the 585 cases, the most common being "Wrong dose" (41 %), "Wrong patient" (13 %) and "Omission of drug" (12 %). In 95 % of the cases, an average of 1.4 individual contributory factors was found; the most common being "Negligence, forgetfulness or lack of attentiveness" (68 %), "Proper protocol not followed" (25 %), "Lack of knowledge" (13 %) and "Practice beyond scope" (12 %). In 78 % of the cases, an average of 1.7 system contributory factors was found; the most common being "Role overload" (36 %), "Unclear communication or orders" (30 %) and "Lack of adequate access to guidelines or unclear organisational routines" (30 %). The errors "Wrong patient due to mix-up of patients" and "Wrong route" and the contributory factors "Lack of knowledge" and "Negligence, forgetfulness or lack of attentiveness" were more common in less experienced nurses. The experienced nurses were more prone to "Practice beyond scope of practice" and to make errors in spite of "Lack of adequate access to guidelines or unclear organisational routines". Medication errors regarded as malpractice in Sweden were of the same character as medication errors worldwide. A complex interplay between individual and system factors often contributed to the errors.

  3. Variability of ischiofemoral space dimensions with changes in hip flexion: an MRI study.

    PubMed

    Johnson, Adam C; Hollman, John H; Howe, Benjamin M; Finnoff, Jonathan T

    2017-01-01

    The primary aim of this study was to determine if ischiofemoral space (IFS) dimensions vary with changes in hip flexion as a result of placing a bolster behind the knees during magnetic resonance imaging (MRI). A secondary aim was to determine if IFS dimensions vary between supine and prone hip neutral positions. The study employed a prospective design. Sports medicine center within a tertiary care institution. Five male and five female adult subjects (age mean = 29.2, range = 23-35; body mass index [BMI] mean = 23.5, range = 19.5-26.6) were recruited to participate in the study. An axial, T1-weighted MRI sequence of the pelvis was obtained of each subject in a supine position with their hips in neutral and flexed positions, and in a prone position with their hips in neutral position. Supine hip flexion was induced by placing a standard, 9-cm-diameter MRI knee bolster under the subject's knees. The order of image acquisition (supine hip neutral, supine hip flexed, prone hip neutral) was randomized. The IFS dimensions were then measured on a separate workstation. The investigator performing the IFS measurements was blinded to the subject position for each image. The main outcome measurements were the IFS dimensions acquired with MRI. The mean IFS dimensions in the prone position were 28.25 mm (SD 5.91 mm, standard error mean 1.32 mm). In the supine hip neutral position, the IFS dimensions were 25.1 (SD 5.6) mm. The mean difference between the two positions of 3.15 (3.6) mm was statistically significant (95 % CI of the difference = 1.4 to 4.8 mm, t 19  = 3.911, p = .001). The mean IFS dimensions in the hip flexed position were 36.9 (SD 5.7) mm. The mean difference between the two supine positions of 11.8 (4.1) mm was statistically significant (95 % CI of the difference = 9.9 to 13.7 mm, t 19  = 12.716, p < .001). Our findings demonstrate that the IFS measurements obtained with MRI are dependent upon patient positioning with respect to hip flexion and supine versus prone positions. This finding has implications when evaluating for ischiofemoral impingement, an entity resulting in hip and/or buttock pain secondary to impingement of the quadratus femoris muscle within a pathologically narrowed IFS. One will need to account for patient hip flexion and supine versus prone positioning when evaluating individuals with suspected ischiofemoral impingement.

  4. A Robust and Affordable Table Indexing Approach for Multi-isocenter Dosimetrically Matched Fields.

    PubMed

    Yu, Amy; Fahimian, Benjamin; Million, Lynn; Hsu, Annie

    2017-05-23

    Purpose  Radiotherapy treatment planning of extended volume typically necessitates the utilization of multiple field isocenters and abutting dosimetrically matched fields in order to enable coverage beyond the field size limits. A common example includes total lymphoid irradiation (TLI) treatments, which are conventionally planned using dosimetric matching of the mantle, para-aortic/spleen, and pelvic fields. Due to the large irradiated volume and system limitations, such as field size and couch extension, a combination of couch shifts and sliding of patients are necessary to be correctly executed for accurate delivery of the plan. However, shifting of patients presents a substantial safety issue and has been shown to be prone to errors ranging from minor deviations to geometrical misses warranting a medical event. To address this complex setup and mitigate the safety issues relating to delivery, a practical technique for couch indexing of TLI treatments has been developed and evaluated through a retrospective analysis of couch position. Methods The indexing technique is based on the modification of the commonly available slide board to enable indexing of the patient position. Modifications include notching to enable coupling with indexing bars, and the addition of a headrest used to fixate the head of the patient relative to the slide board. For the clinical setup, a Varian Exact Couch TM (Varian Medical Systems, Inc, Palo Alto, CA) was utilized. Two groups of patients were treated: 20 patients with table indexing and 10 patients without. The standard deviations (SDs) of the couch positions in longitudinal, lateral, and vertical directions through the entire treatment cycle for each patient were calculated and differences in both groups were analyzed with Student's t-test. Results The longitudinal direction showed the largest improvement. In the non-indexed group, the positioning SD ranged from 2.0 to 7.9 cm. With the indexing device, the positioning SD was reduced to a range of 0.4 to 1.3 cm (p < 0.05 with 95% confidence level). The lateral positioning was slightly improved (p < 0.05 with 95% confidence level), while no improvement was observed in the vertical direction. Conclusions The conventional matched field TLI treatment is error-prone to geometrical setup error. The feasibility of full indexing TLI treatments was validated and shown to result in a significant reduction of positioning and shifting errors.

  5. Borderline Personality Features and Implicit Shame-Prone Self-Concept in Middle Childhood and Early Adolescence

    ERIC Educational Resources Information Center

    Hawes, David J.; Helyer, Rebekah; Herlianto, Eugene C.; Willing, Jonah

    2013-01-01

    This study tested if children and adolescents with high levels of borderline personality features (BPF) exhibit the same shame-prone self-concept previously found to characterize adults with borderline personality disorder (Rusch et al., 2007). Self-concept was indexed using the Implicit Association Test, in a community sample of…

  6. In Vivo Myeloperoxidase Imaging and Flow Cytometry Analysis of Intestinal Myeloid Cells.

    PubMed

    Hülsdünker, Jan; Zeiser, Robert

    2016-01-01

    Myeloperoxidase (MPO) imaging is a non-invasive method to detect cells that produce the enzyme MPO that is most abundant in neutrophils, macrophages, and inflammatory monocytes. While lacking specificity for any of these three cell types, MPO imaging can provide guidance for further flow cytometry-based analysis of tissues where these cell types reside. Isolation of leukocytes from the intestinal tract is an error-prone procedure. Here, we describe a protocol for intestinal leukocyte isolation that works reliable in our hands and allows for flow cytometry-based analysis, in particular of neutrophils.

  7. DAB user's guide

    NASA Technical Reports Server (NTRS)

    Trosin, J.

    1985-01-01

    Use of the Display AButments (DAB) which plots PAN AIR geometries is presented. The DAB program creates hidden line displays of PAN AIR geometries and labels specified geometry components, such as abutments, networks, and network edges. It is used to alleviate the very time consuming and error prone abutment list checking phase of developing a valid PAN AIR geometry, and therefore represents a valuable tool for debugging complex PAN AIR geometry definitions. DAB is written in FORTRAN 77 and runs on a Digital Equipment Corporation VAX 11/780 under VMS. It utilizes a special color version of the SKETCH hidden line analysis routine.

  8. [SOS-repair--60 years].

    PubMed

    Zavil'gel'skiĭ, G B

    2013-01-01

    This review integrates 60 years of research on SOS-repair and SOS-mutagenesis in procaryotes and eucaryotes, from Jean Weigle experiment in 1953 year (mutagenesis of lambda bacteriophage in UV-irradiated bacteria) to the latest achievements in studying SOS-mutagenesis on all living organisms--Eukarya, Archaea and Bacteria. A key role in establishing of a biochemical basis for SOS-mutagenesis belonges to the finding in 1998-1999 years that specific error-prone DNA polymerases (PolV and others) catalysed translesion synthesis on damaged DNA. This review focuses on recent studies addressing the new models for SOS-induced mutagenesis in Escherichia coli and Home sapiens cells.

  9. An empirical comparison of a dynamic software testability metric to static cyclomatic complexity

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.

    1993-01-01

    This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.

  10. Automatisierung des Verfahrens nach Beyer & Schweiger (1969) zur Bestimmung von Durchlässigkeit und Porosität aus Kornverteilungskurven

    NASA Astrophysics Data System (ADS)

    Houben, Georg J.; Blümel, Martin

    2017-11-01

    Porosity is a fundamental parameter in hydrogeology. The empirical method of Beyer and Schweiger (1969) allows the calculation of hydraulic conductivity and both the total and effective porosity from granulometric data. However, due to its graphical nature with type curves, it is tedious to apply and prone to reading errors. In this work, the type curves were digitized and emulated by mathematical functions. The latter were implemented into a spreadsheet and a visual basic program, allowing the fast automated application of the method for any number of samples.

  11. Hybrid artificial intelligence approach based on neural fuzzy inference model and metaheuristic optimization for flood susceptibilitgy modeling in a high-frequency tropical cyclone area using GIS

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Pradhan, Biswajeet; Nampak, Haleh; Bui, Quang-Thanh; Tran, Quynh-An; Nguyen, Quoc-Phi

    2016-09-01

    This paper proposes a new artificial intelligence approach based on neural fuzzy inference system and metaheuristic optimization for flood susceptibility modeling, namely MONF. In the new approach, the neural fuzzy inference system was used to create an initial flood susceptibility model and then the model was optimized using two metaheuristic algorithms, Evolutionary Genetic and Particle Swarm Optimization. A high-frequency tropical cyclone area of the Tuong Duong district in Central Vietnam was used as a case study. First, a GIS database for the study area was constructed. The database that includes 76 historical flood inundated areas and ten flood influencing factors was used to develop and validate the proposed model. Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Receiver Operating Characteristic (ROC) curve, and area under the ROC curve (AUC) were used to assess the model performance and its prediction capability. Experimental results showed that the proposed model has high performance on both the training (RMSE = 0.306, MAE = 0.094, AUC = 0.962) and validation dataset (RMSE = 0.362, MAE = 0.130, AUC = 0.911). The usability of the proposed model was evaluated by comparing with those obtained from state-of-the art benchmark soft computing techniques such as J48 Decision Tree, Random Forest, Multi-layer Perceptron Neural Network, Support Vector Machine, and Adaptive Neuro Fuzzy Inference System. The results show that the proposed MONF model outperforms the above benchmark models; we conclude that the MONF model is a new alternative tool that should be used in flood susceptibility mapping. The result in this study is useful for planners and decision makers for sustainable management of flood-prone areas.

  12. Risk factors of diarrhoea among flood victims: a controlled epidemiological study.

    PubMed

    Mondal, N C; Biswas, R; Manna, A

    2001-01-01

    The concept and practice of 'disaster preparedness and response', instead of traditional casualty relief, is relatively new. Vulnerability analysis and health risks assessment of disaster prone communities are important prerequisites of meaningful preparedness and effective response against any calamity. In this community based study, the risk of diarrhoeal disease and its related epidemiological factors were analysed by collecting data from two selected flood prone block of Midnapur district of West Bengal. The information was compared with that of another population living in two non-flood prone blocks of the same district. The study showed that diarrhoeal disease was the commonest morbidity in flood prone population. Some behaviours, like use of pond water for utensil wash and kitchen purpose, hand washing after defecation without soap, improper hand washing before eating, open field defecation, storage of drinking water in wide mouth vessels etc. were found to be associated with high attack rate of diarrhoea, in both study and control population during flood season compared to pre-flood season. Attack rates were also significantly higher in flood prone population than that of population in non-flood prone area during the same season. Necessity of both community education for proper water use behaviour and personal hygiene along with ensuring safe water and sanitation facilities of flood affected communities were emphasized.

  13. Efficacy of prone position in acute respiratory distress syndrome patients: A pathophysiology-based review

    PubMed Central

    Koulouras, Vasilios; Papathanakos, Georgios; Papathanasiou, Athanasios; Nakos, Georgios

    2016-01-01

    Acute respiratory distress syndrome (ARDS) is a syndrome with heterogeneous underlying pathological processes. It represents a common clinical problem in intensive care unit patients and it is characterized by high mortality. The mainstay of treatment for ARDS is lung protective ventilation with low tidal volumes and positive end-expiratory pressure sufficient for alveolar recruitment. Prone positioning is a supplementary strategy available in managing patients with ARDS. It was first described 40 years ago and it proves to be in alignment with two major ARDS pathophysiological lung models; the “sponge lung” - and the “shape matching” -model. Current evidence strongly supports that prone positioning has beneficial effects on gas exchange, respiratory mechanics, lung protection and hemodynamics as it redistributes transpulmonary pressure, stress and strain throughout the lung and unloads the right ventricle. The factors that individually influence the time course of alveolar recruitment and the improvement in oxygenation during prone positioning have not been well characterized. Although patients’ response to prone positioning is quite variable and hard to predict, large randomized trials and recent meta-analyses show that prone position in conjunction with a lung-protective strategy, when performed early and in sufficient duration, may improve survival in patients with ARDS. This pathophysiology-based review and recent clinical evidence strongly support the use of prone positioning in the early management of severe ARDS systematically and not as a rescue maneuver or a last-ditch effort. PMID:27152255

  14. Face emotion recognition is related to individual differences in psychosis-proneness.

    PubMed

    Germine, L T; Hooker, C I

    2011-05-01

    Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.

  15. Fusion of 4D echocardiography and cine cardiac magnetic resonance volumes using a salient spatio-temporal analysis

    NASA Astrophysics Data System (ADS)

    Atehortúa, Angélica; Garreau, Mireille; Romero, Eduardo

    2017-11-01

    An accurate left (LV) and right ventricular (RV) function quantification is important to support evaluation, diagnosis and prognosis of cardiac pathologies such as the cardiomyopathies. Currently, diagnosis by ultrasound is the most cost-effective examination. However, this modality is highly noisy and operator dependent, hence prone to errors. Therefore, fusion with other cardiac modalities may provide complementary information and improve the analysis of the specific pathologies like cardiomyopathies. This paper proposes an automatic registration between two complementary modalities, 4D echocardiography and Magnetic resonance images, by mapping both modalities to a common space of salience where an optimal registration between them is estimated. The obtained matrix transformation is then applied to the MRI volume which is superimposed to the 4D echocardiography. Manually selected marks in both modalities are used to evaluate the precision of the superimposition. Preliminary results, in three evaluation cases, show the distance between these marked points and the estimated with the transformation is about 2 mm.

  16. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations.

    PubMed

    Martínez-Romero, Marcos; O'Connor, Martin J; Shankar, Ravi D; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L; Gevaert, Olivier; Graybeal, John; Musen, Mark A

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository.

  17. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations

    PubMed Central

    Martínez-Romero, Marcos; O’Connor, Martin J.; Shankar, Ravi D.; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L.; Gevaert, Olivier; Graybeal, John; Musen, Mark A.

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository. PMID:29854196

  18. Learning from Accident and Error: Avoiding the Hazards of Workload, Stress, and Routine Interruptions in the Emergency Department

    PubMed Central

    Morrison, J. Bradley; Rudolph, Jenny W.

    2012-01-01

    This article presents a model of how a build-up of interruptions can shift the dynamics of the emergency department (ED) from an adaptive, self-regulating system into a fragile, crisis-prone one. Drawing on case studies of organizational disasters and insights from the theory of high-reliability organizations, the authors use computer simulations to show how the accumulation of small interruptions could have disproportionately large effects in the ED. In the face of a mounting workload created by interruptions, EDs, like other organizational systems, have tipping points, thresholds beyond which a vicious cycle can lead rather quickly to the collapse of normal operating routines and in the extreme to a crisis of organizational paralysis. The authors discuss some possible implications for emergency medicine, emphasizing the potential threat from routine, non-novel demands on EDs and raising the concern that EDs are operating closer to the precipitous edge of crisis as ED crowding exacerbates the problem. PMID:22168187

  19. Evolving artificial metalloenzymes via random mutagenesis

    NASA Astrophysics Data System (ADS)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  20. Breaks in the 45S rDNA Lead to Recombination-Mediated Loss of Repeats.

    PubMed

    Warmerdam, Daniël O; van den Berg, Jeroen; Medema, René H

    2016-03-22

    rDNA repeats constitute the most heavily transcribed region in the human genome. Tumors frequently display elevated levels of recombination in rDNA, indicating that the repeats are a liability to the genomic integrity of a cell. However, little is known about how cells deal with DNA double-stranded breaks in rDNA. Using selective endonucleases, we show that human cells are highly sensitive to breaks in 45S but not the 5S rDNA repeats. We find that homologous recombination inhibits repair of breaks in 45S rDNA, and this results in repeat loss. We identify the structural maintenance of chromosomes protein 5 (SMC5) as contributing to recombination-mediated repair of rDNA breaks. Together, our data demonstrate that SMC5-mediated recombination can lead to error-prone repair of 45S rDNA repeats, resulting in their loss and thereby reducing cellular viability. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

Top