Sample records for error prone process

  1. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study.

    PubMed

    Cooperstein, Robert; Young, Morgan

    2014-01-01

    Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings.

  2. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study

    PubMed Central

    2014-01-01

    Background Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Methods Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. Results The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. Conclusions As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings. PMID:24904747

  3. Validation, Edits, and Application Processing Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    The impact of quality assurance procedures on the correct award of Basic Educational Opportunity Grants (BEOGs) for 1979-1980 was assessed, and a model for detecting error-prone applications early in processing was developed. The Bureau of Student Financial Aid introduced new comments into the edit system in 1979 and expanded the pre-established…

  4. Update: Validation, Edits, and Application Processing. Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    An update to the Validation, Edits, and Application Processing and Error-Prone Model Report (Section 1, July 3, 1980) is presented. The objective is to present the most current data obtained from the June 1980 Basic Educational Opportunity Grant applicant and recipient files and to determine whether the findings reported in Section 1 of the July…

  5. Regulation of error-prone translesion synthesis by Spartan/C1orf124

    PubMed Central

    Kim, Myoung Shin; Machida, Yuka; Vashisht, Ajay A.; Wohlschlegel, James A.; Pang, Yuan-Ping; Machida, Yuichi J.

    2013-01-01

    Translesion synthesis (TLS) employs low fidelity polymerases to replicate past damaged DNA in a potentially error-prone process. Regulatory mechanisms that prevent TLS-associated mutagenesis are unknown; however, our recent studies suggest that the PCNA-binding protein Spartan plays a role in suppression of damage-induced mutagenesis. Here, we show that Spartan negatively regulates error-prone TLS that is dependent on POLD3, the accessory subunit of the replicative DNA polymerase Pol δ. We demonstrate that the putative zinc metalloprotease domain SprT in Spartan directly interacts with POLD3 and contributes to suppression of damage-induced mutagenesis. Depletion of Spartan induces complex formation of POLD3 with Rev1 and the error-prone TLS polymerase Pol ζ, and elevates mutagenesis that relies on POLD3, Rev1 and Pol ζ. These results suggest that Spartan negatively regulates POLD3 function in Rev1/Pol ζ-dependent TLS, revealing a previously unrecognized regulatory step in error-prone TLS. PMID:23254330

  6. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  7. Adaptive Constructive Processes and the Future of Memory

    ERIC Educational Resources Information Center

    Schacter, Daniel L.

    2012-01-01

    Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes…

  8. Development of a Dependency Theory Toolbox for Database Design.

    DTIC Science & Technology

    1987-12-01

    published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and

  9. Designing an algorithm to preserve privacy for medical record linkage with error-prone data.

    PubMed

    Pal, Doyel; Chen, Tingting; Zhong, Sheng; Khethavath, Praveen

    2014-01-20

    Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients' privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other's database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other's database.

  10. Designing an Algorithm to Preserve Privacy for Medical Record Linkage With Error-Prone Data

    PubMed Central

    Pal, Doyel; Chen, Tingting; Khethavath, Praveen

    2014-01-01

    Background Linking medical records across different medical service providers is important to the enhancement of health care quality and public health surveillance. In records linkage, protecting the patients’ privacy is a primary requirement. In real-world health care databases, records may well contain errors due to various reasons such as typos. Linking the error-prone data and preserving data privacy at the same time are very difficult. Existing privacy preserving solutions for this problem are only restricted to textual data. Objective To enable different medical service providers to link their error-prone data in a private way, our aim was to provide a holistic solution by designing and developing a medical record linkage system for medical service providers. Methods To initiate a record linkage, one provider selects one of its collaborators in the Connection Management Module, chooses some attributes of the database to be matched, and establishes the connection with the collaborator after the negotiation. In the Data Matching Module, for error-free data, our solution offered two different choices for cryptographic schemes. For error-prone numerical data, we proposed a newly designed privacy preserving linking algorithm named the Error-Tolerant Linking Algorithm, that allows the error-prone data to be correctly matched if the distance between the two records is below a threshold. Results We designed and developed a comprehensive and user-friendly software system that provides privacy preserving record linkage functions for medical service providers, which meets the regulation of Health Insurance Portability and Accountability Act. It does not require a third party and it is secure in that neither entity can learn the records in the other’s database. Moreover, our novel Error-Tolerant Linking Algorithm implemented in this software can work well with error-prone numerical data. We theoretically proved the correctness and security of our Error-Tolerant Linking Algorithm. We have also fully implemented the software. The experimental results showed that it is reliable and efficient. The design of our software is open so that the existing textual matching methods can be easily integrated into the system. Conclusions Designing algorithms to enable medical records linkage for error-prone numerical data and protect data privacy at the same time is difficult. Our proposed solution does not need a trusted third party and is secure in that in the linking process, neither entity can learn the records in the other’s database. PMID:25600786

  11. Proximal antecedents and correlates of adopted error approach: a self-regulatory perspective.

    PubMed

    Van Dyck, Cathy; Van Hooft, Edwin; De Gilder, Dick; Liesveld, Lillian

    2010-01-01

    The current study aims to further investigate earlier established advantages of an error mastery approach over an error aversion approach. The two main purposes of the study relate to (1) self-regulatory traits (i.e., goal orientation and action-state orientation) that may predict which error approach (mastery or aversion) is adopted, and (2) proximal, psychological processes (i.e., self-focused attention and failure attribution) that relate to adopted error approach. In the current study participants' goal orientation and action-state orientation were assessed, after which they worked on an error-prone task. Results show that learning goal orientation related to error mastery, while state orientation related to error aversion. Under a mastery approach, error occurrence did not result in cognitive resources "wasted" on self-consciousness. Rather, attention went to internal-unstable, thus controllable, improvement oriented causes of error. Participants that had adopted an aversion approach, in contrast, experienced heightened self-consciousness and attributed failure to internal-stable or external causes. These results imply that when working on an error-prone task, people should be stimulated to take on a mastery rather than an aversion approach towards errors.

  12. Lung Basal Stem Cells Rapidly Repair DNA Damage Using the Error-Prone Nonhomologous End-Joining Pathway

    PubMed Central

    Weeden, Clare E.; Chen, Yunshun; Ma, Stephen B.; Hu, Yifang; Ramm, Georg; Sutherland, Kate D.; Smyth, Gordon K.

    2017-01-01

    Lung squamous cell carcinoma (SqCC), the second most common subtype of lung cancer, is strongly associated with tobacco smoking and exhibits genomic instability. The cellular origins and molecular processes that contribute to SqCC formation are largely unexplored. Here we show that human basal stem cells (BSCs) isolated from heavy smokers proliferate extensively, whereas their alveolar progenitor cell counterparts have limited colony-forming capacity. We demonstrate that this difference arises in part because of the ability of BSCs to repair their DNA more efficiently than alveolar cells following ionizing radiation or chemical-induced DNA damage. Analysis of mice harbouring a mutation in the DNA-dependent protein kinase catalytic subunit (DNA-PKcs), a key enzyme in DNA damage repair by nonhomologous end joining (NHEJ), indicated that BSCs preferentially repair their DNA by this error-prone process. Interestingly, polyploidy, a phenomenon associated with genetically unstable cells, was only observed in the human BSC subset. Expression signature analysis indicated that BSCs are the likely cells of origin of human SqCC and that high levels of NHEJ genes in SqCC are correlated with increasing genomic instability. Hence, our results favour a model in which heavy smoking promotes proliferation of BSCs, and their predilection for error-prone NHEJ could lead to the high mutagenic burden that culminates in SqCC. Targeting DNA repair processes may therefore have a role in the prevention and therapy of SqCC. PMID:28125611

  13. DNA double strand break repair in human bladder cancer is error prone and involves microhomology-associated end-joining

    PubMed Central

    Bentley, Johanne; Diggle, Christine P.; Harnden, Patricia; Knowles, Margaret A.; Kiltie, Anne E.

    2004-01-01

    In human cells DNA double strand breaks (DSBs) can be repaired by the non-homologous end-joining (NHEJ) pathway. In a background of NHEJ deficiency, DSBs with mismatched ends can be joined by an error-prone mechanism involving joining between regions of nucleotide microhomology. The majority of joins formed from a DSB with partially incompatible 3′ overhangs by cell-free extracts from human glioblastoma (MO59K) and urothelial (NHU) cell lines were accurate and produced by the overlap/fill-in of mismatched termini by NHEJ. However, repair of DSBs by extracts using tissue from four high-grade bladder carcinomas resulted in no accurate join formation. Junctions were formed by the non-random deletion of terminal nucleotides and showed a preference for annealing at a microhomology of 8 nt buried within the DNA substrate; this process was not dependent on functional Ku70, DNA-PK or XRCC4. Junctions were repaired in the same manner in MO59K extracts in which accurate NHEJ was inactivated by inhibition of Ku70 or DNA-PKcs. These data indicate that bladder tumour extracts are unable to perform accurate NHEJ such that error-prone joining predominates. Therefore, in high-grade tumours mismatched DSBs are repaired by a highly mutagenic, microhomology-mediated, alternative end-joining pathway, a process that may contribute to genomic instability observed in bladder cancer. PMID:15466592

  14. Estimation of a cover-type change matrix from error-prone data

    Treesearch

    Steen Magnussen

    2009-01-01

    Coregistration and classification errors seriously compromise per-pixel estimates of land cover change. A more robust estimation of change is proposed in which adjacent pixels are grouped into 3x3 clusters and treated as a unit of observation. A complete change matrix is recovered in a two-step process. The diagonal elements of a change matrix are recovered from...

  15. Dual processing and diagnostic errors.

    PubMed

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  16. EEG and chaos: Description of underlying dynamics and its relation to dissociative states

    NASA Technical Reports Server (NTRS)

    Ray, William J.

    1994-01-01

    The goal of this work is the identification of states especially as related to the process of error production and lapses of awareness as might be experienced during aviation. Given the need for further articulation of the characteristics of 'error prone state' or 'hazardous state of awareness,' this NASA grant focused on basic ground work for the study of the psychophysiology of these states. In specific, the purpose of this grant was to establish the necessary methodology for addressing three broad questions. The first is how the error prone state should be conceptualized, and whether it is similar to a dissociative state, a hypnotic state, or absent mindedness. Over 1200 subjects completed a variety of psychometric measures reflecting internal states and proneness to mental lapses and absent mindedness; the study suggests that there exists a consistency of patterns displayed by individuals who self-report dissociative experiences such that those individuals who score high on measures of dissociation also score high on measures of absent mindedness, errors, and absorption, but not on scales of hypnotizability. The second broad question is whether some individuals are more prone to enter these states than others. A study of 14 young adults who scored either high or low on the dissociation experiences scale performed a series of six tasks. This study suggests that high and low dissociative individuals arrive at the experiment in similar electrocortical states and perform cognitive tasks (e.g., mental math) in a similar manner; it is in the processing of internal emotional states that differences begin to emerge. The third question to be answered is whether recent research in nonlinear dynamics, i.e., chaos, offer an addition and/or alternative to traditional signal processing methods, i.e., fast Fourier transforms, and whether chaos procedures can be modified to offer additional information useful in identifying brain states. A preliminary review suggests that current nonlinear dynamical techniques such as dimensional analysis can be successfully applied to electrocortical activity. Using the data set developed in the study of the young adults, chaos analyses using the Farmer algorithm were performed; it is concluded that dimensionality measures reflect information not contained in traditional EEG Fourier analysis.

  17. A continuous quality improvement project to reduce medication error in the emergency department.

    PubMed

    Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts

    2013-01-01

    Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.

  18. Clustered Mutation Signatures Reveal that Error-Prone DNA Repair Targets Mutations to Active Genes.

    PubMed

    Supek, Fran; Lehner, Ben

    2017-07-27

    Many processes can cause the same nucleotide change in a genome, making the identification of the mechanisms causing mutations a difficult challenge. Here, we show that clustered mutations provide a more precise fingerprint of mutagenic processes. Of nine clustered mutation signatures identified from >1,000 tumor genomes, three relate to variable APOBEC activity and three are associated with tobacco smoking. An additional signature matches the spectrum of translesion DNA polymerase eta (POLH). In lymphoid cells, these mutations target promoters, consistent with AID-initiated somatic hypermutation. In solid tumors, however, they are associated with UV exposure and alcohol consumption and target the H3K36me3 chromatin of active genes in a mismatch repair (MMR)-dependent manner. These regions normally have a low mutation rate because error-free MMR also targets H3K36me3 chromatin. Carcinogens and error-prone repair therefore redistribute mutations to the more important regions of the genome, contributing a substantial mutation load in many tumors, including driver mutations. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. STARS Proceedings (3-4 December 1991)

    DTIC Science & Technology

    1991-12-04

    PROJECT PROCESS OBJECTIVES & ASSOCIATED METRICS: Prioritize ECPs: complexity & error-history measures 0 Make vs Buy decisions: Effort & Quality (or...history measures, error- proneness and past histories of trouble with particular modules are very useful measures. Make vs Buy decisions: Does the...Effort offset the gain in Quality relative to buy ... Effort and Quality (or defect rate) histories give helpful indications of how to make this decision

  20. Defense Mapping Agency (DMA) Raster-to-Vector Analysis

    DTIC Science & Technology

    1984-11-30

    model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected

  1. Improving travel information products via robust estimation techniques : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    Traffic-monitoring systems, such as those using loop detectors, are prone to coverage gaps, arising from sensor noise, processing errors and : transmission problems. Such gaps adversely affect the accuracy of Advanced Traveler Information Systems. Th...

  2. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  3. Effects of Shame and Guilt on Error Reporting Among Obstetric Clinicians.

    PubMed

    Zabari, Mara Lynne; Southern, Nancy L

    2018-04-17

    To understand how the experiences of shame and guilt, coupled with organizational factors, affect error reporting by obstetric clinicians. Descriptive cross-sectional. A sample of 84 obstetric clinicians from three maternity units in Washington State. In this quantitative inquiry, a variant of the Test of Self-Conscious Affect was used to measure proneness to guilt and shame. In addition, we developed questions to assess attitudes regarding concerns about damaging one's reputation if an error was reported and the choice to keep an error to oneself. Both assessments were analyzed separately and then correlated to identify relationships between constructs. Interviews were used to identify organizational factors that affect error reporting. As a group, mean scores indicated that obstetric clinicians would not choose to keep errors to themselves. However, bivariate correlations showed that proneness to shame was positively correlated to concerns about one's reputation if an error was reported, and proneness to guilt was negatively correlated with keeping errors to oneself. Interview data analysis showed that Past Experience with Responses to Errors, Management and Leadership Styles, Professional Hierarchy, and Relationships With Colleagues were influential factors in error reporting. Although obstetric clinicians want to report errors, their decisions to report are influenced by their proneness to guilt and shame and perceptions of the degree to which organizational factors facilitate or create barriers to restore their self-images. Findings underscore the influence of the organizational context on clinicians' decisions to report errors. Copyright © 2018 AWHONN, the Association of Women’s Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  4. Simultaneous treatment of unspecified heteroskedastic model error distribution and mismeasured covariates for restricted moment models.

    PubMed

    Garcia, Tanya P; Ma, Yanyuan

    2017-10-01

    We develop consistent and efficient estimation of parameters in general regression models with mismeasured covariates. We assume the model error and covariate distributions are unspecified, and the measurement error distribution is a general parametric distribution with unknown variance-covariance. We construct root- n consistent, asymptotically normal and locally efficient estimators using the semiparametric efficient score. We do not estimate any unknown distribution or model error heteroskedasticity. Instead, we form the estimator under possibly incorrect working distribution models for the model error, error-prone covariate, or both. Empirical results demonstrate robustness to different incorrect working models in homoscedastic and heteroskedastic models with error-prone covariates.

  5. The Influence of Improper Sets of Information on Judgment: How Irrelevant Information Can Bias Judged Probability

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Sprenger, Amber

    2006-01-01

    This article introduces 2 new sources of bias in probability judgment, discrimination failure and inhibition failure, which are conceptualized as arising from an interaction between error prone memory processes and a support theory like comparison process. Both sources of bias stem from the influence of irrelevant information on participants'…

  6. Inducible error-prone repair in B. subtilis. Final report, September 1, 1979-June 30, 1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasbin, R. E.

    1981-06-01

    The research performed under this contract has been concentrated on the relationship between inducible DNA repair systems, mutagenesis and the competent state in the gram positive bacterium Bacillus subtilis. The following results have been obtained from this research: (1) competent Bacillus subtilis cells have been developed into a sensitive tester system for carcinogens; (2) competent B. subtilis cells have an efficient excision-repair system, however, this system will not function on bacteriophage DNA taken into the cell via the process of transfection; (3) DNA polymerase III is essential in the mechanism of the process of W-reactivation; (4) B. subtilis strains curedmore » of their defective prophages have been isolated and are now being developed for gene cloning systems; (5) protoplasts of B. subtilis have been shown capable of acquiring DNA repair enzymes (i.e., enzyme therapy); and (6) a plasmid was characterized which enhanced inducible error-prone repair in a gram positive organism.« less

  7. Gene-targeted Random Mutagenesis to Select Heterochromatin-destabilizing Proteasome Mutants in Fission Yeast.

    PubMed

    Seo, Hogyu David; Lee, Daeyoup

    2018-05-15

    Random mutagenesis of a target gene is commonly used to identify mutations that yield the desired phenotype. Of the methods that may be used to achieve random mutagenesis, error-prone PCR is a convenient and efficient strategy for generating a diverse pool of mutants (i.e., a mutant library). Error-prone PCR is the method of choice when a researcher seeks to mutate a pre-defined region, such as the coding region of a gene while leaving other genomic regions unaffected. After the mutant library is amplified by error-prone PCR, it must be cloned into a suitable plasmid. The size of the library generated by error-prone PCR is constrained by the efficiency of the cloning step. However, in the fission yeast, Schizosaccharomyces pombe, the cloning step can be replaced by the use of a highly efficient one-step fusion PCR to generate constructs for transformation. Mutants of desired phenotypes may then be selected using appropriate reporters. Here, we describe this strategy in detail, taking as an example, a reporter inserted at centromeric heterochromatin.

  8. MRO DKF Post-Processing Tool

    NASA Technical Reports Server (NTRS)

    Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat

    2008-01-01

    This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.

  9. Adaptive constructive processes and the future of memory.

    PubMed

    Schacter, Daniel L

    2012-11-01

    Memory serves critical functions in everyday life but is also prone to error. This article examines adaptive constructive processes, which play a functional role in memory and cognition but can also produce distortions, errors, and illusions. The article describes several types of memory errors that are produced by adaptive constructive processes and focuses in particular on the process of imagining or simulating events that might occur in one's personal future. Simulating future events relies on many of the same cognitive and neural processes as remembering past events, which may help to explain why imagination and memory can be easily confused. The article considers both pitfalls and adaptive aspects of future event simulation in the context of research on planning, prediction, problem solving, mind-wandering, prospective and retrospective memory, coping and positivity bias, and the interconnected set of brain regions known as the default network. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  10. Meiotic Divisions: No Place for Gender Equality.

    PubMed

    El Yakoubi, Warif; Wassmann, Katja

    2017-01-01

    In multicellular organisms the fusion of two gametes with a haploid set of chromosomes leads to the formation of the zygote, the first cell of the embryo. Accurate execution of the meiotic cell division to generate a female and a male gamete is required for the generation of healthy offspring harboring the correct number of chromosomes. Unfortunately, meiosis is error prone. This has severe consequences for fertility and under certain circumstances, health of the offspring. In humans, female meiosis is extremely error prone. In this chapter we will compare male and female meiosis in humans to illustrate why and at which frequency errors occur, and describe how this affects pregnancy outcome and health of the individual. We will first introduce key notions of cell division in meiosis and how they differ from mitosis, followed by a detailed description of the events that are prone to errors during the meiotic divisions.

  11. RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error.

    PubMed

    Huang, Chengqiang; Yang, Youchang; Wu, Bo; Yu, Weize

    2018-06-01

    The sub-pixel arrangement of the RGBG panel and the image with RGB format are different and the algorithm that converts RGB to RGBG is urgently needed to display an image with RGB arrangement on the RGBG panel. However, the information loss is still large although color fringing artifacts are weakened in the published papers that study this conversion. In this paper, an RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error (EDMSE) is proposed. The main points of innovation include the following: (1) the edge detection is first proposed to distinguish image details with serious color fringing artifacts and image details which are prone to be lost in the process of RGB-RGBG conversion; (2) for image details with serious color fringing artifacts, the weighting factor 0.5 is applied to weaken the color fringing artifacts; and (3) for image details that are prone to be lost in the process of RGB-RGBG conversion, a special mechanism to minimize square error is proposed. The experiment shows that the color fringing artifacts are slightly improved by EDMSE, and the values of MSE of the image processed are 19.6% and 7% smaller than those of the image processed by the direct assignment and weighting factor algorithm, respectively. The proposed algorithm is implemented on a field programmable gate array to enable the image display on the RGBG panel.

  12. Two-Step Fair Scheduling of Continuous Media Streams over Error-Prone Wireless Channels

    NASA Astrophysics Data System (ADS)

    Oh, Soohyun; Lee, Jin Wook; Park, Taejoon; Jo, Tae-Chang

    In wireless cellular networks, streaming of continuous media (with strict QoS requirements) over wireless links is challenging due to their inherent unreliability characterized by location-dependent, bursty errors. To address this challenge, we present a two-step scheduling algorithm for a base station to provide streaming of continuous media to wireless clients over the error-prone wireless links. The proposed algorithm is capable of minimizing the packet loss rate of individual clients in the presence of error bursts, by transmitting packets in the round-robin manner and also adopting a mechanism for channel prediction and swapping.

  13. Thermoadaptation-Directed Enzyme Evolution in an Error-Prone Thermophile Derived from Geobacillus kaustophilus HTA426

    PubMed Central

    Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi

    2014-01-01

    Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. PMID:25326311

  14. Thermoadaptation-directed enzyme evolution in an error-prone thermophile derived from Geobacillus kaustophilus HTA426.

    PubMed

    Suzuki, Hirokazu; Kobayashi, Jyumpei; Wada, Keisuke; Furukawa, Megumi; Doi, Katsumi

    2015-01-01

    Thermostability is an important property of enzymes utilized for practical applications because it allows long-term storage and use as catalysts. In this study, we constructed an error-prone strain of the thermophile Geobacillus kaustophilus HTA426 and investigated thermoadaptation-directed enzyme evolution using the strain. A mutation frequency assay using the antibiotics rifampin and streptomycin revealed that G. kaustophilus had substantially higher mutability than Escherichia coli and Bacillus subtilis. The predominant mutations in G. kaustophilus were A · T→G · C and C · G→T · A transitions, implying that the high mutability of G. kaustophilus was attributable in part to high-temperature-associated DNA damage during growth. Among the genes that may be involved in DNA repair in G. kaustophilus, deletions of the mutSL, mutY, ung, and mfd genes markedly enhanced mutability. These genes were subsequently deleted to construct an error-prone thermophile that showed much higher (700- to 9,000-fold) mutability than the parent strain. The error-prone strain was auxotrophic for uracil owing to the fact that the strain was deficient in the intrinsic pyrF gene. Although the strain harboring Bacillus subtilis pyrF was also essentially auxotrophic, cells became prototrophic after 2 days of culture under uracil starvation, generating B. subtilis PyrF variants with an enhanced half-denaturation temperature of >10°C. These data suggest that this error-prone strain is a promising host for thermoadaptation-directed evolution to generate thermostable variants from thermolabile enzymes. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  15. Threat engagement, disengagement, and sensitivity bias in worry-prone individuals as measured by an emotional go/no-go task.

    PubMed

    Gole, Markus; Köchel, Angelika; Schäfer, Axel; Schienle, Anne

    2012-03-01

    The goal of the present study was to investigate a threat engagement, disengagement, and sensitivity bias in individuals suffering from pathological worry. Twenty participants high in worry proneness and 16 control participants low in worry proneness completed an emotional go/no-go task with worry-related threat words and neutral words. Shorter reaction times (i.e., threat engagement bias), smaller omission error rates (i.e., threat sensitivity bias), and larger commission error rates (i.e., threat disengagement bias) emerged only in the high worry group when worry-related words constituted the go-stimuli and neutral words the no-go stimuli. Also, smaller omission error rates as well as larger commission error rates were observed in the high worry group relative to the low worry group when worry-related go stimuli and neutral no-go stimuli were used. The obtained results await further replication within a generalized anxiety disorder sample. Also, further samples should include men as well. Our data suggest that worry-prone individuals are threat-sensitive, engage more rapidly with aversion, and disengage harder. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Associations between intrusive thoughts, reality discrimination and hallucination-proneness in healthy young adults.

    PubMed

    Smailes, David; Meins, Elizabeth; Fernyhough, Charles

    2015-01-01

    People who experience intrusive thoughts are at increased risk of developing hallucinatory experiences, as are people who have weak reality discrimination skills. No study has yet examined whether these two factors interact to make a person especially prone to hallucinatory experiences. The present study examined this question in a non-clinical sample. Participants were 160 students, who completed a reality discrimination task, as well as self-report measures of cannabis use, negative affect, intrusive thoughts and auditory hallucination-proneness. The possibility of an interaction between reality discrimination performance and level of intrusive thoughts was assessed using multiple regression. The number of reality discrimination errors and level of intrusive thoughts were independent predictors of hallucination-proneness. The reality discrimination errors × intrusive thoughts interaction term was significant, with participants who made many reality discrimination errors and reported high levels of intrusive thoughts being especially prone to hallucinatory experiences. Hallucinatory experiences are more likely to occur in people who report high levels of intrusive thoughts and have weak reality discrimination skills. If applicable to clinical samples, these findings suggest that improving patients' reality discrimination skills and reducing the number of intrusive thoughts they experience may reduce the frequency of hallucinatory experiences.

  17. Population size estimation in Yellowstone wolves with error-prone noninvasive microsatellite genotypes.

    PubMed

    Creel, Scott; Spong, Goran; Sands, Jennifer L; Rotella, Jay; Zeigle, Janet; Joe, Lawrence; Murphy, Kerry M; Smith, Douglas

    2003-07-01

    Determining population sizes can be difficult, but is essential for conservation. By counting distinct microsatellite genotypes, DNA from noninvasive samples (hair, faeces) allows estimation of population size. Problems arise because genotypes from noninvasive samples are error-prone, but genotyping errors can be reduced by multiple polymerase chain reaction (PCR). For faecal genotypes from wolves in Yellowstone National Park, error rates varied substantially among samples, often above the 'worst-case threshold' suggested by simulation. Consequently, a substantial proportion of multilocus genotypes held one or more errors, despite multiple PCR. These genotyping errors created several genotypes per individual and caused overestimation (up to 5.5-fold) of population size. We propose a 'matching approach' to eliminate this overestimation bias.

  18. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  19. Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.

    PubMed

    Yamamoto, Loren; Kanemori, Joan

    2010-06-01

    Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  20. A prospective three-step intervention study to prevent medication errors in drug handling in paediatric care.

    PubMed

    Niemann, Dorothee; Bertsche, Astrid; Meyrath, David; Koepf, Ellen D; Traiser, Carolin; Seebald, Katja; Schmitt, Claus P; Hoffmann, Georg F; Haefeli, Walter E; Bertsche, Thilo

    2015-01-01

    To prevent medication errors in drug handling in a paediatric ward. One in five preventable adverse drug events in hospitalised children is caused by medication errors. Errors in drug prescription have been studied frequently, but data regarding drug handling, including drug preparation and administration, are scarce. A three-step intervention study including monitoring procedure was used to detect and prevent medication errors in drug handling. After approval by the ethics committee, pharmacists monitored drug handling by nurses on an 18-bed paediatric ward in a university hospital prior to and following each intervention step. They also conducted a questionnaire survey aimed at identifying knowledge deficits. Each intervention step targeted different causes of errors. The handout mainly addressed knowledge deficits, the training course addressed errors caused by rule violations and slips, and the reference book addressed knowledge-, memory- and rule-based errors. The number of patients who were subjected to at least one medication error in drug handling decreased from 38/43 (88%) to 25/51 (49%) following the third intervention, and the overall frequency of errors decreased from 527 errors in 581 processes (91%) to 116/441 (26%). The issue of the handout reduced medication errors caused by knowledge deficits regarding, for instance, the correct 'volume of solvent for IV drugs' from 49-25%. Paediatric drug handling is prone to errors. A three-step intervention effectively decreased the high frequency of medication errors by addressing the diversity of their causes. Worldwide, nurses are in charge of drug handling, which constitutes an error-prone but often-neglected step in drug therapy. Detection and prevention of errors in daily routine is necessary for a safe and effective drug therapy. Our three-step intervention reduced errors and is suitable to be tested in other wards and settings. © 2014 John Wiley & Sons Ltd.

  1. Trust and the Compliance-Reliance Paradigm: The Effects of Risk, Error Bias, and Reliability on Trust and Dependence.

    PubMed

    Chancey, Eric T; Bliss, James P; Yamani, Yusuke; Handley, Holly A H

    2017-05-01

    This study provides a theoretical link between trust and the compliance-reliance paradigm. We propose that for trust mediation to occur, the operator must be presented with a salient choice, and there must be an element of risk for dependence. Research suggests that false alarms and misses affect dependence via two independent processes, hypothesized as trust in signals and trust in nonsignals. These two trust types manifest in categorically different behaviors: compliance and reliance. Eighty-eight participants completed a primary flight task and a secondary signaling system task. Participants evaluated their trust according to the informational bases of trust: performance, process, and purpose. Participants were in a high- or low-risk group. Signaling systems varied by reliability (90%, 60%) within subjects and error bias (false alarm prone, miss prone) between subjects. False-alarm rate affected compliance but not reliance. Miss rate affected reliance but not compliance. Mediation analyses indicated that trust mediated the relationship between false-alarm rate and compliance. Bayesian mediation analyses favored evidence indicating trust did not mediate miss rate and reliance. Conditional indirect effects indicated that factors of trust mediated the relationship between false-alarm rate and compliance (i.e., purpose) and reliance (i.e., process) but only in the high-risk group. The compliance-reliance paradigm is not the reflection of two types of trust. This research could be used to update training and design recommendations that are based upon the assumption that trust causes operator responses regardless of error bias.

  2. Absence of Mutagenic Activity of Hycanthone in Serratia marcescens,

    DTIC Science & Technology

    1986-05-29

    repair system but is enhanced by the plasmid pKMl01, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , .1...enhanced by the plasmid pKM10, which mediates the inducible error-prone repair system. Hycanthone, like proflavin , intercalates between the stacked bases...Roth (1974) lave suggested that proflavin , which has a planar triple ring structure similar to hycanthone, interacts with DNA, which upon replication

  3. Pre-Modeling Ensures Accurate Solid Models

    ERIC Educational Resources Information Center

    Gow, George

    2010-01-01

    Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…

  4. Financial and clinical governance implications of clinical coding accuracy in neurosurgery: a multidisciplinary audit.

    PubMed

    Haliasos, N; Rezajooi, K; O'neill, K S; Van Dellen, J; Hudovsky, Anita; Nouraei, Sar

    2010-04-01

    Clinical coding is the translation of documented clinical activities during an admission to a codified language. Healthcare Resource Groupings (HRGs) are derived from coding data and are used to calculate payment to hospitals in England, Wales and Scotland and to conduct national audit and benchmarking exercises. Coding is an error-prone process and an understanding of its accuracy within neurosurgery is critical for financial, organizational and clinical governance purposes. We undertook a multidisciplinary audit of neurosurgical clinical coding accuracy. Neurosurgeons trained in coding assessed the accuracy of 386 patient episodes. Where clinicians felt a coding error was present, the case was discussed with an experienced clinical coder. Concordance between the initial coder-only clinical coding and the final clinician-coder multidisciplinary coding was assessed. At least one coding error occurred in 71/386 patients (18.4%). There were 36 diagnosis and 93 procedure errors and in 40 cases, the initial HRG changed (10.4%). Financially, this translated to pound111 revenue-loss per patient episode and projected to pound171,452 of annual loss to the department. 85% of all coding errors were due to accumulation of coding changes that occurred only once in the whole data set. Neurosurgical clinical coding is error-prone. This is financially disadvantageous and with the coding data being the source of comparisons within and between departments, coding inaccuracies paint a distorted picture of departmental activity and subspecialism in audit and benchmarking. Clinical engagement improves accuracy and is encouraged within a clinical governance framework.

  5. Clinical errors that can occur in the treatment decision-making process in psychotherapy.

    PubMed

    Park, Jake; Goode, Jonathan; Tompkins, Kelley A; Swift, Joshua K

    2016-09-01

    Clinical errors occur in the psychotherapy decision-making process whenever a less-than-optimal treatment or approach is chosen when working with clients. A less-than-optimal approach may be one that a client is unwilling to try or fully invest in based on his/her expectations and preferences, or one that may have little chance of success based on contraindications and/or limited research support. The doctor knows best and the independent choice models are two decision-making models that are frequently used within psychology, but both are associated with an increased likelihood of errors in the treatment decision-making process. In particular, these models fail to integrate all three components of the definition of evidence-based practice in psychology (American Psychological Association, 2006). In this article we describe both models and provide examples of clinical errors that can occur in each. We then introduce the shared decision-making model as an alternative that is less prone to clinical errors. PsycINFO Database Record (c) 2016 APA, all rights reserved

  6. Addition to the Lewis Chemical Equilibrium Program to allow computation from coal composition data

    NASA Technical Reports Server (NTRS)

    Sevigny, R.

    1980-01-01

    Changes made to the Coal Gasification Project are reported. The program was developed by equilibrium combustion in rocket engines. It can be applied directly to the entrained flow coal gasification process. The particular problem addressed is the reduction of the coal data into a form suitable to the program, since the manual process is involved and error prone. A similar problem in relating the normal output of the program to parameters meaningful to the coal gasification process is also addressed.

  7. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Error-Prone Model Derived from 1978-1979 Quality Control Study. Data Report. [Task 3.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; Kuchak, JoAnn

    An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…

  8. A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction

    NASA Astrophysics Data System (ADS)

    Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf

    2018-07-01

    Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.

  9. Error-free versus mutagenic processing of genomic uracil--relevance to cancer.

    PubMed

    Krokan, Hans E; Sætrom, Pål; Aas, Per Arne; Pettersen, Henrik Sahlin; Kavli, Bodil; Slupphaug, Geir

    2014-07-01

    Genomic uracil is normally processed essentially error-free by base excision repair (BER), with mismatch repair (MMR) as an apparent backup for U:G mismatches. Nuclear uracil-DNA glycosylase UNG2 is the major enzyme initiating BER of uracil of U:A pairs as well as U:G mismatches. Deficiency in UNG2 results in several-fold increases in genomic uracil in mammalian cells. Thus, the alternative uracil-removing glycosylases, SMUG1, TDG and MBD4 cannot efficiently complement UNG2-deficiency. A major function of SMUG1 is probably to remove 5-hydroxymethyluracil from DNA with general back-up for UNG2 as a minor function. TDG and MBD4 remove deamination products U or T mismatched to G in CpG/mCpG contexts, but may have equally or more important functions in development, epigenetics and gene regulation. Genomic uracil was previously thought to arise only from spontaneous cytosine deamination and incorporation of dUMP, generating U:G mismatches and U:A pairs, respectively. However, the identification of activation-induced cytidine deaminase (AID) and other APOBEC family members as DNA-cytosine deaminases has spurred renewed interest in the processing of genomic uracil. Importantly, AID triggers the adaptive immune response involving error-prone processing of U:G mismatches, but also contributes to B-cell lymphomagenesis. Furthermore, mutational signatures in a substantial fraction of other human cancers are consistent with APOBEC-induced mutagenesis, with U:G mismatches as prime suspects. Mutations can be caused by replicative polymerases copying uracil in U:G mismatches, or by translesion polymerases that insert incorrect bases opposite abasic sites after uracil-removal. In addition, kataegis, localized hypermutations in one strand in the vicinity of genomic rearrangements, requires APOBEC protein, UNG2 and translesion polymerase REV1. What mechanisms govern error-free versus error prone processing of uracil in DNA remains unclear. In conclusion, genomic uracil is an essential intermediate in adaptive immunity and innate antiviral responses, but may also be a fundamental cause of a wide range of malignancies. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Developing a Machine-Supported Coding System for Constructed-Response Items in PISA. Research Report. ETS RR-17-47

    ERIC Educational Resources Information Center

    Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias

    2017-01-01

    Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…

  11. Efficient Variational Quantum Simulator Incorporating Active Error Minimization

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2017-04-01

    One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.

  12. Noise-induced errors in geophysical parameter estimation from retarding potential analyzers in low Earth orbit

    NASA Astrophysics Data System (ADS)

    Debchoudhury, Shantanab; Earle, Gregory

    2017-04-01

    Retarding Potential Analyzers (RPA) have a rich flight heritage. Standard curve-fitting analysis techniques exist that can infer state variables in the ionospheric plasma environment from RPA data, but the estimation process is prone to errors arising from a number of sources. Previous work has focused on the effects of grid geometry on uncertainties in estimation; however, no prior study has quantified the estimation errors due to additive noise. In this study, we characterize the errors in estimation of thermal plasma parameters by adding noise to the simulated data derived from the existing ionospheric models. We concentrate on low-altitude, mid-inclination orbits since a number of nano-satellite missions are focused on this region of the ionosphere. The errors are quantified and cross-correlated for varying geomagnetic conditions.

  13. Somatic immunoglobulin hypermutation

    PubMed Central

    Diaz, Marilyn; Casali, Paolo

    2015-01-01

    Immunoglobulin hypermutation provides the structural correlate for the affinity maturation of the antibody response. Characteristic modalities of this mechanism include a preponderance of point-mutations with prevalence of transitions over transversions, and the mutational hotspot RGYW sequence. Recent evidence suggests a mechanism whereby DNA-breaks induce error-prone DNA synthesis in immunoglobulin V(D)J regions by error-prone DNA polymerases. The nature of the targeting mechanism and the trans-factors effecting such breaks and their repair remain to be determined. PMID:11869898

  14. Topological quantum computing with a very noisy network and local error rates approaching one percent.

    PubMed

    Nickerson, Naomi H; Li, Ying; Benjamin, Simon C

    2013-01-01

    A scalable quantum computer could be built by networking together many simple processor cells, thus avoiding the need to create a single complex structure. The difficulty is that realistic quantum links are very error prone. A solution is for cells to repeatedly communicate with each other and so purify any imperfections; however prior studies suggest that the cells themselves must then have prohibitively low internal error rates. Here we describe a method by which even error-prone cells can perform purification: groups of cells generate shared resource states, which then enable stabilization of topologically encoded data. Given a realistically noisy network (≥10% error rate) we find that our protocol can succeed provided that intra-cell error rates for initialisation, state manipulation and measurement are below 0.82%. This level of fidelity is already achievable in several laboratory systems.

  15. Towards an evaluation framework for Laboratory Information Systems.

    PubMed

    Yusof, Maryati M; Arifin, Azila

    Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  16. Inducible DNA-repair systems in yeast: competition for lesions.

    PubMed

    Mitchel, R E; Morrison, D P

    1987-03-01

    DNA lesions may be recognized and repaired by more than one DNA-repair process. If two repair systems with different error frequencies have overlapping lesion specificity and one or both is inducible, the resulting variable competition for the lesions can change the biological consequences of these lesions. This concept was demonstrated by observing mutation in yeast cells (Saccharomyces cerevisiae) exposed to combinations of mutagens under conditions which influenced the induction of error-free recombinational repair or error-prone repair. Total mutation frequency was reduced in a manner proportional to the dose of 60Co-gamma- or 254 nm UV radiation delivered prior to or subsequent to an MNNG exposure. Suppression was greater per unit radiation dose in cells gamma-irradiated in O2 as compared to N2. A rad3 (excision-repair) mutant gave results similar to wild-type but mutation in a rad52 (rec-) mutant exposed to MNNG was not suppressed by radiation. Protein-synthesis inhibition with heat shock or cycloheximide indicated that it was the mutation due to MNNG and not that due to radiation which had changed. These results indicate that MNNG lesions are recognized by both the recombinational repair system and the inducible error-prone system, but that gamma-radiation induction of error-free recombinational repair resulted in increased competition for the lesions, thereby reducing mutation. Similarly, gamma-radiation exposure resulted in a radiation dose-dependent reduction in mutation due to MNU, EMS, ENU and 8-MOP + UVA, but no reduction in mutation due to MMS. These results suggest that the number of mutational MMS lesions recognizable by the recombinational repair system must be very small relative to those produced by the other agents. MNNG induction of the inducible error-prone systems however, did not alter mutation frequencies due to ENU or MMS exposure but, in contrast to radiation, increased the mutagenic effectiveness of EMS. These experiments demonstrate that in this lower eukaryote, mutagen exposure does not necessarily result in a fixed risk of mutation, but that the risk can be markedly influenced by a variety of external stimuli including heat shock or exposure to other mutagens.

  17. DNA double-strand-break complexity levels and their possible contributions to the probability for error-prone processing and repair pathway choice.

    PubMed

    Schipler, Agnes; Iliakis, George

    2013-09-01

    Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice.

  18. Associations of hallucination proneness with free-recall intrusions and response bias in a nonclinical sample.

    PubMed

    Brébion, Gildas; Larøi, Frank; Van der Linden, Martial

    2010-10-01

    Hallucinations in patients with schizophrenia have been associated with a liberal response bias in signal detection and recognition tasks and with various types of source-memory error. We investigated the associations of hallucination proneness with free-recall intrusions and false recognitions of words in a nonclinical sample. A total of 81 healthy individuals were administered a verbal memory task involving free recall and recognition of one nonorganizable and one semantically organizable list of words. Hallucination proneness was assessed by means of a self-rating scale. Global hallucination proneness was associated with free-recall intrusions in the nonorganizable list and with a response bias reflecting tendency to make false recognitions of nontarget words in both types of list. The verbal hallucination score was associated with more intrusions and with a reduced tendency to make false recognitions of words. The associations between global hallucination proneness and two types of verbal memory error in a nonclinical sample corroborate those observed in patients with schizophrenia and suggest that common cognitive mechanisms underlie hallucinations in psychiatric and nonclinical individuals.

  19. Safeguarding the process of drug administration with an emphasis on electronic support tools

    PubMed Central

    Seidling, Hanna M; Lampert, Anette; Lohmann, Kristina; Schiele, Julia T; Send, Alexander J F; Witticke, Diana; Haefeli, Walter E

    2013-01-01

    Aims The aim of this work is to understand the process of drug administration and identify points in the workflow that resulted in interventions by clinical information systems in order to improve patient safety. Methods To identify a generic way to structure the drug administration process we performed peer-group discussions and supplemented these discussions with a literature search for studies reporting errors in drug administration and strategies for their prevention. Results We concluded that the drug administration process might consist of up to 11 sub-steps, which can be grouped into the four sub-processes of preparation, personalization, application and follow-up. Errors in drug handling and administration are diverse and frequent and in many cases not caused by the patient him/herself, but by family members or nurses. Accordingly, different prevention strategies have been set in place with relatively few approaches involving e-health technology. Conclusions A generic structuring of the administration process and particular error-prone sub-steps may facilitate the allocation of prevention strategies and help to identify research gaps. PMID:24007450

  20. How Alterations in the Cdt1 Expression Lead to Gene Amplification in Breast Cancer

    DTIC Science & Technology

    2011-07-01

    absence of extrinsic DNA damage. We measured the TLS activity by measuring the mutation frequency in a supF gene (in a shuttle vector) subjected to UV...induced DNA damage before its introduction into the cells. Error-prone TLS activity will mutate the supF gene , which is scored by a blue-white colony...Figure 4A). Sequencing of the mutant supF genes , revealed a mutation spectrum consistent with error prone TLS (Supplemental Table 1). Significantly

  1. Belief-bias reasoning in non-clinical delusion-prone individuals.

    PubMed

    Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R

    2017-03-01

    It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Belief-bias reasoning in non-clinical delusion-prone individuals.

    PubMed

    Anandakumar, T; Connaughton, E; Coltheart, M; Langdon, R

    2017-09-01

    It has been proposed that people with delusions have difficulty inhibiting beliefs (i.e., "doxastic inhibition") so as to reason about them as if they might not be true. We used a continuity approach to test this proposal in non-clinical adults scoring high and low in psychometrically assessed delusion-proneness. High delusion-prone individuals were expected to show greater difficulty than low delusion-prone individuals on "conflict" items of a "belief-bias" reasoning task (i.e. when required to reason logically about statements that conflicted with reality), but not on "non-conflict" items. Twenty high delusion-prone and twenty low delusion-prone participants (according to the Peters et al. Delusions Inventory) completed a belief-bias reasoning task and tests of IQ, working memory and general inhibition (Excluded Letter Fluency, Stroop and Hayling Sentence Completion). High delusion-prone individuals showed greater difficulty than low delusion-prone individuals on the Stroop and Excluded Letter Fluency tests of inhibition, but no greater difficulty on the conflict versus non-conflict items of the belief-bias task. They did, however, make significantly more errors overall on the belief-bias task, despite controlling for IQ, working memory and general inhibitory control. The study had a relatively small sample size and used non-clinical participants to test a theory of cognitive processing in individuals with clinically diagnosed delusions. Results failed to support a role for doxastic inhibitory failure in non-clinical delusion-prone individuals. These individuals did, however, show difficulty with conditional reasoning about statements that may or may not conflict with reality, independent of any general cognitive or inhibitory deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Improvement in Patient Transfer Process From the Operating Room to the PICU Using a Lean and Six Sigma-Based Quality Improvement Project.

    PubMed

    Gleich, Stephen J; Nemergut, Michael E; Stans, Anthony A; Haile, Dawit T; Feigal, Scott A; Heinrich, Angela L; Bosley, Christopher L; Tripathi, Sandeep

    2016-08-01

    Ineffective and inefficient patient transfer processes can increase the chance of medical errors. Improvements in such processes are high-priority local institutional and national patient safety goals. At our institution, nonintubated postoperative pediatric patients are first admitted to the postanesthesia care unit before transfer to the PICU. This quality improvement project was designed to improve the patient transfer process from the operating room (OR) to the PICU. After direct observation of the baseline process, we introduced a structured, direct OR-PICU transfer process for orthopedic spinal fusion patients. We performed value stream mapping of the process to determine error-prone and inefficient areas. We evaluated primary outcome measures of handoff error reduction and the overall efficiency of patient transfer process time. Staff satisfaction was evaluated as a counterbalance measure. With the introduction of the new direct OR-PICU patient transfer process, the handoff communication error rate improved from 1.9 to 0.3 errors per patient handoff (P = .002). Inefficiency (patient wait time and non-value-creating activity) was reduced from 90 to 32 minutes. Handoff content was improved with fewer information omissions (P < .001). Staff satisfaction significantly improved among nearly all PICU providers. By using quality improvement methodology to design and implement a new direct OR-PICU transfer process with a structured multidisciplinary verbal handoff, we achieved sustained improvements in patient safety and efficiency. Handoff communication was enhanced, with fewer errors and content omissions. The new process improved efficiency, with high staff satisfaction. Copyright © 2016 by the American Academy of Pediatrics.

  4. Neuro-Oscillatory Mechanisms of Intersensory Selective Attention and Task Switching in School-Aged Children, Adolescents and Young Adults

    ERIC Educational Resources Information Center

    Murphy, Jeremy W.; Foxe, John J.; Molholm, Sophie

    2016-01-01

    The ability to attend to one among multiple sources of information is central to everyday functioning. Just as central is the ability to switch attention among competing inputs as the task at hand changes. Such processes develop surprisingly slowly, such that even into adolescence, we remain slower and more error prone at switching among tasks…

  5. Automated lattice data generation

    NASA Astrophysics Data System (ADS)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  6. Increased Perceptual and Conceptual Processing Difficulty Makes the Immeasurable Measurable: Negative Priming in the Absence of Probe Distractors

    ERIC Educational Resources Information Center

    Frings, Christian; Spence, Charles

    2011-01-01

    Negative priming (NP) refers to the finding that people's responses to probe targets previously presented as prime distractors are usually slower and more error prone than to unrepeated stimuli. In a typical NP experiment, each probe target is accompanied by a distractor. It is an accepted, albeit puzzling, finding that the NP effect depends on…

  7. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  8. An abstract language for specifying Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1986-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  9. Fidelity of DNA Replication in Normal and Malignant Human Breast Cells

    DTIC Science & Technology

    1998-07-01

    synthesome has been extensively demonstrated to carry out full length DNA replication in vitro, and to accurately depict the DNA replication process as it...occurs in the intact cell. By examining the fidelity of the DNA replication process carried out by the DNA synthesome from a number of breast cell types...we have demonstrated for the first time, that the cellular DNA replication machinery of malignant human breast cells is significantly more error-prone than that of non- malignant human breast cells.

  10. Comprehensive analysis of a medication dosing error related to CPOE.

    PubMed

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  11. Improved acid tolerance of Lactobacillus pentosus by error-prone whole genome amplification.

    PubMed

    Ye, Lidan; Zhao, Hua; Li, Zhi; Wu, Jin Chuan

    2013-05-01

    Acid tolerance of Lactobacillus pentosus ATCC 8041 was improved by error-prone amplification of its genomic DNA using random primers and Taq DNA polymerase. The resulting amplification products were transferred into wild-type L. pentosus by electroporation and the transformants were screened for growth on low-pH agar plates. After only one round of mutation, one mutant (MT3) was identified that was able to completely consume 20 g/L of glucose to produce lactic acid at a yield of 95% in 1L MRS medium at pH 3.8 within 36 h, whereas no growth or lactic acid production was observed for the wild-type strain under the same conditions. The acid tolerance of mutant MT3 remained genetically stable for at least 25 subcultures. Therefore, the error-prone whole genome amplification technique is a very powerful tool for improving phenotypes of this lactic acid bacterium and may also be applicable for other microorganisms. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. [Effect of Mn(II) on the error-prone DNA polymerase iota activity in extracts from human normal and tumor cells].

    PubMed

    Lakhin, A V; Efremova, A S; Makarova, I V; Grishina, E E; Shram, S I; Tarantul, V Z; Gening, L V

    2013-01-01

    The DNA polymerase iota (Pol iota), which has some peculiar features and is characterized by an extremely error-prone DNA synthesis, belongs to the group of enzymes preferentially activated by Mn2+ instead of Mg2+. In this work, the effect of Mn2+ on DNA synthesis in cell extracts from a) normal human and murine tissues, b) human tumor (uveal melanoma), and c) cultured human tumor cell lines SKOV-3 and HL-60 was tested. Each group displayed characteristic features of Mn-dependent DNA synthesis. The changes in the Mn-dependent DNA synthesis caused by malignant transformation of normal tissues are described. It was also shown that the error-prone DNA synthesis catalyzed by Pol iota in extracts of all cell types was efficiently suppressed by an RNA aptamer (IKL5) against Pol iota obtained in our work earlier. The obtained results suggest that IKL5 might be used to suppress the enhanced activity of Pol iota in tumor cells.

  13. Improving specialist drug prescribing in primary care using task and error analysis: an observational study.

    PubMed

    Chana, Narinder; Porat, Talya; Whittlesea, Cate; Delaney, Brendan

    2017-03-01

    Electronic prescribing has benefited from computerised clinical decision support systems (CDSSs); however, no published studies have evaluated the potential for a CDSS to support GPs in prescribing specialist drugs. To identify potential weaknesses and errors in the existing process of prescribing specialist drugs that could be addressed in the development of a CDSS. Semi-structured interviews with key informants followed by an observational study involving GPs in the UK. Twelve key informants were interviewed to investigate the use of CDSSs in the UK. Nine GPs were observed while performing case scenarios depicting requests from hospitals or patients to prescribe a specialist drug. Activity diagrams, hierarchical task analysis, and systematic human error reduction and prediction approach analyses were performed. The current process of prescribing specialist drugs by GPs is prone to error. Errors of omission due to lack of information were the most common errors, which could potentially result in a GP prescribing a specialist drug that should only be prescribed in hospitals, or prescribing a specialist drug without reference to a shared care protocol. Half of all possible errors in the prescribing process had a high probability of occurrence. A CDSS supporting GPs during the process of prescribing specialist drugs is needed. This could, first, support the decision making of whether or not to undertake prescribing, and, second, provide drug-specific parameters linked to shared care protocols, which could reduce the errors identified and increase patient safety. © British Journal of General Practice 2017.

  14. Magellan spacecraft and memory state tracking: Lessons learned, future thoughts

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.

  15. Magellan spacecraft and memory state tracking: Lessons learned, future thoughts

    NASA Astrophysics Data System (ADS)

    Bucher, Allen W.

    1993-03-01

    Numerous studies have been dedicated to improving the two main elements of Spacecraft Mission Operations: Command and Telemetry. As a result, not much attention has been given to other tasks that can become tedious, repetitive, and error prone. One such task is Spacecraft and Memory State Tracking, the process by which the status of critical spacecraft components, parameters, and the contents of on-board memory are managed on the ground to maintain knowledge of spacecraft and memory states for future testing, anomaly investigation, and on-board memory reconstruction. The task of Spacecraft and Memory State Tracking has traditionally been a manual task allocated to Mission Operations Procedures. During nominal Mission Operations this job is tedious and error prone. Because the task is not complex and can be accomplished manually, the worth of a sophisticated software tool is often questioned. However, in the event of an anomaly which alters spacecraft components autonomously or a memory anomaly such as a corrupt memory or flight software error, an accurate ground image that can be reconstructed quickly is a priceless commodity. This study explores the process of Spacecraft and Memory State Tracking used by the Magellan Spacecraft Team highlighting its strengths as well as identifying lessons learned during the primary and extended missions, two memory anomalies, and other hardships encountered due to incomplete knowledge of spacecraft states. Ideas for future state tracking tools that require minimal user interaction and are integrated into the Ground Data System will also be discussed.

  16. Random mutagenesis by error-prone pol plasmid replication in Escherichia coli.

    PubMed

    Alexander, David L; Lilly, Joshua; Hernandez, Jaime; Romsdahl, Jillian; Troll, Christopher J; Camps, Manel

    2014-01-01

    Directed evolution is an approach that mimics natural evolution in the laboratory with the goal of modifying existing enzymatic activities or of generating new ones. The identification of mutants with desired properties involves the generation of genetic diversity coupled with a functional selection or screen. Genetic diversity can be generated using PCR or using in vivo methods such as chemical mutagenesis or error-prone replication of the desired sequence in a mutator strain. In vivo mutagenesis methods facilitate iterative selection because they do not require cloning, but generally produce a low mutation density with mutations not restricted to specific genes or areas within a gene. For this reason, this approach is typically used to generate new biochemical properties when large numbers of mutants can be screened or selected. Here we describe protocols for an advanced in vivo mutagenesis method that is based on error-prone replication of a ColE1 plasmid bearing the gene of interest. Compared to other in vivo mutagenesis methods, this plasmid-targeted approach allows increased mutation loads and facilitates iterative selection approaches. We also describe the mutation spectrum for this mutagenesis methodology in detail, and, using cycle 3 GFP as a target for mutagenesis, we illustrate the phenotypic diversity that can be generated using our method. In sum, error-prone Pol I replication is a mutagenesis method that is ideally suited for the evolution of new biochemical activities when a functional selection is available.

  17. WISC-R Examiner Errors: Cause for Concern.

    ERIC Educational Resources Information Center

    Slate, John R.; Chick, David

    1989-01-01

    Clinical psychology graduate students (N=14) administered Wechsler Intelligence Scale for Children-Revised. Found numerous scoring and mechanical errors that influenced full-scale intelligence quotient scores on two-thirds of protocols. Particularly prone to error were Verbal subtests of Vocabulary, Comprehension, and Similarities. Noted specific…

  18. The Sizing and Optimization Language (SOL): A computer language to improve the user/optimizer interface

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Scotti, S. J.

    1989-01-01

    The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.

  19. Computationally mapping sequence space to understand evolutionary protein engineering.

    PubMed

    Armstrong, Kathryn A; Tidor, Bruce

    2008-01-01

    Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.

  20. DNA double-strand–break complexity levels and their possible contributions to the probability for error-prone processing and repair pathway choice

    PubMed Central

    Schipler, Agnes; Iliakis, George

    2013-01-01

    Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice. PMID:23804754

  1. Systematic feasibility analysis of a quantitative elasticity estimation for breast anatomy using supine/prone patient postures.

    PubMed

    Hasse, Katelyn; Neylon, John; Sheng, Ke; Santhanam, Anand P

    2016-03-01

    Breast elastography is a critical tool for improving the targeted radiotherapy treatment of breast tumors. Current breast radiotherapy imaging protocols only involve prone and supine CT scans. There is a lack of knowledge on the quantitative accuracy with which breast elasticity can be systematically measured using only prone and supine CT datasets. The purpose of this paper is to describe a quantitative elasticity estimation technique for breast anatomy using only these supine/prone patient postures. Using biomechanical, high-resolution breast geometry obtained from CT scans, a systematic assessment was performed in order to determine the feasibility of this methodology for clinically relevant elasticity distributions. A model-guided inverse analysis approach is presented in this paper. A graphics processing unit (GPU)-based linear elastic biomechanical model was employed as a forward model for the inverse analysis with the breast geometry in a prone position. The elasticity estimation was performed using a gradient-based iterative optimization scheme and a fast-simulated annealing (FSA) algorithm. Numerical studies were conducted to systematically analyze the feasibility of elasticity estimation. For simulating gravity-induced breast deformation, the breast geometry was anchored at its base, resembling the chest-wall/breast tissue interface. Ground-truth elasticity distributions were assigned to the model, representing tumor presence within breast tissue. Model geometry resolution was varied to estimate its influence on convergence of the system. A priori information was approximated and utilized to record the effect on time and accuracy of convergence. The role of the FSA process was also recorded. A novel error metric that combined elasticity and displacement error was used to quantify the systematic feasibility study. For the authors' purposes, convergence was set to be obtained when each voxel of tissue was within 1 mm of ground-truth deformation. The authors' analyses showed that a ∼97% model convergence was systematically observed with no-a priori information. Varying the model geometry resolution showed no significant accuracy improvements. The GPU-based forward model enabled the inverse analysis to be completed within 10-70 min. Using a priori information about the underlying anatomy, the computation time decreased by as much as 50%, while accuracy improved from 96.81% to 98.26%. The use of FSA was observed to allow the iterative estimation methodology to converge more precisely. By utilizing a forward iterative approach to solve the inverse elasticity problem, this work indicates the feasibility and potential of the fast reconstruction of breast tissue elasticity using supine/prone patient postures.

  2. Commission errors of active intentions: the roles of aging, cognitive load, and practice.

    PubMed

    Boywitt, C Dennis; Rummel, Jan; Meiser, Thorsten

    2015-01-01

    Performing an intended action when it needs to be withheld, for example, when temporarily prescribed medication is incompatible with the other medication, is referred to as commission errors of prospective memory (PM). While recent research indicates that older adults are especially prone to commission errors for finished intentions, there is a lack of research on the effects of aging on commission errors for still active intentions. The present research investigates conditions which might contribute to older adults' propensity to perform planned intentions under inappropriate conditions. Specifically, disproportionally higher rates of commission errors for still active intentions were observed in older than in younger adults with both salient (Experiment 1) and non-salient (Experiment 2) target cues. Practicing the PM task in Experiment 2, however, helped execution of the intended action in terms of higher PM performance at faster ongoing-task response times but did not increase the rate of commission errors. The results have important implications for the understanding of older adults' PM commission errors and the processes involved in these errors.

  3. Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Shijun; Yao Jianhua; Liu Jiamin

    Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice--Once supine and once prone--to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined bymore » the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27{+-}52.97 to 14.98 mm{+-}11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline.« less

  4. The 3 faces of clinical reasoning: Epistemological explorations of disparate error reduction strategies.

    PubMed

    Monteiro, Sandra; Norman, Geoff; Sherbino, Jonathan

    2018-06-01

    There is general consensus that clinical reasoning involves 2 stages: a rapid stage where 1 or more diagnostic hypotheses are advanced and a slower stage where these hypotheses are tested or confirmed. The rapid hypothesis generation stage is considered inaccessible for analysis or observation. Consequently, recent research on clinical reasoning has focused specifically on improving the accuracy of the slower, hypothesis confirmation stage. Three perspectives have developed in this line of research, and each proposes different error reduction strategies for clinical reasoning. This paper considers these 3 perspectives and examines the underlying assumptions. Additionally, this paper reviews the evidence, or lack of, behind each class of error reduction strategies. The first perspective takes an epidemiological stance, appealing to the benefits of incorporating population data and evidence-based medicine in every day clinical reasoning. The second builds on the heuristic and bias research programme, appealing to a special class of dual process reasoning models that theorizes a rapid error prone cognitive process for problem solving with a slower more logical cognitive process capable of correcting those errors. Finally, the third perspective borrows from an exemplar model of categorization that explicitly relates clinical knowledge and experience to diagnostic accuracy. © 2018 John Wiley & Sons, Ltd.

  5. System-on-Chip Data Processing and Data Handling Spaceflight Electronics

    NASA Technical Reports Server (NTRS)

    Kleyner, I.; Katz, R.; Tiggeler, H.

    1999-01-01

    This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.

  6. A comparative study of set up variations and bowel volumes in supine versus prone positions of patients treated with external beam radiation for carcinoma rectum.

    PubMed

    Rajeev, K R; Menon, Smrithy S; Beena, K; Holla, Raghavendra; Kumar, R Rajaneesh; Dinesh, M

    2014-01-01

    A prospective study was undertaken to evaluate the influence of patient positioning on the set up variations to determine the planning target volume (PTV) margins and to evaluate the clinical relevance volume assessment of the small bowel (SB) within the irradiated volume. During the period of months from December 2011 to April 2012, a computed tomography (CT) scan was done either in supine position or in prone position using a belly board (BB) for 20 consecutive patients. All the patients had histologically proven rectal cancer and received either post- or pre-operative pelvic irradiation. Using a three-dimensional planning system, the dose-volume histogram for SB was defined in each axial CT slice. Total dose was 46-50 Gy (2 Gy/fraction), delivered using the 4-field box technique. The set up variation of the study group was assessed from the data received from the electronic portal imaging device in the linear accelerator. The shift along X, Y, and Z directions were noted. Both systematic and random errors were calculated and using both these values the PTV margin was calculated. The systematic errors of patients treated in the supine position were 0.87 (X-mm), 0.66 (Y-mm), 1.6 (Z-mm) and in the prone position were 1.3 (X-mm), 0.59 (Y-mm), 1.17 (Z-mm). The random errors of patients treated in the supine positions were 1.81 (X-mm), 1.73 (Y-mm), 1.83 (Z-mm) and in prone position were 2.02 (X-mm), 1.21 (Y-mm), 3.05 (Z-mm). The calculated PTV margins in the supine position were 3.45 (X-mm), 2.87 (Y-mm), 5.31 (Z-mm) and in the prone position were 4.91 (X-mm), 2.32 (Y-mm), 5.08 (Z-mm). The mean volume of the peritoneal cavity was 648.65 cm 3 in the prone position and 1197.37 cm 3 in the supine position. The prone position using BB device was more effective in reducing irradiated SB volume in rectal cancer patients. There were no significant variations in the daily set up for patients treated in both supine and prone positions.

  7. The preliminary SOL (Sizing and Optimization Language) reference manual

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1989-01-01

    The Sizing and Optimization Language, SOL, a high-level special-purpose computer language has been developed to expedite application of numerical optimization to design problems and to make the process less error-prone. This document is a reference manual for those wishing to write SOL programs. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler and runtime library routines. An overview of SOL appears in NASA TM 100565.

  8. Assessment of Automatically Exported Clinical Data from a Hospital Information System for Clinical Research in Multiple Myeloma.

    PubMed

    Torres, Viviana; Cerda, Mauricio; Knaup, Petra; Löpprich, Martin

    2016-01-01

    An important part of the electronic information available in Hospital Information System (HIS) has the potential to be automatically exported to Electronic Data Capture (EDC) platforms for improving clinical research. This automation has the advantage of reducing manual data transcription, a time consuming and prone to errors process. However, quantitative evaluations of the process of exporting data from a HIS to an EDC system have not been reported extensively, in particular comparing with manual transcription. In this work an assessment to study the quality of an automatic export process, focused in laboratory data from a HIS is presented. Quality of the laboratory data was assessed in two types of processes: (1) a manual process of data transcription, and (2) an automatic process of data transference. The automatic transference was implemented as an Extract, Transform and Load (ETL) process. Then, a comparison was carried out between manual and automatic data collection methods. The criteria to measure data quality were correctness and completeness. The manual process had a general error rate of 2.6% to 7.1%, obtaining the lowest error rate if data fields with a not clear definition were removed from the analysis (p < 10E-3). In the case of automatic process, the general error rate was 1.9% to 12.1%, where lowest error rate is obtained when excluding information missing in the HIS but transcribed to the EDC from other physical sources. The automatic ETL process can be used to collect laboratory data for clinical research if data in the HIS as well as physical documentation not included in HIS, are identified previously and follows a standardized data collection protocol.

  9. Portable and Error-Free DNA-Based Data Storage.

    PubMed

    Yazdi, S M Hossein Tabatabaei; Gabrys, Ryan; Milenkovic, Olgica

    2017-07-10

    DNA-based data storage is an emerging nonvolatile memory technology of potentially unprecedented density, durability, and replication efficiency. The basic system implementation steps include synthesizing DNA strings that contain user information and subsequently retrieving them via high-throughput sequencing technologies. Existing architectures enable reading and writing but do not offer random-access and error-free data recovery from low-cost, portable devices, which is crucial for making the storage technology competitive with classical recorders. Here we show for the first time that a portable, random-access platform may be implemented in practice using nanopore sequencers. The novelty of our approach is to design an integrated processing pipeline that encodes data to avoid costly synthesis and sequencing errors, enables random access through addressing, and leverages efficient portable sequencing via new iterative alignment and deletion error-correcting codes. Our work represents the only known random access DNA-based data storage system that uses error-prone nanopore sequencers, while still producing error-free readouts with the highest reported information rate/density. As such, it represents a crucial step towards practical employment of DNA molecules as storage media.

  10. RADH, a gene of Saccharomyces cerevisiae encoding a putative DNA helicase involved in DNA repair. Characteristics of radH mutants and sequence of the gene.

    PubMed

    Aboussekhra, A; Chanet, R; Zgaga, Z; Cassier-Chauvat, C; Heude, M; Fabre, F

    1989-09-25

    A new type of radiation-sensitive mutant of S. cerevisiae is described. The recessive radH mutation sensitizes to the lethal effect of UV radiations haploids in the G1 but not in the G2 mitotic phase. Homozygous diploids are as sensitive as G1 haploids. The UV-induced mutagenesis is depressed, while the induction of gene conversion is increased. The mutation is believed to channel the repair of lesions engaged in the mutagenic pathway into a recombination process, successful if the events involve sister-chromatids but lethal if they involve homologous chromosomes. The sequence of the RADH gene reveals that it may code for a DNA helicase, with a Mr of 134 kDa. All the consensus domains of known DNA helicases are present. Besides these consensus regions, strong homologies with the Rep and UvrD helicases of E. coli were found. The RadH putative helicase appears to belong to the set of proteins involved in the error-prone repair mechanism, at least for UV-induced lesions, and could act in coordination with the Rev3 error-prone DNA polymerase.

  11. One-step random mutagenesis by error-prone rolling circle amplification

    PubMed Central

    Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi

    2004-01-01

    In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684

  12. Producing good font attribute determination using error-prone information

    NASA Astrophysics Data System (ADS)

    Cooperman, Robert

    1997-04-01

    A method to provide estimates of font attributes in an OCR system, using detectors of individual attributes that are error-prone. For an OCR system to preserve the appearance of a scanned document, it needs accurate detection of font attributes. However, OCR environments have noise and other sources of errors, tending to make font attribute detection unreliable. Certain assumptions about font use can greatly enhance accuracy. Attributes such as boldness and italics are more likely to change between neighboring words, while attributes such as serifness are less likely to change within the same paragraph. Furthermore, the document as a whole, tends to have a limited number of sets of font attributes. These assumptions allow a better use of context than the raw data, or what would be achieved by simpler methods that would oversmooth the data.

  13. The assessment of science: the relative merits of post-publication review, the impact factor, and the number of citations.

    PubMed

    Eyre-Walker, Adam; Stoletzki, Nina

    2013-10-01

    The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative.

  14. The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations

    PubMed Central

    Eyre-Walker, Adam; Stoletzki, Nina

    2013-01-01

    The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative. PMID:24115908

  15. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  16. Computer Aided Software Engineering (CASE) Environment Issues.

    DTIC Science & Technology

    1987-06-01

    tasks tend to be error prone and slowv when done by humans . Ti-.c,. are e’.el nt anidates for automation using a computer. (MacLennan. 10S1. p. 51 2...CASE r,’sourCcs; * human resources. Lonsisting of the people who use and facilitate utilization in !:1e case of manual resource, of the environment...engineering process in a given er,%irent rnizthe nature of rnanua! and human resources. CA.SU_ -esources should provide the softwvare enizincerin2 team

  17. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  18. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  19. Virtual design and construction of plumbing systems

    NASA Astrophysics Data System (ADS)

    Filho, João Bosco P. Dantas; Angelim, Bruno Maciel; Guedes, Joana Pimentel; de Castro, Marcelo Augusto Farias; Neto, José de Paula Barros

    2016-12-01

    Traditionally, the design coordination process is carried out by overlaying and comparing 2D drawings made by different project participants. Detecting information errors from a composite drawing is especially challenging and error prone. This procedure usually leaves many design errors undetected until construction begins, and typically lead to rework. Correcting conflict issues, which were not identified during design and coordination phase, reduces the overall productivity for everyone involved in the construction process. The identification of construction issues in the field generate Request for Information (RFIs) that is one of delays causes. The application of Virtual Design and Construction (VDC) tools to the coordination processes can bring significant value to architecture, structure, and mechanical, electrical, and plumbing (MEP) designs in terms of a reduced number of errors undetected and requests for information. This paper is focused on evaluating requests for information (RFI) associated with water/sanitary facilities of a BIM model. Thus, it is expected to add improvements of water/sanitary facility designs, as well as to assist the virtual construction team to notice and identify design problems. This is an exploratory and descriptive research. A qualitative methodology is used. This study adopts RFI's classification in six analyzed categories: correction, omission, validation of information, modification, divergence of information and verification. The results demonstrate VDC's contribution improving the plumbing system designs. Recommendations are suggested to identify and avoid these RFI types in plumbing system design process or during virtual construction.

  20. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  1. Toward a more sophisticated response representation in theories of medial frontal performance monitoring: The effects of motor similarity and motor asymmetries.

    PubMed

    Hochman, Eldad Yitzhak; Orr, Joseph M; Gehring, William J

    2014-02-01

    Cognitive control in the posterior medial frontal cortex (pMFC) is formulated in models that emphasize adaptive behavior driven by a computation evaluating the degree of difference between 2 conflicting responses. These functions are manifested by an event-related brain potential component coined the error-related negativity (ERN). We hypothesized that the ERN represents a regulative rather than evaluative pMFC process, exerted over the error motor representation, expediting the execution of a corrective response. We manipulated the motor representations of the error and the correct response to varying degrees. The ERN was greater when 1) the error response was more potent than when the correct response was more potent, 2) more errors were committed, 3) fewer and slower corrections were observed, and 4) the error response shared fewer motor features with the correct response. In their current forms, several prominent models of the pMFC cannot be reconciled with these findings. We suggest that a prepotent, unintended error is prone to reach the manual motor processor responsible for response execution before a nonpotent, intended correct response. In this case, the correct response is a correction and its execution must wait until the error is aborted. The ERN may reflect pMFC activity that aimed to suppress the error.

  2. Fast and fragile: A new look at the automaticity of negation processing.

    PubMed

    Deutsch, Roland; Kordts-Freudinger, Robert; Gawronski, Bertram; Strack, Fritz

    2009-01-01

    Numerous studies suggest that processing verbal materials containing negations slows down cognition and makes it more error-prone. This suggests that processing negations affords relatively nonautomatic processes. The present research studied the role of two automaticity features (processing speed and resource dependency) for negation processing. In three experiments, we tested the impact of verbal negations on affective priming effects in the Affect Misattribution Paradigm. Going beyond previous work, the results indicate that negations can be processed unintentionally and quickly (Experiments 1 and 2). In Experiment 3, negations failed to qualify affective priming effects when participants' working memory was taxed by memorizing an eight-digit number. In sum, the experiments suggest that negations can be processed unintentionally, very quickly, but that they rely on working-memory resources.

  3. Cocaine Dependence Treatment Data: Methods for Measurement Error Problems With Predictors Derived From Stationary Stochastic Processes

    PubMed Central

    Guan, Yongtao; Li, Yehua; Sinha, Rajita

    2011-01-01

    In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854

  4. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    PubMed

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  5. Situating Student Errors: Linguistic-to-Algebra Translation Errors

    ERIC Educational Resources Information Center

    Adu-Gyamfi, Kwaku; Bossé, Michael J.; Chandler, Kayla

    2015-01-01

    While it is well recognized that students are prone to difficulties when performing linguistic-to-algebra translations, the nature of students' difficulties remain an issue of contention. Moreover, the literature indicates that these difficulties are not easily remediated by domain-specific instruction. Some have opined that this is the case…

  6. Errors of Inference in Structural Equation Modeling

    ERIC Educational Resources Information Center

    McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.

    2007-01-01

    Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…

  7. Exploring the relationship between boredom and sustained attention.

    PubMed

    Malkovsky, Ela; Merrifield, Colleen; Goldberg, Yael; Danckert, James

    2012-08-01

    Boredom is a common experience, prevalent in neurological and psychiatric populations, yet its cognitive characteristics remain poorly understood. We explored the relationship between boredom proneness, sustained attention and adult symptoms of attention deficit hyperactivity disorder (ADHD). The results showed that high boredom-prone individuals (HBP) performed poorly on measures of sustained attention and showed increased symptoms of ADHD and depression. The results also showed that HBP individuals can be characterised as either apathetic-in which the individual is unconcerned with his/her environment, or as agitated-in which the individual is motivated to engage in meaningful activities, although attempts to do so fail to satisfy. Apathetic boredom proneness was associated with attention lapses, whereas agitated boredom proneness was associated with decreased sensitivity to errors of sustained attention, and increased symptoms of adult ADHD. Our results suggest there is a complex relationship between attention and boredom proneness.

  8. The Concept of Accident Proneness: A Review

    PubMed Central

    Froggatt, Peter; Smiley, James A.

    1964-01-01

    The term accident proneness was coined by psychological research workers in 1926. Since then its concept—that certain individuals are always more likely than others to sustain accidents, even though exposed to equal risk—has been questioned but seldom seriously challenged. This article describes much of the work and theory on which this concept is based, details the difficulties encountered in obtaining valid information and the interpretative errors that can arise from the examination of imperfect data, and explains why accident proneness became so readily accepted as an explanation of the facts. A recent hypothesis of accident causation, namely that a person's accident liability may vary from time to time, is outlined, and the respective abilities of this and of accident proneness to accord with data from the more reliable literature are examined. The authors conclude that the hypothesis of individual variation in liability is more realistic and in better agreement with the data than is accident proneness. PMID:14106130

  9. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection

    PubMed Central

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors. PMID:24688709

  10. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection.

    PubMed

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors.

  11. Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

    PubMed Central

    Wang, Shijun; Yao, Jianhua; Liu, Jiamin; Petrick, Nicholas; Van Uitert, Robert L.; Periaswamy, Senthil; Summers, Ronald M.

    2009-01-01

    Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice—Once supine and once prone—to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined by the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27±52.97 to 14.98 mm±11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline. PMID:20095272

  12. A Novel Way to Relate Ontology Classes

    PubMed Central

    Choksi, Ami T.; Jinwala, Devesh C.

    2015-01-01

    The existing ontologies in the semantic web typically have anonymous union and intersection classes. The anonymous classes are limited in scope and may not be part of the whole inference process. The tools, namely, the pellet, the jena, and the protégé, interpret collection classes as (a) equivalent/subclasses of union class and (b) superclasses of intersection class. As a result, there is a possibility that the tools will produce error prone inference results for relations, namely, sub-, union, intersection, equivalent relations, and those dependent on these relations, namely, complement. To verify whether a class is complement of other involves utilization of sub- and equivalent relations. Motivated by the same, we (i) refine the test data set of the conference ontology by adding named, union, and intersection classes and (ii) propose a match algorithm to (a) calculate corrected subclasses list, (b) correctly relate intersection and union classes with their collection classes, and (c) match union, intersection, sub-, complement, and equivalent classes in a proper sequence, to avoid error prone match results. We compare the results of our algorithms with those of a candidate reasoner, namely, the pellet reasoner. To the best of our knowledge, ours is a unique attempt in establishing a novel way to relate ontology classes. PMID:25984560

  13. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective.

    PubMed

    Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-06-23

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  14. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective

    NASA Astrophysics Data System (ADS)

    Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-07-01

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  15. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography

    DTIC Science & Technology

    1980-03-01

    interpreting/smoothing data containing a significant percentage of gross errors, and thus is ideally suited for applications in automated image ... analysis where interpretation is based on the data provided by error-prone feature detectors. A major portion of the paper describes the application of

  16. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    NASA Astrophysics Data System (ADS)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.

  17. Learning class descriptions from a data base of spectral reflectance with multiple view angles

    NASA Technical Reports Server (NTRS)

    Kimes, Daniel S.; Harrison, Patrick R.; Harrison, P. A.

    1992-01-01

    A learning program has been developed which combines 'learning by example' with the generate-and-test paradigm to furnish a robust learning environment capable of handling error-prone data. The problem is shown to be capable of learning class descriptions from positive and negative training examples of spectral and directional reflectance data taken from soil and vegetation. The program, which used AI techniques to automate very tedious processes, found the sequence of relationships that contained the most important information which could distinguish the classes.

  18. ASSIST: User's manual

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1986-01-01

    Semi-Markov models can be used to compute the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in a model of a complex system can be devastingly tedious and error-prone. The ASSIST program allows the user to describe the semi-Markov model in a high-level language. Instead of specifying the individual states of the model, the user specifies the rules governing the behavior of the system and these are used by ASSIST to automatically generate the model. The ASSIST program is described and illustrated by examples.

  19. A false positive food chain error associated with a generic predator gut content ELISA

    USDA-ARS?s Scientific Manuscript database

    Conventional prey-specific gut content ELISA and PCR assays are useful for identifying predators of insect pests in nature. However, these assays are prone to yielding certain types of food chain errors. For instance, it is possible that prey remains can pass through the food chain as the result of ...

  20. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    PubMed

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  1. Externalizing psychopathology and gain-loss feedback in a simulated gambling task: dissociable components of brain response revealed by time-frequency analysis.

    PubMed

    Bernat, Edward M; Nelson, Lindsay D; Steele, Vaughn R; Gehring, William J; Patrick, Christopher J

    2011-05-01

    Externalizing is a broad construct that reflects propensity toward a variety of impulse control problems, including antisocial personality disorder and substance use disorders. Two event-related potential responses known to be reduced among individuals high in externalizing proneness are the P300, which reflects postperceptual processing of a stimulus, and the error-related negativity (ERN), which indexes performance monitoring based on endogenous representations. In the current study, the authors used a simulated gambling task to examine the relation between externalizing proneness and the feedback-related negativity (FRN), a brain response that indexes performance monitoring related to exogenous cues, which is thought to be highly related to the ERN. Time-frequency (TF) analysis was used to disentangle the FRN from the accompanying P300 response to feedback cues by parsing the overall feedback-locked potential into distinctive theta (4-7 Hz) and delta (<3 Hz) TF components. Whereas delta-P300 amplitude was reduced among individuals high in externalizing proneness, theta-FRN response was unrelated to externalizing. These findings suggest that in contrast with previously reported deficits in endogenously based performance monitoring (as indexed by the ERN), individuals prone to externalizing problems show intact monitoring of exogenous cues (as indexed by the FRN). The results also contribute to a growing body of evidence indicating that the P300 is attenuated across a broad range of task conditions in high-externalizing individuals.

  2. Automated and unsupervised detection of malarial parasites in microscopic images.

    PubMed

    Purwar, Yashasvi; Shah, Sirish L; Clarke, Gwen; Almugairi, Areej; Muehlenbachs, Atis

    2011-12-13

    Malaria is a serious infectious disease. According to the World Health Organization, it is responsible for nearly one million deaths each year. There are various techniques to diagnose malaria of which manual microscopy is considered to be the gold standard. However due to the number of steps required in manual assessment, this diagnostic method is time consuming (leading to late diagnosis) and prone to human error (leading to erroneous diagnosis), even in experienced hands. The focus of this study is to develop a robust, unsupervised and sensitive malaria screening technique with low material cost and one that has an advantage over other techniques in that it minimizes human reliance and is, therefore, more consistent in applying diagnostic criteria. A method based on digital image processing of Giemsa-stained thin smear image is developed to facilitate the diagnostic process. The diagnosis procedure is divided into two parts; enumeration and identification. The image-based method presented here is designed to automate the process of enumeration and identification; with the main advantage being its ability to carry out the diagnosis in an unsupervised manner and yet have high sensitivity and thus reducing cases of false negatives. The image based method is tested over more than 500 images from two independent laboratories. The aim is to distinguish between positive and negative cases of malaria using thin smear blood slide images. Due to the unsupervised nature of method it requires minimal human intervention thus speeding up the whole process of diagnosis. Overall sensitivity to capture cases of malaria is 100% and specificity ranges from 50-88% for all species of malaria parasites. Image based screening method will speed up the whole process of diagnosis and is more advantageous over laboratory procedures that are prone to errors and where pathological expertise is minimal. Further this method provides a consistent and robust way of generating the parasite clearance curves.

  3. Joint estimation over multiple individuals improves behavioural state inference from animal movement data.

    PubMed

    Jonsen, Ian

    2016-02-08

    State-space models provide a powerful way to scale up inference of movement behaviours from individuals to populations when the inference is made across multiple individuals. Here, I show how a joint estimation approach that assumes individuals share identical movement parameters can lead to improved inference of behavioural states associated with different movement processes. I use simulated movement paths with known behavioural states to compare estimation error between nonhierarchical and joint estimation formulations of an otherwise identical state-space model. Behavioural state estimation error was strongly affected by the degree of similarity between movement patterns characterising the behavioural states, with less error when movements were strongly dissimilar between states. The joint estimation model improved behavioural state estimation relative to the nonhierarchical model for simulated data with heavy-tailed Argos location errors. When applied to Argos telemetry datasets from 10 Weddell seals, the nonhierarchical model estimated highly uncertain behavioural state switching probabilities for most individuals whereas the joint estimation model yielded substantially less uncertainty. The joint estimation model better resolved the behavioural state sequences across all seals. Hierarchical or joint estimation models should be the preferred choice for estimating behavioural states from animal movement data, especially when location data are error-prone.

  4. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly.

    PubMed

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-11-26

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm.

  5. Comparing errors in Medicaid reporting across surveys: evidence to date.

    PubMed

    Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria

    2013-04-01

    To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. All available validation studies. Compare results from existing research to understand variation in reporting across surveys. Synthesize all available studies validating survey reports of Medicaid coverage. Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. © Health Research and Educational Trust.

  6. Metrics to quantify the importance of mixing state for CCN activity

    DOE PAGES

    Ching, Joseph; Fast, Jerome; West, Matthew; ...

    2017-06-21

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  7. The Sizing and Optimization Language, (SOL): Computer language for design problems

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1988-01-01

    The Sizing and Optimization Language, (SOL), a new high level, special purpose computer language was developed to expedite application of numerical optimization to design problems and to make the process less error prone. SOL utilizes the ADS optimization software and provides a clear, concise syntax for describing an optimization problem, the OPTIMIZE description, which closely parallels the mathematical description of the problem. SOL offers language statements which can be used to model a design mathematically, with subroutines or code logic, and with existing FORTRAN routines. In addition, SOL provides error checking and clear output of the optimization results. Because of these language features, SOL is best suited to model and optimize a design concept when the model consits of mathematical expressions written in SOL. For such cases, SOL's unique syntax and error checking can be fully utilized. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler, runtime library routines, and a SOL reference manual.

  8. Diagnosing Crime and Diagnosing Disease: Bias Reduction Strategies in the Forensic and Clinical Sciences.

    PubMed

    Lockhart, Joseph J; Satya-Murti, Saty

    2017-11-01

    Cognitive effort is an essential part of both forensic and clinical decision-making. Errors occur in both fields because the cognitive process is complex and prone to bias. We performed a selective review of full-text English language literature on cognitive bias leading to diagnostic and forensic errors. Earlier work (1970-2000) concentrated on classifying and raising bias awareness. Recently (2000-2016), the emphasis has shifted toward strategies for "debiasing." While the forensic sciences have focused on the control of misleading contextual cues, clinical debiasing efforts have relied on checklists and hypothetical scenarios. No single generally applicable and effective bias reduction strategy has emerged so far. Generalized attempts at bias elimination have not been particularly successful. It is time to shift focus to the study of errors within specific domains, and how to best communicate uncertainty in order to improve decision making on the part of both the expert and the trier-of-fact. © 2017 American Academy of Forensic Sciences.

  9. Metrics to quantify the importance of mixing state for CCN activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ching, Joseph; Fast, Jerome; West, Matthew

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index ( χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportantmore » for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations ( χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.« less

  10. QAIT: a quality assurance issue tracking tool to facilitate the improvement of clinical data quality.

    PubMed

    Zhang, Yonghong; Sun, Weihong; Gutchell, Emily M; Kvecher, Leonid; Kohr, Joni; Bekhash, Anthony; Shriver, Craig D; Liebman, Michael N; Mural, Richard J; Hu, Hai

    2013-01-01

    In clinical and translational research as well as clinical trial projects, clinical data collection is prone to errors such as missing data, and misinterpretation or inconsistency of the data. A good quality assurance (QA) program can resolve many such errors though this requires efficient communications between the QA staff and data collectors. Managing such communications is critical to resolving QA problems but imposes a major challenge for a project involving multiple clinical and data processing sites. We have developed a QA issue tracking (QAIT) system to support clinical data QA in the Clinical Breast Care Project (CBCP). This web-based application provides centralized management of QA issues with role-based access privileges. It has greatly facilitated the QA process and enhanced the overall quality of the CBCP clinical data. As a stand-alone system, QAIT can supplement any other clinical data management systems and can be adapted to support other projects. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    PubMed

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Last Year Your Answer Was… : The Impact of Dependent Interviewing Wording and Survey Factors on Reporting of Change

    ERIC Educational Resources Information Center

    Al Baghal, Tarek

    2017-01-01

    Prior studies suggest memories are potentially error prone. Proactive dependent interviewing (PDI) is a possible method to reduce errors in reports of change in longitudinal studies, reminding respondents of previous answers while asking if there has been any change since the last survey. However, little research has been conducted on the impact…

  13. The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.

    ERIC Educational Resources Information Center

    System Development Corp., Falls Church, VA.

    An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…

  14. Multistrip western blotting to increase quantitative data output.

    PubMed

    Kiyatkin, Anatoly; Aksamitiene, Edita

    2009-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.

  15. Keeping mammalian mutation load in check: regulation of the activity of error-prone DNA polymerases by p53 and p21.

    PubMed

    Livneh, Zvi

    2006-09-01

    To overcome DNA lesions that block replication the cell employs translesion DNA synthesis (TLS) polymerases, a group of low fidelity DNA polymerases that have the capacity to bypass a wide range of DNA lesions. This TLS process is also termed error-prone repair, due to its inherent mutagenic nature. We have recently shown that the tumor suppressor p53 and the cell cycle inhibitor p21 are global regulators of TLS. When these proteins are missing or nonfunctional, TLS gets out of control: its extent increases to very high levels, and its fidelity decreases, causing an overall increase in mutation load. This may be explained by the loss of selectivity in the bypass of specific DNA lesions by their cognate specialized polymerases, such that lesion bypass continues to a maximum, regardless of the price paid in increased mutations. The p53 and p21 proteins are also required for efficient UV light-induced monoubiquitination of PCNA, which is consistent with a model in which this modification of PCNA is necessary but not sufficient for the normal activity of TLS. This regulation suggests that TLS evolved in mammals as a system that balances gain in survival with a tolerable mutational cost, and that disturbing this balance causes a potentially harmful increase in mutations, which might play a role in carcinogenesis.

  16. Furfural-tolerant Zymomonas mobilis derived from error-prone PCR-based whole genome shuffling and their tolerant mechanism.

    PubMed

    Huang, Suzhen; Xue, Tingli; Wang, Zhiquan; Ma, Yuanyuan; He, Xueting; Hong, Jiefang; Zou, Shaolan; Song, Hao; Zhang, Minhua

    2018-04-01

    Furfural-tolerant strain is essential for the fermentative production of biofuels or chemicals from lignocellulosic biomass. In this study, Zymomonas mobilis CP4 was for the first time subjected to error-prone PCR-based whole genome shuffling, and the resulting mutants F211 and F27 that could tolerate 3 g/L furfural were obtained. The mutant F211 under various furfural stress conditions could rapidly grow when the furfural concentration reduced to 1 g/L. Meanwhile, the two mutants also showed higher tolerance to high concentration of glucose than the control strain CP4. Genome resequencing revealed that the F211 and F27 had 12 and 13 single-nucleotide polymorphisms. The activity assay demonstrated that the activity of NADH-dependent furfural reductase in mutant F211 and CP4 was all increased under furfural stress, and the activity peaked earlier in mutant than in control. Also, furfural level in the culture of F211 was also more rapidly decreased. These indicate that the increase in furfural tolerance of the mutants may be resulted from the enhanced NADH-dependent furfural reductase activity during early log phase, which could lead to an accelerated furfural detoxification process in mutants. In all, we obtained Z. mobilis mutants with enhanced furfural and high concentration of glucose tolerance, and provided valuable clues for the mechanism of furfural tolerance and strain development.

  17. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.

    2015-07-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.

  18. Comparing Errors in Medicaid Reporting across Surveys: Evidence to Date

    PubMed Central

    Call, Kathleen T; Davern, Michael E; Klerman, Jacob A; Lynch, Victoria

    2013-01-01

    Objective To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. Data Sources All available validation studies. Study Design Compare results from existing research to understand variation in reporting across surveys. Data Collection Methods Synthesize all available studies validating survey reports of Medicaid coverage. Principal Findings Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. Conclusions Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting. PMID:22816493

  19. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  20. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  1. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  2. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  3. A hydrostatic weighing method using total lung capacity and a small tank.

    PubMed Central

    Warner, J G; Yeater, R; Sherwood, L; Weber, K

    1986-01-01

    The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing. PMID:3697596

  4. A hydrostatic weighing method using total lung capacity and a small tank.

    PubMed

    Warner, J G; Yeater, R; Sherwood, L; Weber, K

    1986-03-01

    The purpose of this study was to establish the validity and reliability of a hydrostatic weighing method using total lung capacity (measuring vital capacity with a respirometer at the time of weighing) the prone position, and a small oblong tank. The validity of the method was established by comparing the TLC prone (tank) method against three hydrostatic weighing methods administered in a pool. The three methods included residual volume seated, TLC seated and TLC prone. Eighty male and female subjects were underwater weighed using each of the four methods. Validity coefficients for per cent body fat between the TLC prone (tank) method and the RV seated (pool), TLC seated (pool) and TLC prone (pool) methods were .98, .99 and .99, respectively. A randomised complete block ANOVA found significant differences between the RV seated (pool) method and each of the three TLC methods with respect to both body density and per cent body fat. The differences were negligible with respect to HW error. Reliability of the TLC prone (tank) method was established by weighing twenty subjects three different times with ten-minute time intervals between testing. Multiple correlations yielded reliability coefficients for body density and per cent body fat values of .99 and .99, respectively. It was concluded that the TLC prone (tank) method is valid, reliable and a favourable method of hydrostatic weighing.

  5. Stem revenue losses with effective CDM management.

    PubMed

    Alwell, Michael

    2003-09-01

    Effective CDM management not only minimizes revenue losses due to denied claims, but also helps eliminate administrative costs associated with correcting coding errors. Accountability for CDM management should be assigned to a single individual, who ideally reports to the CFO or high-level finance director. If your organization is prone to making billing errors due to CDM deficiencies, you should consider purchasing CDM software to help you manage your CDM.

  6. Error-prone meiotic division and subfertility in mice with oocyte-conditional knockdown of pericentrin.

    PubMed

    Baumann, Claudia; Wang, Xiaotian; Yang, Luhan; Viveiros, Maria M

    2017-04-01

    Mouse oocytes lack canonical centrosomes and instead contain unique acentriolar microtubule-organizing centers (aMTOCs). To test the function of these distinct aMTOCs in meiotic spindle formation, pericentrin (Pcnt), an essential centrosome/MTOC protein, was knocked down exclusively in oocytes by using a transgenic RNAi approach. Here, we provide evidence that disruption of aMTOC function in oocytes promotes spindle instability and severe meiotic errors that lead to pronounced female subfertility. Pcnt-depleted oocytes from transgenic (Tg) mice were ovulated at the metaphase-II stage, but show significant chromosome misalignment, aneuploidy and premature sister chromatid separation. These defects were associated with loss of key Pcnt-interacting proteins (γ-tubulin, Nedd1 and Cep215) from meiotic spindle poles, altered spindle structure and chromosome-microtubule attachment errors. Live-cell imaging revealed disruptions in the dynamics of spindle assembly and organization, together with chromosome attachment and congression defects. Notably, spindle formation was dependent on Ran GTPase activity in Pcnt-deficient oocytes. Our findings establish that meiotic division is highly error-prone in the absence of Pcnt and disrupted aMTOCs, similar to what reportedly occurs in human oocytes. Moreover, these data underscore crucial differences between MTOC-dependent and -independent meiotic spindle assembly. © 2017. Published by The Company of Biologists Ltd.

  7. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly

    PubMed Central

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-01-01

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm. DOI: http://dx.doi.org/10.7554/eLife.10586.001 PMID:26609813

  8. Random mutagenesis of BoNT/E Hc nanobody to construct a secondary phage-display library.

    PubMed

    Shahi, B; Mousavi Gargari, S L; Rasooli, I; Rajabi Bazl, M; Hoseinpoor, R

    2014-08-01

    To construct secondary mutant phage-display library of recombinant single variable domain (VHH) against botulinum neurotoxin E by error-prone PCR. The gene coding for specific VHH derived from the camel immunized with binding domain of botulinum neurotoxin E (BoNT/E) was amplified by error-prone PCR. Several biopanning rounds were used to screen the phage-displaying BoNT/E Hc nanobodies. The final nanobody, SHMR4, with increased affinity recognized BoNT/E toxin with no cross-reactivity with other antigens especially with related BoNT toxins. The constructed nanobody could be a suitable candidate for VHH-based biosensor production to detect the Clostridium botulinum type E. Diagnosis and treatment of botulinum neurotoxins are important. Generation of high-affinity antibodies based on the construction of secondary libraries using affinity maturation step leads to the development of reagents for precise diagnosis and therapy. © 2014 The Society for Applied Microbiology.

  9. Isolation and characterization of high affinity aptamers against DNA polymerase iota.

    PubMed

    Lakhin, Andrei V; Kazakov, Andrei A; Makarova, Alena V; Pavlov, Yuri I; Efremova, Anna S; Shram, Stanislav I; Tarantul, Viacheslav Z; Gening, Leonid V

    2012-02-01

    Human DNA-polymerase iota (Pol ι) is an extremely error-prone enzyme and the fidelity depends on the sequence context of the template. Using the in vitro systematic evolution of ligands by exponential enrichment (SELEX) procedure, we obtained an oligoribonucleotide with a high affinity to human Pol ι, named aptamer IKL5. We determined its dissociation constant with homogenous preparation of Pol ι and predicted its putative secondary structure. The aptamer IKL5 specifically inhibits DNA-polymerase activity of the purified enzyme Pol ι, but did not inhibit the DNA-polymerase activities of human DNA polymerases beta and kappa. IKL5 suppressed the error-prone DNA-polymerase activity of Pol ι also in cellular extracts of the tumor cell line SKOV-3. The aptamer IKL5 is useful for studies of the biological role of Pol ι and as a potential drug to suppress the increase of the activity of this enzyme in malignant cells.

  10. Post processing for offline Chinese handwritten character string recognition

    NASA Astrophysics Data System (ADS)

    Wang, YanWei; Ding, XiaoQing; Liu, ChangSong

    2012-01-01

    Offline Chinese handwritten character string recognition is one of the most important research fields in pattern recognition. Due to the free writing style, large variability in character shapes and different geometric characteristics, Chinese handwritten character string recognition is a challenging problem to deal with. However, among the current methods over-segmentation and merging method which integrates geometric information, character recognition information and contextual information, shows a promising result. It is found experimentally that a large part of errors are segmentation error and mainly occur around non-Chinese characters. In a Chinese character string, there are not only wide characters namely Chinese characters, but also narrow characters like digits and letters of the alphabet. The segmentation error is mainly caused by uniform geometric model imposed on all segmented candidate characters. To solve this problem, post processing is employed to improve recognition accuracy of narrow characters. On one hand, multi-geometric models are established for wide characters and narrow characters respectively. Under multi-geometric models narrow characters are not prone to be merged. On the other hand, top rank recognition results of candidate paths are integrated to boost final recognition of narrow characters. The post processing method is investigated on two datasets, in total 1405 handwritten address strings. The wide character recognition accuracy has been improved lightly and narrow character recognition accuracy has been increased up by 10.41% and 10.03% respectively. It indicates that the post processing method is effective to improve recognition accuracy of narrow characters.

  11. Operationalizing Proneness to Externalizing Psychopathology as a Multivariate Psychophysiological Phenotype

    PubMed Central

    Nelson, Lindsay D.; Patrick, Christopher J.; Bernat, Edward M.

    2010-01-01

    The externalizing dimension is viewed as a broad dispositional factor underlying risk for numerous disinhibitory disorders. Prior work has documented deficits in event-related brain potential (ERP) responses in individuals prone to externalizing problems. Here, we constructed a direct physiological index of externalizing vulnerability from three ERP indicators and evaluated its validity in relation to criterion measures in two distinct domains: psychometric and physiological. The index was derived from three ERP measures that covaried in their relations with externalizing proneness the error-related negativity and two variants of the P3. Scores on this ERP composite predicted psychometric criterion variables and accounted for externalizing-related variance in P3 response from a separate task. These findings illustrate how a diagnostic construct can be operationalized as a composite (multivariate) psychophysiological variable (phenotype). PMID:20573054

  12. The application of Aronson's taxonomy to medication errors in nursing.

    PubMed

    Johnson, Maree; Young, Helen

    2011-01-01

    Medication administration is a frequent nursing activity that is prone to error. In this study of 318 self-reported medication incidents (including near misses), very few resulted in patient harm-7% required intervention or prolonged hospitalization or caused temporary harm. Aronson's classification system provided an excellent framework for analysis of the incidents with a close connection between the type of error and the change strategy to minimize medication incidents. Taking a behavioral approach to medication error classification has provided helpful strategies for nurses such as nurse-call cards on patient lockers when patients are absent and checking of medication sign-off by outgoing and incoming staff at handover.

  13. Graph-based active learning of agglomeration (GALA): a Python library to segment 2D and 3D neuroimages

    PubMed Central

    Nunez-Iglesias, Juan; Kennedy, Ryan; Plaza, Stephen M.; Chakraborty, Anirban; Katz, William T.

    2014-01-01

    The aim in high-resolution connectomics is to reconstruct complete neuronal connectivity in a tissue. Currently, the only technology capable of resolving the smallest neuronal processes is electron microscopy (EM). Thus, a common approach to network reconstruction is to perform (error-prone) automatic segmentation of EM images, followed by manual proofreading by experts to fix errors. We have developed an algorithm and software library to not only improve the accuracy of the initial automatic segmentation, but also point out the image coordinates where it is likely to have made errors. Our software, called gala (graph-based active learning of agglomeration), improves the state of the art in agglomerative image segmentation. It is implemented in Python and makes extensive use of the scientific Python stack (numpy, scipy, networkx, scikit-learn, scikit-image, and others). We present here the software architecture of the gala library, and discuss several designs that we consider would be generally useful for other segmentation packages. We also discuss the current limitations of the gala library and how we intend to address them. PMID:24772079

  14. The effects of increasing semantic-associate list length on the Deese-Roediger-McDermott false recognition memory: Dual false-memory process in retrieval from sub- and supraspan lists.

    PubMed

    Jou, Jerwen; Arredondo, Mario L; Li, Cheng; Escamilla, Eric E; Zuniga, Richard

    2017-10-01

    In this study, the number of semantic associates in Deese-Roediger-McDermott (DRM) lists was varied from 4 to 14 in a modified Sternberg paradigm. The false alarm (FA) and correct rejection (CR) reaction time (RT)/memory-set size (MSS) functions of critical lures showed a cross-over interaction at approximately MSS 7, suggesting a reversal of the relative dominance between these two responses to the critical lure at this point and also indicating the location of the boundary between the sub- and supraspan MSS. For the subspan lists, FA to critical lures was slower than CR, suggesting a slow, strategic mechanism driving the false memory. Conversely, for the supraspan lists, critical lure FA was faster than its CR, suggesting a spontaneous mechanism driving the false memory. Results of two experiments showed that an automatic, fast, and a slow, controlled process could be error-prone or error-corrective, depending on the length of the DRM memory list. Thus there is a dual retrieval process in false memory as in true memory. The findings can be explained by both the activation/monitoring and the fuzzy-trace theories.

  15. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  16. Echo™ User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Dustin Yewell

    Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less

  17. Recent advances in quantitative analysis of fluid interfaces in multiphase fluid flow measured by synchrotron-based x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Schlueter, S.; Sheppard, A.; Wildenschild, D.

    2013-12-01

    Imaging of fluid interfaces in three-dimensional porous media via x-ray microtomography is an efficient means to test thermodynamically derived predictions on the relationship between capillary pressure, fluid saturation and specific interfacial area (Pc-Sw-Anw) in partially saturated porous media. Various experimental studies exist to date that validate the uniqueness of the Pc-Sw-Anw relationship under static conditions and with current technological progress direct imaging of moving interfaces under dynamic conditions is also becoming available. Image acquisition and subsequent image processing currently involves many steps each prone to operator bias, like merging different scans of the same sample obtained at different beam energies into a single image or the generation of isosurfaces from the segmented multiphase image on which the interface properties are usually calculated. We demonstrate that with recent advancements in (i) image enhancement methods, (ii) multiphase segmentation methods and (iii) methods of structural analysis we can considerably decrease the time and cost of image acquisition and the uncertainty associated with the measurement of interfacial properties. In particular, we highlight three notorious problems in multiphase image processing and provide efficient solutions for each: (i) Due to noise, partial volume effects, and imbalanced volume fractions, automated histogram-based threshold detection methods frequently fail. However, these impairments can be mitigated with modern denoising methods, special treatment of gray value edges and adaptive histogram equilization, such that most of the standard methods for threshold detection (Otsu, fuzzy c-means, minimum error, maximum entropy) coincide at the same set of values. (ii) Partial volume effects due to blur may produce apparent water films around solid surfaces that alter the specific fluid-fluid interfacial area (Anw) considerably. In a synthetic test image some local segmentation methods like Bayesian Markov random field, converging active contours and watershed segmentation reduced the error in Anw associated with apparent water films from 21% to 6-11%. (iii) The generation of isosurfaces from the segmented data usually requires a lot of postprocessing in order to smooth the surface and check for consistency errors. This can be avoided by calculating specific interfacial areas directly on the segmented voxel image by means of Minkowski functionals which is highly efficient and less error prone.

  18. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less

  19. The p21 and PCNA partnership: a new twist for an old plot.

    PubMed

    Prives, Carol; Gottifredi, Vanesa

    2008-12-15

    The contribution of error-prone DNA polymerases to the DNA damage response has been a subject of great interest in the last decade. Error-prone polymerases are required for translesion DNA synthesis (TLS), a process that involves synthesis past a DNA lesion. Under certain circumstances, TLS polymerases can achieve bypass with good efficiency and fidelity. However, they can also in some cases be mutagenic, and so negative regulators of TLS polymerases would have the important function of inhibiting their recruitment to undamaged DNA templates. Recent work from Livneh's and our groups have provided evidence regarding the role of the cyclin kinase inhibitor p21 as a negative regulator of TLS. Interestingly, both the cyclin dependent kinase (CDK) and proliferating cell nuclear antigen (PCNA) binding domains of p21 are involved in different aspects of the modulation of TLS, affecting both the interaction between PCNA and the TLS-specific pol eta as well as PCNA ubiquitination status. In line with this, p21 was shown to reduce the efficiency but increase the accuracy of TLS. Hence, in absence of DNA damage p21 may work to impede accidental loading of pol eta to undamaged DNA and avoid consequential mutagenesis. After UV irradiation, when TLS plays a decisive role, p21 is progressively degraded. This might allow gradual release of replication fork blockage by TLS polymerases. For these reasons, in higher eukaryotes p21 might represent a key regulator of the equilibrium between mutagenesis and cell survival.

  20. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data

    PubMed Central

    Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.

    2017-01-01

    Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395

  1. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    PubMed

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  2. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  3. Is There Evidence for a Mixture of Processes in Speed-Accuracy Trade-Off Behavior?

    PubMed

    van Maanen, Leendert

    2016-01-01

    The speed-accuracy trade-off (SAT) effect refers to the behavioral trade-off between fast yet error-prone respones and accurate but slow responses. Multiple theories on the cognitive mechanisms behind SAT exist. One theory assumes that SAT is a consequence of strategically adjusting the amount of evidence required for overt behaviors, such as perceptual choices. Another theory hypothesizes that SAT is the consequence of the mixture of multiple categorically different cognitive processes. In this paper, these theories are disambiguated by assessing whether the fixed-point property of mixture distributions holds, in both simulations and data. I conclude that, at least for perceptual decision making, there is no evidence for a mixture of different cognitive processes to trade off accuracy of responding for speed. Copyright © 2016 Cognitive Science Society, Inc.

  4. Error Recovery in the Time-Triggered Paradigm with FTT-CAN.

    PubMed

    Marques, Luis; Vasconcelos, Verónica; Pedreiras, Paulo; Almeida, Luís

    2018-01-11

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots.

  5. Error Recovery in the Time-Triggered Paradigm with FTT-CAN

    PubMed Central

    Pedreiras, Paulo; Almeida, Luís

    2018-01-01

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots. PMID:29324723

  6. Photorealistic ray tracing to visualize automobile side mirror reflective scenes.

    PubMed

    Lee, Hocheol; Kim, Kyuman; Lee, Gang; Lee, Sungkoo; Kim, Jingu

    2014-10-20

    We describe an interactive visualization procedure for determining the optimal surface of a special automobile side mirror, thereby removing the blind spot, without the need for feedback from the error-prone manufacturing process. If the horizontally progressive curvature distributions are set to the semi-mathematical expression for a free-form surface, the surface point set can then be derived through numerical integration. This is then converted to a NURBS surface while retaining the surface curvature. Then, reflective scenes from the driving environment can be virtually realized using photorealistic ray tracing, in order to evaluate how these reflected images would appear to drivers.

  7. Interior Reconstruction Using the 3d Hough Transform

    NASA Astrophysics Data System (ADS)

    Dumitru, R.-C.; Borrmann, D.; Nüchter, A.

    2013-02-01

    Laser scanners are often used to create accurate 3D models of buildings for civil engineering purposes, but the process of manually vectorizing a 3D point cloud is time consuming and error-prone (Adan and Huber, 2011). Therefore, the need to characterize and quantify complex environments in an automatic fashion arises, posing challenges for data analysis. This paper presents a system for 3D modeling by detecting planes in 3D point clouds, based on which the scene is reconstructed at a high architectural level through removing automatically clutter and foreground data. The implemented software detects openings, such as windows and doors and completes the 3D model by inpainting.

  8. [Medical image elastic registration smoothed by unconstrained optimized thin-plate spline].

    PubMed

    Zhang, Yu; Li, Shuxiang; Chen, Wufan; Liu, Zhexing

    2003-12-01

    Elastic registration of medical image is an important subject in medical image processing. Previous work has concentrated on selecting the corresponding landmarks manually and then using thin-plate spline interpolating to gain the elastic transformation. However, the landmarks extraction is always prone to error, which will influence the registration results. Localizing the landmarks manually is also difficult and time-consuming. We the optimization theory to improve the thin-plate spline interpolation, and based on it, used an automatic method to extract the landmarks. Combining these two steps, we have proposed an automatic, exact and robust registration method and have gained satisfactory registration results.

  9. The development of causal reasoning.

    PubMed

    Kuhn, Deanna

    2012-05-01

    How do inference rules for causal learning themselves change developmentally? A model of the development of causal reasoning must address this question, as well as specify the inference rules. Here, the evidence for developmental changes in processes of causal reasoning is reviewed, with the distinction made between diagnostic causal inference and causal prediction. Also addressed is the paradox of a causal reasoning literature that highlights the competencies of young children and the proneness to error among adults. WIREs Cogn Sci 2012, 3:327-335. doi: 10.1002/wcs.1160 For further resources related to this article, please visit the WIREs website. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Automatic identification of bacterial types using statistical imaging methods

    NASA Astrophysics Data System (ADS)

    Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon

    2003-05-01

    The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.

  11. Perspective-taking abilities in the balance between autism tendencies and psychosis proneness.

    PubMed

    Abu-Akel, Ahmad M; Wood, Stephen J; Hansen, Peter C; Apperly, Ian A

    2015-06-07

    Difficulties with the ability to appreciate the perspective of others (mentalizing) is central to both autism and schizophrenia spectrum disorders. While the disorders are diagnostically independent, they can co-occur in the same individual. The effect of such co-morbidity is hypothesized to worsen mentalizing abilities. The recent influential 'diametric brain theory', however, suggests that the disorders are etiologically and phenotypically diametrical, predicting opposing effects on one's mentalizing abilities. To test these contrasting hypotheses, we evaluated the effect of psychosis and autism tendencies on the perspective-taking (PT) abilities of 201 neurotypical adults, on the assumption that autism tendencies and psychosis proneness are heritable dimensions of normal variation. We show that while both autism tendencies and psychosis proneness induce PT errors, their interaction reduced these errors. Our study is, to our knowledge, the first to observe that co-occurring autistic and psychotic traits can exert opposing influences on performance, producing a normalizing effect possibly by way of their diametrical effects on socio-cognitive abilities. This advances the notion that some individuals may, to some extent, be buffered against developing either illness or present fewer symptoms owing to a balanced expression of autistic and psychosis liability. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Multi-temporal change image inference towards false alarms reduction for an operational photogrammetric rockfall detection system

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas

    2015-06-01

    Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.

  13. Identifying chronic errors at freeway loop detectors- splashover, pulse breakup, and sensitivity settings.

    DOT National Transportation Integrated Search

    2011-03-01

    Traffic Management applications such as ramp metering, incident detection, travel time prediction, and vehicle : classification greatly depend on the accuracy of data collected from inductive loop detectors, but these data are : prone to various erro...

  14. Automation of Cassini Support Imaging Uplink Command Development

    NASA Technical Reports Server (NTRS)

    Ly-Hollins, Lisa; Breneman, Herbert H.; Brooks, Robert

    2010-01-01

    "Support imaging" is imagery requested by other Cassini science teams to aid in the interpretation of their data. The generation of the spacecraft command sequences for these images is performed by the Cassini Instrument Operations Team. The process initially established for doing this was very labor-intensive, tedious and prone to human error. Team management recognized this process as one that could easily benefit from automation. Team members were tasked to document the existing manual process, develop a plan and strategy to automate the process, implement the plan and strategy, test and validate the new automated process, and deliver the new software tools and documentation to Flight Operations for use during the Cassini extended mission. In addition to the goals of higher efficiency and lower risk in the processing of support imaging requests, an effort was made to maximize adaptability of the process to accommodate uplink procedure changes and the potential addition of new capabilities outside the scope of the initial effort.

  15. Use of Existing CAD Models for Radiation Shielding Analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. T.; Barzilla, J. E.; Wilson, P.; Davis, A.; Zachman, J.

    2015-01-01

    The utility of a radiation exposure analysis depends not only on the accuracy of the underlying particle transport code, but also on the accuracy of the geometric representations of both the vehicle used as radiation shielding mass and the phantom representation of the human form. The current NASA/Space Radiation Analysis Group (SRAG) process to determine crew radiation exposure in a vehicle design incorporates both output from an analytic High Z and Energy Particle Transport (HZETRN) code and the properties (i.e., material thicknesses) of a previously processed drawing. This geometry pre-process can be time-consuming, and the results are less accurate than those determined using a Monte Carlo-based particle transport code. The current work aims to improve this process. Although several Monte Carlo programs (FLUKA, Geant4) are readily available, most use an internal geometry engine. The lack of an interface with the standard CAD formats used by the vehicle designers limits the ability of the user to communicate complex geometries. Translation of native CAD drawings into a format readable by these transport programs is time consuming and prone to error. The Direct Accelerated Geometry -United (DAGU) project is intended to provide an interface between the native vehicle or phantom CAD geometry and multiple particle transport codes to minimize problem setup, computing time and analysis error.

  16. First order error corrections in common introductory physics experiments

    NASA Astrophysics Data System (ADS)

    Beckey, Jacob; Baker, Andrew; Aravind, Vasudeva; Clarion Team

    As a part of introductory physics courses, students perform different standard lab experiments. Almost all of these experiments are prone to errors owing to factors like friction, misalignment of equipment, air drag, etc. Usually these types of errors are ignored by students and not much thought is paid to the source of these errors. However, paying attention to these factors that give rise to errors help students make better physics models and understand physical phenomena behind experiments in more detail. In this work, we explore common causes of errors in introductory physics experiment and suggest changes that will mitigate the errors, or suggest models that take the sources of these errors into consideration. This work helps students build better and refined physical models and understand physics concepts in greater detail. We thank Clarion University undergraduate student grant for financial support involving this project.

  17. Modelling Hepatitis B Virus Antiviral Therapy and Drug Resistant Mutant Strains

    NASA Astrophysics Data System (ADS)

    Bernal, Julie; Dix, Trevor; Allison, Lloyd; Bartholomeusz, Angeline; Yuen, Lilly

    Despite the existence of vaccines, the Hepatitis B virus (HBV) is still a serious global health concern. HBV targets liver cells. It has an unusual replication process involving an RNA pre-genome that the reverse transcriptase domain of the viral polymerase protein translates into viral DNA. The reverse transcription process is error prone and together with the high replication rates of the virus, allows the virus to exist as a heterogeneous population of mutants, known as a quasispecies, that can adapt and become resistant to antiviral therapy. This study presents an individual-based model of HBV inside an artificial liver, and associated blood serum, undergoing antiviral therapy. This model aims to provide insights into the evolution of the HBV quasispecies and the individual contribution of HBV mutations in the outcome of therapy.

  18. Risk management: correct patient and specimen identification in a surgical pathology laboratory. The experience of Infermi Hospital, Rimini, Italy.

    PubMed

    Fabbretti, G

    2010-06-01

    Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.

  19. An automated calibration method for non-see-through head mounted displays.

    PubMed

    Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew

    2011-08-15

    Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Assessing primary care data quality.

    PubMed

    Lim, Yvonne Mei Fong; Yusof, Maryati; Sivasampu, Sheamini

    2018-04-16

    Purpose The purpose of this paper is to assess National Medical Care Survey data quality. Design/methodology/approach Data completeness and representativeness were computed for all observations while other data quality measures were assessed using a 10 per cent sample from the National Medical Care Survey database; i.e., 12,569 primary care records from 189 public and private practices were included in the analysis. Findings Data field completion ranged from 69 to 100 per cent. Error rates for data transfer from paper to web-based application varied between 0.5 and 6.1 per cent. Error rates arising from diagnosis and clinical process coding were higher than medication coding. Data fields that involved free text entry were more prone to errors than those involving selection from menus. The authors found that completeness, accuracy, coding reliability and representativeness were generally good, while data timeliness needs to be improved. Research limitations/implications Only data entered into a web-based application were examined. Data omissions and errors in the original questionnaires were not covered. Practical implications Results from this study provided informative and practicable approaches to improve primary health care data completeness and accuracy especially in developing nations where resources are limited. Originality/value Primary care data quality studies in developing nations are limited. Understanding errors and missing data enables researchers and health service administrators to prevent quality-related problems in primary care data.

  1. Cognitive fallacies and criminal investigations.

    PubMed

    Ditrich, Hans

    2015-03-01

    The human mind is susceptible to inherent fallacies that often hamper fully rational action. Many such misconceptions have an evolutionary background and are thus difficult to avert. Deficits in the reliability of eye-witnesses are well known to legal professionals; however, less attention has been paid to such effects in crime investigators. In order to obtain an "inside view" on the role of cognitive misconceptions in criminalistic work, a list of fallacies from the literature was adapted to criminalistic settings. The statements on this list were rated by highly experienced crime scene investigators according to the assumed likelihood of these errors to appear and their severity of effect. Among others, selective perception, expectation and confirmation bias, anchoring/"pars per toto" errors and "onus probandi"--shifting the burden of proof from the investigator to the suspect--were frequently considered to negatively affect criminal investigations. As a consequence, the following measures are proposed: alerting investigating officers in their training to cognitive fallacies and promoting the exchange of experiences in peer circles of investigators on a regular basis. Furthermore, the improvement of the organizational error culture and the establishment of a failure analysis system in order to identify and alleviate error prone processes are suggested. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Specificity control for read alignments using an artificial reference genome-guided false discovery rate.

    PubMed

    Giese, Sven H; Zickmann, Franziska; Renard, Bernhard Y

    2014-01-01

    Accurate estimation, comparison and evaluation of read mapping error rates is a crucial step in the processing of next-generation sequencing data, as further analysis steps and interpretation assume the correctness of the mapping results. Current approaches are either focused on sensitivity estimation and thereby disregard specificity or are based on read simulations. Although continuously improving, read simulations are still prone to introduce a bias into the mapping error quantitation and cannot capture all characteristics of an individual dataset. We introduce ARDEN (artificial reference driven estimation of false positives in next-generation sequencing data), a novel benchmark method that estimates error rates of read mappers based on real experimental reads, using an additionally generated artificial reference genome. It allows a dataset-specific computation of error rates and the construction of a receiver operating characteristic curve. Thereby, it can be used for optimization of parameters for read mappers, selection of read mappers for a specific problem or for filtering alignments based on quality estimation. The use of ARDEN is demonstrated in a general read mapper comparison, a parameter optimization for one read mapper and an application example in single-nucleotide polymorphism discovery with a significant reduction in the number of false positive identifications. The ARDEN source code is freely available at http://sourceforge.net/projects/arden/.

  3. [Prospective assessment of medication errors in critically ill patients in a university hospital].

    PubMed

    Salazar L, Nicole; Jirón A, Marcela; Escobar O, Leslie; Tobar, Eduardo; Romero, Carlos

    2011-11-01

    Critically ill patients are especially vulnerable to medication errors (ME) due to their severe clinical situation and the complexities of their management. To determine the frequency and characteristics of ME and identify shortcomings in the processes of medication management in an Intensive Care Unit. During a 3 months period, an observational prospective and randomized study was carried out in the ICU of a university hospital. Every step of patient's medication management (prescription, transcription, dispensation, preparation and administration) was evaluated by an external trained professional. Steps with higher frequency of ME and their therapeutic groups involved were identified. Medications errors were classified according to the National Coordinating Council for Medication Error Reporting and Prevention. In 52 of 124 patients evaluated, 66 ME were found in 194 drugs prescribed. In 34% of prescribed drugs, there was at least 1 ME during its use. Half of ME occurred during medication administration, mainly due to problems in infusion rates and schedule times. Antibacterial drugs had the highest rate of ME. We found a 34% rate of ME per drug prescribed, which is in concordance with international reports. The identification of those steps more prone to ME in the ICU, will allow the implementation of an intervention program to improve the quality and security of medication management.

  4. Grunting's competitive advantage: Considerations of force and distraction

    PubMed Central

    Maglinti, Cj; Kingstone, Alan

    2018-01-01

    Background Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport—mixed martial arts—where distraction, rather than masking, is the most likely mechanism. Methodology/Principal findings We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent’s response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. Conclusions/Significance The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined. PMID:29470505

  5. Grunting's competitive advantage: Considerations of force and distraction.

    PubMed

    Sinnett, Scott; Maglinti, Cj; Kingstone, Alan

    2018-01-01

    Grunting is pervasive in many athletic contests, and empirical evidence suggests that it may result in one exerting more physical force. It may also distract one's opponent. That grunts can distract was supported by a study showing that it led to an opponent being slower and more error prone when viewing tennis shots. An alternative explanation was that grunting masks the sound of a ball being hit. The present study provides evidence against this alternative explanation by testing the effect of grunting in a sport-mixed martial arts-where distraction, rather than masking, is the most likely mechanism. We first confirmed that kicking force is increased when a grunt is performed (Experiment 1), and then adapted methodology used in the tennis study to mixed martial arts (Experiment 2). Lifting the foot to kick is a silent act, and therefore there is nothing for a grunt to mask, i.e., its effect on an opponent's response time and/or accuracy can likely be attributed to attentional distraction. Participants viewed videos of a trained mixed martial artist kicking that included, or did not include, a simulated grunt. The task was to determine as quickly as possible whether the kick was traveling upward or downward. Overall, and replicating the tennis finding, the present results indicate that a participant's response to a kick was delayed and more error prone when a simulated grunt was present. The present findings indicate that simulated grunting may distract an opponent, leading to slower and more error prone responses. The implications for martial arts in particular, and the broader question of whether grunting should be perceived as 'cheating' in sports, are examined.

  6. Precise and heritable genome editing in evolutionarily diverse nematodes using TALENs and CRISPR/Cas9 to engineer insertions and deletions.

    PubMed

    Lo, Te-Wen; Pickle, Catherine S; Lin, Steven; Ralston, Edward J; Gurling, Mark; Schartner, Caitlin M; Bian, Qian; Doudna, Jennifer A; Meyer, Barbara J

    2013-10-01

    Exploitation of custom-designed nucleases to induce DNA double-strand breaks (DSBs) at genomic locations of choice has transformed our ability to edit genomes, regardless of their complexity. DSBs can trigger either error-prone repair pathways that induce random mutations at the break sites or precise homology-directed repair pathways that generate specific insertions or deletions guided by exogenously supplied DNA. Prior editing strategies using site-specific nucleases to modify the Caenorhabditis elegans genome achieved only the heritable disruption of endogenous loci through random mutagenesis by error-prone repair. Here we report highly effective strategies using TALE nucleases and RNA-guided CRISPR/Cas9 nucleases to induce error-prone repair and homology-directed repair to create heritable, precise insertion, deletion, or substitution of specific DNA sequences at targeted endogenous loci. Our robust strategies are effective across nematode species diverged by 300 million years, including necromenic nematodes (Pristionchus pacificus), male/female species (Caenorhabditis species 9), and hermaphroditic species (C. elegans). Thus, genome-editing tools now exist to transform nonmodel nematode species into genetically tractable model organisms. We demonstrate the utility of our broadly applicable genome-editing strategies by creating reagents generally useful to the nematode community and reagents specifically designed to explore the mechanism and evolution of X chromosome dosage compensation. By developing an efficient pipeline involving germline injection of nuclease mRNAs and single-stranded DNA templates, we engineered precise, heritable nucleotide changes both close to and far from DSBs to gain or lose genetic function, to tag proteins made from endogenous genes, and to excise entire loci through targeted FLP-FRT recombination.

  7. A nucleotide-analogue-induced gain of function corrects the error-prone nature of human DNA polymerase iota.

    PubMed

    Ketkar, Amit; Zafar, Maroof K; Banerjee, Surajit; Marquez, Victor E; Egli, Martin; Eoff, Robert L

    2012-06-27

    Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2'-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2'-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle, which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base-stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase.

  8. A nucleotide analogue induced gain of function corrects the error-prone nature of human DNA polymerase iota

    PubMed Central

    Ketkar, Amit; Zafar, Maroof K.; Banerjee, Surajit; Marquez, Victor E.; Egli, Martin; Eoff, Robert L

    2012-01-01

    Y-family DNA polymerases participate in replication stress and DNA damage tolerance mechanisms. The properties that allow these enzymes to copy past bulky adducts or distorted template DNA can result in a greater propensity for them to make mistakes. Of the four human Y-family members, human DNA polymerase iota (hpol ι) is the most error-prone. In the current study, we elucidate the molecular basis for improving the fidelity of hpol ι through use of the fixed-conformation nucleotide North-methanocarba-2′-deoxyadenosine triphosphate (N-MC-dATP). Three crystal structures were solved of hpol ι in complex with DNA containing a template 2′-deoxythymidine (dT) paired with an incoming dNTP or modified nucleotide triphosphate. The ternary complex of hpol ι inserting N-MC-dATP opposite dT reveals that the adenine ring is stabilized in the anti orientation about the pseudo-glycosyl torsion angle (χ), which mimics precisely the mutagenic arrangement of dGTP:dT normally preferred by hpol ι. The stabilized anti conformation occurs without notable contacts from the protein but likely results from constraints imposed by the bicyclo[3.1.0]hexane scaffold of the modified nucleotide. Unmodified dATP and South-MC-dATP each adopt syn glycosyl orientations to form Hoogsteen base pairs with dT. The Hoogsteen orientation exhibits weaker base stacking interactions and is less catalytically favorable than anti N-MC-dATP. Thus, N-MC-dATP corrects the error-prone nature of hpol ι by preventing the Hoogsteen base-pairing mode normally observed for hpol ι-catalyzed insertion of dATP opposite dT. These results provide a previously unrecognized means of altering the efficiency and the fidelity of a human translesion DNA polymerase. PMID:22632140

  9. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    DOE PAGES

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; ...

    2017-02-15

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less

  10. Software for Quantifying and Simulating Microsatellite Genotyping Error

    PubMed Central

    Johnson, Paul C.D.; Haydon, Daniel T.

    2007-01-01

    Microsatellite genetic marker data are exploited in a variety of fields, including forensics, gene mapping, kinship inference and population genetics. In all of these fields, inference can be thwarted by failure to quantify and account for data errors, and kinship inference in particular can benefit from separating errors into two distinct classes: allelic dropout and false alleles. Pedant is MS Windows software for estimating locus-specific maximum likelihood rates of these two classes of error. Estimation is based on comparison of duplicate error-prone genotypes: neither reference genotypes nor pedigree data are required. Other functions include: plotting of error rate estimates and confidence intervals; simulations for performing power analysis and for testing the robustness of error rate estimates to violation of the underlying assumptions; and estimation of expected heterozygosity, which is a required input. The program, documentation and source code are available from http://www.stats.gla.ac.uk/~paulj/pedant.html. PMID:20066126

  11. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    PubMed Central

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik; Rudinger, Kenneth; Mizrahi, Jonathan; Fortier, Kevin; Maunz, Peter

    2017-01-01

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Here we use gate set tomography to completely characterize operations on a trapped-Yb+-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10−4). PMID:28198466

  12. Demonstration of qubit operations below a rigorous fault tolerance threshold with gate set tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blume-Kohout, Robin; Gamble, John King; Nielsen, Erik

    Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if—and only if—the error in each physical qubit operation is smaller than a certain threshold. The threshold for general errors is quantified by their diamond norm. Until now, qubits have been assessed primarily by randomized benchmarking, which reports a different error rate that is not sensitive to all errors, and cannot be compared directly to diamond norm thresholds. Finally, we usemore » gate set tomography to completely characterize operations on a trapped-Yb +-ion qubit and demonstrate with greater than 95% confidence that they satisfy a rigorous threshold for FTQEC (diamond norm ≤6.7 × 10 -4).« less

  13. Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning

    NASA Astrophysics Data System (ADS)

    Cui, J.; Dong, B.; Li, J.; Li, L.

    2017-09-01

    As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  14. ADASS Web Database XML Project

    NASA Astrophysics Data System (ADS)

    Barg, M. I.; Stobie, E. B.; Ferro, A. J.; O'Neil, E. J.

    In the spring of 2000, at the request of the ADASS Program Organizing Committee (POC), we began organizing information from previous ADASS conferences in an effort to create a centralized database. The beginnings of this database originated from data (invited speakers, participants, papers, etc.) extracted from HyperText Markup Language (HTML) documents from past ADASS host sites. Unfortunately, not all HTML documents are well formed and parsing them proved to be an iterative process. It was evident at the beginning that if these Web documents were organized in a standardized way, such as XML (Extensible Markup Language), the processing of this information across the Web could be automated, more efficient, and less error prone. This paper will briefly review the many programming tools available for processing XML, including Java, Perl and Python, and will explore the mapping of relational data from our MySQL database to XML.

  15. TOOKUIL: A case study in user interface development for safety code application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less

  16. Effect of lethality on the extinction and on the error threshold of quasispecies.

    PubMed

    Tejero, Hector; Marín, Arturo; Montero, Francisco

    2010-02-21

    In this paper the effect of lethality on error threshold and extinction has been studied in a population of error-prone self-replicating molecules. For given lethality and a simple fitness landscape, three dynamic regimes can be obtained: quasispecies, error catastrophe, and extinction. Using a simple model in which molecules are classified as master, lethal and non-lethal mutants, it is possible to obtain the mutation rates of the transitions between the three regimes analytically. The numerical resolution of the extended model, in which molecules are classified depending on their Hamming distance to the master sequence, confirms the results obtained in the simple model and shows how an error catastrophe regime changes when lethality is taken in account. (c) 2009 Elsevier Ltd. All rights reserved.

  17. Ability/Motivation Interactions in Complex Skill Acquisition

    DTIC Science & Technology

    1988-04-28

    attentional resources. Finally, in the declarative knowledge phase, performance is slow and error prone. Once the learner has come to an adequate cognitive...mediation by the learner. After a substantial amount of consistent task practice, skilled performance becomes fast , accurate, and the task can often be

  18. DNA polymerase η mutational signatures are found in a variety of different types of cancer.

    PubMed

    Rogozin, Igor B; Goncearenco, Alexander; Lada, Artem G; De, Subhajyoti; Yurchenko, Vyacheslav; Nudelman, German; Panchenko, Anna R; Cooper, David N; Pavlov, Youri I

    2018-01-01

    DNA polymerase (pol) η is a specialized error-prone polymerase with at least two quite different and contrasting cellular roles: to mitigate the genetic consequences of solar UV irradiation, and promote somatic hypermutation in the variable regions of immunoglobulin genes. Misregulation and mistargeting of pol η can compromise genome integrity. We explored whether the mutational signature of pol η could be found in datasets of human somatic mutations derived from normal and cancer cells. A substantial excess of single and tandem somatic mutations within known pol η mutable motifs was noted in skin cancer as well as in many other types of human cancer, suggesting that somatic mutations in A:T bases generated by DNA polymerase η are a common feature of tumorigenesis. Another peculiarity of pol ηmutational signatures, mutations in YCG motifs, led us to speculate that error-prone DNA synthesis opposite methylated CpG dinucleotides by misregulated pol η in tumors might constitute an additional mechanism of cytosine demethylation in this hypermutable dinucleotide.

  19. Chiron: translating nanopore raw signal directly into nucleotide sequence using deep learning.

    PubMed

    Teng, Haotian; Cao, Minh Duc; Hall, Michael B; Duarte, Tania; Wang, Sheng; Coin, Lachlan J M

    2018-05-01

    Sequencing by translocating DNA fragments through an array of nanopores is a rapidly maturing technology that offers faster and cheaper sequencing than other approaches. However, accurately deciphering the DNA sequence from the noisy and complex electrical signal is challenging. Here, we report Chiron, the first deep learning model to achieve end-to-end basecalling and directly translate the raw signal to DNA sequence without the error-prone segmentation step. Trained with only a small set of 4,000 reads, we show that our model provides state-of-the-art basecalling accuracy, even on previously unseen species. Chiron achieves basecalling speeds of more than 2,000 bases per second using desktop computer graphics processing units.

  20. Somatic stem cells and the kinetics of mutagenesis and carcinogenesis

    PubMed Central

    Cairns, John

    2002-01-01

    There is now strong experimental evidence that epithelial stem cells arrange their sister chromatids at mitosis such that the same template DNA strands stay together through successive divisions; DNA labeled with tritiated thymidine in infancy is still present in the stem cells of adult mice even though these cells are incorporating (and later losing) bromodeoxyuridine [Potten, C. S., Owen, G., Booth, D. & Booth, C. (2002) J. Cell Sci.115, 2381–2388]. But a cell that preserves “immortal strands” will avoid the accumulation of replication errors only if it inhibits those pathways for DNA repair that involve potentially error-prone resynthesis of damaged strands, and this appears to be a property of intestinal stem cells because they are extremely sensitive to the lethal effects of agents that damage DNA. It seems that the combination, in the stem cell, of immortal strands and the choice of death rather than error-prone repair makes epithelial stem cell systems resistant to short exposures to DNA-damaging agents, because the stem cell accumulates few if any errors, and any errors made by the daughters are destined to be discarded. This paper discusses these issues and shows that they lead to a model that explains the strange kinetics of mutagenesis and carcinogenesis in adult mammalian tissues. Coincidentally, the model also can explain why cancers arise even though the spontaneous mutation rate of differentiated mammalian cells is not high enough to generate the multiple mutations needed to form a cancer and why loss of nucleotide-excision repair does not significantly increase the frequency of the common internal cancers. PMID:12149477

  1. Hybrid learning in signalling games

    NASA Astrophysics Data System (ADS)

    Barrett, Jeffrey A.; Cochran, Calvin T.; Huttegger, Simon; Fujiwara, Naoki

    2017-09-01

    Lewis-Skyrms signalling games have been studied under a variety of low-rationality learning dynamics. Reinforcement dynamics are stable but slow and prone to evolving suboptimal signalling conventions. A low-inertia trial-and-error dynamical like win-stay/lose-randomise is fast and reliable at finding perfect signalling conventions but unstable in the context of noise or agent error. Here we consider a low-rationality hybrid of reinforcement and win-stay/lose-randomise learning that exhibits the virtues of both. This hybrid dynamics is reliable, stable and exceptionally fast.

  2. Reduced vision selectively impairs spatial updating in fall-prone older adults.

    PubMed

    Barrett, Maeve M; Doheny, Emer P; Setti, Annalisa; Maguinness, Corrina; Foran, Timothy G; Kenny, Rose Anne; Newell, Fiona N

    2013-01-01

    The current study examined the role of vision in spatial updating and its potential contribution to an increased risk of falls in older adults. Spatial updating was assessed using a path integration task in fall-prone and healthy older adults. Specifically, participants conducted a triangle completion task in which they were guided along two sides of a triangular route and were then required to return, unguided, to the starting point. During the task, participants could either clearly view their surroundings (full vision) or visuo-spatial information was reduced by means of translucent goggles (reduced vision). Path integration performance was measured by calculating the distance and angular deviation from the participant's return point relative to the starting point. Gait parameters for the unguided walk were also recorded. We found equivalent performance across groups on all measures in the full vision condition. In contrast, in the reduced vision condition, where participants had to rely on interoceptive cues to spatially update their position, fall-prone older adults made significantly larger distance errors relative to healthy older adults. However, there were no other performance differences between fall-prone and healthy older adults. These findings suggest that fall-prone older adults, compared to healthy older adults, have greater difficulty in reweighting other sensory cues for spatial updating when visual information is unreliable.

  3. Relationships between trait impulsivity and cognitive control: the effect of attention switching on response inhibition and conflict resolution.

    PubMed

    Leshem, Rotem

    2016-02-01

    This study examined the relationship between trait impulsivity and cognitive control, as measured by the Barratt Impulsiveness Scale (BIS) and a focused attention dichotic listening to words task, respectively. In the task, attention was manipulated in two attention conditions differing in their cognitive control demands: one in which attention was directed to one ear at a time for a whole block of trials (blocked condition) and another in which attention was switched pseudo-randomly between the two ears from trial to trial (mixed condition). Results showed that high impulsivity participants exhibited more false alarm and intrusion errors as well as a lesser ability to distinguish between stimuli in the mixed condition, as compared to low impulsivity participants. In the blocked condition, the performance levels of the two groups were comparable with respect to these measures. In addition, total BIS scores were correlated with intrusions and laterality index in the mixed but not the blocked condition. The findings suggest that high impulsivity individuals may be less prone to attentional difficulties when cognitive load is relatively low. In contrast, when attention switching is involved, high impulsivity is associated with greater difficulty in inhibiting responses and resolving cognitive conflict than is low impulsivity, as reflected in error-prone information processing. The conclusion is that trait impulsivity in a non-clinical population is manifested more strongly when attention switching is required than during maintained attention. This may have important implications for the conceptualization and treatment of impulsivity in both non-clinical and clinical populations.

  4. Coordinating DNA polymerase traffic during high and low fidelity synthesis.

    PubMed

    Sutton, Mark D

    2010-05-01

    With the discovery that organisms possess multiple DNA polymerases (Pols) displaying different fidelities, processivities, and activities came the realization that mechanisms must exist to manage the actions of these diverse enzymes to prevent gratuitous mutations. Although many of the Pols encoded by most organisms are largely accurate, and participate in DNA replication and DNA repair, a sizeable fraction display a reduced fidelity, and act to catalyze potentially error-prone translesion DNA synthesis (TLS) past lesions that persist in the DNA. Striking the proper balance between use of these different enzymes during DNA replication, DNA repair, and TLS is essential for ensuring accurate duplication of the cell's genome. This review highlights mechanisms that organisms utilize to manage the actions of their different Pols. A particular emphasis is placed on discussion of current models for how different Pols switch places with each other at the replication fork during high fidelity replication and potentially error-pone TLS. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  5. Radiation damage and repair in cells and cell components. Progress report, 1980-1981

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-01-01

    One aim has been to see whether, in E.coli, the various phenomena which were ascribed to the induction of the recA gene produce (p-recA) are really manifestations of one process. It was concluded that this is true for septum inhibition, Weigle-reactivation, induced inhibition of post radiation DNA degradation, and with the additional concept of a premutational lesion, for uv mutagenesis. lambda prophage induction may perhaps be brought into line with p-recA induction with the consideration of the additional secondary aspects of (a) activation of p-recA to make it enzymatically active and (b) the need to have the concentration of activatedmore » p-recA high enough to keep up with the rate of production of lambda-repressors. Revertants seem to be in more than one class and two of these can not easily be explained by the idea that p-recA contains an error-prone repair enzyme that makes errors at mutagenic lesions.« less

  6. Automated structure refinement of macromolecular assemblies from cryo-EM maps using Rosetta.

    PubMed

    Wang, Ray Yu-Ruei; Song, Yifan; Barad, Benjamin A; Cheng, Yifan; Fraser, James S; DiMaio, Frank

    2016-09-26

    Cryo-EM has revealed the structures of many challenging yet exciting macromolecular assemblies at near-atomic resolution (3-4.5Å), providing biological phenomena with molecular descriptions. However, at these resolutions, accurately positioning individual atoms remains challenging and error-prone. Manually refining thousands of amino acids - typical in a macromolecular assembly - is tedious and time-consuming. We present an automated method that can improve the atomic details in models that are manually built in near-atomic-resolution cryo-EM maps. Applying the method to three systems recently solved by cryo-EM, we are able to improve model geometry while maintaining the fit-to-density. Backbone placement errors are automatically detected and corrected, and the refinement shows a large radius of convergence. The results demonstrate that the method is amenable to structures with symmetry, of very large size, and containing RNA as well as covalently bound ligands. The method should streamline the cryo-EM structure determination process, providing accurate and unbiased atomic structure interpretation of such maps.

  7. The "subjective" pupil old/new effect: is the truth plain to see?

    PubMed

    Montefinese, Maria; Ambrosini, Ettore; Fairfield, Beth; Mammarella, Nicola

    2013-07-01

    Human memory is an imperfect process, prone to distortion and errors that range from minor disturbances to major errors that can have serious consequences on everyday life. In this study, we investigated false remembering of manipulatory verbs using an explicit recognition task and pupillometry. Our results replicated the "classical" pupil old/new effect as well as data in false remembering literature that show how items must be recognize as old in order for the pupil size to increase (e.g., "subjective" pupil old/new effect), even though these items do not necessarily have to be truly old. These findings support the strength-of-memory trace account that affirms that pupil dilation is related to experience rather than to the accuracy of recognition. Moreover, behavioral results showed higher rates of true and false recognitions for manipulatory verbs and a consequent larger pupil diameter, supporting the embodied view of language. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. A Novel Real-Time Reference Key Frame Scan Matching Method.

    PubMed

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-05-07

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions' environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.

  9. Adopting Extensible Business Reporting Language (XBRL): A Grounded Theory

    ERIC Educational Resources Information Center

    Cruz, Marivic

    2010-01-01

    In 2007 and 2008, government challenges consisted of error prone, manually intensive, and inefficient environments for financial reporting. Banking regulators worldwide faced issues with respect to transparency, timeliness, quality, and managing risks associated with accounting opacity. The general problem was the existing reporting standards and…

  10. Efficient Dependency Computation for Dynamic Hybrid Bayesian Network in On-line System Health Management Applications

    DTIC Science & Technology

    2014-10-02

    intervals (Neil, Tailor, Marquez, Fenton , & Hear, 2007). This is cumbersome, error prone and usually inaccurate. Even though a universal framework...Science. Neil, M., Tailor, M., Marquez, D., Fenton , N., & Hear. (2007). Inference in Bayesian networks using dynamic discretisation. Statistics

  11. Automatic latency equalization in VHDL-implemented complex pipelined systems

    NASA Astrophysics Data System (ADS)

    Zabołotny, Wojciech M.

    2016-09-01

    In the pipelined data processing systems it is very important to ensure that parallel paths delay data by the same number of clock cycles. If that condition is not met, the processing blocks receive data not properly aligned in time and produce incorrect results. Manual equalization of latencies is a tedious and error-prone work. This paper presents an automatic method of latency equalization in systems described in VHDL. The proposed method uses simulation to measure latencies and verify introduced correction. The solution is portable between different simulation and synthesis tools. The method does not increase the complexity of the synthesized design comparing to the solution based on manual latency adjustment. The example implementation of the proposed methodology together with a simple design demonstrating its use is available as an open source project under BSD license.

  12. Covariance NMR Processing and Analysis for Protein Assignment.

    PubMed

    Harden, Bradley J; Frueh, Dominique P

    2018-01-01

    During NMR resonance assignment it is often necessary to relate nuclei to one another indirectly, through their common correlations to other nuclei. Covariance NMR has emerged as a powerful technique to correlate such nuclei without relying on error-prone peak peaking. However, false-positive artifacts in covariance spectra have impeded a general application to proteins. We recently introduced pre- and postprocessing steps to reduce the prevalence of artifacts in covariance spectra, allowing for the calculation of a variety of 4D covariance maps obtained from diverse combinations of pairs of 3D spectra, and we have employed them to assign backbone and sidechain resonances in two large and challenging proteins. In this chapter, we present a detailed protocol describing how to (1) properly prepare existing 3D spectra for covariance, (2) understand and apply our processing script, and (3) navigate and interpret the resulting 4D spectra. We also provide solutions to a number of errors that may occur when using our script, and we offer practical advice when assigning difficult signals. We believe such 4D spectra, and covariance NMR in general, can play an integral role in the assignment of NMR signals.

  13. Towards automatic Markov reliability modeling of computer architectures

    NASA Technical Reports Server (NTRS)

    Liceaga, C. A.; Siewiorek, D. P.

    1986-01-01

    The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.

  14. Decreasing scoring errors on Wechsler Scale Vocabulary, Comprehension, and Similarities subtests: a preliminary study.

    PubMed

    Linger, Michele L; Ray, Glen E; Zachar, Peter; Underhill, Andrea T; LoBello, Steven G

    2007-10-01

    Studies of graduate students learning to administer the Wechsler scales have generally shown that training is not associated with the development of scoring proficiency. Many studies report on the reduction of aggregated administration and scoring errors, a strategy that does not highlight the reduction of errors on subtests identified as most prone to error. This study evaluated the development of scoring proficiency specifically on the Wechsler (WISC-IV and WAIS-III) Vocabulary, Comprehension, and Similarities subtests during training by comparing a set of 'early test administrations' to 'later test administrations.' Twelve graduate students enrolled in an intelligence-testing course participated in the study. Scoring errors (e.g., incorrect point assignment) were evaluated on the students' actual practice administration test protocols. Errors on all three subtests declined significantly when scoring errors on 'early' sets of Wechsler scales were compared to those made on 'later' sets. However, correcting these subtest scoring errors did not cause significant changes in subtest scaled scores. Implications for clinical instruction and future research are discussed.

  15. Simplified stereo-optical ultrasound plane calibration

    NASA Astrophysics Data System (ADS)

    Hoßbach, Martin; Noll, Matthias; Wesarg, Stefan

    2013-03-01

    Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke

  16. OPTIMA: sensitive and accurate whole-genome alignment of error-prone genomic maps by combinatorial indexing and technology-agnostic statistical analysis.

    PubMed

    Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan

    2016-01-01

    Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.

  17. Integration of multi-sensor data to measure soil surface changes

    NASA Astrophysics Data System (ADS)

    Eltner, Anette; Schneider, Danilo

    2016-04-01

    Digital elevation models (DEM) of high resolution and accuracy covering a suitable sized area of interest can be a promising approach to help understanding the processes of soil erosion. Thereby, the plot under investigation should remain undisturbed. The fragile marl landscape in Andalusia (Spain) is especially prone to soil detachment and transport with unique sediment connectivity characteristics due to the soil properties and climatic conditions. A 600 m² field plot is established and monitored during three field campaigns (Sep. 2013, Nov. 2013 and Feb. 2014). Unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are suitable tools to generate high resolution topography data that describe soil surface changes at large field plots. Thereby, the advantages of both methods are utilised in a synergetic manner. On the one hand, TLS data is assumed to comprise a higher reliability regarding consistent error behaviour than DEMs derived from overlapping UAV images. Therefore, global errors (e.g. dome effect) and local errors (e.g. DEM blunders due to erroneous image matching) within the UAV data are assessed with the DEMs produced by TLS. Furthermore, TLS point clouds allow for fast and reliable filtering of vegetation spots, which is not as straightforward within the UAV data due to known image matching problems in areas displaying plant cover. On the other hand, systematic DEM errors linked to TLS are detected and possibly corrected utilising the DEMs reconstructed from overlapping UAV images. Furthermore, TLS point clouds are filtered corresponding to the degree of point quality, which is estimated from parameters of the scan geometry (i.e. incidence angle and footprint size). This is especially relevant for this study because the area of interest is located at gentle hillslopes that are prone to soil erosion. Thus, the view of the scanning device onto the surface results in an adverse angle, which is solely slightly improved by the usage of a 4 m high tripod. Surface roughness is considered as a further parameter to evaluate the TLS point quality. The filtering tool allows for choosing each data point either from the TLS or UAV data corresponding to the data acquisition geometry and surface properties. The filtered points are merged into one point cloud, which is finally processed to reduce remaining data noise. DEM analysis reveals a continuous decrease of soil surface roughness after tillage, the reappearance of former wheel tracks and local patterns of erosion as well as accumulation.

  18. List of Error-Prone Abbreviations, Symbols, and Dose Designations

    MedlinePlus

    ... unit dose (e.g., diltiazem 125 mg IV infusion “UD” misin- terpreted as meaning to give the entire infusion as a unit [bolus] dose) Use “as directed” ... Names Intended Meaning Misinterpretation Correction “Nitro” drip nitroglycerin infusion Mistaken as sodium nitroprusside infusion Use complete drug ...

  19. Improving Advising Using Technology and Data Analytics

    ERIC Educational Resources Information Center

    Phillips, Elizabeth D.

    2013-01-01

    Traditionally, the collegiate advising system provides each student with a personal academic advisor who designs a pathway to the degree for that student in face-to-face meetings. Ideally, this is a supportive mentoring relationship. In truth, however, this system is highly inefficient, error prone, expensive, and a source of ubiquitous student…

  20. Finite element modeling of light propagation in fruit under illumination of continuous-wave beam

    USDA-ARS?s Scientific Manuscript database

    Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...

  1. Finite element simulation of light transfer in turbid media under structured illumination

    USDA-ARS?s Scientific Manuscript database

    Spatial-frequency domain (SFD) imaging technique allows to estimate the optical properties of biological tissues in a wide field of view. The technique is, however, prone to error in measurement because the two crucial assumptions used for deriving the analytical solution to diffusion approximation ...

  2. Propensity Score Weighting with Error-Prone Covariates

    ERIC Educational Resources Information Center

    McCaffrey, Daniel F.; Lockwood, J. R.; Setodji, Claude M.

    2011-01-01

    Inverse probability weighting (IPW) estimates are widely used in applications where data are missing due to nonresponse or censoring or in observational studies of causal effects where the counterfactuals cannot be observed. This extensive literature has shown the estimators to be consistent and asymptotically normal under very general conditions,…

  3. How Emotions Affect Learning.

    ERIC Educational Resources Information Center

    Sylwester, Robert

    1994-01-01

    Studies show our emotional system is a complex, widely distributed, and error-prone system that defines our basic personality early in life and is quite resistant to change. This article describes our emotional system's major parts (the peptides that carry emotional information and the body and brain structures that activate and regulate emotions)…

  4. Online Hand Holding in Fixing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2005-01-01

    According to most surveys, computer manufacturers such as HP puts out reliable products, and computers in general are less troublesome than in the past. But personal computers are still prone to bugs, conflicts, viruses, spyware infestations, hacker and phishing attacks, and--most of all--user error. Unfortunately, technical support from computer…

  5. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  6. Spatial and temporal variability of the overall error of National Atmospheric Deposition Program measurements determined by the USGS collocated-sampler program, water years 1989-2001

    USGS Publications Warehouse

    Wetherbee, G.A.; Latysh, N.E.; Gordon, J.D.

    2005-01-01

    Data from the U.S. Geological Survey (USGS) collocated-sampler program for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) are used to estimate the overall error of NADP/NTN measurements. Absolute errors are estimated by comparison of paired measurements from collocated instruments. Spatial and temporal differences in absolute error were identified and are consistent with longitudinal distributions of NADP/NTN measurements and spatial differences in precipitation characteristics. The magnitude of error for calcium, magnesium, ammonium, nitrate, and sulfate concentrations, specific conductance, and sample volume is of minor environmental significance to data users. Data collected after a 1994 sample-handling protocol change are prone to less absolute error than data collected prior to 1994. Absolute errors are smaller during non-winter months than during winter months for selected constituents at sites where frozen precipitation is common. Minimum resolvable differences are estimated for different regions of the USA to aid spatial and temporal watershed analyses.

  7. The importance of robust error control in data compression applications

    NASA Technical Reports Server (NTRS)

    Woolley, S. I.

    1993-01-01

    Data compression has become an increasingly popular option as advances in information technology have placed further demands on data storage capabilities. With compression ratios as high as 100:1 the benefits are clear; however, the inherent intolerance of many compression formats to error events should be given careful consideration. If we consider that efficiently compressed data will ideally contain no redundancy, then the introduction of a channel error must result in a change of understanding from that of the original source. While the prefix property of codes such as Huffman enables resynchronisation, this is not sufficient to arrest propagating errors in an adaptive environment. Arithmetic, Lempel-Ziv, discrete cosine transform (DCT) and fractal methods are similarly prone to error propagating behaviors. It is, therefore, essential that compression implementations provide sufficient combatant error control in order to maintain data integrity. Ideally, this control should be derived from a full understanding of the prevailing error mechanisms and their interaction with both the system configuration and the compression schemes in use.

  8. Self-Interaction Error in Density Functional Theory: An Appraisal.

    PubMed

    Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G

    2018-05-03

    Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.

  9. The Diagnosis of Error in Histories of Science

    NASA Astrophysics Data System (ADS)

    Thomas, William

    Whether and how to diagnose error in the history of science is a contentious issue. For many scientists, diagnosis is appealing because it allows them to discuss how knowledge can progress most effectively. Many historians disagree. They consider diagnosis inappropriate because it may discard features of past actors' thought that are important to understanding it, and may have even been intellectually productive. Ironically, these historians are apt to diagnose flaws in scientists' histories as proceeding from a misguided desire to idealize scientific method, and from their attendant identification of deviations from the ideal as, ipso facto, a paramount source of error in historical science. While both views have some merit, they should be reconciled if a more harmonious and productive relationship between the disciplines is to prevail. In To Explain the World, Steven Weinberg narrates the slow but definite emergence of what we call science from long traditions of philosophical and mathematical thought. This narrative follows in a historiographical tradition charted by historians such as Alexandre Koyre and Rupert Hall about sixty years ago. It is essentially a history of the emergence of reliable (if fallible) scientific method from more error-prone thought. While some historians such as Steven Shapin view narratives of this type as fundamentally error-prone, I do not view such projects as a priori illegitimate. They are, however, perhaps more difficult than Weinberg supposes. In this presentation, I will focus on two of Weinberg's strong historical claims: that physics became detached from religion as early as the beginning of the eighteenth century, and that physics proved an effective model for placing other fields on scientific grounds. While I disagree with these claims, they represent at most an overestimation of vintage science's interest in discarding theological questions, and an overestimation of that science's ability to function at all reliably.

  10. Detecting medication errors in the New Zealand pharmacovigilance database: a retrospective analysis.

    PubMed

    Kunac, Desireé L; Tatley, Michael V

    2011-01-01

    Despite the traditional focus being adverse drug reactions (ADRs), pharmacovigilance centres have recently been identified as a potentially rich and important source of medication error data. To identify medication errors in the New Zealand Pharmacovigilance database (Centre for Adverse Reactions Monitoring [CARM]), and to describe the frequency and characteristics of these events. A retrospective analysis of the CARM pharmacovigilance database operated by the New Zealand Pharmacovigilance Centre was undertaken for the year 1 January-31 December 2007. All reports, excluding those relating to vaccines, clinical trials and pharmaceutical company reports, underwent a preventability assessment using predetermined criteria. Those events deemed preventable were subsequently classified to identify the degree of patient harm, type of error, stage of medication use process where the error occurred and origin of the error. A total of 1412 reports met the inclusion criteria and were reviewed, of which 4.3% (61/1412) were deemed preventable. Not all errors resulted in patient harm: 29.5% (18/61) were 'no harm' errors but 65.5% (40/61) of errors were deemed to have been associated with some degree of patient harm (preventable adverse drug events [ADEs]). For 5.0% (3/61) of events, the degree of patient harm was unable to be determined as the patient outcome was unknown. The majority of preventable ADEs (62.5% [25/40]) occurred in adults aged 65 years and older. The medication classes most involved in preventable ADEs were antibacterials for systemic use and anti-inflammatory agents, with gastrointestinal and respiratory system disorders the most common adverse events reported. For both preventable ADEs and 'no harm' events, most errors were incorrect dose and drug therapy monitoring problems consisting of failures in detection of significant drug interactions, past allergies or lack of necessary clinical monitoring. Preventable events were mostly related to the prescribing and administration stages of the medication use process, with the majority of errors 82.0% (50/61) deemed to have originated in the community setting. The CARM pharmacovigilance database includes medication errors, many of which were found to originate in the community setting and reported as ADRs. Error-prone situations were able to be identified, providing greater opportunity to improve patient safety. However, to enhance detection of medication errors by pharmacovigilance centres, reports should be prospectively reviewed for preventability and the reporting form revised to facilitate capture of important information that will provide meaningful insight into the nature of the underlying systems defects that caused the error.

  11. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    PubMed

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  12. Surface driven biomechanical breast image registration

    NASA Astrophysics Data System (ADS)

    Eiben, Björn; Vavourakis, Vasileios; Hipwell, John H.; Kabus, Sven; Lorenz, Cristian; Buelow, Thomas; Williams, Norman R.; Keshtgar, M.; Hawkes, David J.

    2016-03-01

    Biomechanical modelling enables large deformation simulations of breast tissues under different loading conditions to be performed. Such simulations can be utilised to transform prone Magnetic Resonance (MR) images into a different patient position, such as upright or supine. We present a novel integration of biomechanical modelling with a surface registration algorithm which optimises the unknown material parameters of a biomechanical model and performs a subsequent regularised surface alignment. This allows deformations induced by effects other than gravity, such as those due to contact of the breast and MR coil, to be reversed. Correction displacements are applied to the biomechanical model enabling transformation of the original pre-surgical images to the corresponding target position. The algorithm is evaluated for the prone-to-supine case using prone MR images and the skin outline of supine Computed Tomography (CT) scans for three patients. A mean target registration error (TRE) of 10:9 mm for internal structures is achieved. For the prone-to-upright scenario, an optical 3D surface scan of one patient is used as a registration target and the nipple distances after alignment between the transformed MRI and the surface are 10:1 mm and 6:3 mm respectively.

  13. Comparison of exercises inducing maximum voluntary isometric contraction for the latissimus dorsi using surface electromyography.

    PubMed

    Park, Se-yeon; Yoo, Won-gyu

    2013-10-01

    The aim of this study was to compare muscular activation during five different normalization techniques that induced maximal isometric contraction of the latissimus dorsi. Sixteen healthy men participated in the study. Each participant performed three repetitions each of five types of isometric exertion: (1) conventional shoulder extension in the prone position, (2) caudal shoulder depression in the prone position, (3) body lifting with shoulder depression in the seated position, (4) trunk bending to the right in the lateral decubitus position, and (5) downward bar pulling in the seated position. In most participants, maximal activation of the latissimus dorsi was observed during conventional shoulder extension in the prone position; the percentage of maximal voluntary contraction was significantly greater for this exercise than for all other normalization techniques except downward bar pulling in the seated position. Although differences in electrode placement among various electromyographic studies represent a limitation, normalization techniques for the latissimus dorsi are recommended to minimize error in assessing maximal muscular activation of the latissimus dorsi through the combined use of shoulder extension in the prone position and downward pulling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Protection of HEVC Video Delivery in Vehicular Networks with RaptorQ Codes

    PubMed Central

    Martínez-Rach, Miguel; López, Otoniel; Malumbres, Manuel Pérez

    2014-01-01

    With future vehicles equipped with processing capability, storage, and communications, vehicular networks will become a reality. A vast number of applications will arise that will make use of this connectivity. Some of them will be based on video streaming. In this paper we focus on HEVC video coding standard streaming in vehicular networks and how it deals with packet losses with the aid of RaptorQ, a Forward Error Correction scheme. As vehicular networks are packet loss prone networks, protection mechanisms are necessary if we want to guarantee a minimum level of quality of experience to the final user. We have run simulations to evaluate which configurations fit better in this type of scenarios. PMID:25136675

  15. Topological Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Jajodia, Sushil; Noel, Steven

    Traditionally, network administrators rely on labor-intensive processes for tracking network configurations and vulnerabilities. This requires a great deal of expertise, and is error prone because of the complexity of networks and associated security data. The interdependencies of network vulnerabilities make traditional point-wise vulnerability analysis inadequate. We describe a Topological Vulnerability Analysis (TVA) approach that analyzes vulnerability dependencies and shows all possible attack paths into a network. From models of the network vulnerabilities and potential attacker exploits, we compute attack graphs that convey the impact of individual and combined vulnerabilities on overall security. TVA finds potential paths of vulnerability through a network, showing exactly how attackers may penetrate a network. From this, we identify key vulnerabilities and provide strategies for protection of critical network assets.

  16. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  17. Critical Thinking in Critical Care: Five Strategies to Improve Teaching and Learning in the Intensive Care Unit.

    PubMed

    Hayes, Margaret M; Chatterjee, Souvik; Schwartzstein, Richard M

    2017-04-01

    Critical thinking, the capacity to be deliberate about thinking, is increasingly the focus of undergraduate medical education, but is not commonly addressed in graduate medical education. Without critical thinking, physicians, and particularly residents, are prone to cognitive errors, which can lead to diagnostic errors, especially in a high-stakes environment such as the intensive care unit. Although challenging, critical thinking skills can be taught. At this time, there is a paucity of data to support an educational gold standard for teaching critical thinking, but we believe that five strategies, routed in cognitive theory and our personal teaching experiences, provide an effective framework to teach critical thinking in the intensive care unit. The five strategies are: make the thinking process explicit by helping learners understand that the brain uses two cognitive processes: type 1, an intuitive pattern-recognizing process, and type 2, an analytic process; discuss cognitive biases, such as premature closure, and teach residents to minimize biases by expressing uncertainty and keeping differentials broad; model and teach inductive reasoning by utilizing concept and mechanism maps and explicitly teach how this reasoning differs from the more commonly used hypothetico-deductive reasoning; use questions to stimulate critical thinking: "how" or "why" questions can be used to coach trainees and to uncover their thought processes; and assess and provide feedback on learner's critical thinking. We believe these five strategies provide practical approaches for teaching critical thinking in the intensive care unit.

  18. Critical Thinking in Critical Care: Five Strategies to Improve Teaching and Learning in the Intensive Care Unit

    PubMed Central

    Chatterjee, Souvik; Schwartzstein, Richard M.

    2017-01-01

    Critical thinking, the capacity to be deliberate about thinking, is increasingly the focus of undergraduate medical education, but is not commonly addressed in graduate medical education. Without critical thinking, physicians, and particularly residents, are prone to cognitive errors, which can lead to diagnostic errors, especially in a high-stakes environment such as the intensive care unit. Although challenging, critical thinking skills can be taught. At this time, there is a paucity of data to support an educational gold standard for teaching critical thinking, but we believe that five strategies, routed in cognitive theory and our personal teaching experiences, provide an effective framework to teach critical thinking in the intensive care unit. The five strategies are: make the thinking process explicit by helping learners understand that the brain uses two cognitive processes: type 1, an intuitive pattern-recognizing process, and type 2, an analytic process; discuss cognitive biases, such as premature closure, and teach residents to minimize biases by expressing uncertainty and keeping differentials broad; model and teach inductive reasoning by utilizing concept and mechanism maps and explicitly teach how this reasoning differs from the more commonly used hypothetico-deductive reasoning; use questions to stimulate critical thinking: “how” or “why” questions can be used to coach trainees and to uncover their thought processes; and assess and provide feedback on learner’s critical thinking. We believe these five strategies provide practical approaches for teaching critical thinking in the intensive care unit. PMID:28157389

  19. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  20. Path-following in model predictive rollover prevention using front steering and braking

    NASA Astrophysics Data System (ADS)

    Ghazali, Mohammad; Durali, Mohammad; Salarieh, Hassan

    2017-01-01

    In this paper vehicle path-following in the presence of rollover risk is investigated. Vehicles with high centre of mass are prone to roll instability. Untripped rollover risk is increased in high centre of gravity vehicles and high-friction road condition. Researches introduce strategies to handle the short-duration rollover condition. In these researches, however, trajectory tracking is affected and not thoroughly investigated. This paper puts stress on tracking error from rollover prevention. A lower level model predictive front steering controller is adopted to deal with rollover and tracking error as a priority sequence. A brake control is included in lower level controller which directly obeys an upper level controller (ULC) command. The ULC manages vehicle speed regarding primarily tracking error. Simulation results show that the proposed control framework maintains roll stability while tracking error is confined to predefined error limit.

  1. Classification-Based Spatial Error Concealment for Visual Communications

    NASA Astrophysics Data System (ADS)

    Chen, Meng; Zheng, Yefeng; Wu, Min

    2006-12-01

    In an error-prone transmission environment, error concealment is an effective technique to reconstruct the damaged visual content. Due to large variations of image characteristics, different concealment approaches are necessary to accommodate the different nature of the lost image content. In this paper, we address this issue and propose using classification to integrate the state-of-the-art error concealment techniques. The proposed approach takes advantage of multiple concealment algorithms and adaptively selects the suitable algorithm for each damaged image area. With growing awareness that the design of sender and receiver systems should be jointly considered for efficient and reliable multimedia communications, we proposed a set of classification-based block concealment schemes, including receiver-side classification, sender-side attachment, and sender-side embedding. Our experimental results provide extensive performance comparisons and demonstrate that the proposed classification-based error concealment approaches outperform the conventional approaches.

  2. A water-vapor radiometer error model. [for ionosphere in geodetic microwave techniques

    NASA Technical Reports Server (NTRS)

    Beckman, B.

    1985-01-01

    The water-vapor radiometer (WVR) is used to calibrate unpredictable delays in the wet component of the troposphere in geodetic microwave techniques such as very-long-baseline interferometry (VLBI) and Global Positioning System (GPS) tracking. Based on experience with Jet Propulsion Laboratory (JPL) instruments, the current level of accuracy in wet-troposphere calibration limits the accuracy of local vertical measurements to 5-10 cm. The goal for the near future is 1-3 cm. Although the WVR is currently the best calibration method, many instruments are prone to systematic error. In this paper, a treatment of WVR data is proposed and evaluated. This treatment reduces the effect of WVR systematic errors by estimating parameters that specify an assumed functional form for the error. The assumed form of the treatment is evaluated by comparing the results of two similar WVR's operating near each other. Finally, the observability of the error parameters is estimated by covariance analysis.

  3. Utility of eButton images for identifying food preparation behaviors and meal-related tasks in adolescents

    USDA-ARS?s Scientific Manuscript database

    Food preparation skills may encourage healthy eating. Traditional assessment of child food preparation employs self- or parent proxy-reporting methods, which are prone to error. The eButton is a wearable all-day camera that has promise as an objective, passive method for measuring child food prepara...

  4. Understanding Clinician Information Demands and Synthesis of Clinical Documents in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Farri, Oladimeji Feyisetan

    2012-01-01

    Large quantities of redundant clinical data are usually transferred from one clinical document to another, making the review of such documents cognitively burdensome and potentially error-prone. Inadequate designs of electronic health record (EHR) clinical document user interfaces probably contribute to the difficulties clinicians experience while…

  5. Finite element modeling of light propagation in turbid media under illumination of a continuous-wave beam

    USDA-ARS?s Scientific Manuscript database

    Spatially-resolved spectroscopy provides a means for measuring the optical properties of biological tissues, based on analytical solutions to diffusion approximation for semi-infinite media under the normal illumination of infinitely small size light beam. The method is, however, prone to error in m...

  6. ATS-PD: An Adaptive Testing System for Psychological Disorders

    ERIC Educational Resources Information Center

    Donadello, Ivan; Spoto, Andrea; Sambo, Francesco; Badaloni, Silvana; Granziol, Umberto; Vidotto, Giulio

    2017-01-01

    The clinical assessment of mental disorders can be a time-consuming and error-prone procedure, consisting of a sequence of diagnostic hypothesis formulation and testing aimed at restricting the set of plausible diagnoses for the patient. In this article, we propose a novel computerized system for the adaptive testing of psychological disorders.…

  7. Towards New Multiplatform Hybrid Online Laboratory Models

    ERIC Educational Resources Information Center

    Rodriguez-Gil, Luis; García-Zubia, Javier; Orduña, Pablo; López-de-Ipiña, Diego

    2017-01-01

    Online laboratories have traditionally been split between virtual labs, with simulated components; and remote labs, with real components. The former tend to provide less realism but to be easily scalable and less expensive to maintain, while the latter are fully real but tend to require a higher maintenance effort and be more error-prone. This…

  8. A Practical Teaching Course in Directed Protein Evolution Using the Green Fluorescent Protein as a Model

    ERIC Educational Resources Information Center

    Ruller, Roberto; Silva-Rocha, Rafael; Silva, Artur; Schneider, Maria Paula Cruz; Ward, Richard John

    2011-01-01

    Protein engineering is a powerful tool, which correlates protein structure with specific functions, both in applied biotechnology and in basic research. Here, we present a practical teaching course for engineering the green fluorescent protein (GFP) from "Aequorea victoria" by a random mutagenesis strategy using error-prone polymerase…

  9. Accuracy of an IFSAR-derived digital terrain model under a conifer forest canopy.

    Treesearch

    Hans-Erik Andersen; Stephen E. Reutebuch; Robert J. McGaughey

    2005-01-01

    Accurate digital terrain models (DTMs) are necessary for a variety of forest resource management applications, including watershed management, timber harvest planning, and fire management. Traditional methods for acquiring topographic data typically rely on aerial photogrammetry, where measurement of the terrain surface below forest canopy is difficult and error prone...

  10. Shared cognitive processes underlying past and future thinking: the impact of imagery and concurrent task demands on event specificity.

    PubMed

    Anderson, Rachel J; Dewhurst, Stephen A; Nash, Robert A

    2012-03-01

    Recent literature has argued that whereas remembering the past and imagining the future make use of shared cognitive substrates, simulating future events places heavier demands on executive resources. These propositions were explored in 3 experiments comparing the impact of imagery and concurrent task demands on speed and accuracy of past event retrieval and future event simulation. Results provide support for the suggestion that both past and future episodes can be constructed through 2 mechanisms: a noneffortful "direct" pathway and a controlled, effortful "generative" pathway. However, limited evidence emerged for the suggestion that simulating of future, compared with retrieving past, episodes places heavier demands on executive resources; only under certain conditions did it emerge as a more error prone and lengthier process. The findings are discussed in terms of how retrieval and simulation make use of the same cognitive substrates in subtly different ways. 2012 APA, all rights reserved

  11. unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, Ian J.; Chandler, Richard B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientific questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mechanisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unified modeling interface. The R package unmarked provides such a unified modeling framework, including tools for data exploration, model fitting, model criticism, post-hoc analysis, and model comparison.

  12. Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Bales, Ben; Pollock, Tresa; Petzold, Linda

    2017-06-01

    Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.

  13. Antigenic Variation in the Lyme Spirochete: Insights into Recombinational Switching with a Suggested Role for Error-Prone Repair.

    PubMed

    Verhey, Theodore B; Castellanos, Mildred; Chaconas, George

    2018-05-29

    The Lyme disease spirochete, Borrelia burgdorferi, uses antigenic variation as a strategy to evade the host's acquired immune response. New variants of surface-localized VlsE are generated efficiently by unidirectional recombination from 15 unexpressed vls cassettes into the vlsE locus. Using algorithms to analyze switching from vlsE sequencing data, we characterize a population of over 45,000 inferred recombination events generated during mouse infection. We present evidence for clustering of these recombination events within the population and along the vlsE gene, a role for the direct repeats flanking the variable region in vlsE, and the importance of sequence homology in determining the location of recombination, despite RecA's dispensability. Finally, we report that non-templated sequence variation is strongly associated with recombinational switching and occurs predominantly at the 5' end of conversion tracts. This likely results from an error-prone repair mechanism operational during recombinational switching that elevates the mutation rate > 5,000-fold in switched regions. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Error-prone bypass of O6-methylguanine by DNA polymerase of Pseudomonas aeruginosa phage PaP1.

    PubMed

    Gu, Shiling; Xiong, Jingyuan; Shi, Ying; You, Jia; Zou, Zhenyu; Liu, Xiaoying; Zhang, Huidong

    2017-09-01

    O 6 -Methylguanine (O 6 -MeG) is highly mutagenic and is commonly found in DNA exposed to methylating agents, generally leads to G:C to A:T mutagenesis. To study DNA replication encountering O 6 -MeG by the DNA polymerase (gp90) of P. aeruginosa phage PaP1, we analyzed steady-state and pre-steady-state kinetics of nucleotide incorporation opposite O 6 -MeG by gp90 exo - . O 6 -MeG partially inhibited full-length extension by gp90 exo - . O 6 -MeG greatly reduces dNTP incorporation efficiency, resulting in 67-fold preferential error-prone incorporation of dTTP than dCTP. Gp90 exo - extends beyond T:O 6 -MeG 2-fold more efficiently than C:O 6 -MeG. Incorporation of dCTP opposite G and incorporation of dCTP or dTTP opposite O 6 -MeG show fast burst phases. The pre-steady-state incorporation efficiency (k pol /K d,dNTP ) is decreased in the order of dCTP:G>dTTP:O 6 -MeG>dCTP:O 6 -MeG. The presence of O 6 -MeG at template does not affect the binding affinity of polymerase to DNA but it weakened their binding in the presence of dCTP and Mg 2+ . Misincorporation of dTTP opposite O 6 -MeG further weakens the binding affinity of polymerase to DNA. The priority of dTTP incorporation opposite O 6 -MeG is originated from the fact that dTTP can induce a faster conformational change step and a faster chemical step than dCTP. This study reveals that gp90 bypasses O 6 -MeG in an error-prone manner and provides further understanding in DNA replication encountering mutagenic alkylation DNA damage for P. aeruginosa phage PaP1. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Minimal Contribution of APOBEC3-Induced G-to-A Hypermutation to HIV-1 Recombination and Genetic Variation

    PubMed Central

    Nikolaitchik, Olga A.; Burdick, Ryan C.; Gorelick, Robert J.; Keele, Brandon F.; Hu, Wei-Shau; Pathak, Vinay K.

    2016-01-01

    Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10−5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10−21 and1 × 10−11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication. PMID:27186986

  16. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Minimal Contribution of APOBEC3-Induced G-to-A Hypermutation to HIV-1 Recombination and Genetic Variation.

    PubMed

    Delviks-Frankenberry, Krista A; Nikolaitchik, Olga A; Burdick, Ryan C; Gorelick, Robert J; Keele, Brandon F; Hu, Wei-Shau; Pathak, Vinay K

    2016-05-01

    Although the predominant effect of host restriction APOBEC3 proteins on HIV-1 infection is to block viral replication, they might inadvertently increase retroviral genetic variation by inducing G-to-A hypermutation. Numerous studies have disagreed on the contribution of hypermutation to viral genetic diversity and evolution. Confounding factors contributing to the debate include the extent of lethal (stop codon) and sublethal hypermutation induced by different APOBEC3 proteins, the inability to distinguish between G-to-A mutations induced by APOBEC3 proteins and error-prone viral replication, the potential impact of hypermutation on the frequency of retroviral recombination, and the extent to which viral recombination occurs in vivo, which can reassort mutations in hypermutated genomes. Here, we determined the effects of hypermutation on the HIV-1 recombination rate and its contribution to genetic variation through recombination to generate progeny genomes containing portions of hypermutated genomes without lethal mutations. We found that hypermutation did not significantly affect the rate of recombination, and recombination between hypermutated and wild-type genomes only increased the viral mutation rate by 3.9 × 10-5 mutations/bp/replication cycle in heterozygous virions, which is similar to the HIV-1 mutation rate. Since copackaging of hypermutated and wild-type genomes occurs very rarely in vivo, recombination between hypermutated and wild-type genomes does not significantly contribute to the genetic variation of replicating HIV-1. We also analyzed previously reported hypermutated sequences from infected patients and determined that the frequency of sublethal mutagenesis for A3G and A3F is negligible (4 × 10-21 and1 × 10-11, respectively) and its contribution to viral mutations is far below mutations generated during error-prone reverse transcription. Taken together, we conclude that the contribution of APOBEC3-induced hypermutation to HIV-1 genetic variation is substantially lower than that from mutations during error-prone replication.

  18. SU-E-J-21: Setup Variability of Colorectal Cancer Patients Treated in the Prone Position and Dosimetric Comparison with the Supine Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, A; Foster, J; Chu, W

    2015-06-15

    Purpose: Many cancer centers treat colorectal patients in the prone position on a belly board to minimize dose to the small bowel. That may potentially Result in patient setup instability with corresponding impact on dose delivery accuracy for highly conformal techniques such as IMRT/VMAT. Two aims of this work are 1) to investigate setup accuracy of rectum patients treated in the prone position on a belly board using CBCT and 2) to evaluate dosimetric impact on bladder and small bowel of treating rectum patients in supine vs. prone position. Methods: For the setup accuracy study, 10 patients were selected. Weeklymore » CBCTs were acquired and matched to bone. The CBCT-determined shifts were recorded. For the dosimetric study, 7 prone-setup patients and 7 supine-setup patients were randomly selected from our clinical database. Various clinically relevant dose volume histogram values were recorded for the small bowel and bladder. Results: The CBCT-determined rotational shifts had a wide variation. For the dataset acquired at the time of this writing, the ranges of rotational setup errors for pitch, roll, and yaw were [−3.6° 4.7°], [−4.3° 3.2°], and [−1.4° 1.4°]. For the dosimetric study: the small bowel V(45Gy) and mean dose for the prone position was 5.6±12.1% and 18.4±6.2Gy (ranges indicate standard deviations); for the supine position the corresponding dose values were 12.9±15.8% and 24.7±8.8Gy. For the bladder, the V(30Gy) and mean dose for prone position were 68.7±12.7% and 38.4±3.3Gy; for supine position these dose values were 77.1±13.7% and 40.7±3.1Gy. Conclusion: There is evidence of significant rotational instability in the prone position. The OAR dosimetry study indicates that there are some patients that may still benefit from the prone position, though many patients can be safely treated supine.« less

  19. The Significance of the Record Length in Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Senarath, S. U.

    2013-12-01

    Of all of the potential natural hazards, flood is the most costly in many regions of the world. For example, floods cause over a third of Europe's average annual catastrophe losses and affect about two thirds of the people impacted by natural catastrophes. Increased attention is being paid to determining flow estimates associated with pre-specified return periods so that flood-prone areas can be adequately protected against floods of particular magnitudes or return periods. Flood frequency analysis, which is conducted by using an appropriate probability density function that fits the observed annual maximum flow data, is frequently used for obtaining these flow estimates. Consequently, flood frequency analysis plays an integral role in determining the flood risk in flood prone watersheds. A long annual maximum flow record is vital for obtaining accurate estimates of discharges associated with high return period flows. However, in many areas of the world, flood frequency analysis is conducted with limited flow data or short annual maximum flow records. These inevitably lead to flow estimates that are subject to error. This is especially the case with high return period flow estimates. In this study, several statistical techniques are used to identify errors caused by short annual maximum flow records. The flow estimates used in the error analysis are obtained by fitting a log-Pearson III distribution to the flood time-series. These errors can then be used to better evaluate the return period flows in data limited streams. The study findings, therefore, have important implications for hydrologists, water resources engineers and floodplain managers.

  20. MRMAide: a mixed resolution modeling aide

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; McGraw, Robert M.

    2002-07-01

    The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.

  1. Intellicount: High-Throughput Quantification of Fluorescent Synaptic Protein Puncta by Machine Learning

    PubMed Central

    Fantuzzo, J. A.; Mirabella, V. R.; Zahn, J. D.

    2017-01-01

    Abstract Synapse formation analyses can be performed by imaging and quantifying fluorescent signals of synaptic markers. Traditionally, these analyses are done using simple or multiple thresholding and segmentation approaches or by labor-intensive manual analysis by a human observer. Here, we describe Intellicount, a high-throughput, fully-automated synapse quantification program which applies a novel machine learning (ML)-based image processing algorithm to systematically improve region of interest (ROI) identification over simple thresholding techniques. Through processing large datasets from both human and mouse neurons, we demonstrate that this approach allows image processing to proceed independently of carefully set thresholds, thus reducing the need for human intervention. As a result, this method can efficiently and accurately process large image datasets with minimal interaction by the experimenter, making it less prone to bias and less liable to human error. Furthermore, Intellicount is integrated into an intuitive graphical user interface (GUI) that provides a set of valuable features, including automated and multifunctional figure generation, routine statistical analyses, and the ability to run full datasets through nested folders, greatly expediting the data analysis process. PMID:29218324

  2. A Novel Real-Time Reference Key Frame Scan Matching Method

    PubMed Central

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-01-01

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions’ environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems. PMID:28481285

  3. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.

    PubMed

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-05-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.

  4. Jumping to conclusions and the continuum of delusional beliefs.

    PubMed

    Warman, Debbie M; Lysaker, Paul H; Martin, Joel M; Davis, Louanne; Haudenschield, Samantha L

    2007-06-01

    The present study examined the jumping to conclusions reasoning bias across the continuum of delusional ideation by investigating individuals with active delusions, delusion prone individuals, and non-delusion prone individuals. Neutral and highly self-referent probabilistic reasoning tasks were employed. Results indicated that individuals with delusions gathered significantly less information than delusion prone and non-delusion prone participants on both the neutral and self-referent tasks, (p<.001). Individuals with delusions made less accurate decisions than the delusion prone and non-delusion prone participants on both tasks (p<.001), yet were more confident about their decisions than were delusion prone and non-delusion prone participants on the self-referent task (p=.002). Those with delusions and those who were delusion prone reported higher confidence in their performance on the self-referent task than they did the neutral task (p=.02), indicating that high self-reference impacted information processing for individuals in both of these groups. The results are discussed in relation to previous research in the area of probabilistic reasoning and delusions.

  5. Neglected children, shame-proneness, and depressive symptoms.

    PubMed

    Bennett, David S; Sullivan, Margaret Wolan; Lewis, Michael

    2010-11-01

    Neglected children may be at increased risk for depressive symptoms. This study examines shame-proneness as an outcome of child neglect and as a potential explanatory variable in the relation between neglect and depressive symptoms. Participants were 111 children (52 with a Child Protective Services [CPS] allegation of neglect) seen at age 7. Neglected children reported more shame-proneness and more depressive symptoms than comparison children. Guilt-proneness, in contrast, was unrelated to neglect and depressive symptoms, indicating specificity for shame-proneness. The potential role of shame as a process variable that can help explain how some neglected children exhibit depressive symptoms is discussed.

  6. Review of Significant Incidents and Close Calls in Human Spaceflight from a Human Factors Perspective

    NASA Technical Reports Server (NTRS)

    Silva-Martinez, Jackelynne; Ellenberger, Richard; Dory, Jonathan

    2017-01-01

    This project aims to identify poor human factors design decisions that led to error-prone systems, or did not facilitate the flight crew making the right choices; and to verify that NASA is effectively preventing similar incidents from occurring again. This analysis was performed by reviewing significant incidents and close calls in human spaceflight identified by the NASA Johnson Space Center Safety and Mission Assurance Flight Safety Office. The review of incidents shows whether the identified human errors were due to the operational phase (flight crew and ground control) or if they initiated at the design phase (includes manufacturing and test). This classification was performed with the aid of the NASA Human Systems Integration domains. This in-depth analysis resulted in a tool that helps with the human factors classification of significant incidents and close calls in human spaceflight, which can be used to identify human errors at the operational level, and how they were or should be minimized. Current governing documents on human systems integration for both government and commercial crew were reviewed to see if current requirements, processes, training, and standard operating procedures protect the crew and ground control against these issues occurring in the future. Based on the findings, recommendations to target those areas are provided.

  7. Body-related state shame and guilt in women: do causal attributions mediate the influence of physical self-concept and shame and guilt proneness.

    PubMed

    Crocker, Peter R E; Brune, Sara M; Kowalski, Kent C; Mack, Diane E; Wilson, Philip M; Sabiston, Catherine M

    2014-01-01

    Guided by the process model of self-conscious emotions, this study examined whether physical self-concept (PSC) and shame and guilt proneness were associated with body-related self-conscious emotions of state shame and guilt and if these relationships were mediated by attributions of stability, globality, and controllability. Female participants (N=284; Mean age=20.6±1.9 years) completed measures of PSC and shame and guilt proneness before reading a hypothetical scenario. Participants completed measures of attributions and state shame and guilt in response to the scenario. Significant relationships were noted between state shame and attributions of globality and controllability, and shame proneness, guilt proneness, and PSC. Similar relationships, with the additional predictor of stability, were found for state guilt. Mediation analysis partially supported the process model hypotheses for shame. Results indicate PSC and shame proneness are important in predicting body-related emotions, but the role of specific attributions are still unclear. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Errors Affect Hypothetical Intertemporal Food Choice in Women

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2014-01-01

    Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534

  9. Skills, rules and knowledge in aircraft maintenance: errors in context

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  10. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  11. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  12. Word learning and the cerebral hemispheres: from serial to parallel processing of written words

    PubMed Central

    Ellis, Andrew W.; Ferreira, Roberto; Cathles-Hagan, Polly; Holt, Kathryn; Jarvis, Lisa; Barca, Laura

    2009-01-01

    Reading familiar words differs from reading unfamiliar non-words in two ways. First, word reading is faster and more accurate than reading of unfamiliar non-words. Second, effects of letter length are reduced for words, particularly when they are presented in the right visual field in familiar formats. Two experiments are reported in which right-handed participants read aloud non-words presented briefly in their left and right visual fields before and after training on those items. The non-words were interleaved with familiar words in the naming tests. Before training, naming was slow and error prone, with marked effects of length in both visual fields. After training, fewer errors were made, naming was faster, and the effect of length was much reduced in the right visual field compared with the left. We propose that word learning creates orthographic word forms in the mid-fusiform gyrus of the left cerebral hemisphere. Those word forms allow words to access their phonological and semantic representations on a lexical basis. But orthographic word forms also interact with more posterior letter recognition systems in the middle/inferior occipital gyri, inducing more parallel processing of right visual field words than is possible for any left visual field stimulus, or for unfamiliar non-words presented in the right visual field. PMID:19933140

  13. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    PubMed

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  14. Laboratory testing in primary care: A systematic review of health IT impacts.

    PubMed

    Maillet, Éric; Paré, Guy; Currie, Leanne M; Raymond, Louis; Ortiz de Guinea, Ana; Trudel, Marie-Claude; Marsan, Josianne

    2018-08-01

    Laboratory testing in primary care is a fundamental process that supports patient management and care. Any breakdown in the process may alter clinical information gathering and decision-making activities and can lead to medical errors and potential adverse outcomes for patients. Various information technologies are being used in primary care with the goal to support the process, maximize patient benefits and reduce medical errors. However, the overall impact of health information technologies on laboratory testing processes has not been evaluated. To synthesize the positive and negative impacts resulting from the use of health information technology in each phase of the laboratory 'total testing process' in primary care. We conducted a systematic review. Databases including Medline, PubMed, CINAHL, Web of Science and Google Scholar were searched. Studies eligible for inclusion reported empirical data on: 1) the use of a specific IT system, 2) the impacts of the systems to support the laboratory testing process, and were conducted in 3) primary care settings (including ambulatory care and primary care offices). Our final sample consisted of 22 empirical studies which were mapped to a framework that outlines the phases of the laboratory total testing process, focusing on phases where medical errors may occur. Health information technology systems support several phases of the laboratory testing process, from ordering the test to following-up with patients. This is a growing field of research with most studies focusing on the use of information technology during the final phases of the laboratory total testing process. The findings were largely positive. Positive impacts included easier access to test results by primary care providers, reduced turnaround times, and increased prescribed tests based on best practice guidelines. Negative impacts were reported in several studies: paper-based processes employed in parallel to the electronic process increased the potential for medical errors due to clinicians' cognitive overload; systems deemed not reliable or user-friendly hampered clinicians' performance; and organizational issues arose when results tracking relied on the prescribers' memory. The potential of health information technology lies not only in the exchange of health information, but also in knowledge sharing among clinicians. This review has underscored the important role played by cognitive factors, which are critical in the clinician's decision-making, the selection of the most appropriate tests, correct interpretation of the results and efficient interventions. By providing the right information, at the right time to the right clinician, many IT solutions adequately support the laboratory testing process and help primary care clinicians make better decisions. However, several technological and organizational barriers require more attention to fully support the highly fragmented and error-prone process of laboratory testing. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A Semantic Analysis of XML Schema Matching for B2B Systems Integration

    ERIC Educational Resources Information Center

    Kim, Jaewook

    2011-01-01

    One of the most critical steps to integrating heterogeneous e-Business applications using different XML schemas is schema matching, which is known to be costly and error-prone. Many automatic schema matching approaches have been proposed, but the challenge is still daunting because of the complexity of schemas and immaturity of technologies in…

  16. A Logically Centralized Approach for Control and Management of Large Computer Networks

    ERIC Educational Resources Information Center

    Iqbal, Hammad A.

    2012-01-01

    Management of large enterprise and Internet service provider networks is a complex, error-prone, and costly challenge. It is widely accepted that the key contributors to this complexity are the bundling of control and data forwarding in traditional routers and the use of fully distributed protocols for network control. To address these…

  17. An Evaluation of a New Printing Instrument to Aid in Identifying the Failure-prone Preschool Child.

    ERIC Educational Resources Information Center

    Simner, Marvin L.

    Involving 619 preschool children, a longitudinal investigation evaluated a new test for identifying preschool children who produce an excessive number of form errors in printing. All children participating were fluent in English and were in the appropriate grades for their ages, either pre-kindergarten or kindergarten, when they were given the…

  18. Computer programs for optical dendrometer measurements of standing tree profiles

    Treesearch

    Jacob R. Beard; Thomas G. Matney; Emily B. Schultz

    2015-01-01

    Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...

  19. Conducting Web-Based Surveys. ERIC Digest.

    ERIC Educational Resources Information Center

    Solomon, David J.

    Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…

  20. An Improved Unsupervised Image Segmentation Evaluation Approach Based on - and Over-Segmentation Aware

    NASA Astrophysics Data System (ADS)

    Su, Tengfei

    2018-04-01

    In this paper, an unsupervised evaluation scheme for remote sensing image segmentation is developed. Based on a method called under- and over-segmentation aware (UOA), the new approach is improved by overcoming the defect in the part of estimating over-segmentation error. Two cases of such error-prone defect are listed, and edge strength is employed to devise a solution to this issue. Two subsets of high resolution remote sensing images were used to test the proposed algorithm, and the experimental results indicate its superior performance, which is attributed to its improved OSE detection model.

  1. [Error prevention through management of complications in urology: standard operating procedures from commercial aviation as a model].

    PubMed

    Kranz, J; Sommer, K-J; Steffens, J

    2014-05-01

    Patient safety and risk/complication management rank among the current megatrends in modern medicine, which has undoubtedly become more complex. In time-critical, error-prone and difficult situations, which often occur repeatedly in everyday clinical practice, guidelines are inappropriate for acting rapidly and intelligently. With the establishment and consistent use of standard operating procedures like in commercial aviation, a possible strategic approach is available. These medical aids to decision-making - quick reference cards - are short, optimized instructions that enable a standardized procedure in case of medical claims.

  2. Skull registration for prone patient position using tracked ultrasound

    NASA Astrophysics Data System (ADS)

    Underwood, Grace; Ungi, Tamas; Baum, Zachary; Lasso, Andras; Kronreif, Gernot; Fichtinger, Gabor

    2017-03-01

    PURPOSE: Tracked navigation has become prevalent in neurosurgery. Problems with registration of a patient and a preoperative image arise when the patient is in a prone position. Surfaces accessible to optical tracking on the back of the head are unreliable for registration. We investigated the accuracy of surface-based registration using points accessible through tracked ultrasound. Using ultrasound allows access to bone surfaces that are not available through optical tracking. Tracked ultrasound could eliminate the need to work (i) under the table for registration and (ii) adjust the tracker between surgery and registration. In addition, tracked ultrasound could provide a non-invasive method in comparison to an alternative method of registration involving screw implantation. METHODS: A phantom study was performed to test the feasibility of tracked ultrasound for registration. An initial registration was performed to partially align the pre-operative computer tomography data and skull phantom. The initial registration was performed by an anatomical landmark registration. Surface points accessible by tracked ultrasound were collected and used to perform an Iterative Closest Point Algorithm. RESULTS: When the surface registration was compared to a ground truth landmark registration, the average TRE was found to be 1.6+/-0.1mm and the average distance of points off the skull surface was 0.6+/-0.1mm. CONCLUSION: The use of tracked ultrasound is feasible for registration of patients in prone position and eliminates the need to perform registration under the table. The translational component of error found was minimal. Therefore, the amount of TRE in registration is due to a rotational component of error.

  3. Image based automatic water meter reader

    NASA Astrophysics Data System (ADS)

    Jawas, N.; Indrianto

    2018-01-01

    Water meter is used as a tool to calculate water consumption. This tool works by utilizing water flow and shows the calculation result with mechanical digit counter. Practically, in everyday use, an operator will manually check the digit counter periodically. The Operator makes logs of the number shows by water meter to know the water consumption. This manual operation is time consuming and prone to human error. Therefore, in this paper we propose an automatic water meter digit reader from digital image. The digits sequence is detected by utilizing contour information of the water meter front panel.. Then an OCR method is used to get the each digit character. The digit sequence detection is an important part of overall process. It determines the success of overall system. The result shows promising results especially in sequence detection.

  4. Implementing High-Performance Geometric Multigrid Solver with Naturally Grained Messages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shan, Hongzhang; Williams, Samuel; Zheng, Yili

    2015-10-26

    Structured-grid linear solvers often require manually packing and unpacking of communication data to achieve high performance.Orchestrating this process efficiently is challenging, labor-intensive, and potentially error-prone.In this paper, we explore an alternative approach that communicates the data with naturally grained messagesizes without manual packing and unpacking. This approach is the distributed analogue of shared-memory programming, taking advantage of the global addressspace in PGAS languages to provide substantial programming ease. However, its performance may suffer from the large number of small messages. We investigate theruntime support required in the UPC ++ library for this naturally grained version to close the performance gapmore » between the two approaches and attain comparable performance at scale using the High-Performance Geometric Multgrid (HPGMG-FV) benchmark as a driver.« less

  5. Standardized Competencies for Parenteral Nutrition Prescribing: The American Society for Parenteral and Enteral Nutrition Model.

    PubMed

    Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David

    2015-08-01

    Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.

  6. Quantity and unit extraction for scientific and technical intelligence analysis

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy

    2017-05-01

    Scientific and Technical (S and T) intelligence analysts consume huge amounts of data to understand how scientific progress and engineering efforts affect current and future military capabilities. One of the most important types of information S and T analysts exploit is the quantities discussed in their source material. Frequencies, ranges, size, weight, power, and numerous other properties and measurements describing the performance characteristics of systems and the engineering constraints that define them must be culled from source documents before quantified analysis can begin. Automating the process of finding and extracting the relevant quantities from a wide range of S and T documents is difficult because information about quantities and their units is often contained in unstructured text with ad hoc conventions used to convey their meaning. Currently, even simple tasks, such as searching for documents discussing RF frequencies in a band of interest, is a labor intensive and error prone process. This research addresses the challenges facing development of a document processing capability that extracts quantities and units from S and T data, and how Natural Language Processing algorithms can be used to overcome these challenges.

  7. A high-speed linear algebra library with automatic parallelism

    NASA Technical Reports Server (NTRS)

    Boucher, Michael L.

    1994-01-01

    Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.

  8. Proneness to Self-Conscious Emotions in Adults With and Without Autism Traits.

    PubMed

    Davidson, Denise; Vanegas, Sandra B; Hilvert, Elizabeth

    2017-11-01

    Self-conscious emotions, such as shame, guilt and pride, facilitate our social interactions by motivating us to adhere to social norms and external standards. In this study, we examined proneness to shame, guilt, hubristic pride and authentic pride in adults with Autism Spectrum Disorder traits (ASD-T) and in neurotypical (NT) adults. Relations between proneness to self-conscious emotions and theory of mind (ToM), fear of negative evaluation, and social functioning were also assessed. Adults with ASD-T showed greater proneness to shame, and less proneness to guilt and pride than NT adults. Both ToM and fear of negative evaluation predicted proneness to self-conscious emotions in ASD-T. These findings are discussed in terms of understanding complex emotion processing in adults with ASD-T.

  9. 13Check_RNA: A tool to evaluate 13C chemical shifts assignments of RNA.

    PubMed

    Icazatti, A A; Martin, O A; Villegas, M; Szleifer, I; Vila, J A

    2018-06-19

    Chemical shifts (CS) are an important source of structural information of macromolecules such as RNA. In addition to the scarce availability of CS for RNA, the observed values are prone to errors due to a wrong re-calibration or miss assignments. Different groups have dedicated their efforts to correct CS systematic errors on RNA. Despite this, there are not automated and freely available algorithms for correct assignments of RNA 13C CS before their deposition to the BMRB or re-reference already deposited CS with systematic errors. Based on an existent method we have implemented an open source python module to correct 13C CS (from here on 13Cexp) systematic errors of RNAs and then return the results in 3 formats including the nmrstar one. This software is available on GitHub at https://github.com/BIOS-IMASL/13Check_RNA under a MIT license. Supplementary data are available at Bioinformatics online.

  10. Corrected score estimation in the proportional hazards model with misclassified discrete covariates

    PubMed Central

    Zucker, David M.; Spiegelman, Donna

    2013-01-01

    SUMMARY We consider Cox proportional hazards regression when the covariate vector includes error-prone discrete covariates along with error-free covariates, which may be discrete or continuous. The misclassification in the discrete error-prone covariates is allowed to be of any specified form. Building on the work of Nakamura and his colleagues, we present a corrected score method for this setting. The method can handle all three major study designs (internal validation design, external validation design, and replicate measures design), both functional and structural error models, and time-dependent covariates satisfying a certain ‘localized error’ condition. We derive the asymptotic properties of the method and indicate how to adjust the covariance matrix of the regression coefficient estimates to account for estimation of the misclassification matrix. We present the results of a finite-sample simulation study under Weibull survival with a single binary covariate having known misclassification rates. The performance of the method described here was similar to that of related methods we have examined in previous works. Specifically, our new estimator performed as well as or, in a few cases, better than the full Weibull maximum likelihood estimator. We also present simulation results for our method for the case where the misclassification probabilities are estimated from an external replicate measures study. Our method generally performed well in these simulations. The new estimator has a broader range of applicability than many other estimators proposed in the literature, including those described in our own earlier work, in that it can handle time-dependent covariates with an arbitrary misclassification structure. We illustrate the method on data from a study of the relationship between dietary calcium intake and distal colon cancer. PMID:18219700

  11. Intransparent German number words complicate transcoding - a translingual comparison with Japanese.

    PubMed

    Moeller, Korbinian; Zuber, Julia; Olsen, Naoko; Nuerk, Hans-Christoph; Willmes, Klaus

    2015-01-01

    Superior early numerical competencies of children in several Asian countries have (amongst others) been attributed to the higher transparency of their number word systems. Here, we directly investigated this claim by evaluating whether Japanese children's transcoding performance when writing numbers to dictation (e.g., "twenty five" → 25) was less error prone than that of German-speaking children - both in general as well as when considering language-specific attributes of the German number word system such as the inversion property, in particular. In line with this hypothesis we observed that German-speaking children committed more transcoding errors in general than their Japanese peers. Moreover, their error pattern reflected the specific inversion intransparency of the German number-word system. Inversion errors in transcoding represented the most prominent error category in German-speaking children, but were almost absent in Japanese-speaking children. We conclude that the less transparent German number-word system complicates the acquisition of the correspondence between symbolic Arabic numbers and their respective verbal number words.

  12. [Analysis of judicial sentences issued against traumatologists between 1995 and 2011 as regards medical negligence].

    PubMed

    Cardoso-Cita, Z; Perea-Pérez, B; Albarrán-Juan, M E; Labajo-González, M E; López-Durán, L; Marco-Martínez, F; Santiago-Saéz, A

    2016-01-01

    Traumatology and Orthopaedic Surgery is one of the specialities with most complaints due to its scope and complexity. The aim of this study is to determine the characteristics of the complaints made against medical specialists in Traumatology, taking into account those variables that might have an influence both on the presenting of the complaint as well as on the resolving of the process. An analysis was performed on 303 legal judgments (1995-2011) collected in the health legal judgements archive of the Madrid School of Medicine, which is linked to the Westlaw Aranzadi data base. Civil jurisdiction was the most used. The specific processes with most complaints were bone-joint disorders followed by vascular-nerve problems and infections. The injury claimed against most was in the lower limb, particularly the knee. The most frequent general cause of complaint was surgical treatment error, followed by diagnostic error. There was lack of information in 14.9%. There was sentencing in 49.8% of the cases, with compensation mainly being less than 50,000 euros. Traumatology and Orthopaedic Surgery is a speciality prone to complaints due to malpractice. The number of sentences against traumatologists is high, but compensations are usually less than 50,000 euros. The main reason for sentencing is surgical treatment error; thus being the basic surgical procedure and where precautions should be maximised. The judgements due to lack of information are high, with adequate doctor-patient communication being essential as well as the correct completion of the informed consent. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  13. Factor Structure and Measurement Invariance of the Cognitive Failures Questionnaire across the Adult Life Span

    ERIC Educational Resources Information Center

    Rast, Philippe; Zimprich, Daniel; Van Boxtel, Martin; Jolles, Jellemer

    2009-01-01

    The Cognitive Failures Questionnaire (CFQ) is designed to assess a person's proneness to committing cognitive slips and errors in the completion of everyday tasks. Although the CFQ is a widely used instrument, its factor structure remains an issue of scientific debate. The present study used data of a representative sample (N = 1,303, 24-83 years…

  14. Ground-based digital imagery for tree stem analysis

    Treesearch

    Neil Clark; Daniel L. Schmoldt; Randolph H. Wynne; Matthew F. Winn; Philip A. Araman

    2000-01-01

    In the USA, a subset of permanent forest sample plots within each geographic region are intensively measured to obtain estimates of tree volume and products. The detailed field measurements required for this type of sampling are both time consuming and error prone. We are attempting to reduce both of these factors with the aid of a commercially-available solid-state...

  15. Applying recovery biomarkers to calibrate self-report measures of energy and protein in the Hispanic Community Health Study/Study of Latinos

    USDA-ARS?s Scientific Manuscript database

    We investigated measurement error in the self-reported diets of US Hispanics/Latinos, who are prone to obesity and related comorbidities, by background (Central American, Cuban, Dominican, Mexican, Puerto Rican, and South American) in 2010–2012. In 477 participants aged 18–74 years, doubly labeled w...

  16. Structure-Function Analysis of Chloroplast Proteins via Random Mutagenesis Using Error-Prone PCR.

    PubMed

    Dumas, Louis; Zito, Francesca; Auroy, Pascaline; Johnson, Xenie; Peltier, Gilles; Alric, Jean

    2018-06-01

    Site-directed mutagenesis of chloroplast genes was developed three decades ago and has greatly advanced the field of photosynthesis research. Here, we describe a new approach for generating random chloroplast gene mutants that combines error-prone polymerase chain reaction of a gene of interest with chloroplast complementation of the knockout Chlamydomonas reinhardtii mutant. As a proof of concept, we targeted a 300-bp sequence of the petD gene that encodes subunit IV of the thylakoid membrane-bound cytochrome b 6 f complex. By sequencing chloroplast transformants, we revealed 149 mutations in the 300-bp target petD sequence that resulted in 92 amino acid substitutions in the 100-residue target subunit IV sequence. Our results show that this method is suited to the study of highly hydrophobic, multisubunit, and chloroplast-encoded proteins containing cofactors such as hemes, iron-sulfur clusters, and chlorophyll pigments. Moreover, we show that mutant screening and sequencing can be used to study photosynthetic mechanisms or to probe the mutational robustness of chloroplast-encoded proteins, and we propose that this method is a valuable tool for the directed evolution of enzymes in the chloroplast. © 2018 American Society of Plant Biologists. All rights reserved.

  17. Error-prone PCR mutation of Ls-EPSPS gene from Liriope spicata conferring to its enhanced glyphosate-resistance.

    PubMed

    Mao, Chanjuan; Xie, Hongjie; Chen, Shiguo; Valverde, Bernal E; Qiang, Sheng

    2017-09-01

    Liriope spicata (Thunb.) Lour has a unique LsEPSPS structure contributing to the highest-ever-recognized natural glyphosate tolerance. The transformed LsEPSPS confers increased glyphosate resistance to E. coli and A. thaliana. However, the increased glyphosate-resistance level is not high enough to be of commercial value. Therefore, LsEPSPS was subjected to error-prone PCR to screen mutant EPSPS genes capable of endowing higher resistance levels. A mutant designated as ELs-EPSPS having five mutated amino acids (37Val, 67Asn, 277Ser, 351Gly and 422Gly) was selected for its ability to confer improved resistance to glyphosate. Expression of ELs-EPSPS in recombinant E. coli BL21 (DE3) strains enhanced resistance to glyphosate in comparison to both the LsEPSPS-transformed and -untransformed controls. Furthermore, transgenic ELs-EPSPS A. thaliana was about 5.4 fold and 2-fold resistance to glyphosate compared with the wild-type and the Ls-EPSPS-transgenic plants, respectively. Therefore, the mutated ELs-EPSPS gene has potential value for has potential for the development of glyphosate-resistant crops. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Aggression proneness: Transdiagnostic processes involving negative valence and cognitive systems.

    PubMed

    Verona, Edelyn; Bresin, Konrad

    2015-11-01

    Aggressive behavior is observed in persons with various mental health problems and has been studied from the perspectives of neuroscience and psychophysiology. The present research reviews some of the extant experimental literature to help clarify the interplay between domains of functioning implicated in aggression proneness. We then convey a process-oriented model that elucidates how the interplay of the Negative Valence and Cognitive System domains of NIMH's Research Domain Criteria (RDoC) helps explain aggression proneness, particularly reactive aggression. Finally, we report on a study involving event-related potential (ERP) indices of emotional and inhibitory control processing during an emotional-linguistic go/no-go task among 67 individuals with histories of violence and criminal offending (30% female, 44% African-American) who reported on their aggressive tendencies using the Buss-Perry Aggression Questionnaire. Results provide evidence that tendencies toward angry and aggressive behavior relate to reduced inhibitory control processing (no-go P3) specifically during relevant threat-word blocks, suggesting deterioration of cognitive control by acute or sustained threat sensitivity. These findings highlight the value of ERP methodologies for clarifying the interplay of Negative Valence and Cognitive System processes in aggression proneness. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Using Block-local Atomicity to Detect Stale-value Concurrency Errors

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus; Biere, Armin

    2004-01-01

    Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.

  20. Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials

    PubMed Central

    Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.

    2013-01-01

    Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072

  1. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  2. A Case-Series Test of the Interactive Two-step Model of Lexical Access: Predicting Word Repetition from Picture Naming

    PubMed Central

    Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.

    2010-01-01

    Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error patterns of 65 aphasic subjects from their naming errors. The model’s characterizations of the subjects’ naming errors were taken from the companion paper to this one (Schwartz, Dell, N. Martin, Gahl & Sobel, 2006), and their repetition was predicted from the model on the assumption that naming involves two error prone steps, word and phonological retrieval, whereas repetition only creates errors in the second of these steps. A version of the model in which lexical-semantic and lexical-phonological connections could be independently lesioned was generally successful in predicting repetition for the aphasics. An analysis of the few cases in which model predictions were inaccurate revealed the role of input phonology in the repetition task. PMID:21085621

  3. Spatial calibration of an optical see-through head mounted display

    PubMed Central

    Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew

    2010-01-01

    We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125

  4. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    PubMed Central

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389

  5. Medication errors reported to the National Medication Error Reporting System in Malaysia: a 4-year retrospective review (2009 to 2012).

    PubMed

    Samsiah, A; Othman, Noordin; Jamshed, Shazia; Hassali, Mohamed Azmi; Wan-Mohaina, W M

    2016-12-01

    Reporting and analysing the data on medication errors (MEs) is important and contributes to a better understanding of the error-prone environment. This study aims to examine the characteristics of errors submitted to the National Medication Error Reporting System (MERS) in Malaysia. A retrospective review of reports received from 1 January 2009 to 31 December 2012 was undertaken. Descriptive statistics method was applied. A total of 17,357 MEs reported were reviewed. The majority of errors were from public-funded hospitals. Near misses were classified in 86.3 % of the errors. The majority of errors (98.1 %) had no harmful effects on the patients. Prescribing contributed to more than three-quarters of the overall errors (76.1 %). Pharmacists detected and reported the majority of errors (92.1 %). Cases of erroneous dosage or strength of medicine (30.75 %) were the leading type of error, whilst cardiovascular (25.4 %) was the most common category of drug found. MERS provides rich information on the characteristics of reported MEs. Low contribution to reporting from healthcare facilities other than government hospitals and non-pharmacists requires further investigation. Thus, a feasible approach to promote MERS among healthcare providers in both public and private sectors needs to be formulated and strengthened. Preventive measures to minimise MEs should be directed to improve prescribing competency among the fallible prescribers identified.

  6. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis

    PubMed Central

    Gobiet, Andreas

    2016-01-01

    ABSTRACT Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio‐temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan‐European data sets and a set that combines eight very high‐resolution station‐based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post‐processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small‐scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate‐mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments. PMID:28111497

  7. Impacts of uncertainties in European gridded precipitation observations on regional climate analysis.

    PubMed

    Prein, Andreas F; Gobiet, Andreas

    2017-01-01

    Gridded precipitation data sets are frequently used to evaluate climate models or to remove model output biases. Although precipitation data are error prone due to the high spatio-temporal variability of precipitation and due to considerable measurement errors, relatively few attempts have been made to account for observational uncertainty in model evaluation or in bias correction studies. In this study, we compare three types of European daily data sets featuring two Pan-European data sets and a set that combines eight very high-resolution station-based regional data sets. Furthermore, we investigate seven widely used, larger scale global data sets. Our results demonstrate that the differences between these data sets have the same magnitude as precipitation errors found in regional climate models. Therefore, including observational uncertainties is essential for climate studies, climate model evaluation, and statistical post-processing. Following our results, we suggest the following guidelines for regional precipitation assessments. (1) Include multiple observational data sets from different sources (e.g. station, satellite, reanalysis based) to estimate observational uncertainties. (2) Use data sets with high station densities to minimize the effect of precipitation undersampling (may induce about 60% error in data sparse regions). The information content of a gridded data set is mainly related to its underlying station density and not to its grid spacing. (3) Consider undercatch errors of up to 80% in high latitudes and mountainous regions. (4) Analyses of small-scale features and extremes are especially uncertain in gridded data sets. For higher confidence, use climate-mean and larger scale statistics. In conclusion, neglecting observational uncertainties potentially misguides climate model development and can severely affect the results of climate change impact assessments.

  8. A scalable method to improve gray matter segmentation at ultra high field MRI.

    PubMed

    Gulban, Omer Faruk; Schneider, Marian; Marquardt, Ingo; Haast, Roy A M; De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data.

  9. A scalable method to improve gray matter segmentation at ultra high field MRI

    PubMed Central

    De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data. PMID:29874295

  10. Structural and Functional Basis of the Fidelity of Nucleotide Selection by Flavivirus RNA-Dependent RNA Polymerases

    PubMed Central

    Canard, Bruno

    2018-01-01

    Viral RNA-dependent RNA polymerases (RdRps) play a central role not only in viral replication, but also in the genetic evolution of viral RNAs. After binding to an RNA template and selecting 5′-triphosphate ribonucleosides, viral RdRps synthesize an RNA copy according to Watson-Crick base-pairing rules. The copy process sometimes deviates from both the base-pairing rules specified by the template and the natural ribose selectivity and, thus, the process is error-prone due to the intrinsic (in)fidelity of viral RdRps. These enzymes share a number of conserved amino-acid sequence strings, called motifs A–G, which can be defined from a structural and functional point-of-view. A co-relation is gradually emerging between mutations in these motifs and viral genome evolution or observed mutation rates. Here, we review our current knowledge on these motifs and their role on the structural and mechanistic basis of the fidelity of nucleotide selection and RNA synthesis by Flavivirus RdRps. PMID:29385764

  11. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations.

    PubMed

    Martínez-Romero, Marcos; O'Connor, Martin J; Shankar, Ravi D; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L; Gevaert, Olivier; Graybeal, John; Musen, Mark A

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository.

  12. Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations

    PubMed Central

    Martínez-Romero, Marcos; O’Connor, Martin J.; Shankar, Ravi D.; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L.; Gevaert, Olivier; Graybeal, John; Musen, Mark A.

    2017-01-01

    In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository. PMID:29854196

  13. Unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, I.J.; Chandler, R.B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientic questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mecha- nisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unied modeling in- terface. The R package unmarked provides such a unied modeling framework, including tools for data exploration, model tting, model criticism, post-hoc analysis, and model comparison.

  14. Automatic Identification of Critical Follow-Up Recommendation Sentences in Radiology Reports

    PubMed Central

    Yetisgen-Yildiz, Meliha; Gunn, Martin L.; Xia, Fei; Payne, Thomas H.

    2011-01-01

    Communication of follow-up recommendations when abnormalities are identified on imaging studies is prone to error. When recommendations are not systematically identified and promptly communicated to referrers, poor patient outcomes can result. Using information technology can improve communication and improve patient safety. In this paper, we describe a text processing approach that uses natural language processing (NLP) and supervised text classification methods to automatically identify critical recommendation sentences in radiology reports. To increase the classification performance we enhanced the simple unigram token representation approach with lexical, semantic, knowledge-base, and structural features. We tested different combinations of those features with the Maximum Entropy (MaxEnt) classification algorithm. Classifiers were trained and tested with a gold standard corpus annotated by a domain expert. We applied 5-fold cross validation and our best performing classifier achieved 95.60% precision, 79.82% recall, 87.0% F-score, and 99.59% classification accuracy in identifying the critical recommendation sentences in radiology reports. PMID:22195225

  15. Cognitive determinants of affective forecasting errors

    PubMed Central

    Hoerger, Michael; Quirk, Stuart W.; Lucas, Richard E.; Carr, Thomas H.

    2011-01-01

    Often to the detriment of human decision making, people are prone to an impact bias when making affective forecasts, overestimating the emotional consequences of future events. The cognitive processes underlying the impact bias, and methods for correcting it, have been debated and warrant further exploration. In the present investigation, we examined both individual differences and contextual variables associated with cognitive processing in affective forecasting for an election. Results showed that the perceived importance of the event and working memory capacity were both associated with an increased impact bias for some participants, whereas retrieval interference had no relationship with bias. Additionally, an experimental manipulation effectively reduced biased forecasts, particularly among participants who were most distracted thinking about peripheral life events. These findings have direct theoretical implications for understanding the impact bias, highlight the importance of individual differences in affective forecasting, and have ramifications for future decision making research. The possible functional role of the impact bias is discussed within the context of evolutionary psychology. PMID:21912580

  16. Automatic identification of critical follow-up recommendation sentences in radiology reports.

    PubMed

    Yetisgen-Yildiz, Meliha; Gunn, Martin L; Xia, Fei; Payne, Thomas H

    2011-01-01

    Communication of follow-up recommendations when abnormalities are identified on imaging studies is prone to error. When recommendations are not systematically identified and promptly communicated to referrers, poor patient outcomes can result. Using information technology can improve communication and improve patient safety. In this paper, we describe a text processing approach that uses natural language processing (NLP) and supervised text classification methods to automatically identify critical recommendation sentences in radiology reports. To increase the classification performance we enhanced the simple unigram token representation approach with lexical, semantic, knowledge-base, and structural features. We tested different combinations of those features with the Maximum Entropy (MaxEnt) classification algorithm. Classifiers were trained and tested with a gold standard corpus annotated by a domain expert. We applied 5-fold cross validation and our best performing classifier achieved 95.60% precision, 79.82% recall, 87.0% F-score, and 99.59% classification accuracy in identifying the critical recommendation sentences in radiology reports.

  17. What is Developmental Dyslexia?

    PubMed Central

    Stein, John

    2018-01-01

    Until the 1950s, developmental dyslexia was defined as a hereditary visual disability, selectively affecting reading without compromising oral or non-verbal reasoning skills. This changed radically after the development of the phonological theory of dyslexia; this not only ruled out any role for visual processing in its aetiology, but it also cast doubt on the use of discrepancy between reading and reasoning skills as a criterion for diagnosing it. Here I argue that this theory is set at too high a cognitive level to be explanatory; we need to understand the pathophysiological visual and auditory mechanisms that cause children’s phonological problems. I discuss how the ‘magnocellular theory’ attempts to do this in terms of slowed and error prone temporal processing which leads to dyslexics’ defective visual and auditory sequencing when attempting to read. I attempt to deal with the criticisms of this theory and show how it leads to a number of successful ways of helping dyslexic children to overcome their reading difficulties. PMID:29401712

  18. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    PubMed

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. "Truth be told" - Semantic memory as the scaffold for veridical communication.

    PubMed

    Hayes, Brett K; Ramanan, Siddharth; Irish, Muireann

    2018-01-01

    Theoretical accounts placing episodic memory as central to constructive and communicative functions neglect the role of semantic memory. We argue that the decontextualized nature of semantic schemas largely supersedes the computational bottleneck and error-prone nature of episodic memory. Rather, neuroimaging and neuropsychological evidence of episodic-semantic interactions suggest that an integrative framework more accurately captures the mechanisms underpinning social communication.

  20. Exploring the Clinical Utility of the Development and Well-Being Assessment (DAWBA) in the Detection of Hyperkinetic Disorders and Associated Diagnoses in Clinical Practice

    ERIC Educational Resources Information Center

    Foreman, David; Morton, Stephanie; Ford, Tamsin

    2009-01-01

    Background: The clinical diagnosis of ADHD is time-consuming and error-prone. Secondary care referral results in long waiting times, but primary care staff may not provide reliable diagnoses. The Development And Well-Being Assessment (DAWBA) is a standardised assessment for common child mental health problems, including attention…

  1. Measuring Diameters Of Large Vessels

    NASA Technical Reports Server (NTRS)

    Currie, James R.; Kissel, Ralph R.; Oliver, Charles E.; Smith, Earnest C.; Redmon, John W., Sr.; Wallace, Charles C.; Swanson, Charles P.

    1990-01-01

    Computerized apparatus produces accurate results quickly. Apparatus measures diameter of tank or other large cylindrical vessel, without prior knowledge of exact location of cylindrical axis. Produces plot of inner circumference, estimate of true center of vessel, data on radius, diameter of best-fit circle, and negative and positive deviations of radius from circle at closely spaced points on circumference. Eliminates need for time-consuming and error-prone manual measurements.

  2. Quantitative Analysis of the Mutagenic Potential of 1-Aminopyrene-DNA Adduct Bypass Catalyzed by Y-Family DNA Polymerases

    PubMed Central

    Sherrer, Shanen M.; Taggart, David J.; Pack, Lindsey R.; Malik, Chanchal K.; Basu, Ashis K.; Suo, Zucai

    2012-01-01

    N- (deoxyguanosin-8-yl)-1-aminopyrene (dGAP) is the predominant nitro polyaromatic hydrocarbon product generated from the air pollutant 1-nitropyrene reacting with DNA. Previous studies have shown that dGAP induces genetic mutations in bacterial and mammalian cells. One potential source of these mutations is the error-prone bypass of dGAP lesions catalyzed by the low-fidelity Y-family DNA polymerases. To provide a comparative analysis of the mutagenic potential of the translesion DNA synthesis (TLS) of dGAP, we employed short oligonucleotide sequencing assays (SOSAs) with the model Y-family DNA polymerase from Sulfolobus solfataricus, DNA Polymerase IV (Dpo4), and the human Y-family DNA polymerases eta (hPolη), kappa (hPolκ), and iota (hPolι). Relative to undamaged DNA, all four enzymes generated far more mutations (base deletions, insertions, and substitutions) with a DNA template containing a site-specifically placed dGAP. Opposite dGAP and at an immediate downstream template position, the most frequent mutations made by the three human enzymes were base deletions and the most frequent base substitutions were dAs for all enzymes. Based on the SOSA data, Dpo4 was the least error-prone Y-family DNA polymerase among the four enzymes during the TLS of dGAP. Among the three human Y-family enzymes, hPolκ made the fewest mutations at all template positions except opposite the lesion site. hPolκ was significantly less error-prone than hPolι and hPolη during the extension of dGAP bypass products. Interestingly, the most frequent mutations created by hPolι at all template positions were base deletions. Although hRev1, the fourth human Y-family enzyme, could not extend dGAP bypass products in our standing start assays, it preferentially incorporated dCTP opposite the bulky lesion. Collectively, these mutagenic profiles suggest that hPolkk and hRev1 are the most suitable human Y-family DNA polymerases to perform TLS of dGAP in humans. PMID:22917544

  3. Experimental investigation of observation error in anuran call surveys

    USGS Publications Warehouse

    McClintock, B.T.; Bailey, L.L.; Pollock, K.H.; Simons, T.R.

    2010-01-01

    Occupancy models that account for imperfect detection are often used to monitor anuran and songbird species occurrence. However, presenceabsence data arising from auditory detections may be more prone to observation error (e.g., false-positive detections) than are sampling approaches utilizing physical captures or sightings of individuals. We conducted realistic, replicated field experiments using a remote broadcasting system to simulate simple anuran call surveys and to investigate potential factors affecting observation error in these studies. Distance, time, ambient noise, and observer abilities were the most important factors explaining false-negative detections. Distance and observer ability were the best overall predictors of false-positive errors, but ambient noise and competing species also affected error rates for some species. False-positive errors made up 5 of all positive detections, with individual observers exhibiting false-positive rates between 0.5 and 14. Previous research suggests false-positive errors of these magnitudes would induce substantial positive biases in standard estimators of species occurrence, and we recommend practices to mitigate for false positives when developing occupancy monitoring protocols that rely on auditory detections. These recommendations include additional observer training, limiting the number of target species, and establishing distance and ambient noise thresholds during surveys. ?? 2010 The Wildlife Society.

  4. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  5. A Survey of Flow Cytometry Data Analysis Methods

    PubMed Central

    Bashashati, Ali; Brinkman, Ryan R.

    2009-01-01

    Flow cytometry (FCM) is widely used in health research and in treatment for a variety of tasks, such as in the diagnosis and monitoring of leukemia and lymphoma patients, providing the counts of helper-T lymphocytes needed to monitor the course and treatment of HIV infection, the evaluation of peripheral blood hematopoietic stem cell grafts, and many other diseases. In practice, FCM data analysis is performed manually, a process that requires an inordinate amount of time and is error-prone, nonreproducible, nonstandardized, and not open for re-evaluation, making it the most limiting aspect of this technology. This paper reviews state-of-the-art FCM data analysis approaches using a framework introduced to report each of the components in a data analysis pipeline. Current challenges and possible future directions in developing fully automated FCM data analysis tools are also outlined. PMID:20049163

  6. ASSIST user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1995-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.

  7. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  8. Face emotion recognition is related to individual differences in psychosis-proneness.

    PubMed

    Germine, L T; Hooker, C I

    2011-05-01

    Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.

  9. Temporal consistent depth map upscaling for 3DTV

    NASA Astrophysics Data System (ADS)

    Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger

    2014-03-01

    The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time­ of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.

  10. Human machine interface by using stereo-based depth extraction

    NASA Astrophysics Data System (ADS)

    Liao, Chao-Kang; Wu, Chi-Hao; Lin, Hsueh-Yi; Chang, Ting-Ting; Lin, Tung-Yang; Huang, Po-Kuan

    2014-03-01

    The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time­ of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.

  11. The GHEP–EMPOP collaboration on mtDNA population data—A new resource for forensic casework

    PubMed Central

    Prieto, L.; Zimmermann, B.; Goios, A.; Rodriguez-Monge, A.; Paneto, G.G.; Alves, C.; Alonso, A.; Fridman, C.; Cardoso, S.; Lima, G.; Anjos, M.J.; Whittle, M.R.; Montesino, M.; Cicarelli, R.M.B.; Rocha, A.M.; Albarrán, C.; de Pancorbo, M.M.; Pinheiro, M.F.; Carvalho, M.; Sumita, D.R.; Parson, W.

    2011-01-01

    Mitochondrial DNA (mtDNA) population data for forensic purposes are still scarce for some populations, which may limit the evaluation of forensic evidence especially when the rarity of a haplotype needs to be determined in a database search. In order to improve the collection of mtDNA lineages from the Iberian and South American subcontinents, we here report the results of a collaborative study involving nine laboratories from the Spanish and Portuguese Speaking Working Group of the International Society for Forensic Genetics (GHEP-ISFG) and EMPOP. The individual laboratories contributed population data that were generated throughout the past 10 years, but in the majority of cases have not been made available to the scientific community. A total of 1019 haplotypes from Iberia (Basque Country, 2 general Spanish populations, 2 North and 1 Central Portugal populations), and Latin America (3 populations from São Paulo) were collected, reviewed and harmonized according to defined EMPOP criteria. The majority of data ambiguities that were found during the reviewing process (41 in total) were transcription errors confirming that the documentation process is still the most error-prone stage in reporting mtDNA population data, especially when performed manually. This GHEP–EMPOP collaboration has significantly improved the quality of the individual mtDNA datasets and adds mtDNA population data as valuable resource to the EMPOP database (www.empop.org). PMID:21075696

  12. Learning from malpractice claims about negligent, adverse events in primary care in the United States

    PubMed Central

    Phillips, R; Bartholomew, L; Dovey, S; Fryer, G; Miyoshi, T; Green, L

    2004-01-01

    Background: The epidemiology, risks, and outcomes of errors in primary care are poorly understood. Malpractice claims brought for negligent adverse events offer a useful insight into errors in primary care. Methods: Physician Insurers Association of America malpractice claims data (1985–2000) were analyzed for proportions of negligent claims by primary care specialty, setting, severity, health condition, and attributed cause. We also calculated risks of a claim for condition-specific negligent events relative to the prevalence of those conditions in primary care. Results: Of 49 345 primary care claims, 26 126 (53%) were peer reviewed and 5921 (23%) were assessed as negligent; 68% of claims were for negligent events in outpatient settings. No single condition accounted for more than 5% of all negligent claims, but the underlying causes were more clustered with "diagnosis error" making up one third of claims. The ratios of condition-specific negligent event claims relative to the frequency of those conditions in primary care revealed a significantly disproportionate risk for a number of conditions (for example, appendicitis was 25 times more likely to generate a claim for negligence than breast cancer). Conclusions: Claims data identify conditions and processes where primary health care in the United States is prone to go awry. The burden of severe outcomes and death from malpractice claims made against primary care physicians was greater in primary care outpatient settings than in hospitals. Although these data enhance information about error related negligent events in primary care, particularly when combined with other primary care data, there are many operating limitations. PMID:15069219

  13. The Argos-CLS Kalman Filter: Error Structures and State-Space Modelling Relative to Fastloc GPS Data.

    PubMed

    Lowther, Andrew D; Lydersen, Christian; Fedak, Mike A; Lovell, Phil; Kovacs, Kit M

    2015-01-01

    Understanding how an animal utilises its surroundings requires its movements through space to be described accurately. Satellite telemetry is the only means of acquiring movement data for many species however data are prone to varying amounts of spatial error; the recent application of state-space models (SSMs) to the location estimation problem have provided a means to incorporate spatial errors when characterising animal movements. The predominant platform for collecting satellite telemetry data on free-ranging animals, Service Argos, recently provided an alternative Doppler location estimation algorithm that is purported to be more accurate and generate a greater number of locations that its predecessor. We provide a comprehensive assessment of this new estimation process performance on data from free-ranging animals relative to concurrently collected Fastloc GPS data. Additionally, we test the efficacy of three readily-available SSM in predicting the movement of two focal animals. Raw Argos location estimates generated by the new algorithm were greatly improved compared to the old system. Approximately twice as many Argos locations were derived compared to GPS on the devices used. Root Mean Square Errors (RMSE) for each optimal SSM were less than 4.25 km with some producing RMSE of less than 2.50 km. Differences in the biological plausibility of the tracks between the two focal animals used to investigate the utility of SSM highlights the importance of considering animal behaviour in movement studies. The ability to reprocess Argos data collected since 2008 with the new algorithm should permit questions of animal movement to be revisited at a finer resolution.

  14. Neutrophil elastase and proteinase 3 trafficking routes in myelomonocytic cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaellquist, Linda; Rosen, Hanna; Nordenfelt, Pontus

    2010-11-15

    Neutrophil elastase (NE) and proteinase 3 (PR3) differ in intracellular localization, which may reflect different trafficking mechanisms of the precursor forms when synthesized at immature stages of neutrophils. To shed further light on these mechanisms, we compared the trafficking of precursor NE (proNE) and precursor PR3 (proPR3). Like proNE [1], proPR3 interacted with CD63 upon heterologous co-expression in COS cells but endogenous interaction was not detected although cell surface proNE/proPR3/CD63 were co-endocytosed in myelomonocytic cells. Cell surface proNE/proPR3 turned over more rapidly than cell surface CD63 consistent with processing/degradation of the pro-proteases but recycling of CD63. Colocalization of proNE/proPR3/CD63 withmore » clathrin and Rab 7 suggested trafficking through coated vesicles and late endosomes. Partial caveolar trafficking of proNE/CD63 but not proPR3 was suggested by colocalization with caveolin-1. Blocking the C-terminus of proNE/proPR3 by creating a fusion with FK506 binding protein inhibited endosomal re-uptake of proNE but not proPR3 indicating 'pro{sub C}'-peptide-dependent structural/conformational requirements for proNE but not for proPR3 endocytosis. The NE aminoacid residue Y199 of a proposed NE sorting motif that interacts with AP-3 [2] was not required for proNE processing, sorting or endocytosis in rat basophilic leukemia (RBL) cells expressing heterologous Y199-deleted proNE; this suggests operation of another AP-3-link for proNE targeting. Our results show intracellular multi-step trafficking to be different between proNE and proPR3 consistent with their differential subcellular NE/PR3 localization in neutrophils.« less

  15. Arousal from sleep pathways are affected by the prone sleeping position and preterm birth: preterm birth, prone sleeping and arousal from sleep.

    PubMed

    Richardson, Heidi L; Horne, Rosemary S C

    2013-09-01

    Preterm infants exhibit depressed arousability from sleep when compared with term infants. As the final cortical element of the arousal process may be the most critical for survival, we hypothesized that the increased vulnerability of preterm infants to the Sudden Infant Death Syndrome (SIDS) could be explained by depressed cortical arousal (CA) responses. We evaluated the effects of preterm birth on stimulus-induced arousal processes in both the prone and supine sleeping positions. 10 healthy preterm infants were studied with daytime polysomnography, in both supine and prone sleeping positions, at 36 weeks gestational age, 2-4 weeks, 2-3 months and 5-6 months post-term corrected age. Sub-cortical activations and cortical arousals (CA) were expressed as proportions of total arousal responses. Preterm data were compared with data from 13 healthy term infants studied at the same corrected ages. In preterm infants increased CAs were observed in the prone position at all ages studied. Compared to term infants, preterm infants had significantly fewer CAs in QS when prone at 2-3 months of age and more CAs when prone at 2-4 weeks in AS. There were no differences in either sleep state when infants slept supine. Prone sleeping promoted CA responses in healthy preterm infants throughout the first six months of post-term age. We have previously suggested that in term infants enhanced CA represents a critical protection against a potentially harmful situation; we speculate that for preterm-born infants the need for this protection is greater than in term infants. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  16. A novel body frame based approach to aerospacecraft attitude tracking.

    PubMed

    Ma, Carlos; Chen, Michael Z Q; Lam, James; Cheung, Kie Chung

    2017-09-01

    In the common practice of designing an attitude tracker for an aerospacecraft, one transforms the Newton-Euler rotation equations to obtain the dynamic equations of some chosen inertial frame based attitude metrics, such as Euler angles and unit quaternions. A Lyapunov approach is then used to design a controller which ensures asymptotic convergence of the attitude to the desired orientation. Although this design methodology is pretty standard, it usually involves singularity-prone coordinate transformations which complicates the analysis process and controller design. A new, singularity free error feedback method is proposed in the paper to provide simple and intuitive stability analysis and controller synthesis. This new body frame based method utilizes the concept of Euleraxis and angles to generate the smallest error angles from a body frame perspective, without coordinate transformations. Global tracking convergence is illustrated with the use of a feedback linearizing PD tracker, a sliding mode controller, and a model reference adaptive controller. Experimental results are also obtained on a quadrotor platform with unknown system parameters and disturbances, using a boundary layer approximated sliding mode controller, a PIDD controller, and a unit sliding mode controller. Significant tracking quality is attained. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model-Informed Drug Discovery and Development.

    PubMed

    Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya

    2017-05-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  18. Evolution of gossip-based indirect reciprocity on a bipartite network

    PubMed Central

    Giardini, Francesca; Vilone, Daniele

    2016-01-01

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission. PMID:27885256

  19. Evolution of gossip-based indirect reciprocity on a bipartite network.

    PubMed

    Giardini, Francesca; Vilone, Daniele

    2016-11-25

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.

  20. Evolution of gossip-based indirect reciprocity on a bipartite network

    NASA Astrophysics Data System (ADS)

    Giardini, Francesca; Vilone, Daniele

    2016-11-01

    Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.

  1. Correlating behavioral responses to FMRI signals from human prefrontal cortex: examining cognitive processes using task analysis.

    PubMed

    DeSouza, Joseph F X; Ovaysikia, Shima; Pynn, Laura

    2012-06-20

    The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop(1) and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli(1,2). When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task(3,4,5,6), where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position. Yet again we measure behavior by recording the eye movements of participants which allows for the sorting of the behavioral responses into correct and error trials(7) which then can be correlated to brain activity. Neuroimaging now allows researchers to measure different behaviors of correct and error trials that are indicative of different cognitive processes and pinpoint the different neural networks involved.

  2. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  3. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE PAGES

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E; ...

    2016-02-29

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  4. Medication Safety Systems and the Important Role of Pharmacists.

    PubMed

    Mansur, Jeannell M

    2016-03-01

    Preventable medication-related adverse events continue to occur in the healthcare setting. While the Institute of Medicine's To Err is Human, published in 2000, highlighted the prevalence of medical and medication-related errors in patient morbidity and mortality, there has not been significant documented progress in addressing system contributors to medication errors. The lack of progress may be related to the myriad of pharmaceutical options now available and the nuances of optimizing drug therapy to achieve desired outcomes and prevent undesirable outcomes. However, on a broader scale, there may be opportunities to focus on the design and performance of the many processes that are part of the medication system. Errors may occur in the storage, prescribing, transcription, preparation and dispensing, or administration and monitoring of medications. Each of these nodes of the medication system, with its many components, is prone to failure, resulting in harm to patients. The pharmacist is uniquely trained to be able to impact medication safety at the individual patient level through medication management skills that are part of the clinical pharmacist's role, but also to analyze the performance of medication processes and to lead redesign efforts to mitigate drug-related outcomes that may cause harm. One population that can benefit from a focus on medication safety through clinical pharmacy services and medication safety programs is the elderly, who are at risk for adverse drug events due to their many co-morbidities and the number of medications often used. This article describes the medication safety systems and provides a blueprint for creating a foundation for medication safety programs within healthcare organizations. The specific role of pharmacists and clinical pharmacy services in medication safety is also discussed here and in other articles in this Theme Issue.

  5. A template-based approach for responsibility management in executable business processes

    NASA Astrophysics Data System (ADS)

    Cabanillas, Cristina; Resinas, Manuel; Ruiz-Cortés, Antonio

    2018-05-01

    Process-oriented organisations need to manage the different types of responsibilities their employees may have w.r.t. the activities involved in their business processes. Despite several approaches provide support for responsibility modelling, in current Business Process Management Systems (BPMS) the only responsibility considered at runtime is the one related to performing the work required for activity completion. Others like accountability or consultation must be implemented by manually adding activities in the executable process model, which is time-consuming and error-prone. In this paper, we address this limitation by enabling current BPMS to execute processes in which people with different responsibilities interact to complete the activities. We introduce a metamodel based on Responsibility Assignment Matrices (RAM) to model the responsibility assignment for each activity, and a flexible template-based mechanism that automatically transforms such information into BPMN elements, which can be interpreted and executed by a BPMS. Thus, our approach does not enforce any specific behaviour for the different responsibilities but new templates can be modelled to specify the interaction that best suits the activity requirements. Furthermore, libraries of templates can be created and reused in different processes. We provide a reference implementation and build a library of templates for a well-known set of responsibilities.

  6. Dietary Assessment in Food Environment Research

    PubMed Central

    Kirkpatrick, Sharon I.; Reedy, Jill; Butler, Eboneé N.; Dodd, Kevin W.; Subar, Amy F.; Thompson, Frances E.; McKinnon, Robin A.

    2015-01-01

    Context The existing evidence on food environments and diet is inconsistent, potentially due in part to heterogeneity in measures used to assess diet. The objective of this review, conducted in 2012–2013, was to examine measures of dietary intake utilized in food environment research. Evidence acquisition Included studies were published from January 2007 through June 2012 and assessed relationships between at least one food environment exposure and at least one dietary outcome. Fifty-one articles were identified using PubMed, Scopus, Web of Knowledge, and PsycINFO; references listed in the papers reviewed and relevant review articles; and the National Cancer Institute's Measures of the Food Environment website. The frequency of the use of dietary intake measures and assessment of specific dietary outcomes was examined, as were patterns of results among studies using different dietary measures. Evidence synthesis The majority of studies used brief instruments, such as screeners or one or two questions, to assess intake. Food frequency questionnaires were used in about a third of studies, one in ten used 24-hour recalls, and fewer than one in twenty used diaries. Little consideration of dietary measurement error was evident. Associations between the food environment and diet were more consistently in the expected direction in studies using less error-prone measures. Conclusions There is a tendency toward the use of brief dietary assessment instruments with low cost and burden rather than more detailed instruments that capture intake with less bias. Use of error-prone dietary measures may lead to spurious findings and reduced power to detect associations. PMID:24355678

  7. Variations of Human DNA Polymerase Genes as Biomarkers of Prostate Cancer Progression

    DTIC Science & Technology

    2013-07-01

    discovery , cancer genetics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC...variations identified (including all single and double mutant combinations of the Triple mutant), and some POLK mutants • Discovery of a novel...Athens, Greece, 07/10 Makridakis N. Error-prone polymerase mutations and prostate cancer progression, COBRE /Cancer Genetics group seminar, Tulane

  8. The expanding polymerase universe.

    PubMed

    Goodman, M F; Tippin, B

    2000-11-01

    Over the past year, the number of known prokaryotic and eukaryotic DNA polymerases has exploded. Many of these newly discovered enzymes copy aberrant bases in the DNA template over which 'respectable' polymerases fear to tread. The next step is to unravel their functions, which are thought to range from error-prone copying of DNA lesions, somatic hypermutation and avoidance of skin cancer, to restarting stalled replication forks and repairing double-stranded DNA breaks.

  9. Error rates and resource overheads of encoded three-qubit gates

    NASA Astrophysics Data System (ADS)

    Takagi, Ryuji; Yoder, Theodore J.; Chuang, Isaac L.

    2017-10-01

    A non-Clifford gate is required for universal quantum computation, and, typically, this is the most error-prone and resource-intensive logical operation on an error-correcting code. Small, single-qubit rotations are popular choices for this non-Clifford gate, but certain three-qubit gates, such as Toffoli or controlled-controlled-Z (ccz), are equivalent options that are also more suited for implementing some quantum algorithms, for instance, those with coherent classical subroutines. Here, we calculate error rates and resource overheads for implementing logical ccz with pieceable fault tolerance, a nontransversal method for implementing logical gates. We provide a comparison with a nonlocal magic-state scheme on a concatenated code and a local magic-state scheme on the surface code. We find the pieceable fault-tolerance scheme particularly advantaged over magic states on concatenated codes and in certain regimes over magic states on the surface code. Our results suggest that pieceable fault tolerance is a promising candidate for fault tolerance in a near-future quantum computer.

  10. CTF Preprocessor User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, Maria; Salko, Robert K.

    2016-05-26

    This document describes how a user should go about using the CTF pre- processor tool to create an input deck for modeling rod-bundle geometry in CTF. The tool was designed to generate input decks in a quick and less error-prone manner for CTF. The pre-processor is a completely independent utility, written in Fortran, that takes a reduced amount of input from the user. The information that the user must supply is basic information on bundle geometry, such as rod pitch, clad thickness, and axial location of spacer grids--the pre-processor takes this basic information and determines channel placement and connection informationmore » to be written to the input deck, which is the most time-consuming and error-prone segment of creating a deck. Creation of the model is also more intuitive, as the user can specify assembly and water-tube placement using visual maps instead of having to place them by determining channel/channel and rod/channel connections. As an example of the benefit of the pre-processor, a quarter-core model that contains 500,000 scalar-mesh cells was read into CTF from an input deck containing 200,000 lines of data. This 200,000 line input deck was produced automatically from a set of pre-processor decks that contained only 300 lines of data.« less

  11. Overconfidence across the psychosis continuum: a calibration approach.

    PubMed

    Balzan, Ryan P; Woodward, Todd S; Delfabbro, Paul; Moritz, Steffen

    2016-11-01

    An 'overconfidence in errors' bias has been consistently observed in people with schizophrenia relative to healthy controls, however, the bias is seldom found to be associated with delusional ideation. Using a more precise confidence-accuracy calibration measure of overconfidence, the present study aimed to explore whether the overconfidence bias is greater in people with higher delusional ideation. A sample of 25 participants with schizophrenia and 50 non-clinical controls (25 high- and 25 low-delusion-prone) completed 30 difficult trivia questions (accuracy <75%); 15 'half-scale' items required participants to indicate their level of confidence for accuracy, and the remaining 'confidence-range' items asked participants to provide lower/upper bounds in which they were 80% confident the true answer lay within. There was a trend towards higher overconfidence for half-scale items in the schizophrenia and high-delusion-prone groups, which reached statistical significance for confidence-range items. However, accuracy was particularly low in the two delusional groups and a significant negative correlation between clinical delusional scores and overconfidence was observed for half-scale items within the schizophrenia group. Evidence in support of an association between overconfidence and delusional ideation was therefore mixed. Inflated confidence-accuracy miscalibration for the two delusional groups may be better explained by their greater unawareness of their underperformance, rather than representing genuinely inflated overconfidence in errors.

  12. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  13. RD Optimized, Adaptive, Error-Resilient Transmission of MJPEG2000-Coded Video over Multiple Time-Varying Channels

    NASA Astrophysics Data System (ADS)

    Bezan, Scott; Shirani, Shahram

    2006-12-01

    To reliably transmit video over error-prone channels, the data should be both source and channel coded. When multiple channels are available for transmission, the problem extends to that of partitioning the data across these channels. The condition of transmission channels, however, varies with time. Therefore, the error protection added to the data at one instant of time may not be optimal at the next. In this paper, we propose a method for adaptively adding error correction code in a rate-distortion (RD) optimized manner using rate-compatible punctured convolutional codes to an MJPEG2000 constant rate-coded frame of video. We perform an analysis on the rate-distortion tradeoff of each of the coding units (tiles and packets) in each frame and adapt the error correction code assigned to the unit taking into account the bandwidth and error characteristics of the channels. This method is applied to both single and multiple time-varying channel environments. We compare our method with a basic protection method in which data is either not transmitted, transmitted with no protection, or transmitted with a fixed amount of protection. Simulation results show promising performance for our proposed method.

  14. Dysmorphic concern is related to delusional proneness and negative affect in a community sample.

    PubMed

    Keating, Charlotte; Thomas, Neil; Stephens, Jessie; Castle, David J; Rossell, Susan L

    2016-06-30

    Body image concerns are common in the general population and in some mental illnesses reach pathological levels. We investigated whether dysmorphic concern with appearance (a preoccupation with minor or imagined defects in appearance) is explained by psychotic processes in a community sample. In a cross-sectional design, two hundred and twenty six participants completed an online survey battery including: The Dysmorphic Concern Questionnaire; the Peters Delusional inventory; the Aberrant Salience Inventory; and the Depression, Anxiety, Stress Scale. Participants were native English speakers residing in Australia. Dysmorphic concern was positively correlated with delusional proneness, aberrant salience and negative emotion. Regression established that negative emotion and delusional proneness predicted dysmorphic concern, whereas, aberrant salience did not. Although delusional proneness was related to body dysmorphia, there was no evidence that it was related to aberrant salience. Understanding the contribution of other psychosis processes, and other health related variables to the severity of dysmorphic concern will be a focus of future research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Local Minima Free Parameterized Appearance Models

    PubMed Central

    Nguyen, Minh Hoai; De la Torre, Fernando

    2010-01-01

    Parameterized Appearance Models (PAMs) (e.g. Eigentracking, Active Appearance Models, Morphable Models) are commonly used to model the appearance and shape variation of objects in images. While PAMs have numerous advantages relative to alternate approaches, they have at least two drawbacks. First, they are especially prone to local minima in the fitting process. Second, often few if any of the local minima of the cost function correspond to acceptable solutions. To solve these problems, this paper proposes a method to learn a cost function by explicitly optimizing that the local minima occur at and only at the places corresponding to the correct fitting parameters. To the best of our knowledge, this is the first paper to address the problem of learning a cost function to explicitly model local properties of the error surface to fit PAMs. Synthetic and real examples show improvement in alignment performance in comparison with traditional approaches. PMID:21804750

  16. Splash: a software tool for stereotactic planning of recording chamber placement and electrode trajectories.

    PubMed

    Sperka, Daniel J; Ditterich, Jochen

    2011-01-01

    While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories.

  17. Splash: A Software Tool for Stereotactic Planning of Recording Chamber Placement and Electrode Trajectories

    PubMed Central

    Sperka, Daniel J.; Ditterich, Jochen

    2011-01-01

    While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories. PMID:21472085

  18. Mining Genotype-Phenotype Associations from Public Knowledge Sources via Semantic Web Querying.

    PubMed

    Kiefer, Richard C; Freimuth, Robert R; Chute, Christopher G; Pathak, Jyotishman

    2013-01-01

    Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively.

  19. Improving patient safety via automated laboratory-based adverse event grading.

    PubMed

    Niland, Joyce C; Stiller, Tracey; Neat, Jennifer; Londrc, Adina; Johnson, Dina; Pannoni, Susan

    2012-01-01

    The identification and grading of adverse events (AEs) during the conduct of clinical trials is a labor-intensive and error-prone process. This paper describes and evaluates a software tool developed by City of Hope to automate complex algorithms to assess laboratory results and identify and grade AEs. We compared AEs identified by the automated system with those previously assessed manually, to evaluate missed/misgraded AEs. We also conducted a prospective paired time assessment of automated versus manual AE assessment. We found a substantial improvement in accuracy/completeness with the automated grading tool, which identified an additional 17% of severe grade 3-4 AEs that had been missed/misgraded manually. The automated system also provided an average time saving of 5.5 min per treatment course. With 400 ongoing treatment trials at City of Hope and an average of 1800 laboratory results requiring assessment per study, the implications of these findings for patient safety are enormous.

  20. Interpretation of HCMM images: A regional study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Potential users of HCMM data, especially those with only a cursory background in thermal remote sensing are familiarized with the kinds of information contained in the images that can be extracted with some reliability solely from inspection of such standard products as those generated at NASA/GSFC and now achieved in the National Space Science Data Center. Visual analysis of photoimagery is prone to various misimpressions and outright errors brought on by unawareness of the influence of physical factors as well as by sometimes misleading tonal patterns introduced during photoprocessing. The quantitative approach, which relies on computer processing of digital HCMM data, field measurements, and integration of rigorous mathematical models, can usually be used to identify, compensate for, or correct the contributions from at least some of the natural factors and those associated with photoprocessing. Color composite, day-IR, night-IR and visible images of California and Nevada are examined.

  1. Automated grain mapping using wide angle convergent beam electron diffraction in transmission electron microscope for nanomaterials.

    PubMed

    Kumar, Vineet

    2011-12-01

    The grain size statistics, commonly derived from the grain map of a material sample, are important microstructure characteristics that greatly influence its properties. The grain map for nanomaterials is usually obtained manually by visual inspection of the transmission electron microscope (TEM) micrographs because automated methods do not perform satisfactorily. While the visual inspection method provides reliable results, it is a labor intensive process and is often prone to human errors. In this article, an automated grain mapping method is developed using TEM diffraction patterns. The presented method uses wide angle convergent beam diffraction in the TEM. The automated technique was applied on a platinum thin film sample to obtain the grain map and subsequently derive grain size statistics from it. The grain size statistics obtained with the automated method were found in good agreement with the visual inspection method.

  2. Effect of tif expression, irradiation of recipient and presence of plasmid pKM101 on recovery of a marker from a donor exposed to ultraviolet light prior to conjugation.

    PubMed

    von Wright, A; Bridges, B A

    1980-08-01

    To detect the effect of the postulated inducible error-prone repair system ('SOS repair') on the bacterial chromosome, an Hfr Escherichia coli strain JC5088 recA was u.v.-irradiated immediately before mating it with recipients in which SOS repair was supposed to be functioning through tif expression, u.v. irradiation or the presence of plasmid pKM101. The recombinant yields of these crosses were compared with those obtained in corresponding crosses with recipients in which SOS repair either was not induced or was totally eliminated by the lexA mutation. No difference in marker recovery efficiency could be detected between these two sets of recipients and thus no induced repair process acting on donor DNA could be demonstrated. The possible reasons for this finding are discussed.

  3. Reasoning in believers in the paranormal.

    PubMed

    Lawrence, Emma; Peters, Emmanuelle

    2004-11-01

    Reasoning biases have been identified in deluded patients, delusion-prone individuals, and believers in the paranormal. This study examined content-specific reasoning and delusional ideation in believers in the paranormal. A total of 174 members of the Society for Psychical Research completed a delusional ideation questionnaire and a deductive reasoning task. The reasoning statements were manipulated for congruency with paranormal beliefs. As predicted, individuals who reported a strong belief in the paranormal made more errors and displayed more delusional ideation than skeptical individuals. However, no differences were found with statements that were congruent with their belief system, confirming the domain-specificity of reasoning. This reasoning bias was limited to people who reported a belief in, rather than experience of, paranormal phenomena. These results suggest that reasoning abnormalities may have a causal role in the formation of unusual beliefs. The dissociation between experiences and beliefs implies that such abnormalities operate at the evaluative, rather than the perceptual, stage of processing.

  4. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  5. Generic method for automatic bladder segmentation on cone beam CT using a patient-specific bladder shape model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoot, A. J. A. J. van de, E-mail: a.j.schootvande@amc.uva.nl; Schooneveldt, G.; Wognum, S.

    Purpose: The aim of this study is to develop and validate a generic method for automatic bladder segmentation on cone beam computed tomography (CBCT), independent of gender and treatment position (prone or supine), using only pretreatment imaging data. Methods: Data of 20 patients, treated for tumors in the pelvic region with the entire bladder visible on CT and CBCT, were divided into four equally sized groups based on gender and treatment position. The full and empty bladder contour, that can be acquired with pretreatment CT imaging, were used to generate a patient-specific bladder shape model. This model was used tomore » guide the segmentation process on CBCT. To obtain the bladder segmentation, the reference bladder contour was deformed iteratively by maximizing the cross-correlation between directional grey value gradients over the reference and CBCT bladder edge. To overcome incorrect segmentations caused by CBCT image artifacts, automatic adaptations were implemented. Moreover, locally incorrect segmentations could be adapted manually. After each adapted segmentation, the bladder shape model was expanded and new shape patterns were calculated for following segmentations. All available CBCTs were used to validate the segmentation algorithm. The bladder segmentations were validated by comparison with the manual delineations and the segmentation performance was quantified using the Dice similarity coefficient (DSC), surface distance error (SDE) and SD of contour-to-contour distances. Also, bladder volumes obtained by manual delineations and segmentations were compared using a Bland-Altman error analysis. Results: The mean DSC, mean SDE, and mean SD of contour-to-contour distances between segmentations and manual delineations were 0.87, 0.27 cm and 0.22 cm (female, prone), 0.85, 0.28 cm and 0.22 cm (female, supine), 0.89, 0.21 cm and 0.17 cm (male, supine) and 0.88, 0.23 cm and 0.17 cm (male, prone), respectively. Manual local adaptations improved the segmentation results significantly (p < 0.01) based on DSC (6.72%) and SD of contour-to-contour distances (0.08 cm) and decreased the 95% confidence intervals of the bladder volume differences. Moreover, expanding the shape model improved the segmentation results significantly (p < 0.01) based on DSC and SD of contour-to-contour distances. Conclusions: This patient-specific shape model based automatic bladder segmentation method on CBCT is accurate and generic. Our segmentation method only needs two pretreatment imaging data sets as prior knowledge, is independent of patient gender and patient treatment position and has the possibility to manually adapt the segmentation locally.« less

  6. Evaluating team decision-making as an emergent phenomenon.

    PubMed

    Kinnear, John; Wilson, Nick; O'Dwyer, Anthony

    2018-04-01

    The complexity of modern clinical practice has highlighted the fallibility of individual clinicians' decision-making, with effective teamwork emerging as a key to patient safety. Dual process theory is widely accepted as a framework for individual decision-making, with type 1 processes responsible for fast, intuitive and automatic decisions and type 2 processes for slow, analytical decisions. However, dual process theory does not explain cognition at the group level, when individuals act in teams. Team cognition resulting from dynamic interaction of individuals is said to be more resilient to decision-making error and greater than simply aggregated cognition. Clinicians were paired as teams and asked to solve a cognitive puzzle constructed as a drug calculation. The frequency at which the teams made incorrect decisions was compared with that of individual clinicians answering the same question. When clinicians acted in pairs, 63% answered the cognitive puzzle correctly, compared with 33% of clinicians as individuals, showing a statistically significant difference in performance (χ 2 (1, n=116)=24.329, P<0.001). Based on the predicted performance of teams made up of the random pairing of individuals who had the same propensity to answer as previously, there was no statistical difference in the actual and predicted teams' performance. Teams are less prone to making errors of decision-making than individuals. However, the improved performance is likely to be owing to the effect of aggregated cognition rather than any improved decision-making as a result of the interaction. There is no evidence of team cognition as an emergent and distinct entity. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. The High Altitude Pollution Program (1976-1982).

    DTIC Science & Technology

    1984-01-01

    ground, where air pollution problems arise due to ground level emissions from, for example, automobiles and power plants) to about 25 km above the...downward and poleward. Near the ground, in areas such as cities prone to air pollution , ozone is produced by nitrogen dioxide photolysis and reaction...Spectrophotcmeter Total Ozone Measurement Errors caused by Interfering Absorbing Species Such as SO2, NO2 and Photochemically Produced 03 IN Polluted Air ," NOAA

  8. Hematology of camelids.

    PubMed

    Vap, Linda; Bohn, Andrea A

    2015-01-01

    Interpretation of camelid hematology results is similar to that of other mammals. Obtaining accurate results and using appropriate reference intervals can be a bit problematic, particularly when evaluating the erythron. Camelid erythrocytes vary from other mammals in that they are small, flat, and elliptical. This variation makes data obtained from samples collected from these species prone to error when using some automated instruments. Normal and abnormal findings in camelid blood are reviewed as well as how to ensure accurate results.

  9. Coordinating Robot Teams for Disaster Relief

    DTIC Science & Technology

    2015-05-01

    eventually guide vehicles in cooperation with its Operator(s), but in this paper we assume static mission goals, a fixed number of vehicles, and a...is tedious and error prone. Kress-Gazit et al. (2009) instead synthesize an FSA from an LTL specification using a game theory approach (Bloem et al...helping an Operator coordinate a team of vehicles in Disaster Relief. Acknowledgements Thanks to OSD ASD (R&E) for sponsoring this research. The

  10. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  11. Toward an Operational Definition of Workload: A Workload Assessment of Aviation Maneuvers

    DTIC Science & Technology

    2010-08-01

    and evaluated by the learner . With practice, the learner moves into the second phase, where optimal strategies are strengthened. The final stage of...The first phase demands a great amount of resources as performance is slow and prone to errors. During this phase, strategies are being formulated...asked to assess mental, physical, visual, aural , and verbal demands of each task. The new assessment is a cost effective method of assessing workload

  12. An Analysis of Misconceptions in Science Textbooks: Earth science in England and Wales

    NASA Astrophysics Data System (ADS)

    King, Chris John Henry

    2010-03-01

    Surveys of the earth science content of all secondary (high school) science textbooks and related publications used in England and Wales have revealed high levels of error/misconception. The 29 science textbooks or textbook series surveyed (51 texts in all) showed poor coverage of National Curriculum earth science and contained a mean level of one earth science error/misconception per page. Science syllabuses and examinations surveyed also showed errors/misconceptions. More than 500 instances of misconception were identified through the surveys. These were analysed for frequency, indicating that those areas of the earth science curriculum most prone to misconception are sedimentary processes/rocks, earthquakes/Earth's structure, and plate tectonics. For the 15 most frequent misconceptions, examples of quotes from the textbooks are given, together with the scientific consensus view, a discussion, and an example of a misconception of similar significance in another area of science. The misconceptions identified in the surveys are compared with those described in the literature. This indicates that the misconceptions found in college students and pre-service/practising science teachers are often also found in published materials, and therefore are likely to reinforce the misconceptions in teachers and their students. The analysis may also reflect the prevalence earth science misconceptions in the UK secondary (high school) science-teaching population. The analysis and discussion provide the opportunity for writers of secondary science materials to improve their work on earth science and to provide a platform for improved teaching and learning of earth science in the future.

  13. Delusion proneness and emotion appraisal in individuals with high psychosis vulnerability.

    PubMed

    Szily, Erika; Kéri, Szabolcs

    2013-01-01

    Evidence suggests that emotional processes play an important role in the development of delusions. The aim of the present study was to investigate emotion appraisal in individuals with high and low psychosis proneness. We compared 30 individuals who experienced a transient psychotic episode followed by a complete remission with 30 healthy control volunteers. The participants received the Peters et al. Delusion Inventory (PDI) and the Scherer's Emotion Appraisal Questionnaire. We also assessed the IQ and the severity of depressive and anxiety symptoms. Results revealed that individuals with high psychosis proneness displayed increased PDI scores and more pronounced anxiety compared with individuals with low psychosis proneness. There was a specific pattern of emotion appraisal in individuals with high psychosis proneness. In the case of fear, they achieved higher scores for external causality and immorality, and lower scores for coping ability and self-esteem compared with individuals with low proneness. The PDI scores were weakly related to external causality (r = 0.41) and self-esteem (r = -0.37). In the case of sadness and joy, no emotion appraisal differences were found between participants with low and high proneness. These results suggest that individuals who had a history of psychotic breakdown and therefore exhibit high psychosis proneness display an altered appraisal of fear, emphasizing external circumstances, feeling less power to cope and experience low self-esteem. Patients remitted from a transient psychotic episode still exhibit milder forms of delusion proneness. Emotion appraisal for fear is related to delusion proneness. Clinicians should pay a special attention to self-esteem and attribution biases in psychosis-prone individuals. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Identification and correction of systematic error in high-throughput sequence data

    PubMed Central

    2011-01-01

    Background A feature common to all DNA sequencing technologies is the presence of base-call errors in the sequenced reads. The implications of such errors are application specific, ranging from minor informatics nuisances to major problems affecting biological inferences. Recently developed "next-gen" sequencing technologies have greatly reduced the cost of sequencing, but have been shown to be more error prone than previous technologies. Both position specific (depending on the location in the read) and sequence specific (depending on the sequence in the read) errors have been identified in Illumina and Life Technology sequencing platforms. We describe a new type of systematic error that manifests as statistically unlikely accumulations of errors at specific genome (or transcriptome) locations. Results We characterize and describe systematic errors using overlapping paired reads from high-coverage data. We show that such errors occur in approximately 1 in 1000 base pairs, and that they are highly replicable across experiments. We identify motifs that are frequent at systematic error sites, and describe a classifier that distinguishes heterozygous sites from systematic error. Our classifier is designed to accommodate data from experiments in which the allele frequencies at heterozygous sites are not necessarily 0.5 (such as in the case of RNA-Seq), and can be used with single-end datasets. Conclusions Systematic errors can easily be mistaken for heterozygous sites in individuals, or for SNPs in population analyses. Systematic errors are particularly problematic in low coverage experiments, or in estimates of allele-specific expression from RNA-Seq data. Our characterization of systematic error has allowed us to develop a program, called SysCall, for identifying and correcting such errors. We conclude that correction of systematic errors is important to consider in the design and interpretation of high-throughput sequencing experiments. PMID:22099972

  15. Endoluminal surface registration for CT colonography using haustral fold matching☆

    PubMed Central

    Hampshire, Thomas; Roth, Holger R.; Helbren, Emma; Plumb, Andrew; Boone, Darren; Slabaugh, Greg; Halligan, Steve; Hawkes, David J.

    2013-01-01

    Computed Tomographic (CT) colonography is a technique used for the detection of bowel cancer or potentially precancerous polyps. The procedure is performed routinely with the patient both prone and supine to differentiate fixed colonic pathology from mobile faecal residue. Matching corresponding locations is difficult and time consuming for radiologists due to colonic deformations that occur during patient repositioning. We propose a novel method to establish correspondence between the two acquisitions automatically. The problem is first simplified by detecting haustral folds using a graph cut method applied to a curvature-based metric applied to a surface mesh generated from segmentation of the colonic lumen. A virtual camera is used to create a set of images that provide a metric for matching pairs of folds between the prone and supine acquisitions. Image patches are generated at the fold positions using depth map renderings of the endoluminal surface and optimised by performing a virtual camera registration over a restricted set of degrees of freedom. The intensity difference between image pairs, along with additional neighbourhood information to enforce geometric constraints over a 2D parameterisation of the 3D space, are used as unary and pair-wise costs respectively, and included in a Markov Random Field (MRF) model to estimate the maximum a posteriori fold labelling assignment. The method achieved fold matching accuracy of 96.0% and 96.1% in patient cases with and without local colonic collapse. Moreover, it improved upon an existing surface-based registration algorithm by providing an initialisation. The set of landmark correspondences is used to non-rigidly transform a 2D source image derived from a conformal mapping process on the 3D endoluminal surface mesh. This achieves full surface correspondence between prone and supine views and can be further refined with an intensity based registration showing a statistically significant improvement (p < 0.001), and decreasing mean error from 11.9 mm to 6.0 mm measured at 1743 reference points from 17 CTC datasets. PMID:23845949

  16. Hydrography for the non-Hydrographer: A Paradigm shift in Data Processing

    NASA Astrophysics Data System (ADS)

    Malzone, C.; Bruce, S.

    2017-12-01

    Advancements in technology have led to overall systematic improvements including; hardware design, software architecture, data transmission/ telepresence. Historically, utilization of this technology has required a high knowledge level obtained with many years of experience, training and/or education. High training costs are incurred to achieve and maintain an acceptable level proficiency within an organization. Recently, engineers have developed off-the-shelf software technology called Qimera that has simplified the processing of hydrographic data. The core technology is centered around the isolation of tasks within the work- flow to capitalize on the technological advances in computing technology to automate the mundane error prone tasks to bring more value to the stages in which the human brain brings value. Key design features include: guided workflow, transcription automation, processing state management, real-time QA, dynamic workflow for validation, collaborative cleaning and production line processing. Since, Qimera is designed to guide the user, it allows expedition leaders to focus on science while providing an educational opportunity for students to quickly learn the hydrographic processing workflow including ancillary data analysis, trouble-shooting, calibration and cleaning. This paper provides case studies on how Qimera is currently implemented in scientific expeditions, benefits of implementation and how it is directing the future of on-board research for the non-hydrographer.

  17. Protocols for Image Processing based Underwater Inspection of Infrastructure Elements

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; Pakrashi, Vikram

    2015-07-01

    Image processing can be an important tool for inspecting underwater infrastructure elements like bridge piers and pile wharves. Underwater inspection often relies on visual descriptions of divers who are not necessarily trained in specifics of structural degradation and the information may often be vague, prone to error or open to significant variation of interpretation. Underwater vehicles, on the other hand can be quite expensive to deal with for such inspections. Additionally, there is now significant encouragement globally towards the deployment of more offshore renewable wind turbines and wave devices and the requirement for underwater inspection can be expected to increase significantly in the coming years. While the merit of image processing based assessment of the condition of underwater structures is understood to a certain degree, there is no existing protocol on such image based methods. This paper discusses and describes an image processing protocol for underwater inspection of structures. A stereo imaging image processing method is considered in this regard and protocols are suggested for image storage, imaging, diving, and inspection. A combined underwater imaging protocol is finally presented which can be used for a variety of situations within a range of image scenes and environmental conditions affecting the imaging conditions. An example of detecting marine growth is presented of a structure in Cork Harbour, Ireland.

  18. Autism and psychosis expressions diametrically modulate the right temporoparietal junction.

    PubMed

    Abu-Akel, Ahmad M; Apperly, Ian A; Wood, Stephen J; Hansen, Peter C

    2017-10-01

    The mentalizing network is atypically activated in autism and schizophrenia spectrum disorders. While these disorders are considered diagnostically independent, expressions of both can co-occur in the same individual. We examined the concurrent effect of autism traits and psychosis proneness on the activity of the mentalizing network in 24 neurotypical adults while performing a social competitive game. Activations were observed in the paracingulate cortex and the right temporoparietal junction (rTPJ). Autism traits and psychosis proneness did not modulate activity within the paracingulate or the dorsal component of the rTPJ. However, diametric modulations of autism traits and psychosis proneness were observed in the posterior (rvpTPJ) and anterior (rvaTPJ) subdivisions of the ventral rTPJ, which respectively constitute core regions within the mentalizing and attention-reorienting networks. Within the rvpTPJ, increasing autism tendencies decreased activity, and increasing psychosis proneness increased activity. This effect was reversed within the rvaTPJ. We suggest that this results from an interaction between regions responsible for higher level social cognitive processing (rvpTPJ) and regions responsible for domain-general attentional processes (rvaTPJ). The observed diametric modulation of autism tendencies and psychosis proneness of neuronal activity within the mentalizing network highlights the importance of assessing both autism and psychosis expressions within the individual.

  19. Semiautomated model building for RNA crystallography using a directed rotameric approach.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2010-05-04

    Structured RNA molecules play essential roles in a variety of cellular processes; however, crystallographic studies of such RNA molecules present a large number of challenges. One notable complication arises from the low resolutions typical of RNA crystallography, which results in electron density maps that are imprecise and difficult to interpret. This problem is exacerbated by the lack of computational tools for RNA modeling, as many of the techniques commonly used in protein crystallography have no equivalents for RNA structure. This leads to difficulty and errors in the model building process, particularly in modeling of the RNA backbone, which is highly error prone due to the large number of variable torsion angles per nucleotide. To address this, we have developed a method for accurately building the RNA backbone into maps of intermediate or low resolution. This method is semiautomated, as it requires a crystallographer to first locate phosphates and bases in the electron density map. After this initial trace of the molecule, however, an accurate backbone structure can be built without further user intervention. To accomplish this, backbone conformers are first predicted using RNA pseudotorsions and the base-phosphate perpendicular distance. Detailed backbone coordinates are then calculated to conform both to the predicted conformer and to the previously located phosphates and bases. This technique is shown to produce accurate backbone structure even when starting from imprecise phosphate and base coordinates. A program implementing this methodology is currently available, and a plugin for the Coot model building program is under development.

  20. Flexible Retrieval: When True Inferences Produce False Memories

    PubMed Central

    Carpenter, Alexis C.; Schacter, Daniel L.

    2016-01-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave memory prone to error or distortion, such as source misattributions in which details of one event are mistakenly attributed to another related event. To determine whether the same recombination-related retrieval mechanism supports both successful inference and source memory errors, we developed a modified version of an associative inference paradigm in which participants encoded everyday scenes comprised of people, objects, and other contextual details. These scenes contained overlapping elements (AB, BC) that could later be linked to support novel inferential retrieval regarding elements that had not appeared together previously (AC). Our critical experimental manipulation concerned whether contextual details were probed before or after the associative inference test, thereby allowing us to assess whether a) false memories increased for successful versus unsuccessful inferences, and b) any such effects were specific to after as compared to before participants received the inference test. In each of four experiments that used variants of this paradigm, participants were more susceptible to false memories for contextual details after successful than unsuccessful inferential retrieval, but only when contextual details were probed after the associative inference test. These results suggest that the retrieval-mediated recombination mechanism that underlies associative inference also contributes to source misattributions that result from combining elements of distinct episodes. PMID:27918169

  1. Error-proneness as a handicap signal.

    PubMed

    De Jaegher, Kris

    2003-09-21

    This paper describes two discrete signalling models in which the error-proneness of signals can serve as a handicap signal. In the first model, the direct handicap of sending a high-quality signal is not large enough to assure that a low-quality signaller will not send it. However, if the receiver sometimes mistakes a high-quality signal for a low-quality one, then there is an indirect handicap to sending a high-quality signal. The total handicap of sending such a signal may then still be such that a low-quality signaller would not want to send it. In the second model, there is no direct handicap of sending signals, so that nothing would seem to stop a signaller from always sending a high-quality signal. However, the receiver sometimes fails to detect signals, and this causes an indirect handicap of sending a high-quality signal that still stops the low-quality signaller of sending such a signal. The conditions for honesty are that the probability of an error of detection is higher for a high-quality than for a low-quality signal, and that the signaller who does not detect a signal adopts a response that is bad to the signaller. In both our models, we thus obtain the result that signal accuracy should not lie above a certain level in order for honest signalling to be possible. Moreover, we show that the maximal accuracy that can be achieved is higher the lower the degree of conflict between signaller and receiver. As well, we show that it is the conditions for honest signalling that may be constraining signal accuracy, rather than the signaller trying to make honest signals as effective as possible given receiver psychology, or the signaller adapting the accuracy of honest signals depending on his interests.

  2. Dynamic power scheduling system for JPEG2000 delivery over wireless networks

    NASA Astrophysics Data System (ADS)

    Martina, Maurizio; Vacca, Fabrizio

    2003-06-01

    Third generation mobile terminals diffusion is encouraging the development of new multimedia based applications. The reliable transmission of audiovisual content will gain major interest being one of the most valuable services. Nevertheless, mobile scenario is severely power constrained: high compression ratios and refined energy management strategies are highly advisable. JPEG2000 as the source encoding stage assures excellent performance with extremely good visual quality. However the limited power budged imposes to limit the computational effort in order to save as much power as possible. Starting from an error prone environment, as the wireless one, high error-resilience features need to be employed. This paper tries to investigate the trade-off between quality and power in such a challenging environment.

  3. Identification of User Facility Related Publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Robert M; Stahl, Christopher G; Wells, Jack C

    2012-01-01

    Scientific user facilities provide physical resources and technical support that enable scientists to conduct experiments or simulations pertinent to their respective research. One metric for evaluating the scientific value or impact of a facility is the number of publications by users as a direct result of using that facility. Unfortunately, for a variety of reasons, capturing accurate values for this metric proves time consuming and error-prone. This work describes a new approach that leverages automated browser technology combined with text analytics to reduce the time and error involved in identifying publications related to user facilities. With this approach, scientific usermore » facilities gain more accurate measures of their impact as well as insight into policy revisions for user access.« less

  4. Interactions and Localization of Escherichia coli Error-Prone DNA Polymerase IV after DNA Damage.

    PubMed

    Mallik, Sarita; Popodi, Ellen M; Hanson, Andrew J; Foster, Patricia L

    2015-09-01

    Escherichia coli's DNA polymerase IV (Pol IV/DinB), a member of the Y family of error-prone polymerases, is induced during the SOS response to DNA damage and is responsible for translesion bypass and adaptive (stress-induced) mutation. In this study, the localization of Pol IV after DNA damage was followed using fluorescent fusions. After exposure of E. coli to DNA-damaging agents, fluorescently tagged Pol IV localized to the nucleoid as foci. Stepwise photobleaching indicated ∼60% of the foci consisted of three Pol IV molecules, while ∼40% consisted of six Pol IV molecules. Fluorescently tagged Rep, a replication accessory DNA helicase, was recruited to the Pol IV foci after DNA damage, suggesting that the in vitro interaction between Rep and Pol IV reported previously also occurs in vivo. Fluorescently tagged RecA also formed foci after DNA damage, and Pol IV localized to them. To investigate if Pol IV localizes to double-strand breaks (DSBs), an I-SceI endonuclease-mediated DSB was introduced close to a fluorescently labeled LacO array on the chromosome. After DSB induction, Pol IV localized to the DSB site in ∼70% of SOS-induced cells. RecA also formed foci at the DSB sites, and Pol IV localized to the RecA foci. These results suggest that Pol IV interacts with RecA in vivo and is recruited to sites of DSBs to aid in the restoration of DNA replication. DNA polymerase IV (Pol IV/DinB) is an error-prone DNA polymerase capable of bypassing DNA lesions and aiding in the restart of stalled replication forks. In this work, we demonstrate in vivo localization of fluorescently tagged Pol IV to the nucleoid after DNA damage and to DNA double-strand breaks. We show colocalization of Pol IV with two proteins: Rep DNA helicase, which participates in replication, and RecA, which catalyzes recombinational repair of stalled replication forks. Time course experiments suggest that Pol IV recruits Rep and that RecA recruits Pol IV. These findings provide in vivo evidence that Pol IV aids in maintaining genomic stability not only by bypassing DNA lesions but also by participating in the restoration of stalled replication forks. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. Clarifying the link between job satisfaction and absenteeism: The role of guilt proneness.

    PubMed

    Schaumberg, Rebecca L; Flynn, Francis J

    2017-06-01

    We propose that the relationship between job satisfaction and absenteeism depends partly on guilt proneness. Drawing on withdrawal and process models of absenteeism, we argue that job satisfaction predicts absences for employees who are low (but not high) in guilt proneness because low guilt-prone people's behaviors are governed more by fulfilling their own egoistic desires than by fulfilling others' normative expectations. We find support for this prediction in a sample of customer service agents working for a major telecommunications company and a sample of working adults employed in a range of industries. In each study, we use measures of employees' guilt proneness and job satisfaction to predict their subsequent workplace absences. In Study 2, we extend our hypothesis tests to 2 traits that are conceptually comparable to guilt proneness (i.e., moral identity and agreeableness), showing that these traits similarly moderate the relationship between job satisfaction and absenteeism. We discuss the implications of these findings for extant models of absenteeism and research on moral affectivity in the workplace. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Neurodevelopmental correlates of proneness to guilt and shame in adolescence and early adulthood.

    PubMed

    Whittle, Sarah; Liu, Kirra; Bastin, Coralie; Harrison, Ben J; Davey, Christopher G

    2016-06-01

    Investigating how brain development during adolescence and early adulthood underlies guilt- and shame-proneness may be important for understanding risk processes for mental disorders. The aim of this study was to investigate the neurodevelopmental correlates of interpersonal guilt- and shame-proneness in healthy adolescents and young adults using structural magnetic resonance imaging (sMRI). Sixty participants (age range: 15-25) completed sMRI and self-report measures of interpersonal guilt- and shame-proneness. Independent of interpersonal guilt, higher levels of shame-proneness were associated with thinner posterior cingulate cortex (PCC) thickness and smaller amygdala volume. Higher levels of shame-proneness were also associated with attenuated age-related reductions in thickness of lateral orbitofrontal cortex (lOFC). Our findings highlight the complexities in understanding brain-behavior relationships during the adolescent/young adult period. Results were consistent with growing evidence that accelerated cortical thinning during adolescence may be associated with superior socioemotional functioning. Further research is required to understand the implications of these findings for mental disorders characterized by higher levels of guilt and shame. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  8. Joint three-dimensional inversion of coupled groundwater flow and heat transfer based on automatic differentiation: sensitivity calculation, verification, and synthetic examples

    NASA Astrophysics Data System (ADS)

    Rath, V.; Wolf, A.; Bücker, H. M.

    2006-10-01

    Inverse methods are useful tools not only for deriving estimates of unknown parameters of the subsurface, but also for appraisal of the thus obtained models. While not being neither the most general nor the most efficient methods, Bayesian inversion based on the calculation of the Jacobian of a given forward model can be used to evaluate many quantities useful in this process. The calculation of the Jacobian, however, is computationally expensive and, if done by divided differences, prone to truncation error. Here, automatic differentiation can be used to produce derivative code by source transformation of an existing forward model. We describe this process for a coupled fluid flow and heat transport finite difference code, which is used in a Bayesian inverse scheme to estimate thermal and hydraulic properties and boundary conditions form measured hydraulic potentials and temperatures. The resulting derivative code was validated by comparison to simple analytical solutions and divided differences. Synthetic examples from different flow regimes demonstrate the use of the inverse scheme, and its behaviour in different configurations.

  9. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  10. TOTAL user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1994-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in the model of a complex system can be devastatingly tedious and error-prone. Even with tools such as the Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST), the user must describe a system by specifying the rules governing the behavior of the system in order to generate the model. With the Table Oriented Translator to the ASSIST Language (TOTAL), the user can specify the components of a typical system and their attributes in the form of a table. The conditions that lead to system failure are also listed in a tabular form. The user can also abstractly specify dependencies with causes and effects. The level of information required is appropriate for system designers with little or no background in the details of reliability calculations. A menu-driven interface guides the user through the system description process, and the program updates the tables as new information is entered. The TOTAL program automatically generates an ASSIST input description to match the system description.

  11. fMRI evidence for a dual process account of the speed-accuracy tradeoff in decision-making.

    PubMed

    Ivanoff, Jason; Branning, Philip; Marois, René

    2008-07-09

    The speed and accuracy of decision-making have a well-known trading relationship: hasty decisions are more prone to errors while careful, accurate judgments take more time. Despite the pervasiveness of this speed-accuracy trade-off (SAT) in decision-making, its neural basis is still unknown. Using functional magnetic resonance imaging (fMRI) we show that emphasizing the speed of a perceptual decision at the expense of its accuracy lowers the amount of evidence-related activity in lateral prefrontal cortex. Moreover, this speed-accuracy difference in lateral prefrontal cortex activity correlates with the speed-accuracy difference in the decision criterion metric of signal detection theory. We also show that the same instructions increase baseline activity in a dorso-medial cortical area involved in the internal generation of actions. These findings suggest that the SAT is neurally implemented by modulating not only the amount of externally-derived sensory evidence used to make a decision, but also the internal urge to make a response. We propose that these processes combine to control the temporal dynamics of the speed-accuracy trade-off in decision-making.

  12. Alternative end-joining pathway(s): bricolage at DNA breaks.

    PubMed

    Frit, Philippe; Barboule, Nadia; Yuan, Ying; Gomez, Dennis; Calsou, Patrick

    2014-05-01

    To cope with DNA double strand break (DSB) genotoxicity, cells have evolved two main repair pathways: homologous recombination which uses homologous DNA sequences as repair templates, and non-homologous Ku-dependent end-joining involving direct sealing of DSB ends by DNA ligase IV (Lig4). During the last two decades a third player most commonly named alternative end-joining (A-EJ) has emerged, which is defined as any Ku- or Lig4-independent end-joining process. A-EJ increasingly appears as a highly error-prone bricolage on DSBs and despite expanding exploration, it still escapes full characterization. In the present review, we discuss the mechanism and regulation of A-EJ as well as its biological relevance under physiological and pathological situations, with a particular emphasis on chromosomal instability and cancer. Whether or not it is a genuine DSB repair pathway, A-EJ is emerging as an important cellular process and understanding A-EJ will certainly be a major challenge for the coming years. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  13. [When shape-invariant recognition ('A' = 'a') fails. A case study of pure alexia and kinesthetic facilitation].

    PubMed

    Diesfeldt, H F A

    2011-06-01

    A right-handed patient, aged 72, manifested alexia without agraphia, a right homonymous hemianopia and an impaired ability to identify visually presented objects. He was completely unable to read words aloud and severely deficient in naming visually presented letters. He responded to orthographic familiarity in the lexical decision tasks of the Psycholinguistic Assessments of Language Processing in Aphasia (PALPA) rather than to the lexicality of the letter strings. He was impaired at deciding whether two letters of different case (e.g., A, a) are the same, though he could detect real letters from made-up ones or from their mirror image. Consequently, his core deficit in reading was posited at the level of the abstract letter identifiers. When asked to trace a letter with his right index finger, kinesthetic facilitation enabled him to read letters and words aloud. Though he could use intact motor representations of letters in order to facilitate recognition and reading, the slow, sequential and error-prone process of reading letter by letter made him abandon further training.

  14. DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology

    PubMed Central

    Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng

    2015-01-01

    Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437

  15. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  16. Reducing uncertainty on satellite image classification through spatiotemporal reasoning

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Nikolakaki, Natassa; Psillakis, Periklis; Miliaresis, George; Xanthakis, Michail

    2014-05-01

    The natural habitat constantly endures both inherent natural and human-induced influences. Remote sensing has been providing monitoring oriented solutions regarding the natural Earth surface, by offering a series of tools and methodologies which contribute to prudent environmental management. Processing and analysis of multi-temporal satellite images for the observation of the land changes include often classification and change-detection techniques. These error prone procedures are influenced mainly by the distinctive characteristics of the study areas, the remote sensing systems limitations and the image analysis processes. The present study takes advantage of the temporal continuity of multi-temporal classified images, in order to reduce classification uncertainty, based on reasoning rules. More specifically, pixel groups that temporally oscillate between classes are liable to misclassification or indicate problematic areas. On the other hand, constant pixel group growth indicates a pressure prone area. Computational tools are developed in order to disclose the alterations in land use dynamics and offer a spatial reference to the pressures that land use classes endure and impose between them. Moreover, by revealing areas that are susceptible to misclassification, we propose specific target site selection for training during the process of supervised classification. The underlying objective is to contribute to the understanding and analysis of anthropogenic and environmental factors that influence land use changes. The developed algorithms have been tested upon Landsat satellite image time series, depicting the National Park of Ainos in Kefallinia, Greece, where the unique in the world Abies cephalonica grows. Along with the minor changes and pressures indicated in the test area due to harvesting and other human interventions, the developed algorithms successfully captured fire incidents that have been historically confirmed. Overall, the results have shown that the use of the suggested procedures can contribute to the reduction of the classification uncertainty and support the existing knowledge regarding the pressure among land-use changes.

  17. Causal inference with measurement error in outcomes: Bias analysis and estimation methods.

    PubMed

    Shu, Di; Yi, Grace Y

    2017-01-01

    Inverse probability weighting estimation has been popularly used to consistently estimate the average treatment effect. Its validity, however, is challenged by the presence of error-prone variables. In this paper, we explore the inverse probability weighting estimation with mismeasured outcome variables. We study the impact of measurement error for both continuous and discrete outcome variables and reveal interesting consequences of the naive analysis which ignores measurement error. When a continuous outcome variable is mismeasured under an additive measurement error model, the naive analysis may still yield a consistent estimator; when the outcome is binary, we derive the asymptotic bias in a closed-form. Furthermore, we develop consistent estimation procedures for practical scenarios where either validation data or replicates are available. With validation data, we propose an efficient method for estimation of average treatment effect; the efficiency gain is substantial relative to usual methods of using validation data. To provide protection against model misspecification, we further propose a doubly robust estimator which is consistent even when either the treatment model or the outcome model is misspecified. Simulation studies are reported to assess the performance of the proposed methods. An application to a smoking cessation dataset is presented.

  18. IPTV multicast with peer-assisted lossy error control

    NASA Astrophysics Data System (ADS)

    Li, Zhi; Zhu, Xiaoqing; Begen, Ali C.; Girod, Bernd

    2010-07-01

    Emerging IPTV technology uses source-specific IP multicast to deliver television programs to end-users. To provide reliable IPTV services over the error-prone DSL access networks, a combination of multicast forward error correction (FEC) and unicast retransmissions is employed to mitigate the impulse noises in DSL links. In existing systems, the retransmission function is provided by the Retransmission Servers sitting at the edge of the core network. In this work, we propose an alternative distributed solution where the burden of packet loss repair is partially shifted to the peer IP set-top boxes. Through Peer-Assisted Repair (PAR) protocol, we demonstrate how the packet repairs can be delivered in a timely, reliable and decentralized manner using the combination of server-peer coordination and redundancy of repairs. We also show that this distributed protocol can be seamlessly integrated with an application-layer source-aware error protection mechanism called forward and retransmitted Systematic Lossy Error Protection (SLEP/SLEPr). Simulations show that this joint PARSLEP/ SLEPr framework not only effectively mitigates the bottleneck experienced by the Retransmission Servers, thus greatly enhancing the scalability of the system, but also efficiently improves the resistance to the impulse noise.

  19. Demonstration of Qubit Operations Below a Rigorous Fault Tolerance Threshold With Gate Set Tomography (Open Access, Publisher’s Version)

    DTIC Science & Technology

    2017-02-15

    Maunz2 Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone...information processors have been demonstrated experimentally using superconducting circuits1–3, electrons in semiconductors4–6, trapped atoms and...qubit quantum information processor has been realized14, and single- qubit gates have demonstrated randomized benchmarking (RB) infidelities as low as 10

  20. The Implications of Self-Reporting Systems for Maritime Domain Awareness

    DTIC Science & Technology

    2006-12-01

    SIA), offrent des avantages significatifs comparativement à la poursuite des navires par détecteur ordinaire et que la disponibilité de l’information...reporting system for sea-going vessels that originated in Sweden in the early 1990s. It was designed primarily for safety of life at sea (SOLAS) and...report information is prone to human error and potential malicious altering and the system itself was not designed with these vulnerabilities in mind

  1. The use of modified and non-natural nucleotides provide unique insights into pro-mutagenic replication catalyzed by polymerase eta

    PubMed Central

    Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J.; Berdis, Anthony J.

    2016-01-01

    This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. PMID:26717984

  2. Multiple point mutations in a shuttle vector propagated in human cells: evidence for an error-prone DNA polymerase activity.

    PubMed

    Seidman, M M; Bredberg, A; Seetharam, S; Kraemer, K H

    1987-07-01

    Mutagenesis was studied at the DNA-sequence level in human fibroblast and lymphoid cells by use of a shuttle vector plasmid, pZ189, containing a suppressor tRNA marker gene. In a series of experiments, 62 plasmids were recovered that had two to six base substitutions in the 160-base-pair marker gene. Approximately 20-30% of the mutant plasmids that were recovered after passing ultraviolet-treated pZ189 through a repair-proficient human fibroblast line contained these multiple mutations. In contrast, passage of ultraviolet-treated pZ189 through an excision-repair-deficient (xeroderma pigmentosum) line yielded only 2% multiple base substitution mutants. Introducing a single-strand nick in otherwise unmodified pZ189 adjacent to the marker, followed by passage through the xeroderma pigmentosum cells, resulted in about 66% multiple base substitution mutants. The multiple mutations were found in a 160-base-pair region containing the marker gene but were rarely found in an adjacent 170-base-pair region. Passing ultraviolet-treated or nicked pZ189 through a repair-proficient human B-cell line also yielded multiple base substitution mutations in 20-33% of the mutant plasmids. An explanation for these multiple mutations is that they were generated by an error-prone polymerase while filling gaps. These mutations share many of the properties displayed by mutations in the immunoglobulin hypervariable regions.

  3. The PSO4 gene is responsible for an error-prone recombinational DNA repair pathway in Saccharomyces cerevisiae.

    PubMed

    de Andrade, H H; Marques, E K; Schenberg, A C; Henriques, J A

    1989-06-01

    The induction of mitotic gene conversion and crossing-over in Saccharomyces cerevisiae diploid cells homozygous for the pso4-1 mutation was examined in comparison to the corresponding wild-type strain. The pso4-1 mutant strain was found to be completely blocked in mitotic recombination induced by photoaddition of mono- and bifunctional psoralen derivatives as well as by mono- (HN1) and bifunctional (HN2) nitrogen mustards or 254 nm UV radiation in both stationary and exponential phases of growth. Concerning the lethal effect, diploids homozygous for the pso4-1 mutation are more sensitive to all agents tested in any growth phase. However, this effect is more pronounced in the G2 phase of the cell cycle. These results imply that the ploidy effect and the resistance of budding cells are under the control of the PSO4 gene. On the other hand, the pso4-1 mutant is mutationally defective for all agents used. Therefore, the pso4-1 mutant has a generalized block in both recombination and mutation ability. This indicates that the PSO4 gene is involved in an error-prone repair pathway which relies on a recombinational mechanism, strongly suggesting an analogy between the pso4-1 mutation and the RecA or LexA mutation of Escherichia coli.

  4. Efficient error correction for next-generation sequencing of viral amplicons

    PubMed Central

    2012-01-01

    Background Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. Results In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Conclusions Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses. The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm PMID:22759430

  5. Efficient error correction for next-generation sequencing of viral amplicons.

    PubMed

    Skums, Pavel; Dimitrova, Zoya; Campo, David S; Vaughan, Gilberto; Rossi, Livia; Forbi, Joseph C; Yokosawa, Jonny; Zelikovsky, Alex; Khudyakov, Yury

    2012-06-25

    Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses.The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm.

  6. An RFID solution for enhancing inpatient medication safety with real-time verifiable grouping-proof.

    PubMed

    Chen, Yu-Yi; Tsai, Meng-Lin

    2014-01-01

    The occurrence of a medication error can threaten patient safety. The medication administration process is complex and cumbersome, and nursing staffs are prone to error when they are tired. Proper Information Technology (IT) can assist the nurse in correct medication administration. We review a recent proposal regarding a leading-edge solution to enhance inpatient medication safety by using RFID technology. The proof mechanism is the kernel concept in their design and worth studying to develop a well-designed grouping-proof scheme. Other RFID grouping-proof protocols could be similarly applied in administering physician orders. We improve on the weaknesses of previous works and develop a reading-order independent RFID grouping-proof scheme in this paper. In our scheme, tags are queried and verified under the direct control of the authorized reader without connecting to the back-end database server. Immediate verification in our design makes this application more portable and efficient and critical security issues have been analyzed by the threat model. Our scheme is suitable for the safe drug administration scenario and the drug package scenario in a hospital environment to enhance inpatient medication safety. It automatically checks for correct drug unit-dose and appropriate inpatient treatments. Copyright © 2013. Published by Elsevier Ireland Ltd.

  7. Sexual coercion and the misperception of sexual intent☆

    PubMed Central

    Farris, Coreen; Treat, Teresa A.; Viken, Richard J.; McFall, Richard M.

    2010-01-01

    Misperceiving a woman’s platonic interest as sexual interest has been implicated in a sexual bargaining process that leads to sexual coercion. This paper provides a comprehensive review of sexual misperception, including gender differences in perception of women’s sexual intent, the relationship between sexual coercion and misperception, and situational factors that increase the risk that sexual misperception will occur. Compared to women, men consistently perceive a greater degree of sexual intent in women’s behavior. However, there is evidence to suggest that this gender effect may be driven largely by a sub-group of men who are particularly prone to perceive sexual intent in women’s behavior, such as sexually coercive men and men who endorse sex-role stereotypes. Situational factors, such as alcohol use by the man or woman, provocative clothing, and dating behaviors (e.g., initiating the date or making eye contact), are all associated with increased estimates of women’s sexual interest. We also critique the current measurement strategies and introduce a model of perception that more closely maps on to important theoretical questions in this area. A clearer understanding of sexual perception errors and the etiology of these errors may serve to guide sexual-assault prevention programs toward more effective strategies. PMID:17462798

  8. Development of a Stereovision-Based Technique to Measure the Spread Patterns of Granular Fertilizer Spreaders

    PubMed Central

    Cool, Simon R.; Pieters, Jan G.; Seatovic, Dejan; Mertens, Koen C.; Nuyttens, David; Van De Gucht, Tim C.; Vangeyte, Jürgen

    2017-01-01

    Centrifugal fertilizer spreaders are by far the most commonly used granular fertilizer spreader type in Europe. Their spread pattern however is error-prone, potentially leading to an undesired distribution of particles in the field and losses out of the field, which is often caused by poor calibration of the spreader for the specific fertilizer used. Due to the large environmental impact of fertilizer use, it is important to optimize the spreading process and minimize these errors. Spreader calibrations can be performed by using collection trays to determine the (field) spread pattern, but this is very time-consuming and expensive for the farmer and hence not common practice. Therefore, we developed an innovative multi-camera system to predict the spread pattern in a fast and accurate way, independent of the spreader configuration. Using high-speed stereovision, ejection parameters of particles leaving the spreader vanes were determined relative to a coordinate system associated with the spreader. The landing positions and subsequent spread patterns were determined using a ballistic model incorporating the effect of tractor motion and wind. Experiments were conducted with a commercial spreader and showed a high repeatability. The results were transformed to one spatial dimension to enable comparison with transverse spread patterns determined in the field and showed similar results. PMID:28617339

  9. Long-term dynamic modeling of tethered spacecraft using nodal position finite element method and symplectic integration

    NASA Astrophysics Data System (ADS)

    Li, G. Q.; Zhu, Z. H.

    2015-12-01

    Dynamic modeling of tethered spacecraft with the consideration of elasticity of tether is prone to the numerical instability and error accumulation over long-term numerical integration. This paper addresses the challenges by proposing a globally stable numerical approach with the nodal position finite element method (NPFEM) and the implicit, symplectic, 2-stage and 4th order Gaussian-Legendre Runge-Kutta time integration. The NPFEM eliminates the numerical error accumulation by using the position instead of displacement of tether as the state variable, while the symplectic integration enforces the energy and momentum conservation of the discretized finite element model to ensure the global stability of numerical solution. The effectiveness and robustness of the proposed approach is assessed by an elastic pendulum problem, whose dynamic response resembles that of tethered spacecraft, in comparison with the commonly used time integrators such as the classical 4th order Runge-Kutta schemes and other families of non-symplectic Runge-Kutta schemes. Numerical results show that the proposed approach is accurate and the energy of the corresponding numerical model is conservative over the long-term numerical integration. Finally, the proposed approach is applied to the dynamic modeling of deorbiting process of tethered spacecraft over a long period.

  10. Derivation of a closed form analytical expression for fluorescence recovery after photo bleaching in the case of continuous bleaching during read out

    NASA Astrophysics Data System (ADS)

    Endress, E.; Weigelt, S.; Reents, G.; Bayerl, T. M.

    2005-01-01

    Measurements of very slow diffusive processes in membranes, like the diffusion of integral membrane proteins, by fluorescence recovery after photo bleaching (FRAP) are hampered by bleaching of the probe during the read out of the fluorescence recovery. In the limit of long observation time (very slow diffusion as in the case of large membrane proteins), this bleaching may cause errors to the recovery function and thus provides error-prone diffusion coefficients. In this work we present a new approach to a two-dimensional closed form analytical solution of the reaction-diffusion equation, based on the addition of a dissipative term to the conventional diffusion equation. The calculation was done assuming (i) a Gaussian laser beam profile for bleaching the spot and (ii) that the fluorescence intensity profile emerging from the spot can be approximated by a two-dimensional Gaussian. The detection scheme derived from the analytical solution allows for diffusion measurements without the constraint of observation bleaching. Recovery curves of experimental FRAP data obtained under non-negligible read-out bleaching for native membranes (rabbit endoplasmic reticulum) on a planar solid support showed excellent agreement with the analytical solution and allowed the calculation of the lipid diffusion coefficient.

  11. Application of Terrestrial Microwave Remote Sensing to Agricultural Drought Monitoring

    NASA Astrophysics Data System (ADS)

    Crow, W. T.; Bolten, J. D.

    2014-12-01

    Root-zone soil moisture information is a valuable diagnostic for detecting the onset and severity of agricultural drought. Current attempts to globally monitor root-zone soil moisture are generally based on the application of soil water balance models driven by observed meteorological variables. Such systems, however, are prone to random error associated with: incorrect process model physics, poor parameter choices and noisy meteorological inputs. The presentation will describe attempts to remediate these sources of error via the assimilation of remotely-sensed surface soil moisture retrievals from satellite-based passive microwave sensors into a global soil water balance model. Results demonstrate the ability of satellite-based soil moisture retrieval products to significantly improve the global characterization of root-zone soil moisture - particularly in data-poor regions lacking adequate ground-based rain gage instrumentation. This success has lead to an on-going effort to implement an operational land data assimilation system at the United States Department of Agriculture's Foreign Agricultural Service (USDA FAS) to globally monitor variations in root-zone soil moisture availability via the integration of satellite-based precipitation and soil moisture information. Prospects for improving the performance of the USDA FAS system via the simultaneous assimilation of both passive and active-based soil moisture retrievals derived from the upcoming NASA Soil Moisture Active/Passive mission will also be discussed.

  12. Reading Ground Water Levels with a Smartphone

    NASA Astrophysics Data System (ADS)

    van Overloop, Peter-Jules

    2015-04-01

    Most ground water levels in the world are measured manually. It requires employees of water management organizations to visit sites in the field and execute a measurement procedure that requires special tools and training. Once the measurement is done, the value is jotted down in a notebook and later, at the office, entered in a computer system. This procedure is slow and prone to human errors. A new development is the introduction of modern Information and Communication Technology to support this task and make it more efficient. Two innovations are introduced to measure and immediately store ground water levels. The first method is a measuring tape that gives a sound and light when it just touches the water in combination with an app on a smartphone with which a picture needs to be taken from the measuring tape. Using dedicated pattern recognition algorithms, the depth is read on the tape and it is verified if the light is on. The second method estimates the depth using a sound from the smartphone that is sent into the borehole and records the reflecting waves in the pipe. Both methods use gps-localization of the smartphone to store the depths in the right location in the central database, making the monitoring of ground water levels a real-time process that eliminates human errors.

  13. SU-F-T-404: Dosimetric Advantages of Flattening Free Beams to Prone Accelerated Partial Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galavis, P; Barbee, D; Jozsef, G

    2016-06-15

    Purpose: Prone accelerated partial breast irradiation (APBI) results in dose reduction to the heart and lung. Flattening filter free beams (FFF) reduce out of field dose due to the reduced scatter from the removal of the flattening filter and reduce the buildup region. The aim of this work is to evaluate the dosimetric advantages of FFF beams to prone APBI target coverage and reduction in dose to organs at risk. Methods: Fifteen clinical prone APBI cases using flattened photon beams were retrospectively re-planned in Eclipse-TPS using FFF beams. FFF plans were designed to provide equivalent target coverage with similar hotspotsmore » using the same field arrangements, resulting in comparable target DVHs. Both plans were transferred to a prone breast phantom and delivered on Varian-Edge-Linac. GafChromic-film was placed in the coronal plane of the phantom, partially overlapping the treatment field and extending into OARs to compare dose profiles from both plans. Results: FFF plans were comparable to the clinical plans with maximum doses of (108.3±2.3)% and (109.2±2.4)% and mean doses of (104.5±1.0)% and (104.6±1.2)%, respectively. Similar mean dose doses to the heart and contralateral lungs were observed from both plans, whereas the mean dose to the contra-lateral breast was (2.79±1.18) cGy and (2.86±1.40) cGy for FFF and clinical plans respectively. However for both plans the error between calculated and measured doses at 4 cm from the field edge was 10%. Conclusion: The results showed that FFF beams in prone APBI provide dosimetrically equivalent target coverage and improved coverage in superficial target due to softer energy spectra. Film analysis showed that the TPS underestimates dose outside field edges for both cases. The FFF measured plans showed less dose outside the beam that might reduce the probability of secondary cancers in the contralateral breast.« less

  14. DEM-based Approaches for the Identification of Flood Prone Areas

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Manfreda, Salvatore; Nardi, Fernando; Grimaldi, Salvatore; Roth, Giorgio; Sole, Aurelia

    2013-04-01

    The remarkable number of inundations that caused, in the last decades, thousands of deaths and huge economic losses, testifies the extreme vulnerability of many Countries to the flood hazard. As a matter of fact, human activities are often developed in the floodplains, creating conditions of extremely high risk. Terrain morphology plays an important role in understanding, modelling and analyzing the hydraulic behaviour of flood waves. Research during the last 10 years has shown that the delineation of flood prone areas can be carried out using fast methods that relay on basin geomorphologic features. In fact, the availability of new technologies to measure surface elevation (e.g., GPS, SAR, SAR interferometry, RADAR and LASER altimetry) has given a strong impulse to the development of Digital Elevation Models (DEMs) based approaches. The identification of the dominant topographic controls on the flood inundation process is a critical research question that we try to tackle with a comparative analysis of several techniques. We reviewed four different approaches for the morphological characterization of a river basin with the aim to provide a description of their performances and to identify their range of applicability. In particular, we explored the potential of the following tools. 1) The hydrogeomorphic method proposed by Nardi et al. (2006) which defines the flood prone areas according to the water level in the river network through the hydrogeomorphic theory. 2) The linear binary classifier proposed by Degiorgis et al. (2012) which allows distinguishing flood-prone areas using two features related to the location of the site under exam with respect to the nearest hazard source. The two features, proposed in the study, are the length of the path that hydrologically connects the location under exam to the nearest element of the drainage network and the difference in elevation between the cell under exam and the final point of the same path. 3) The method by Manfreda et al. (2011) that suggested a modified Topographic Index (TIm) for the identification of flood prone area. 4) The downslope index proposed by Hjerdt et al. (2004) that quantifies the topographic controls on hydrology by evaluating head differences following the (surface) flow path in the steepest direction. The method does not use the exit point at the stream as reference; instead, the algorithm looks at how far a parcel of water has to travel along its flow path to lose a given head potential, d [m]. This last index was not defined with the aim to describe flood prone areas; in fact it represents an interesting alternative descriptor of morphological features that deserve to be tested. Analyses have been carried out for some Italian catchments. The outcomes of the four methods are presented using, for calibration and validation purposes, flood inundation maps made available by River Basin Authorities. The aim is, therefore, to evaluate the reliability and the relative errors in the detection of the areas subject to the flooding hazard. These techniques should not be considered as an alternative of traditional procedures, but additional tool for the identification of flood-prone areas and hazard graduation over large regions or when a preliminary identification is needed. Reference Degiorgis M., G. Gnecco, S. Gorni, G. Roth, M. Sanguineti, A. C. Taramasso, Classifiers for the detection of flood-prone areas using remote sensed elevation data, J. Hydrol., 470-471, 302-315, 2012. Hjerdt, K. N., J. J. McDonnell, J. Seibert, A. Rodhe, A new topographic index to quantify downslope controls on local drainage, Water Resour. Res., 40, W05602, 2004. Manfreda, S., M. Di Leo, A. Sole, Detection of Flood Prone Areas using Digital Elevation Models, Journal of Hydrologic Engineering, Vol. 16, No. 10, 781-790, 2011. Nardi, F., E. R. Vivoni, S. Grimaldi, Investigating a floodplain scaling relation using a hydrogeomorphic delineation method, Water Resour. Res., 42, W09409, 2006.

  15. The Relationship between Student Anti-Intellectualism and Proneness to Boredom in a Sample of College Students

    ERIC Educational Resources Information Center

    Laverghetta, Antonio

    2015-01-01

    College student anti-intellectualism is defined as a general disdain of intellectual and academic endeavors. Eigenberger and Sealander (2001), using the student anti-intellectualism scale (SAIS), reported that SAIS scores were negatively correlated with openness to experience and elaborative/deep cognitive processing. Proneness to boredom,…

  16. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven

    2015-02-15

    Purpose: One of the most critical steps in radiation therapy treatment is accurate tumor and critical organ-at-risk (OAR) contouring. Both manual and automated contouring processes are prone to errors and to a large degree of inter- and intraobserver variability. These are often due to the limitations of imaging techniques in visualizing human anatomy as well as to inherent anatomical variability among individuals. Physicians/physicists have to reverify all the radiation therapy contours of every patient before using them for treatment planning, which is tedious, laborious, and still not an error-free process. In this study, the authors developed a general strategy basedmore » on novel geometric attribute distribution (GAD) models to automatically detect radiation therapy OAR contouring errors and facilitate the current clinical workflow. Methods: Considering the radiation therapy structures’ geometric attributes (centroid, volume, and shape), the spatial relationship of neighboring structures, as well as anatomical similarity of individual contours among patients, the authors established GAD models to characterize the interstructural centroid and volume variations, and the intrastructural shape variations of each individual structure. The GAD models are scalable and deformable, and constrained by their respective principal attribute variations calculated from training sets with verified OAR contours. A new iterative weighted GAD model-fitting algorithm was developed for contouring error detection. Receiver operating characteristic (ROC) analysis was employed in a unique way to optimize the model parameters to satisfy clinical requirements. A total of forty-four head-and-neck patient cases, each of which includes nine critical OAR contours, were utilized to demonstrate the proposed strategy. Twenty-nine out of these forty-four patient cases were utilized to train the inter- and intrastructural GAD models. These training data and the remaining fifteen testing data sets were separately employed to test the effectiveness of the proposed contouring error detection strategy. Results: An evaluation tool was implemented to illustrate how the proposed strategy automatically detects the radiation therapy contouring errors for a given patient and provides 3D graphical visualization of error detection results as well. The contouring error detection results were achieved with an average sensitivity of 0.954/0.906 and an average specificity of 0.901/0.909 on the centroid/volume related contouring errors of all the tested samples. As for the detection results on structural shape related contouring errors, an average sensitivity of 0.816 and an average specificity of 0.94 on all the tested samples were obtained. The promising results indicated the feasibility of the proposed strategy for the detection of contouring errors with low false detection rate. Conclusions: The proposed strategy can reliably identify contouring errors based upon inter- and intrastructural constraints derived from clinically approved contours. It holds great potential for improving the radiation therapy workflow. ROC and box plot analyses allow for analytically tuning of the system parameters to satisfy clinical requirements. Future work will focus on the improvement of strategy reliability by utilizing more training sets and additional geometric attribute constraints.« less

  17. Evolution of dissolved inorganic carbon in groundwater recharged by cyclones and groundwater age estimations using the 14C statistical approach

    NASA Astrophysics Data System (ADS)

    Meredith, K. T.; Han, L. F.; Cendón, D. I.; Crawford, J.; Hankin, S.; Peterson, M.; Hollins, S. E.

    2018-01-01

    The Canning Basin is the largest sedimentary basin in Western Australia and is located in one of the most cyclone prone regions of Australia. Despite its importance as a future resource, limited groundwater data is available for the Basin. The main aims of this paper are to provide a detailed understanding of the source of groundwater recharge, the chemical evolution of dissolved inorganic carbon (DIC) and provide groundwater age estimations using radiocarbon (14CDIC). To do this we combine hydrochemical and isotopic techniques to investigate the type of precipitation that recharge the aquifer and identify the carbon processes influencing 14CDIC, δ13CDIC, and [DIC]. This enables us to select an appropriate model for calculating radiocarbon ages in groundwater. The aquifer was found to be recharged by precipitation originating from tropical cyclones imparting lower average δ2H and δ18O values in groundwater (-56.9‰ and -7.87‰, respectively). Water recharges the soil zone rapidly after these events and the groundwater undergoes silicate mineral weathering and clay mineral transformation processes. It was also found that partial carbonate dissolution processes occur within the saturated zone under closed system conditions. Additionally, the processes could be lumped into a pseudo-first-order process and the age could be estimated using the 14C statistical approach. In the single-sample-based 14C models, 14C0 is the initial 14CDIC value used in the decay equation that considers only 14C decay rate. A major advantage of using the statistical approach is that both 14C decay and geochemical processes that cause the decrease in 14CDIC are accounted for in the calculation. The 14CDIC values of groundwater were found to increase from 89 pmc in the south east to around 16 pmc along the groundwater flow path towards the coast indicating ages ranging from modern to 5.3 ka. A test of the sensitivity of this method showed that a ∼15% error could be found for the oldest water. This error was low when compared to single-sample-based models. This study not only provides the first groundwater age estimations for the Canning Basin but is the first groundwater dating study to test the sensitivity of the statistical approach and provide meaningful error calculations for groundwater dating.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H; Gao, Y; Liu, T

    Purpose: To develop quantitative clinical guidelines between supine Deep Inspiratory Breath Hold (DIBH) and prone free breathing treatments for breast patients, we applied 3D deformable phantoms to perform Monte Carlo simulation to predict corresponding Dose to the Organs at Risk (OARs). Methods: The RPI-adult female phantom (two selected cup sizes: A and D) was used to represent the female patient, and it was simulated using the MCNP6 Monte Carlo code. Doses to OARs were investigated for supine DIBH and prone treatments, considering two breast sizes. The fluence maps of the 6-MV opposed tangential fields were exported. In the Monte Carlomore » simulation, the fluence maps allow each simulated photon particle to be weighed in the final dose calculation. The relative error of all dose calculations was kept below 5% by simulating 3*10{sup 7} photons for each projection. Results: In terms of dosimetric accuracy, the RPI Adult Female phantom with cup size D in DIBH positioning matched with a DIBH treatment plan of the patient. Based on the simulation results, for cup size D phantom, prone positioning reduced the cardiac dose and the dose to other OARs, while cup size A phantom benefits more from DIBH positioning. Comparing simulation results for cup size A and D phantom, dose to OARs was generally higher for the large breast size due to increased scattering arising from a larger portion of the body in the primary beam. The lower dose that was registered for the heart in the large breast phantom in prone positioning was due to the increase of the distance between the heart and the primary beam when the breast was pendulous. Conclusion: Our 3D deformable phantom appears an excellent tool to predict dose to the OARs for the supine DIBH and prone positions, which might help quantitative clinical decisions. Further investigation will be conducted. National Institutes of Health R01EB015478.« less

  19. Process control and recovery in the Link Monitor and Control Operator Assistant

    NASA Technical Reports Server (NTRS)

    Lee, Lorrine; Hill, Randall W., Jr.

    1993-01-01

    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  20. Medical errors and uncertainty in primary healthcare: A comparative study of coping strategies among young and experienced GPs

    PubMed Central

    Kuikka, Liisa; Pitkälä, Kaisu

    2014-01-01

    Abstract Objective. To study coping differences between young and experienced GPs in primary care who experience medical errors and uncertainty. Design. Questionnaire-based survey (self-assessment) conducted in 2011. Setting. Finnish primary practice offices in Southern Finland. Subjects. Finnish GPs engaged in primary health care from two different respondent groups: young (working experience ≤ 5years, n = 85) and experienced (working experience > 5 years, n = 80). Main outcome measures. Outcome measures included experiences and attitudes expressed by the included participants towards medical errors and tolerance of uncertainty, their coping strategies, and factors that may influence (positively or negatively) sources of errors. Results. In total, 165/244 GPs responded (response rate: 68%). Young GPs expressed significantly more often fear of committing a medical error (70.2% vs. 48.1%, p = 0.004) and admitted more often than experienced GPs that they had committed a medical error during the past year (83.5% vs. 68.8%, p = 0.026). Young GPs were less prone to apologize to a patient for an error (44.7% vs. 65.0%, p = 0.009) and found, more often than their more experienced colleagues, on-site consultations and electronic databases useful for avoiding mistakes. Conclusion. Experienced GPs seem to better tolerate uncertainty and also seem to fear medical errors less than their young colleagues. Young and more experienced GPs use different coping strategies for dealing with medical errors. Implications. When GPs become more experienced, they seem to get better at coping with medical errors. Means to support these skills should be studied in future research. PMID:24914458

  1. Considerations for analysis of time-to-event outcomes measured with error: Bias and correction with SIMEX.

    PubMed

    Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A

    2018-04-15

    For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  3. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  4. Modeling the Error of the Medtronic Paradigm Veo Enlite Glucose Sensor.

    PubMed

    Biagi, Lyvia; Ramkissoon, Charrise M; Facchinetti, Andrea; Leal, Yenny; Vehi, Josep

    2017-06-12

    Continuous glucose monitors (CGMs) are prone to inaccuracy due to time lags, sensor drift, calibration errors, and measurement noise. The aim of this study is to derive the model of the error of the second generation Medtronic Paradigm Veo Enlite (ENL) sensor and compare it with the Dexcom SEVEN PLUS (7P), G4 PLATINUM (G4P), and advanced G4 for Artificial Pancreas studies (G4AP) systems. An enhanced methodology to a previously employed technique was utilized to dissect the sensor error into several components. The dataset used included 37 inpatient sessions in 10 subjects with type 1 diabetes (T1D), in which CGMs were worn in parallel and blood glucose (BG) samples were analyzed every 15 ± 5 min Calibration error and sensor drift of the ENL sensor was best described by a linear relationship related to the gain and offset. The mean time lag estimated by the model is 9.4 ± 6.5 min. The overall average mean absolute relative difference (MARD) of the ENL sensor was 11.68 ± 5.07% Calibration error had the highest contribution to total error in the ENL sensor. This was also reported in the 7P, G4P, and G4AP. The model of the ENL sensor error will be useful to test the in silico performance of CGM-based applications, i.e., the artificial pancreas, employing this kind of sensor.

  5. Effects of Non-Normal Outlier-Prone Error Distribution on Kalman Filter Track

    DTIC Science & Technology

    1991-09-01

    other possibilities exist. For example the GST (Generic Statistical Tracker) uses four motion models [Ref. 41. The GST keeps track of both the target...1.011 + + + 3.113 1.291 4 Although this procedure is not easily statistically interpretable, it was used for the sake of comparison with the other... TRANSITOR TARGET’ WRITE(6,*)’ 3 SECOND ORDER GAUSS MARKOV TARGET’ WRITE(6,*)’ 4 RANDOM TOUR TARGET’ READ(6,*) CHOICE IF((CHOICE.LT.1).OR.(CHOICE.GT.4

  6. Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual

    DTIC Science & Technology

    1988-12-01

    The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.

  7. The OPL Access Control Policy Language

    NASA Astrophysics Data System (ADS)

    Alm, Christopher; Wolf, Ruben; Posegga, Joachim

    Existing policy languages suffer from a limited ability of directly and elegantly expressing high-level access control principles such as history-based separation of duty [22], binding of duty [26], context constraints [24], Chinese wall properties [10], and obligations [20]. It is often difficult to extend a language in order to retrofit these features once required or it is necessary to use complicated and complex language constructs to express such concepts. The latter, however, is cumbersome and error-prone for humans dealing with policy administration.

  8. Environmental Health Risk Assesement in Flood-prone Area in Tamangapa Sub-District Makassar

    NASA Astrophysics Data System (ADS)

    Haris, Ibrahim Abdul; Basir, Basir

    2018-05-01

    Environmental health in Indonesia is still caution to concern, poor sanitation in Indonesia is characterized by the high incidence of infectious diseases in society. The society in flood-prone area has a high-risk exposure on the disease based on the environment because they live in disaster-prone area. This research aimed to describe the condition of sanitary facilities and risky behavior on public health in flood-prone areas in Manggala district particularly in Tamangapa sub-district of Makassar. This reserach uses an observation method with a descriptive approach. The data is processed by using SPSS and Arc View GIS applications. Environmental risk category is determined by the approach of Environmental Health Risk Assessment (EHRA). The results showed that the flood-prone area in RT 04 RW 06 was included in very high-risk category at 229 with an index value of environmental health risks 212-229. Meanwhile, RT 04 RW 05 was in the category of low risk in the amount of 155 with an index of 155-173. Environmental health hazards identified in Tamangapa flood-prone areas sub-district includes domestic sources of clean water, domestic wastewater, and household garbage.

  9. Assessing individual differences in proneness to shame and guilt: development of the Self-Conscious Affect and Attribution Inventory.

    PubMed

    Tangney, J P

    1990-07-01

    Individual differences in proneness to shame and proneness to guilt are thought to play an important role in the development of both adaptive and maladaptive interpersonal and intrapersonal processes. But little empirical research has addressed these issues, largely because no reliable, valid measure has been available to researchers interested in differentiating proneness to shame from proneness to guilt. The Self-Conscious Affect and Attribution Inventory (SCAAI) was developed to assess characteristic affective, cognitive, and behavioral responses associated with shame and guilt among a young adult population. The SCAAI also includes indices of externalization of cause or blame, detachment/unconcern, pride in self, and pride in behavior. Data from 3 independent studies of college students and 1 study of noncollege adults provide support for the reliability of the main SCAAI subscales. Moreover, the pattern of relations among the SCAAI subscales and the relation of SCAAI subscales to 2 extant measures of shame and guilt support the validity of this new measure. The SCAAI appears to provide related but functionally distinct indices of proneness to shame and guilt in a way that these previous measures have not.

  10. A filtering method to generate high quality short reads using illumina paired-end technology.

    PubMed

    Eren, A Murat; Vineis, Joseph H; Morrison, Hilary G; Sogin, Mitchell L

    2013-01-01

    Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.

  11. Gamification of Clinical Routine: The Dr. Fill Approach.

    PubMed

    Bukowski, Mark; Kühn, Martin; Zhao, Xiaoqing; Bettermann, Ralf; Jonas, Stephan

    2016-01-01

    Gamification is used in clinical context in the health care education. Furthermore, it has shown great promises to improve the performance of the health care staff in their daily routine. In this work we focus on the medication sorting task, which is performed manually in hospitals. This task is very error prone and needs to be performed daily. Nevertheless, errors in the medication are crucial and lead to serious complications. In this work we present a real world gamification approach of the medication sorting task in a patient's daily pill organizer. The player of the game needs to sort the correct medication into the correct dispenser slots and is rewarded or punished in real time. At the end of the game, a score is given and the user can register in a leaderboard.

  12. SOPRA: Scaffolding algorithm for paired reads via statistical optimization.

    PubMed

    Dayarian, Adel; Michael, Todd P; Sengupta, Anirvan M

    2010-06-24

    High throughput sequencing (HTS) platforms produce gigabases of short read (<100 bp) data per run. While these short reads are adequate for resequencing applications, de novo assembly of moderate size genomes from such reads remains a significant challenge. These limitations could be partially overcome by utilizing mate pair technology, which provides pairs of short reads separated by a known distance along the genome. We have developed SOPRA, a tool designed to exploit the mate pair/paired-end information for assembly of short reads. The main focus of the algorithm is selecting a sufficiently large subset of simultaneously satisfiable mate pair constraints to achieve a balance between the size and the quality of the output scaffolds. Scaffold assembly is presented as an optimization problem for variables associated with vertices and with edges of the contig connectivity graph. Vertices of this graph are individual contigs with edges drawn between contigs connected by mate pairs. Similar graph problems have been invoked in the context of shotgun sequencing and scaffold building for previous generation of sequencing projects. However, given the error-prone nature of HTS data and the fundamental limitations from the shortness of the reads, the ad hoc greedy algorithms used in the earlier studies are likely to lead to poor quality results in the current context. SOPRA circumvents this problem by treating all the constraints on equal footing for solving the optimization problem, the solution itself indicating the problematic constraints (chimeric/repetitive contigs, etc.) to be removed. The process of solving and removing of constraints is iterated till one reaches a core set of consistent constraints. For SOLiD sequencer data, SOPRA uses a dynamic programming approach to robustly translate the color-space assembly to base-space. For assessing the quality of an assembly, we report the no-match/mismatch error rate as well as the rates of various rearrangement errors. Applying SOPRA to real data from bacterial genomes, we were able to assemble contigs into scaffolds of significant length (N50 up to 200 Kb) with very few errors introduced in the process. In general, the methodology presented here will allow better scaffold assemblies of any type of mate pair sequencing data.

  13. Predicted Errors In Children's Early Sentence Comprehension

    PubMed Central

    Gertner, Yael; Fisher, Cynthia

    2012-01-01

    Children use syntax to interpret sentences and learn verbs; this is syntactic bootstrapping. The structure-mapping account of early syntactic bootstrapping proposes that a partial representation of sentence structure, the set of nouns occurring with the verb, guides initial interpretation and provides an abstract format for new learning. This account predicts early successes, but also telltale errors: Toddlers should be unable to tell transitive sentences from other sentences containing two nouns. In testing this prediction, we capitalized on evidence that 21-month-olds use what they have learned about noun order in English sentences to understand new transitive verbs. In two experiments, 21-month-olds applied this noun-order knowledge to two-noun intransitive sentences, mistakenly assigning different interpretations to “The boy and the girl are gorping!” and “The girl and the boy are gorping!”. This suggests that toddlers exploit partial representations of sentence structure to guide sentence interpretation; these sparse representations are useful, but error-prone. PMID:22525312

  14. Landmark-based elastic registration using approximating thin-plate splines.

    PubMed

    Rohr, K; Stiehl, H S; Sprengel, R; Buzug, T M; Weese, J; Kuhn, M H

    2001-06-01

    We consider elastic image registration based on a set of corresponding anatomical point landmarks and approximating thin-plate splines. This approach is an extension of the original interpolating thin-plate spline approach and allows to take into account landmark localization errors. The extension is important for clinical applications since landmark extraction is always prone to error. Our approach is based on a minimizing functional and can cope with isotropic as well as anisotropic landmark errors. In particular, in the latter case it is possible to include different types of landmarks, e.g., unique point landmarks as well as arbitrary edge points. Also, the scheme is general with respect to the image dimension and the order of smoothness of the underlying functional. Optimal affine transformations as well as interpolating thin-plate splines are special cases of this scheme. To localize landmarks we use a semi-automatic approach which is based on three-dimensional (3-D) differential operators. Experimental results are presented for two-dimensional as well as 3-D tomographic images of the human brain.

  15. Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.

    PubMed

    Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L

    2018-05-01

    Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.

  16. Gender Differences in Self-Conscious Emotions and Motivation to Quit Gambling.

    PubMed

    Kushnir, Vladyslav; Godinho, Alexandra; Hodgins, David C; Hendershot, Christian S; Cunningham, John A

    2016-09-01

    Considerable gender differences have been previously noted in the prevalence, etiology, and clinical features of problem gambling. While differences in affective states between men and women in particular, may explain differential experiences in the process of gambling, the role of affect in motivations for quitting gambling and recovery has not been thoroughly explored. The aim of this study was to examine gender differences within a sample of problem gamblers motivated to quit with or without formal treatment, and further, to explore the interactions between gender, shame and guilt-proneness, and autonomous versus controlled reasons for change. Motivation for change and self-conscious emotional traits were analyzed for 207 adult problem gamblers with an interest in quitting or reducing their gambling (96.6 % not receiving treatment). Overall, gender differences were not observed in clinical and demographic characteristics. However, women exhibited greater shame [F(1,204) = 12.11, p = 0.001] and guilt proneness [F(1,204) = 14.16, p < 0.001] compared to men, whereas men scored higher on trait detachment [F(1,204) = 7.08, p = 0.008]. Controlling for demographic and clinical characteristics, general linear models revealed that autonomous motivation for change was associated with higher guilt-proneness, greater problem gambling severity, and the preparation stage of change; whereas controlled forms of motivation were significantly associated with higher shame-proneness and greater problem gambling severity. No gender effects were observed for either motivation for change. These findings suggest that the process of change can be different for shame-prone and guilt-prone problem gamblers, which may impact behavioral outcomes.

  17. Disruption of N terminus long range non covalent interactions shifted temp.opt 25°C to cold: Evolution of point mutant Bacillus lipase by error prone PCR.

    PubMed

    Goomber, Shelly; Kumar, Arbind; Kaur, Jagdeep

    2016-01-15

    Cold adapted enzymes have applications in detergent, textile, food, bioremediation and biotechnology processes. Bacillus lipases are 'generally recognized as safe' (GRAS) and hence are industrially attractive. Bacillus lipase of 1.4 subfamily are of lowest molecular weight and are reversibly unfolded due to absence of disulphide bonds. Therefore these are largely used to study energetic of protein stability that represents unfolding of native protein to fully unfolded state. In present study, metagenomically isolated Bacillus LipJ was laboratory evolved for cold adaptation by error Prone PCR. Library of variants were screened for high relative activity at low temperature of 10°C compared to native protein LipJ. Point mutant sequenced as Phe19→Leu was determined to be active at cold and was selected for extensive biochemical, biophysical characterization. Variant F19L showed its maximum activity at 10°C where parent protein LipJ had 20% relative activity. Psychrophilic nature of F19L was established with about 50% relative active at 5°C where native protein was frozen to act. Variant F19L showed no activity at temperature 40°C and above, establishing its thermolabile nature. Thermostability studies determined mutant to be unstable above 20°C and three fold decrease in its half life at 30°C compared to native protein. Far UV-CD and intrinsic fluorescence study demonstrated unstable tertiary structure of point variant F19L leading to its unfolding at low temperature of 20°C. Cold adaptation of mutant F19L is accompanied with increased specific activity. Mutant was catalytically more efficient with 1.3 fold increase in kcat. Homologue structure modelling predicted disruption of intersecondary hydrophobic core formed by aromatic ring of Phe19 with non polar residues placed at β3, β4, β5, β6, αF. Increased local flexibility of variant F19L explains molecular basis of its psychrophilic nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Implementing Access to Data Distributed on Many Processors

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A reference architecture is defined for an object-oriented implementation of domains, arrays, and distributions written in the programming language Chapel. This technology primarily addresses domains that contain arrays that have regular index sets with the low-level implementation details being beyond the scope of this discussion. What is defined is a complete set of object-oriented operators that allows one to perform data distributions for domain arrays involving regular arithmetic index sets. What is unique is that these operators allow for the arbitrary regions of the arrays to be fragmented and distributed across multiple processors with a single point of access giving the programmer the illusion that all the elements are collocated on a single processor. Today's massively parallel High Productivity Computing Systems (HPCS) are characterized by a modular structure, with a large number of processing and memory units connected by a high-speed network. Locality of access as well as load balancing are primary concerns in these systems that are typically used for high-performance scientific computation. Data distributions address these issues by providing a range of methods for spreading large data sets across the components of a system. Over the past two decades, many languages, systems, tools, and libraries have been developed for the support of distributions. Since the performance of data parallel applications is directly influenced by the distribution strategy, users often resort to low-level programming models that allow fine-tuning of the distribution aspects affecting performance, but, at the same time, are tedious and error-prone. This technology presents a reusable design of a data-distribution framework for data parallel high-performance applications. Distributions are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on the performance of applications, it is important that the distribution strategy is flexible, so its behavior can change depending on the needs of the application. At the same time, high productivity concerns require that the user be shielded from error-prone, tedious details such as communication and synchronization.

  19. Flexible retrieval: When true inferences produce false memories.

    PubMed

    Carpenter, Alexis C; Schacter, Daniel L

    2017-03-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave memory prone to error or distortion, such as source misattributions in which details of one event are mistakenly attributed to another related event. To determine whether the same recombination-related retrieval mechanism supports both successful inference and source memory errors, we developed a modified version of an associative inference paradigm in which participants encoded everyday scenes comprised of people, objects, and other contextual details. These scenes contained overlapping elements (AB, BC) that could later be linked to support novel inferential retrieval regarding elements that had not appeared together previously (AC). Our critical experimental manipulation concerned whether contextual details were probed before or after the associative inference test, thereby allowing us to assess whether (a) false memories increased for successful versus unsuccessful inferences, and (b) any such effects were specific to after compared with before participants received the inference test. In each of 4 experiments that used variants of this paradigm, participants were more susceptible to false memories for contextual details after successful than unsuccessful inferential retrieval, but only when contextual details were probed after the associative inference test. These results suggest that the retrieval-mediated recombination mechanism that underlies associative inference also contributes to source misattributions that result from combining elements of distinct episodes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Time-symmetric integration in astrophysics

    NASA Astrophysics Data System (ADS)

    Hernandez, David M.; Bertschinger, Edmund

    2018-04-01

    Calculating the long-term solution of ordinary differential equations, such as those of the N-body problem, is central to understanding a wide range of dynamics in astrophysics, from galaxy formation to planetary chaos. Because generally no analytic solution exists to these equations, researchers rely on numerical methods that are prone to various errors. In an effort to mitigate these errors, powerful symplectic integrators have been employed. But symplectic integrators can be severely limited because they are not compatible with adaptive stepping and thus they have difficulty in accommodating changing time and length scales. A promising alternative is time-reversible integration, which can handle adaptive time-stepping, but the errors due to time-reversible integration in astrophysics are less understood. The goal of this work is to study analytically and numerically the errors caused by time-reversible integration, with and without adaptive stepping. We derive the modified differential equations of these integrators to perform the error analysis. As an example, we consider the trapezoidal rule, a reversible non-symplectic integrator, and show that it gives secular energy error increase for a pendulum problem and for a Hénon-Heiles orbit. We conclude that using reversible integration does not guarantee good energy conservation and that, when possible, use of symplectic integrators is favoured. We also show that time-symmetry and time-reversibility are properties that are distinct for an integrator.

  1. Evaluation of exome variants using the Ion Proton Platform to sequence error-prone regions.

    PubMed

    Seo, Heewon; Park, Yoomi; Min, Byung Joo; Seo, Myung Eui; Kim, Ju Han

    2017-01-01

    The Ion Proton sequencer from Thermo Fisher accurately determines sequence variants from target regions with a rapid turnaround time at a low cost. However, misleading variant-calling errors can occur. We performed a systematic evaluation and manual curation of read-level alignments for the 675 ultrarare variants reported by the Ion Proton sequencer from 27 whole-exome sequencing data but that are not present in either the 1000 Genomes Project and the Exome Aggregation Consortium. We classified positive variant calls into 393 highly likely false positives, 126 likely false positives, and 156 likely true positives, which comprised 58.2%, 18.7%, and 23.1% of the variants, respectively. We identified four distinct error patterns of variant calling that may be bioinformatically corrected when using different strategies: simplicity region, SNV cluster, peripheral sequence read, and base inversion. Local de novo assembly successfully corrected 201 (38.7%) of the 519 highly likely or likely false positives. We also demonstrate that the two sequencing kits from Thermo Fisher (the Ion PI Sequencing 200 kit V3 and the Ion PI Hi-Q kit) exhibit different error profiles across different error types. A refined calling algorithm with better polymerase may improve the performance of the Ion Proton sequencing platform.

  2. Impact of Standardized Communication Techniques on Errors during Simulated Neonatal Resuscitation.

    PubMed

    Yamada, Nicole K; Fuerch, Janene H; Halamek, Louis P

    2016-03-01

    Current patterns of communication in high-risk clinical situations, such as resuscitation, are imprecise and prone to error. We hypothesized that the use of standardized communication techniques would decrease the errors committed by resuscitation teams during neonatal resuscitation. In a prospective, single-blinded, matched pairs design with block randomization, 13 subjects performed as a lead resuscitator in two simulated complex neonatal resuscitations. Two nurses assisted each subject during the simulated resuscitation scenarios. In one scenario, the nurses used nonstandard communication; in the other, they used standardized communication techniques. The performance of the subjects was scored to determine errors committed (defined relative to the Neonatal Resuscitation Program algorithm), time to initiation of positive pressure ventilation (PPV), and time to initiation of chest compressions (CC). In scenarios in which subjects were exposed to standardized communication techniques, there was a trend toward decreased error rate, time to initiation of PPV, and time to initiation of CC. While not statistically significant, there was a 1.7-second improvement in time to initiation of PPV and a 7.9-second improvement in time to initiation of CC. Should these improvements in human performance be replicated in the care of real newborn infants, they could improve patient outcomes and enhance patient safety. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Suboptimal decision making by children with ADHD in the face of risk: Poor risk adjustment and delay aversion rather than general proneness to taking risks.

    PubMed

    Sørensen, Lin; Sonuga-Barke, Edmund; Eichele, Heike; van Wageningen, Heidi; Wollschlaeger, Daniel; Plessen, Kerstin Jessica

    2017-02-01

    Suboptimal decision making in the face of risk (DMR) in children with attention-deficit hyperactivity disorder (ADHD) may be mediated by deficits in a number of different neuropsychological processes. We investigated DMR in children with ADHD using the Cambridge Gambling Task (CGT) to distinguish difficulties in adjusting to changing probabilities of choice outcomes (so-called risk adjustment) from general risk proneness, and to distinguish these 2 processes from delay aversion (the tendency to choose the least delayed option) and impairments in the ability to reflect on choice options. Based on previous research, we predicted that suboptimal performance on this task in children with ADHD would be primarily relate to problems with risk adjustment and delay aversion rather than general risk proneness. Drug naïve children with ADHD (n = 36), 8 to 12 years, and an age-matched group of typically developing children (n = 34) performed the CGT. As predicted, children with ADHD were not more prone to making risky choices (i.e., risk proneness). However, they had difficulty adjusting to changing risk levels and were more delay aversive-with these 2 effects being correlated. Our findings add to the growing body of evidence that children with ADHD do not favor risk taking per se when performing gambling tasks, but rather may lack the cognitive skills or motivational style to appraise changing patterns of risk effectively. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. How smart is your BEOL? productivity improvement through intelligent automation

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony

    2017-07-01

    The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.

  5. Group elicitations yield more consistent, yet more uncertain experts in understanding risks to ecosystem services in New Zealand bays

    PubMed Central

    Sinner, Jim; Ellis, Joanne; Kandlikar, Milind; Halpern, Benjamin S.; Satterfield, Terre; Chan, Kai

    2017-01-01

    The elicitation of expert judgment is an important tool for assessment of risks and impacts in environmental management contexts, and especially important as decision-makers face novel challenges where prior empirical research is lacking or insufficient. Evidence-driven elicitation approaches typically involve techniques to derive more accurate probability distributions under fairly specific contexts. Experts are, however, prone to overconfidence in their judgements. Group elicitations with diverse experts can reduce expert overconfidence by allowing cross-examination and reassessment of prior judgements, but groups are also prone to uncritical “groupthink” errors. When the problem context is underspecified the probability that experts commit groupthink errors may increase. This study addresses how structured workshops affect expert variability among and certainty within responses in a New Zealand case study. We find that experts’ risk estimates before and after a workshop differ, and that group elicitations provided greater consistency of estimates, yet also greater uncertainty among experts, when addressing prominent impacts to four different ecosystem services in coastal New Zealand. After group workshops, experts provided more consistent ranking of risks and more consistent best estimates of impact through increased clarity in terminology and dampening of extreme positions, yet probability distributions for impacts widened. The results from this case study suggest that group elicitations have favorable consequences for the quality and uncertainty of risk judgments within and across experts, making group elicitation techniques invaluable tools in contexts of limited data. PMID:28767694

  6. Java Performance for Scientific Applications on LLNL Computer Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapfer, C; Wissink, A

    2002-05-10

    Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less

  7. The use of modified and non-natural nucleotides provide unique insights into pro-mutagenic replication catalyzed by polymerase eta.

    PubMed

    Choi, Jung-Suk; Dasari, Anvesh; Hu, Peter; Benkovic, Stephen J; Berdis, Anthony J

    2016-02-18

    This report evaluates the pro-mutagenic behavior of 8-oxo-guanine (8-oxo-G) by quantifying the ability of high-fidelity and specialized DNA polymerases to incorporate natural and modified nucleotides opposite this lesion. Although high-fidelity DNA polymerases such as pol δ and the bacteriophage T4 DNA polymerase replicating 8-oxo-G in an error-prone manner, they display remarkably low efficiencies for TLS compared to normal DNA synthesis. In contrast, pol η shows a combination of high efficiency and low fidelity when replicating 8-oxo-G. These combined properties are consistent with a pro-mutagenic role for pol η when replicating this DNA lesion. Studies using modified nucleotide analogs show that pol η relies heavily on hydrogen-bonding interactions during translesion DNA synthesis. However, nucleobase modifications such as alkylation to the N2 position of guanine significantly increase error-prone synthesis catalyzed by pol η when replicating 8-oxo-G. Molecular modeling studies demonstrate the existence of a hydrophobic pocket in pol η that participates in the increased utilization of certain hydrophobic nucleotides. A model is proposed for enhanced pro-mutagenic replication catalyzed by pol η that couples efficient incorporation of damaged nucleotides opposite oxidized DNA lesions created by reactive oxygen species. The biological implications of this model toward increasing mutagenic events in lung cancer are discussed. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Effects of response bias and judgment framing on operator use of an automated aid in a target detection task.

    PubMed

    Rice, Stephen; McCarley, Jason S

    2011-12-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.

  9. PSO4: a novel gene involved in error-prone repair in Saccharomyces cerevisiae.

    PubMed

    Henriques, J A; Vicente, E J; Leandro da Silva, K V; Schenberg, A C

    1989-09-01

    The haploid xs9 mutant, originally selected for on the basis of a slight sensitivity to the lethal effect of X-rays, was found to be extremely sensitive to inactivation by 8-methoxypsoralen (8MOP) photoaddition, especially when cells are treated in the G2 phase of the cell cycle. As the xs9 mutation showed no allelism with any of the 3 known pso mutations, it was now given the name of pso4-1. Regarding inactivation, the pso4-1 mutant is also sensitive to mono- (HN1) or bi-functional (HN2) nitrogen mustards, it is slightly sensitive to 254 nm UV radiation (UV), and shows nearly normal sensitivity to 3-carbethoxypsoralen (3-CPs) photoaddition or methyl methanesulfonate (MMS). Regarding mutagenesis, the pso4-1 mutation completely blocks reverse and forward mutations induced by either 8MOP or 3CPs photoaddition, or by gamma-rays. In the cases of UV, HN1, HN2 or MMS treatments, while reversion induction is still completely abolished, forward mutagenesis is only partially inhibited for UV, HN1, or MMS, and it is unaffected for HN2. Besides severely inhibiting induced mutagenesis, the pso4-1 mutation was found to be semi-dominant, to block sporulation, to abolish the diploid resistance effect, and to block induced mitotic recombination, which indicates that the PSO4 gene is involved in a recombinational pathway of error-prone repair, comparable to the E. coli SOS repair pathway.

  10. The accuracy of self-reported pregnancy-related weight: a systematic review.

    PubMed

    Headen, I; Cohen, A K; Mujahid, M; Abrams, B

    2017-03-01

    Self-reported maternal weight is error-prone, and the context of pregnancy may impact error distributions. This systematic review summarizes error in self-reported weight across pregnancy and assesses implications for bias in associations between pregnancy-related weight and birth outcomes. We searched PubMed and Google Scholar through November 2015 for peer-reviewed articles reporting accuracy of self-reported, pregnancy-related weight at four time points: prepregnancy, delivery, over gestation and postpartum. Included studies compared maternal self-report to anthropometric measurement or medical report of weights. Sixty-two studies met inclusion criteria. We extracted data on magnitude of error and misclassification. We assessed impact of reporting error on bias in associations between pregnancy-related weight and birth outcomes. Women underreported prepregnancy (PPW: -2.94 to -0.29 kg) and delivery weight (DW: -1.28 to 0.07 kg), and over-reported gestational weight gain (GWG: 0.33 to 3 kg). Magnitude of error was small, ranged widely, and varied by prepregnancy weight class and race/ethnicity. Misclassification was moderate (PPW: 0-48.3%; DW: 39.0-49.0%; GWG: 16.7-59.1%), and overestimated some estimates of population prevalence. However, reporting error did not largely bias associations between pregnancy-related weight and birth outcomes. Although measured weight is preferable, self-report is a cost-effective and practical measurement approach. Future researchers should develop bias correction techniques for self-reported pregnancy-related weight. © 2017 World Obesity Federation.

  11. Correcting for Measurement Error in Time-Varying Covariates in Marginal Structural Models.

    PubMed

    Kyle, Ryan P; Moodie, Erica E M; Klein, Marina B; Abrahamowicz, Michał

    2016-08-01

    Unbiased estimation of causal parameters from marginal structural models (MSMs) requires a fundamental assumption of no unmeasured confounding. Unfortunately, the time-varying covariates used to obtain inverse probability weights are often error-prone. Although substantial measurement error in important confounders is known to undermine control of confounders in conventional unweighted regression models, this issue has received comparatively limited attention in the MSM literature. Here we propose a novel application of the simulation-extrapolation (SIMEX) procedure to address measurement error in time-varying covariates, and we compare 2 approaches. The direct approach to SIMEX-based correction targets outcome model parameters, while the indirect approach corrects the weights estimated using the exposure model. We assess the performance of the proposed methods in simulations under different clinically plausible assumptions. The simulations demonstrate that measurement errors in time-dependent covariates may induce substantial bias in MSM estimators of causal effects of time-varying exposures, and that both proposed SIMEX approaches yield practically unbiased estimates in scenarios featuring low-to-moderate degrees of error. We illustrate the proposed approach in a simple analysis of the relationship between sustained virological response and liver fibrosis progression among persons infected with hepatitis C virus, while accounting for measurement error in γ-glutamyltransferase, using data collected in the Canadian Co-infection Cohort Study from 2003 to 2014. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Multiple description distributed image coding with side information for mobile wireless transmission

    NASA Astrophysics Data System (ADS)

    Wu, Min; Song, Daewon; Chen, Chang Wen

    2005-03-01

    Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.

  13. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  14. Body-object interaction ratings for 1,618 monosyllabic nouns.

    PubMed

    Tillotson, Sherri M; Siakaluk, Paul D; Pexman, Penny M

    2008-11-01

    Body-object interaction (BOI) assesses the ease with which a human body can physically interact with a word's referent. Recent research has shown that BOI influences visual word recognition processes in such a way that responses to high-BOI words (e.g., couch) are faster and less error prone than responses to low-BOI words (e.g., cliff). Importantly, the high-BOI words and the low-BOI words that were used in those studies were matched on imageability. In the present study, we collected BOI ratings for a large set of words. BOI ratings, on a 1-7 scale, were obtained for 1,618 monosyllabic nouns. These ratings allowed us to test the generalizability of BOI effects to a large set of items, and they should be useful to researchers who are interested in manipulating or controlling for the effects of BOI. The body-object interaction ratings for this study may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  15. Automatic mediastinal lymph node detection in chest CT

    NASA Astrophysics Data System (ADS)

    Feuerstein, Marco; Deguchi, Daisuke; Kitasaka, Takayuki; Iwano, Shingo; Imaizumi, Kazuyoshi; Hasegawa, Yoshinori; Suenaga, Yasuhito; Mori, Kensaku

    2009-02-01

    Computed tomography (CT) of the chest is a very common staging investigation for the assessment of mediastinal, hilar, and intrapulmonary lymph nodes in the context of lung cancer. In the current clinical workflow, the detection and assessment of lymph nodes is usually performed manually, which can be error-prone and timeconsuming. We therefore propose a method for the automatic detection of mediastinal, hilar, and intrapulmonary lymph node candidates in contrast-enhanced chest CT. Based on the segmentation of important mediastinal anatomy (bronchial tree, aortic arch) and making use of anatomical knowledge, we utilize Hessian eigenvalues to detect lymph node candidates. As lymph nodes can be characterized as blob-like structures of varying size and shape within a specific intensity interval, we can utilize these characteristics to reduce the number of false positive candidates significantly. We applied our method to 5 cases suspected to have lung cancer. The processing time of our algorithm did not exceed 6 minutes, and we achieved an average sensitivity of 82.1% and an average precision of 13.3%.

  16. Plexiform neurofibroma tissue classification

    NASA Astrophysics Data System (ADS)

    Weizman, L.; Hoch, L.; Ben Sira, L.; Joskowicz, L.; Pratt, L.; Constantini, S.; Ben Bashat, D.

    2011-03-01

    Plexiform Neurofibroma (PN) is a major complication of NeuroFibromatosis-1 (NF1), a common genetic disease that involving the nervous system. PNs are peripheral nerve sheath tumors extending along the length of the nerve in various parts of the body. Treatment decision is based on tumor volume assessment using MRI, which is currently time consuming and error prone, with limited semi-automatic segmentation support. We present in this paper a new method for the segmentation and tumor mass quantification of PN from STIR MRI scans. The method starts with a user-based delineation of the tumor area in a single slice and automatically detects the PN lesions in the entire image based on the tumor connectivity. Experimental results on seven datasets yield a mean volume overlap difference of 25% as compared to manual segmentation by expert radiologist with a mean computation and interaction time of 12 minutes vs. over an hour for manual annotation. Since the user interaction in the segmentation process is minimal, our method has the potential to successfully become part of the clinical workflow.

  17. LOSITAN: a workbench to detect molecular adaptation based on a Fst-outlier method.

    PubMed

    Antao, Tiago; Lopes, Ana; Lopes, Ricardo J; Beja-Pereira, Albano; Luikart, Gordon

    2008-07-28

    Testing for selection is becoming one of the most important steps in the analysis of multilocus population genetics data sets. Existing applications are difficult to use, leaving many non-trivial, error-prone tasks to the user. Here we present LOSITAN, a selection detection workbench based on a well evaluated Fst-outlier detection method. LOSITAN greatly facilitates correct approximation of model parameters (e.g., genome-wide average, neutral Fst), provides data import and export functions, iterative contour smoothing and generation of graphics in a easy to use graphical user interface. LOSITAN is able to use modern multi-core processor architectures by locally parallelizing fdist, reducing computation time by half in current dual core machines and with almost linear performance gains in machines with more cores. LOSITAN makes selection detection feasible to a much wider range of users, even for large population genomic datasets, by both providing an easy to use interface and essential functionality to complete the whole selection detection process.

  18. Mining Genotype-Phenotype Associations from Public Knowledge Sources via Semantic Web Querying

    PubMed Central

    Kiefer, Richard C.; Freimuth, Robert R.; Chute, Christopher G; Pathak, Jyotishman

    Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively. PMID:24303249

  19. Recent Advances of Malaria Parasites Detection Systems Based on Mathematical Morphology

    PubMed Central

    Di Ruberto, Cecilia; Kocher, Michel

    2018-01-01

    Malaria is an epidemic health disease and a rapid, accurate diagnosis is necessary for proper intervention. Generally, pathologists visually examine blood stained slides for malaria diagnosis. Nevertheless, this kind of visual inspection is subjective, error-prone and time-consuming. In order to overcome the issues, numerous methods of automatic malaria diagnosis have been proposed so far. In particular, many researchers have used mathematical morphology as a powerful tool for computer aided malaria detection and classification. Mathematical morphology is not only a theory for the analysis of spatial structures, but also a very powerful technique widely used for image processing purposes and employed successfully in biomedical image analysis, especially in preprocessing and segmentation tasks. Microscopic image analysis and particularly malaria detection and classification can greatly benefit from the use of morphological operators. The aim of this paper is to present a review of recent mathematical morphology based methods for malaria parasite detection and identification in stained blood smears images. PMID:29419781

  20. A hybrid computational-experimental approach for automated crystal structure solution

    NASA Astrophysics Data System (ADS)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  1. Adaptive and automatic red blood cell counting method based on microscopic hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Liu, Xi; Zhou, Mei; Qiu, Song; Sun, Li; Liu, Hongying; Li, Qingli; Wang, Yiting

    2017-12-01

    Red blood cell counting, as a routine examination, plays an important role in medical diagnoses. Although automated hematology analyzers are widely used, manual microscopic examination by a hematologist or pathologist is still unavoidable, which is time-consuming and error-prone. This paper proposes a full-automatic red blood cell counting method which is based on microscopic hyperspectral imaging of blood smears and combines spatial and spectral information to achieve high precision. The acquired hyperspectral image data of the blood smear in the visible and near-infrared spectral range are firstly preprocessed, and then a quadratic blind linear unmixing algorithm is used to get endmember abundance images. Based on mathematical morphological operation and an adaptive Otsu’s method, a binaryzation process is performed on the abundance images. Finally, the connected component labeling algorithm with magnification-based parameter setting is applied to automatically select the binary images of red blood cell cytoplasm. Experimental results show that the proposed method can perform well and has potential for clinical applications.

  2. Towards Automatic Image Segmentation Using Optimised Region Growing Technique

    NASA Astrophysics Data System (ADS)

    Alazab, Mamoun; Islam, Mofakharul; Venkatraman, Sitalakshmi

    Image analysis is being adopted extensively in many applications such as digital forensics, medical treatment, industrial inspection, etc. primarily for diagnostic purposes. Hence, there is a growing interest among researches in developing new segmentation techniques to aid the diagnosis process. Manual segmentation of images is labour intensive, extremely time consuming and prone to human errors and hence an automated real-time technique is warranted in such applications. There is no universally applicable automated segmentation technique that will work for all images as the image segmentation is quite complex and unique depending upon the domain application. Hence, to fill the gap, this paper presents an efficient segmentation algorithm that can segment a digital image of interest into a more meaningful arrangement of regions and objects. Our algorithm combines region growing approach with optimised elimination of false boundaries to arrive at more meaningful segments automatically. We demonstrate this using X-ray teeth images that were taken for real-life dental diagnosis.

  3. Recent Advances of Malaria Parasites Detection Systems Based on Mathematical Morphology.

    PubMed

    Loddo, Andrea; Di Ruberto, Cecilia; Kocher, Michel

    2018-02-08

    Malaria is an epidemic health disease and a rapid, accurate diagnosis is necessary for proper intervention. Generally, pathologists visually examine blood stained slides for malaria diagnosis. Nevertheless, this kind of visual inspection is subjective, error-prone and time-consuming. In order to overcome the issues, numerous methods of automatic malaria diagnosis have been proposed so far. In particular, many researchers have used mathematical morphology as a powerful tool for computer aided malaria detection and classification. Mathematical morphology is not only a theory for the analysis of spatial structures, but also a very powerful technique widely used for image processing purposes and employed successfully in biomedical image analysis, especially in preprocessing and segmentation tasks. Microscopic image analysis and particularly malaria detection and classification can greatly benefit from the use of morphological operators. The aim of this paper is to present a review of recent mathematical morphology based methods for malaria parasite detection and identification in stained blood smears images.

  4. [Control levels of Sin3 histone deacetylase for spontaneous and UV-induced mutagenesis in yeasts Saccharomyces cerevisiae].

    PubMed

    Lebovka, I Iu; Kozhina, T N; Fedorova, I V; Peshekhonov, V T; Evstiukhina, T A; Chernenkov, A Iu; Korolev, V G

    2014-01-01

    SIN3 gene product operates as a repressor for a huge amount of genes in Saccharomyces cerevisiae. Sin3 protein with a mass of about 175 kDa is a member of the RPD3 protein complex with an assessed mass of greater than 2 million Da. It was previously shownthat RPD3 gene mutations influence recombination and repair processes in S. cerevisiae yeasts. We studied the impacts of the sin3 mutation on UV-light sensitivity and UV-induced mutagenesis in budding yeast cells. The deletion ofthe SIN3 gene causes weak UV-sensitivity of mutant budding cells as compared to the wild-type strain. These results show that the sin3 mutation decreases both spontaneous and UV-induced levels of levels. This fact is hypothetically related to themalfunction of ribonucleotide reductase activity regulation, which leads to a decrease in the dNTP pool and the inaccurate error-prone damage bypass postreplication repair pathway, which in turn provokes a reduction in the incidence of mutations.

  5. Adaptive Strategies for the Elderly in Inhibiting Irrelevant and Conflict No-Go Trials while Performing the Go/No-Go Task

    PubMed Central

    Hsieh, Shulan; Wu, Mengyao; Tang, Chien-Hui

    2016-01-01

    This study aimed to differentiate whether or not older adults are more prone to distraction or conflict, as induced by irrelevant and conflict no-go stimuli (irNOGO and cfNOGO), respectively. This study also aimed to determine whether or not older adults would devote more effort to withholding a no-go trial in the higher-control demand condition (20% no-go trials’ probability) as compared to the lower-control demand condition (50 and 80% no-go trials’ probability). A total of 96 individuals were recruited, and each of the three no-go trials’ probability conditions included 32 participants (16 younger adults and 16 older adults). Both behavioral and event-related potential (ERP) data were measured. The behavioral results showed that the older adults performed more poorly than the younger adults for the go trials, as reflected by slower reaction times (RTs) and higher numbers of omission errors in the go trials. In contrast, in the no-go trials, the older adults counter-intuitively exhibited similar behavioral performance (i.e., equivalent commission errors) as compared to the younger adults. The ERP data further showed that the older adults (but not the younger adults) exhibited larger P3 peak amplitudes for the irNOGO than cfNOGO trials. Yet, on the other hand, the older adults performed more poorly (i.e., had more commission errors) in the cfNOGO than irNOGO trials. These results seem to suggest that the older adults recruited more control processes in order to conquer the commitment of responses in the no-go trials, especially in the irNOGO trials. This age-related compensatory response of recruiting more control processes was specifically seen in the 20% no-go trial probability condition. This study therefore provides a deeper understanding into how older adults adopt strategies for performing the go/no-go task such as devoting more control processes to inhibiting the irNOGO trials compared to the cfNOGO trials in order to cope with their deficient inhibition ability. PMID:26779012

  6. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    PubMed

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of magnitude on many simulated datasets. The advantages of the proposed pipeline include informed and data specific input arguments for baseline subtraction methods, the avoidance of time-intensive and subjective piecewise baseline subtraction, and the ability to automate baseline subtraction completely. Moreover, individual steps can be adopted as stand-alone routines.

  7. A dose error evaluation study for 4D dose calculations

    NASA Astrophysics Data System (ADS)

    Milz, Stefan; Wilkens, Jan J.; Ullrich, Wolfgang

    2014-10-01

    Previous studies have shown that respiration induced motion is not negligible for Stereotactic Body Radiation Therapy. The intrafractional breathing induced motion influences the delivered dose distribution on the underlying patient geometry such as the lung or the abdomen. If a static geometry is used, a planning process for these indications does not represent the entire dynamic process. The quality of a full 4D dose calculation approach depends on the dose coordinate transformation process between deformable geometries. This article provides an evaluation study that introduces an advanced method to verify the quality of numerical dose transformation generated by four different algorithms. The used transformation metric value is based on the deviation of the dose mass histogram (DMH) and the mean dose throughout dose transformation. The study compares the results of four algorithms. In general, two elementary approaches are used: dose mapping and energy transformation. Dose interpolation (DIM) and an advanced concept, so called divergent dose mapping model (dDMM), are used for dose mapping. The algorithms are compared to the basic energy transformation model (bETM) and the energy mass congruent mapping (EMCM). For evaluation 900 small sample regions of interest (ROI) are generated inside an exemplary lung geometry (4DCT). A homogeneous fluence distribution is assumed for dose calculation inside the ROIs. The dose transformations are performed with the four different algorithms. The study investigates the DMH-metric and the mean dose metric for different scenarios (voxel sizes: 8 mm, 4 mm, 2 mm, 1 mm 9 different breathing phases). dDMM achieves the best transformation accuracy in all measured test cases with 3-5% lower errors than the other models. The results of dDMM are reasonable and most efficient in this study, although the model is simple and easy to implement. The EMCM model also achieved suitable results, but the approach requires a more complex programming structure. The study discloses disadvantages for the bETM and for the DIM. DIM yielded insufficient results for large voxel sizes, while bETM is prone to errors for small voxel sizes.

  8. A dose error evaluation study for 4D dose calculations.

    PubMed

    Milz, Stefan; Wilkens, Jan J; Ullrich, Wolfgang

    2014-11-07

    Previous studies have shown that respiration induced motion is not negligible for Stereotactic Body Radiation Therapy. The intrafractional breathing induced motion influences the delivered dose distribution on the underlying patient geometry such as the lung or the abdomen. If a static geometry is used, a planning process for these indications does not represent the entire dynamic process. The quality of a full 4D dose calculation approach depends on the dose coordinate transformation process between deformable geometries. This article provides an evaluation study that introduces an advanced method to verify the quality of numerical dose transformation generated by four different algorithms.The used transformation metric value is based on the deviation of the dose mass histogram (DMH) and the mean dose throughout dose transformation. The study compares the results of four algorithms. In general, two elementary approaches are used: dose mapping and energy transformation. Dose interpolation (DIM) and an advanced concept, so called divergent dose mapping model (dDMM), are used for dose mapping. The algorithms are compared to the basic energy transformation model (bETM) and the energy mass congruent mapping (EMCM). For evaluation 900 small sample regions of interest (ROI) are generated inside an exemplary lung geometry (4DCT). A homogeneous fluence distribution is assumed for dose calculation inside the ROIs. The dose transformations are performed with the four different algorithms.The study investigates the DMH-metric and the mean dose metric for different scenarios (voxel sizes: 8 mm, 4 mm, 2 mm, 1 mm; 9 different breathing phases). dDMM achieves the best transformation accuracy in all measured test cases with 3-5% lower errors than the other models. The results of dDMM are reasonable and most efficient in this study, although the model is simple and easy to implement. The EMCM model also achieved suitable results, but the approach requires a more complex programming structure. The study discloses disadvantages for the bETM and for the DIM. DIM yielded insufficient results for large voxel sizes, while bETM is prone to errors for small voxel sizes.

  9. Proneness to guilt, shame, and pride in children with Autism Spectrum Disorders and neurotypical children.

    PubMed

    Davidson, Denise; Hilvert, Elizabeth; Misiunaite, Ieva; Giordano, Michael

    2018-06-01

    Self-conscious emotions (e.g., guilt, shame, and pride) are complex emotions that require self-reflection and self-evaluation, and are thought to facilitate the maintenance of societal norms and personal standards. Despite the importance of self-conscious emotions, most research has focused on basic emotion processing in children with Autism Spectrum Disorders (ASD). Therefore, in the present study, we used the Test of Self-Conscious Affect for Children (TOSCA-C) to assess proneness to, or propensity to experience, the self-conscious emotions guilt, shame, and pride in children with ASD and neurotypical children. The TOSCA-C is designed to capture a child's natural tendency to experience a given emotion across a range of everyday situations [Tangney, Stuewig, & Mashek, 2007]. We also assessed how individual characteristics contribute to the development of proneness to self-conscious emotions, including theory of mind (ToM) and ASD symptomatology. In comparison to neurotypical children, children with ASD showed less proneness to guilt, although all children showed relatively high levels of proneness to guilt. Greater ToM ability was related to more proneness to guilt and authentic pride in children with ASD. Additionally, we found that children with ASD with more severe symptomatology were more prone to hubristic pride. Our results provide evidence of differences in proneness to self-conscious emotions in children with ASD, as well as highlight important mechanisms contributing to how children with ASD may experience self-conscious emotions. Autism Res 2018,11:883-892. ©2017 International Society for Autism Research, Wiley Periodicals, Inc. This research examined proneness to guilt, shame, and pride in children with Autism Spectrum Disorders (ASD) and neurotypical children. We found that children with ASD showed less proneness to guilt than neurotypical children. Better understanding of theory of mind was related to greater proneness to guilt and pride, but only for children with ASD. These findings are important because these complex emotions are linked with both positive and negative social behaviors towards others and oneself. © 2018 International Society for Autism Research, Wiley Periodicals, Inc.

  10. Assessing dangerous driving behavior during driving inattention: Psychometric adaptation and validation of the Attention-Related Driving Errors Scale in China.

    PubMed

    Qu, Weina; Ge, Yan; Zhang, Qian; Zhao, Wenguo; Zhang, Kan

    2015-07-01

    Driver inattention is a significant cause of motor vehicle collisions and incidents. The purpose of this study was to translate the Attention-Related Driving Error Scale (ARDES) into Chinese and to verify its reliability and validity. A total of 317 drivers completed the Chinese version of the ARDES, the Dula Dangerous Driving Index (DDDI), the Attention-Related Cognitive Errors Scale (ARCES) and the Mindful Attention Awareness Scale (MAAS) questionnaires. Specific sociodemographic variables and traffic violations were also measured. Psychometric results confirm that the ARDES-China has adequate psychometric properties (Cronbach's alpha=0.88) to be a useful tool for evaluating proneness to attentional errors in the Chinese driving population. First, ARDES-China scores were positively correlated with both DDDI scores and number of accidents in the prior year; in addition, ARDES-China scores were a significant predictor of dangerous driving behavior as measured by DDDI. Second, we found that ARDES-China scores were strongly correlated with ARCES scores and negatively correlated with MAAS scores. Finally, different demographic groups exhibited significant differences in ARDES scores; in particular, ARDES scores varied with years of driving experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array

    PubMed Central

    Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Tao, Yuan

    2018-01-01

    Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%. PMID:29734742

  12. Study of Current Measurement Method Based on Circular Magnetic Field Sensing Array.

    PubMed

    Li, Zhenhua; Zhang, Siqiu; Wu, Zhengtian; Abu-Siada, Ahmed; Tao, Yuan

    2018-05-05

    Classic core-based instrument transformers are more prone to magnetic saturation. This affects the measurement accuracy of such transformers and limits their applications in measuring large direct current (DC). Moreover, protection and control systems may exhibit malfunctions due to such measurement errors. This paper presents a more accurate method for current measurement based on a circular magnetic field sensing array. The proposed measurement approach utilizes multiple hall sensors that are evenly distributed on a circle. The average value of all hall sensors is regarded as the final measurement. The calculation model is established in the case of magnetic field interference of the parallel wire, and the simulation results show that the error decreases significantly when the number of hall sensors n is greater than 8. The measurement error is less than 0.06% when the wire spacing is greater than 2.5 times the radius of the sensor array. A simulation study on the off-center primary conductor is conducted, and a kind of hall sensor compensation method is adopted to improve the accuracy. The simulation and test results indicate that the measurement error of the system is less than 0.1%.

  13. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    PubMed Central

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  14. Influence of Errors in Tactile Sensors on Some High Level Parameters Used for Manipulation with Robotic Hands.

    PubMed

    Sánchez-Durán, José A; Hidalgo-López, José A; Castellanos-Ramos, Julián; Oballe-Peinado, Óscar; Vidal-Verdú, Fernando

    2015-08-19

    Tactile sensors suffer from many types of interference and errors like crosstalk, non-linearity, drift or hysteresis, therefore calibration should be carried out to compensate for these deviations. However, this procedure is difficult in sensors mounted on artificial hands for robots or prosthetics for instance, where the sensor usually bends to cover a curved surface. Moreover, the calibration procedure should be repeated often because the correction parameters are easily altered by time and surrounding conditions. Furthermore, this intensive and complex calibration could be less determinant, or at least simpler. This is because manipulation algorithms do not commonly use the whole data set from the tactile image, but only a few parameters such as the moments of the tactile image. These parameters could be changed less by common errors and interferences, or at least their variations could be in the order of those caused by accepted limitations, like reduced spatial resolution. This paper shows results from experiments to support this idea. The experiments are carried out with a high performance commercial sensor as well as with a low-cost error-prone sensor built with a common procedure in robotics.

  15. [Factors Related to Presenteeism in Young and Middle-aged Nurses].

    PubMed

    Yoshida, Mami; Miki, Akiko

    2018-04-03

    Presenteeism is considered to be not only a work-related stressor but also a factor involved in the development of workaholism and error proneness, which is often described as careless. Additionally, increasing health issues arising from aging suggest the possibility that presenteeism in middle-aged nurses is different than that in young ones. Therefore, the present study aimed to identify and tease apart factors involved in presenteeism among young and middle-aged nurses. An anonymous self-administered questionnaire survey was conducted among 2,006 nurses working at 10 hospitals. In total, 761 nurses aged <40 years and 536 nurses aged ≥40 years were enrolled in this study. Work Impairment Scores (WIS) on the Japanese version of the Stanford Presenteeism Scale were measured for presenteeism. Job stressors, workaholism, and error proneness were measured for related factors. Multiple regression analysis was conducted after determining the WIS as the dependent variable and related factors as independent variables. Overall, 70.8% of the young nurses reported health problems compared to 82.5% of the middle-aged nurses. However, WIS in young nurses was significantly higher than that in middle-aged ones (p < 0.001). WIS in young nurses showed a significant relationship with the degree of stressors, "difficulty of work" (β = 0.28, p < 0.001) and tendency to "work excessively" (β = 0.18, p < 0.001), which is a subscale of workaholism, error proneness of "action slips" (β = 0.14, p < 0.01) and "cognitive narrowing" (β = 0.11, p < 0.05). Conversely, WIS in middle-aged nurses showed a significant relationship with "cognitive narrowing" (β = 0.29, p < 0.001) and to "work excessively" (β = 0.17, p < 0.001), the degree of stressors on "difficulty of work" (β = 0.12, p < 0.05) and "lack of communication" (β = 0.13, p < 0.01). It was clarified that the increased health problems of middle-aged nurses does not necessarily lower their working capacity. Also, compared to young nurses, the degree of failing tendency, rather than the degree of job stressors, was more related to presenteeism for middle-aged nurses. It can be considered that middle-aged nurses simply realize that their working ability is hindered because of incidents resulting from attention narrowing. As fatigue and state of tension tend to cause narrowing of attention, it may be necessary to reduce such risks and adjust work environments so mistakes can be avoided.

  16. Challenges and a checklist for biodiversity conservation in fire-prone forests: perspecitves from the Pacific Northwest of USA and Southeastern Australia

    Treesearch

    Thomas A. Spies; David B. Lindenmayer; A. Malcolm Gill; Scott L. Stephens; James K. Agee

    2012-01-01

    Conserving biodiversity in fire-prone forest ecosystems is challenging for several reasons including differing and incomplete conceptual models of fire-related ecological processes, major gaps in ecological and management knowledge, high variability in fire behavior and ecological responses to fires, altered fire regimes as a result of land-use history and climate...

  17. A probabilistic approach to remote compositional analysis of planetary surfaces

    USGS Publications Warehouse

    Lapotre, Mathieu G.A.; Ehlmann, Bethany L.; Minson, Sarah E.

    2017-01-01

    Reflected light from planetary surfaces provides information, including mineral/ice compositions and grain sizes, by study of albedo and absorption features as a function of wavelength. However, deconvolving the compositional signal in spectra is complicated by the nonuniqueness of the inverse problem. Trade-offs between mineral abundances and grain sizes in setting reflectance, instrument noise, and systematic errors in the forward model are potential sources of uncertainty, which are often unquantified. Here we adopt a Bayesian implementation of the Hapke model to determine sets of acceptable-fit mineral assemblages, as opposed to single best fit solutions. We quantify errors and uncertainties in mineral abundances and grain sizes that arise from instrument noise, compositional end members, optical constants, and systematic forward model errors for two suites of ternary mixtures (olivine-enstatite-anorthite and olivine-nontronite-basaltic glass) in a series of six experiments in the visible-shortwave infrared (VSWIR) wavelength range. We show that grain sizes are generally poorly constrained from VSWIR spectroscopy. Abundance and grain size trade-offs lead to typical abundance errors of ≤1 wt % (occasionally up to ~5 wt %), while ~3% noise in the data increases errors by up to ~2 wt %. Systematic errors further increase inaccuracies by a factor of 4. Finally, phases with low spectral contrast or inaccurate optical constants can further increase errors. Overall, typical errors in abundance are <10%, but sometimes significantly increase for specific mixtures, prone to abundance/grain-size trade-offs that lead to high unmixing uncertainties. These results highlight the need for probabilistic approaches to remote determination of planetary surface composition.

  18. Efficacy of prone position in acute respiratory distress syndrome patients: A pathophysiology-based review

    PubMed Central

    Koulouras, Vasilios; Papathanakos, Georgios; Papathanasiou, Athanasios; Nakos, Georgios

    2016-01-01

    Acute respiratory distress syndrome (ARDS) is a syndrome with heterogeneous underlying pathological processes. It represents a common clinical problem in intensive care unit patients and it is characterized by high mortality. The mainstay of treatment for ARDS is lung protective ventilation with low tidal volumes and positive end-expiratory pressure sufficient for alveolar recruitment. Prone positioning is a supplementary strategy available in managing patients with ARDS. It was first described 40 years ago and it proves to be in alignment with two major ARDS pathophysiological lung models; the “sponge lung” - and the “shape matching” -model. Current evidence strongly supports that prone positioning has beneficial effects on gas exchange, respiratory mechanics, lung protection and hemodynamics as it redistributes transpulmonary pressure, stress and strain throughout the lung and unloads the right ventricle. The factors that individually influence the time course of alveolar recruitment and the improvement in oxygenation during prone positioning have not been well characterized. Although patients’ response to prone positioning is quite variable and hard to predict, large randomized trials and recent meta-analyses show that prone position in conjunction with a lung-protective strategy, when performed early and in sufficient duration, may improve survival in patients with ARDS. This pathophysiology-based review and recent clinical evidence strongly support the use of prone positioning in the early management of severe ARDS systematically and not as a rescue maneuver or a last-ditch effort. PMID:27152255

  19. Adaptive constructive processes and memory accuracy: Consequences of counterfactual simulations in young and older adults

    PubMed Central

    Gerlach, Kathy D.; Dornblaser, David W.; Schacter, Daniel L.

    2013-01-01

    People frequently engage in counterfactual thinking: mental simulations of alternative outcomes to past events. Like simulations of future events, counterfactual simulations serve adaptive functions. However, future simulation can also result in various kinds of distortions and has thus been characterized as an adaptive constructive process. Here we approach counterfactual thinking as such and examine whether it can distort memory for actual events. In Experiments 1a/b, young and older adults imagined themselves experiencing different scenarios. Participants then imagined the same scenario again, engaged in no further simulation of a scenario, or imagined a counterfactual outcome. On a subsequent recognition test, participants were more likely to make false alarms to counterfactual lures than novel scenarios. Older adults were more prone to these memory errors than younger adults. In Experiment 2, younger and older participants selected and performed different actions, then recalled performing some of those actions, imagined performing alternative actions to some of the selected actions, and did not imagine others. Participants, especially older adults, were more likely to falsely remember counterfactual actions than novel actions as previously performed. The findings suggest that counterfactual thinking can cause source confusion based on internally generated misinformation, consistent with its characterization as an adaptive constructive process. PMID:23560477

  20. Freeze drying of orally disintegrating tablets containing taste masked naproxen sodium granules in blisters.

    PubMed

    Stange, Ulrike; Führling, Christian; Gieseler, Henning

    2014-09-15

    Abstract Orally disintegrating tablets (ODTs) were freeze dried in blisters using the Lyostar® II SMART™ Freeze Dryer Technology. ODT formulations either without non-water soluble particles (placebo) or containing large fractions (717 mg) of taste-masked naproxen sodium (NaS) granules were freeze dried. The process data revealed differences between ODTs with and without embedded granules in the pressure rise curves as well as in the shelf (inlet) temperature adjustments during freeze-drying. Pressure rise curves of the placebo ODTs from eight hours process time showed no distinct temperature-dominated part, and the last optimization step of the shelf temperature to achieve -24.4 °C might be prone to errors. The final shelf temperature of ODTs containing granules was -23.3 °C. The detection of primary drying endpoints using SMART™ Technology or comparative pressure measurements was reliable for both ODT formulations, whereas the application of thermocouples resulted in premature endpoint indication. Product resistance of ODTs containing granules was generally elevated in comparison to ODTs without granules, but increased only slightly over the course of the drying process. In summary, the developed freeze-drying cycle was found applicable for production of elegant ODTs with incorporated taste masked NaS granules.

  1. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis.

    PubMed

    Benítez, José Alberto; Labra, José Emilio; Quiroga, Enedina; Martín, Vicente; García, Isaías; Marqués-Sánchez, Pilar; Benavides, Carmen

    2017-01-01

    There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  2. Proximity to AGCT sequences dictates MMR-independent versus MMR-dependent mechanisms for AID-induced mutation via UNG2

    PubMed Central

    Thientosapol, Eddy Sanchai; Sharbeen, George; Lau, K.K. Edwin; Bosnjak, Daniel; Durack, Timothy; Stevanovski, Igor; Weninger, Wolfgang

    2017-01-01

    Abstract AID deaminates C to U in either strand of Ig genes, exclusively producing C:G/G:C to T:A/A:T transition mutations if U is left unrepaired. Error-prone processing by UNG2 or mismatch repair diversifies mutation, predominantly at C:G or A:T base pairs, respectively. Here, we show that transversions at C:G base pairs occur by two distinct processing pathways that are dictated by sequence context. Within and near AGCT mutation hotspots, transversion mutation at C:G was driven by UNG2 without requirement for mismatch repair. Deaminations in AGCT were refractive both to processing by UNG2 and to high-fidelity base excision repair (BER) downstream of UNG2, regardless of mismatch repair activity. We propose that AGCT sequences resist faithful BER because they bind BER-inhibitory protein(s) and/or because hemi-deaminated AGCT motifs innately form a BER-resistant DNA structure. Distal to AGCT sequences, transversions at G were largely co-dependent on UNG2 and mismatch repair. We propose that AGCT-distal transversions are produced when apyrimidinic sites are exposed in mismatch excision patches, because completion of mismatch repair would require bypass of these sites. PMID:28039326

  3. Reading and visual processing in Greek dyslexic children: an eye-movement study.

    PubMed

    Hatzidaki, Anna; Gianneli, Maria; Petrakis, Eftichis; Makaronas, Nikolaos; Aslanides, Ioannis M

    2011-02-01

    We examined the impact of the effects of dyslexia on various processing and cognitive components (e.g., reading speed and accuracy) in a language with high phonological and orthographic consistency. Greek dyslexic children were compared with a chronological age-matched group on tasks that tested participants' phonological and orthographic awareness during reading and spelling, as well as their efficiency to detect a specific target-letter during a sequential visual search task. Dyslexic children showed impaired reading and spelling that was reflected in slow reading speed and error-prone performance, especially for non-words. Eye movement measures of text reading also provided supporting evidence for a reading deficit, with dyslexic participants producing more fixations and longer fixation duration as opposed to non-dyslexic participants. The results of the visual search task showed similar performance between the two groups, but when they were compared with the results of text reading, dyslexic participants were found to be able to process fewer stimuli (i.e., letters) at each fixation than non-dyslexics. Our findings further suggest that, although Greek dyslexics have the advantage of a consistent orthographic system which facilitates acquisition of reading and phonological awareness, they demonstrate more impaired access to orthographic forms than dyslexics of other transparent orthographies. Copyright © 2010 John Wiley & Sons, Ltd.

  4. Adaptive constructive processes and memory accuracy: consequences of counterfactual simulations in young and older adults.

    PubMed

    Gerlach, Kathy D; Dornblaser, David W; Schacter, Daniel L

    2014-01-01

    People frequently engage in counterfactual thinking: mental simulations of alternative outcomes to past events. Like simulations of future events, counterfactual simulations serve adaptive functions. However, future simulation can also result in various kinds of distortions and has thus been characterised as an adaptive constructive process. Here we approach counterfactual thinking as such and examine whether it can distort memory for actual events. In Experiments 1a/b young and older adults imagined themselves experiencing different scenarios. Participants then imagined the same scenario again, engaged in no further simulation of a scenario, or imagined a counterfactual outcome. On a subsequent recognition test participants were more likely to make false alarms to counterfactual lures than novel scenarios. Older adults were more prone to these memory errors than younger adults. In Experiment 2 younger and older participants selected and performed different actions, then recalled performing some of those actions, imagined performing alternative actions to some of the selected actions, and did not imagine others. Participants, especially older adults, were more likely to falsely remember counterfactual actions than novel actions as previously performed. The findings suggest that counterfactual thinking can cause source confusion based on internally generated misinformation, consistent with its characterisation as an adaptive constructive process.

  5. Determinants of spontaneous mutation in the bacterium Escherichia coli as revealed by whole-genome sequencing

    PubMed Central

    Foster, Patricia L.; Lee, Heewook; Popodi, Ellen; Townes, Jesse P.; Tang, Haixu

    2015-01-01

    A complete understanding of evolutionary processes requires that factors determining spontaneous mutation rates and spectra be identified and characterized. Using mutation accumulation followed by whole-genome sequencing, we found that the mutation rates of three widely diverged commensal Escherichia coli strains differ only by about 50%, suggesting that a rate of 1–2 × 10−3 mutations per generation per genome is common for this bacterium. Four major forces are postulated to contribute to spontaneous mutations: intrinsic DNA polymerase errors, endogenously induced DNA damage, DNA damage caused by exogenous agents, and the activities of error-prone polymerases. To determine the relative importance of these factors, we studied 11 strains, each defective for a major DNA repair pathway. The striking result was that only loss of the ability to prevent or repair oxidative DNA damage significantly impacted mutation rates or spectra. These results suggest that, with the exception of oxidative damage, endogenously induced DNA damage does not perturb the overall accuracy of DNA replication in normally growing cells and that repair pathways may exist primarily to defend against exogenously induced DNA damage. The thousands of mutations caused by oxidative damage recovered across the entire genome revealed strong local-sequence biases of these mutations. Specifically, we found that the identity of the 3′ base can affect the mutability of a purine by oxidative damage by as much as eightfold. PMID:26460006

  6. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS).

    PubMed

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G; Herrera-Cuenca, Marianella; Zimberg, Ioná Z; Tucker, Katherine L; Koletzko, Berthold; Pratt, Michael

    2015-09-16

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region.

  7. The Impact of Environmental and Endogenous Damage on Somatic Mutation Load in Human Skin Fibroblasts

    PubMed Central

    Saini, Natalie; Chan, Kin; Grimm, Sara A.; Dai, Shuangshuang; Fargo, David C.; Kaufmann, William K.; Taylor, Jack A.; Lee, Eunjung; Cortes-Ciriano, Isidro; Park, Peter J.; Schurman, Shepherd H.; Malc, Ewa P.; Mieczkowski, Piotr A.

    2016-01-01

    Accumulation of somatic changes, due to environmental and endogenous lesions, in the human genome is associated with aging and cancer. Understanding the impacts of these processes on mutagenesis is fundamental to understanding the etiology, and improving the prognosis and prevention of cancers and other genetic diseases. Previous methods relying on either the generation of induced pluripotent stem cells, or sequencing of single-cell genomes were inherently error-prone and did not allow independent validation of the mutations. In the current study we eliminated these potential sources of error by high coverage genome sequencing of single-cell derived clonal fibroblast lineages, obtained after minimal propagation in culture, prepared from skin biopsies of two healthy adult humans. We report here accurate measurement of genome-wide magnitude and spectra of mutations accrued in skin fibroblasts of healthy adult humans. We found that every cell contains at least one chromosomal rearrangement and 600–13,000 base substitutions. The spectra and correlation of base substitutions with epigenomic features resemble many cancers. Moreover, because biopsies were taken from body parts differing by sun exposure, we can delineate the precise contributions of environmental and endogenous factors to the accrual of genetic changes within the same individual. We show here that UV-induced and endogenous DNA damage can have a comparable impact on the somatic mutation loads in skin fibroblasts. Trial Registration ClinicalTrials.gov NCT01087307 PMID:27788131

  8. Standardization of the Food Composition Database Used in the Latin American Nutrition and Health Study (ELANS)

    PubMed Central

    Kovalskys, Irina; Fisberg, Mauro; Gómez, Georgina; Rigotti, Attilio; Cortés, Lilia Yadira; Yépez, Martha Cecilia; Pareja, Rossina G.; Herrera-Cuenca, Marianella; Zimberg, Ioná Z.; Tucker, Katherine L.; Koletzko, Berthold; Pratt, Michael

    2015-01-01

    Between-country comparisons of estimated dietary intake are particularly prone to error when different food composition tables are used. The objective of this study was to describe our procedures and rationale for the selection and adaptation of available food composition to a single database to enable cross-country nutritional intake comparisons. Latin American Study of Nutrition and Health (ELANS) is a multicenter cross-sectional study of representative samples from eight Latin American countries. A standard study protocol was designed to investigate dietary intake of 9000 participants enrolled. Two 24-h recalls using the Multiple Pass Method were applied among the individuals of all countries. Data from 24-h dietary recalls were entered into the Nutrition Data System for Research (NDS-R) program after a harmonization process between countries to include local foods and appropriately adapt the NDS-R database. A food matching standardized procedure involving nutritional equivalency of local food reported by the study participants with foods available in the NDS-R database was strictly conducted by each country. Standardization of food and nutrient assessments has the potential to minimize systematic and random errors in nutrient intake estimations in the ELANS project. This study is expected to result in a unique dataset for Latin America, enabling cross-country comparisons of energy, macro- and micro-nutrient intake within this region. PMID:26389952

  9. Medication-related clinical decision support in computerized provider order entry systems: a review.

    PubMed

    Kuperman, Gilad J; Bobb, Anne; Payne, Thomas H; Avery, Anthony J; Gandhi, Tejal K; Burns, Gerard; Classen, David C; Bates, David W

    2007-01-01

    While medications can improve patients' health, the process of prescribing them is complex and error prone, and medication errors cause many preventable injuries. Computer provider order entry (CPOE) with clinical decision support (CDS), can improve patient safety and lower medication-related costs. To realize the medication-related benefits of CDS within CPOE, one must overcome significant challenges. Healthcare organizations implementing CPOE must understand what classes of CDS their CPOE systems can support, assure that clinical knowledge underlying their CDS systems is reasonable, and appropriately represent electronic patient data. These issues often influence to what extent an institution will succeed with its CPOE implementation and achieve its desired goals. Medication-related decision support is probably best introduced into healthcare organizations in two stages, basic and advanced. Basic decision support includes drug-allergy checking, basic dosing guidance, formulary decision support, duplicate therapy checking, and drug-drug interaction checking. Advanced decision support includes dosing support for renal insufficiency and geriatric patients, guidance for medication-related laboratory testing, drug-pregnancy checking, and drug-disease contraindication checking. In this paper, the authors outline some of the challenges associated with both basic and advanced decision support and discuss how those challenges might be addressed. The authors conclude with summary recommendations for delivering effective medication-related clinical decision support addressed to healthcare organizations, application and knowledge base vendors, policy makers, and researchers.

  10. The mismeasure of morals: antisocial personality traits predict utilitarian responses to moral dilemmas.

    PubMed

    Bartels, Daniel M; Pizarro, David A

    2011-10-01

    Researchers have recently argued that utilitarianism is the appropriate framework by which to evaluate moral judgment, and that individuals who endorse non-utilitarian solutions to moral dilemmas (involving active vs. passive harm) are committing an error. We report a study in which participants responded to a battery of personality assessments and a set of dilemmas that pit utilitarian and non-utilitarian options against each other. Participants who indicated greater endorsement of utilitarian solutions had higher scores on measures of Psychopathy, machiavellianism, and life meaninglessness. These results question the widely-used methods by which lay moral judgments are evaluated, as these approaches lead to the counterintuitive conclusion that those individuals who are least prone to moral errors also possess a set of psychological characteristics that many would consider prototypically immoral. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  12. On the retrieval of crystallographic information from atom probe microscopy data via signal mapping from the detector coordinate space.

    PubMed

    Wallace, Nathan D; Ceguerra, Anna V; Breen, Andrew J; Ringer, Simon P

    2018-06-01

    Atom probe tomography is a powerful microscopy technique capable of reconstructing the 3D position and chemical identity of millions of atoms within engineering materials, at the atomic level. Crystallographic information contained within the data is particularly valuable for the purposes of reconstruction calibration and grain boundary analysis. Typically, analysing this data is a manual, time-consuming and error prone process. In many cases, the crystallographic signal is so weak that it is difficult to detect at all. In this study, a new automated signal processing methodology is demonstrated. We use the affine properties of the detector coordinate space, or the 'detector stack', as the basis for our calculations. The methodological framework and the visualisation tools are shown to be superior to the standard method of crystallographic pole visualisation directly from field evaporation images and there is no requirement for iterations between a full real-space initial tomographic reconstruction and the detector stack. The mapping approaches are demonstrated for aluminium, tungsten, magnesium and molybdenum. Implications for reconstruction calibration, accuracy of crystallographic measurements, reliability and repeatability are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Why are small and large numbers enumerated differently? A limited-capacity preattentive stage in vision.

    PubMed

    Trick, L M; Pylyshyn, Z W

    1994-01-01

    "Subitizing," the process of enumeration when there are fewer than 4 items, is rapid (40-100 ms/item), effortless, and accurate. "Counting," the process of enumeration when there are more than 4 items, is slow (250-350 ms/item), effortful, and error-prone. Why is there a difference in the way the small and large numbers of items are enumerated? A theory of enumeration is proposed that emerges from a general theory of vision, yet explains the numeric abilities of preverbal infants, children, and adults. We argue that subitizing exploits a limited-capacity parallel mechanism for item individuation, the FINST mechanism, associated with the multiple target tracking task (Pylyshyn, 1989; Pylyshyn & Storm, 1988). Two kinds of evidence support the claim that subitizing relies on preattentive information, whereas counting requires spatial attention. First, whenever spatial attention is needed to compute a spatial relation (cf. Ullman, 1984) or to perform feature integration (cf. Treisman & Gelade, 1980), subitizing does not occur (Trick & Pylyshyn, 1993a). Second, the position of the attentional focus, as manipulated by cue validity, has a greater effect on counting than subitizing latencies (Trick & Pylyshyn, 1993b).

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farhi, David; Feige, Ilya; Freytsis, Marat

    Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MadGraph [1], Alpgen [2] or Sherpa [3]. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution.more » These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including e +e – two- and four-jet event shapes, n-jettiness and jet-mass related observables at hadron colliders at next-to-leading-log (NLL) matched to leading order (LO). Furthermore, the attached code can be used to modify MadGraph to export the relevant LO hard functions and color structures for arbitrary processes.« less

  15. Trial-by-trial fluctuations in CNV amplitude reflect anticipatory adjustment of response caution.

    PubMed

    Boehm, Udo; van Maanen, Leendert; Forstmann, Birte; van Rijn, Hedderik

    2014-08-01

    The contingent negative variation, a slow cortical potential, occurs when humans are warned by a stimulus about an upcoming task. The cognitive processes that give rise to this EEG potential are not yet well understood. To explain these processes, we adopt a recently developed theoretical framework from the area of perceptual decision-making. This framework assumes that the basal ganglia control the tradeoff between fast and accurate decision-making in the cortex. It suggests that an increase in cortical excitability serves to lower response caution, which results in faster but more error prone responding. We propose that the CNV reflects this increased cortical excitability. To test this hypothesis, we conducted an EEG experiment in which participants performed the random dot motion task either under speed or under accuracy stress. Our results show that trial-by-trial fluctuations in participants' response speed as well as model-based estimates of response caution correlated with single-trial CNV amplitude under conditions of speed but not accuracy stress. We conclude that the CNV might reflect adjustments of response caution, which serves to enhance quick decision-making. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Streamlining resummed QCD calculations using Monte Carlo integration

    DOE PAGES

    Farhi, David; Feige, Ilya; Freytsis, Marat; ...

    2016-08-18

    Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MadGraph [1], Alpgen [2] or Sherpa [3]. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution.more » These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including e +e – two- and four-jet event shapes, n-jettiness and jet-mass related observables at hadron colliders at next-to-leading-log (NLL) matched to leading order (LO). Furthermore, the attached code can be used to modify MadGraph to export the relevant LO hard functions and color structures for arbitrary processes.« less

  17. Artificial neuron-glia networks learning approach based on cooperative coevolution.

    PubMed

    Mesejo, Pablo; Ibáñez, Oscar; Fernández-Blanco, Enrique; Cedrón, Francisco; Pazos, Alejandro; Porto-Pazos, Ana B

    2015-06-01

    Artificial Neuron-Glia Networks (ANGNs) are a novel bio-inspired machine learning approach. They extend classical Artificial Neural Networks (ANNs) by incorporating recent findings and suppositions about the way information is processed by neural and astrocytic networks in the most evolved living organisms. Although ANGNs are not a consolidated method, their performance against the traditional approach, i.e. without artificial astrocytes, was already demonstrated on classification problems. However, the corresponding learning algorithms developed so far strongly depends on a set of glial parameters which are manually tuned for each specific problem. As a consequence, previous experimental tests have to be done in order to determine an adequate set of values, making such manual parameter configuration time-consuming, error-prone, biased and problem dependent. Thus, in this paper, we propose a novel learning approach for ANGNs that fully automates the learning process, and gives the possibility of testing any kind of reasonable parameter configuration for each specific problem. This new learning algorithm, based on coevolutionary genetic algorithms, is able to properly learn all the ANGNs parameters. Its performance is tested on five classification problems achieving significantly better results than ANGN and competitive results with ANN approaches.

  18. The mechanism of untargeted mutagenesis in UV-irradiated yeast.

    PubMed

    Lawrence, C W; Christensen, R B

    1982-01-01

    The SOS error-prone repair hypothesis proposes that untargeted and targeted mutations in E. coli both result from the inhibition of polymerase functions that normally maintain fidelity, and that this is a necessary precondition for translesion synthesis. Using mating experiments with excision deficient strains of Bakers' yeast, Saccharomyces cerevisiae, we find that up to 40% of cycl-91 revertants induced by UV are untargeted, showing that a reduction in fidelity is also found in irradiated cells of this organism. We are, however, unable to detect the induction or activation of any diffusible factor capable of inhibiting fidelity, and therefore suggest that untargeted and targeted mutations are the consequence of largely different processes. We propose that these observations are best explained in terms of a limited fidelity model. Untargeted mutations are thought to result from the limited capacity of processes which normally maintain fidelity, which are active during replication on both irradiated and unirradiated templates. Even moderate UV fluences saturate this capacity, leading to competition for the limited resource. Targeted mutations are believed to result from the limited, though far from negligible, capacity of lesions like pyrimidine dimers to form Watson-Crick base pairs.

  19. Residents' numeric inputting error in computerized physician order entry prescription.

    PubMed

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial incidence of errors found in this study. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Automatic white blood cell classification using pre-trained deep learning models: ResNet and Inception

    NASA Astrophysics Data System (ADS)

    Habibzadeh, Mehdi; Jannesari, Mahboobeh; Rezaei, Zahra; Baharvand, Hossein; Totonchi, Mehdi

    2018-04-01

    This works gives an account of evaluation of white blood cell differential counts via computer aided diagnosis (CAD) system and hematology rules. Leukocytes, also called white blood cells (WBCs) play main role of the immune system. Leukocyte is responsible for phagocytosis and immunity and therefore in defense against infection involving the fatal diseases incidence and mortality related issues. Admittedly, microscopic examination of blood samples is a time consuming, expensive and error-prone task. A manual diagnosis would search for specific Leukocytes and number abnormalities in the blood slides while complete blood count (CBC) examination is performed. Complications may arise from the large number of varying samples including different types of Leukocytes, related sub-types and concentration in blood, which makes the analysis prone to human error. This process can be automated by computerized techniques which are more reliable and economical. In essence, we seek to determine a fast, accurate mechanism for classification and gather information about distribution of white blood evidences which may help to diagnose the degree of any abnormalities during CBC test. In this work, we consider the problem of pre-processing and supervised classification of white blood cells into their four primary types including Neutrophils, Eosinophils, Lymphocytes, and Monocytes using a consecutive proposed deep learning framework. For first step, this research proposes three consecutive pre-processing calculations namely are color distortion; bounding box distortion (crop) and image flipping mirroring. In second phase, white blood cell recognition performed with hierarchy topological feature extraction using Inception and ResNet architectures. Finally, the results obtained from the preliminary analysis of cell classification with (11200) training samples and 1244 white blood cells evaluation data set are presented in confusion matrices and interpreted using accuracy rate, and false positive with the classification framework being validated with experiments conducted on poor quality blood images sized 320 × 240 pixels. The deferential outcomes in the challenging cell detection task, as shown in result section, indicate that there is a significant achievement in using Inception and ResNet architecture with proposed settings. Our framework detects on average 100% of the four main white blood cell types using ResNet V1 50 while also alternative promising result with 99.84% and 99.46% accuracy rate obtained with ResNet V1 152 and ResNet 101, respectively with 3000 epochs and fine-tuning all layers. Further statistical confusion matrix tests revealed that this work achieved 1, 0.9979, 0.9989 sensitivity values when area under the curve (AUC) scores above 1, 0.9992, 0.9833 on three proposed techniques. In addition, current work shows negligible and small false negative 0, 2, 1 and substantial false positive with 0, 0, 5 values in Leukocytes detection.

Top