Sample records for avoiding common errors

  1. Machine Translation as a Model for Overcoming Some Common Errors in English-into-Arabic Translation among EFL University Freshmen

    ERIC Educational Resources Information Center

    El-Banna, Adel I.; Naeem, Marwa A.

    2016-01-01

    This research work aimed at making use of Machine Translation to help students avoid some syntactic, semantic and pragmatic common errors in translation from English into Arabic. Participants were a hundred and five freshmen who studied the "Translation Common Errors Remedial Program" prepared by the researchers. A testing kit that…

  2. Twenty Common Testing Mistakes for EFL Teachers to Avoid

    ERIC Educational Resources Information Center

    Henning, Grant

    2012-01-01

    To some extent, good testing procedure, like good language use, can be achieved through avoidance of errors. Almost any language-instruction program requires the preparation and administration of tests, and it is only to the extent that certain common testing mistakes have been avoided that such tests can be said to be worthwhile selection,…

  3. Ten common errors beginning substance abuse workers make in group treatment.

    PubMed

    Greif, G L

    1996-01-01

    Beginning therapists sometimes make mistakes when working with substance abusers in groups. This article discusses ten common errors that the author has observed. Five center on the therapist's approach and five center on the nuts and bolts of group leadership. Suggestions are offered for how to avoid them.

  4. The Language of Scholarship: How to Rapidly Locate and Avoid Common APA Errors.

    PubMed

    Freysteinson, Wyona M; Krepper, Rebecca; Mellott, Susan

    2015-10-01

    This article is relevant for nurses and nursing students who are writing scholarly documents for work, school, or publication and who have a basic understanding of American Psychological Association (APA) style. Common APA errors on the reference list and in citations within the text are reviewed. Methods to quickly find and reduce those errors are shared. Copyright 2015, SLACK Incorporated.

  5. Guidance for Avoiding Incomplete Premanufacture Notices or Bona Fides in the New Chemicals Program

    EPA Pesticide Factsheets

    This page contains documents to help you avoid having an incomplete Premanufacture notice or Bona Fide . The documents go over the chemical identity requirements and common errors that result in incompletes.

  6. Most Common Formal Grammatical Errors Committed by Authors

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2017-01-01

    Empirical evidence has been provided about the importance of avoiding American Psychological Association (APA) errors in the abstract, body, reference list, and table sections of empirical research articles. Specifically, authors are significantly more likely to have their manuscripts rejected for publication if they commit numerous APA…

  7. Modeling error distributions of growth curve models through Bayesian methods.

    PubMed

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  8. Modeling Error Distributions of Growth Curve Models through Bayesian Methods

    ERIC Educational Resources Information Center

    Zhang, Zhiyong

    2016-01-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is…

  9. What do reviewers look for in an original research article?

    PubMed

    Shankar, P R

    2012-01-01

    In this article common errors committed by authors especially those, whose first language is not English, while writing an original research articleis described. Avoiding common errors and improving chances of publication has also been covered. This article may resemble instruction to the author. However, tips from reviewer's eyes has been given. The abstract is the section of the paper most commonly read and care should be taken while writing this section. Keywordsare usedto retrieve articles following searches and use of words from the MeSH database is recommended.The introduction describes work already conducted in the particular area and briefly mentions how the manuscript will add to the existing knowledge.The methods section describes how the study was conducted, is written in the past tense and is often the first part of the paper to be written. The results describe what was found in the study and is usually written after the methods section.The discussion compares the study with the literature and helps to put the study findings in context. The conclusions should be based on the results of the study. The references should be written strictly according to the journal format. Language should be simple, active voice should be used and jargon avoided. Avoid directly quoting from reference articles and paraphrase these in your own words to avoid plagiarism.

  10. Writing Research Reports.

    PubMed

    Sessler, Daniel I; Shafer, Steven

    2018-01-01

    Clear writing makes manuscripts easier to understand. Clear writing enhances research reports, increasing clinical adoption and scientific impact. We discuss styles and organization to help junior investigators present their findings and avoid common errors.

  11. Unreliable numbers: error and harm induced by bad design can be reduced by better design

    PubMed Central

    Thimbleby, Harold; Oladimeji, Patrick; Cairns, Paul

    2015-01-01

    Number entry is a ubiquitous activity and is often performed in safety- and mission-critical procedures, such as healthcare, science, finance, aviation and in many other areas. We show that Monte Carlo methods can quickly and easily compare the reliability of different number entry systems. A surprising finding is that many common, widely used systems are defective, and induce unnecessary human error. We show that Monte Carlo methods enable designers to explore the implications of normal and unexpected operator behaviour, and to design systems to be more resilient to use error. We demonstrate novel designs with improved resilience, implying that the common problems identified and the errors they induce are avoidable. PMID:26354830

  12. Medication errors: problems and recommendations from a consensus meeting

    PubMed Central

    Agrawal, Abha; Aronson, Jeffrey K; Britten, Nicky; Ferner, Robin E; de Smet, Peter A; Fialová, Daniela; Fitzgerald, Richard J; Likić, Robert; Maxwell, Simon R; Meyboom, Ronald H; Minuz, Pietro; Onder, Graziano; Schachter, Michael; Velo, Giampaolo

    2009-01-01

    Here we discuss 15 recommendations for reducing the risks of medication errors: Provision of sufficient undergraduate learning opportunities to make medical students safe prescribers. Provision of opportunities for students to practise skills that help to reduce errors. Education of students about common types of medication errors and how to avoid them. Education of prescribers in taking accurate drug histories. Assessment in medical schools of prescribing knowledge and skills and demonstration that newly qualified doctors are safe prescribers. European harmonization of prescribing and safety recommendations and regulatory measures, with regular feedback about rational drug use. Comprehensive assessment of elderly patients for declining function. Exploration of low-dose regimens for elderly patients and preparation of special formulations as required. Training for all health-care professionals in drug use, adverse effects, and medication errors in elderly people. More involvement of pharmacists in clinical practice. Introduction of integrated prescription forms and national implementation in individual countries. Development of better monitoring systems for detecting medication errors, based on classification and analysis of spontaneous reports of previous reactions, and for investigating the possible role of medication errors when patients die. Use of IT systems, when available, to provide methods of avoiding medication errors; standardization, proper evaluation, and certification of clinical information systems. Nonjudgmental communication with patients about their concerns and elicitation of symptoms that they perceive to be adverse drug reactions. Avoidance of defensive reactions if patients mention symptoms resulting from medication errors. PMID:19594525

  13. Avoiding common pitfalls in qualitative data collection and transcription.

    PubMed

    Easton, K L; McComish, J F; Greenberg, R

    2000-09-01

    The subjective nature of qualitative research necessitates scrupulous scientific methods to ensure valid results. Although qualitative methods such as grounded theory, phenomenology, and ethnography yield rich data, consumers of research need to be able to trust the findings reported in such studies. Researchers are responsible for establishing the trustworthiness of qualitative research through a variety of ways. Specific challenges faced in the field can seriously threaten the dependability of the data. However, by minimizing potential errors that can occur when doing fieldwork, researchers can increase the trustworthiness of the study. The purpose of this article is to present three of the pitfalls that can occur in qualitative research during data collection and transcription: equipment failure, environmental hazards, and transcription errors. Specific strategies to minimize the risk for avoidable errors will be discussed.

  14. The Seven Deadly Sins of Online Microcomputing.

    ERIC Educational Resources Information Center

    King, Alan

    1989-01-01

    Offers suggestions for avoiding common errors in online microcomputer use. Areas discussed include learning the basics; hardware protection; backup options; hard disk organization; software selection; file security; and the use of dedicated communications lines. (CLB)

  15. Survey of childhood blindness and visual impairment in Botswana.

    PubMed

    Nallasamy, Sudha; Anninger, William V; Quinn, Graham E; Kroener, Brian; Zetola, Nicola M; Nkomazana, Oathokwa

    2011-10-01

    In terms of blind-person years, the worldwide burden of childhood blindness is second only to cataracts. In many developing countries, 30-72% of childhood blindness is avoidable. The authors conducted this study to determine the causes of childhood blindness and visual impairment (VI) in Botswana, a middle-income country with limited access to ophthalmic care. This study was conducted over 4 weeks in eight cities and villages in Botswana. Children were recruited through a radio advertisement and local outreach programmes. Those ≤ 15 years of age with visual acuity <6/18 in either eye were enrolled. The WHO/Prevention of Blindness Eye Examination Record for Children with Blindness and Low Vision was used to record data. The authors enrolled 241 children, 79 with unilateral and 162 with bilateral VI. Of unilateral cases, 89% were avoidable: 23% preventable (83% trauma-related) and 66% treatable (40% refractive error and 31% amblyopia). Of bilateral cases, 63% were avoidable: 5% preventable and 58% treatable (33% refractive error and 31% congenital cataracts). Refractive error, which is easily correctable with glasses, is the most common cause of bilateral VI, with cataracts a close second. A nationwide intervention is currently being planned to reduce the burden of avoidable childhood VI in Botswana.

  16. Putting Meaning Back Into the Mean: A Comment on the Misuse of Elementary Statistics in a Sample of Manuscripts Submitted to Clinical Therapeutics.

    PubMed

    Forrester, Janet E

    2015-12-01

    Errors in the statistical presentation and analyses of data in the medical literature remain common despite efforts to improve the review process, including the creation of guidelines for authors and the use of statistical reviewers. This article discusses common elementary statistical errors seen in manuscripts recently submitted to Clinical Therapeutics and describes some ways in which authors and reviewers can identify errors and thus correct them before publication. A nonsystematic sample of manuscripts submitted to Clinical Therapeutics over the past year was examined for elementary statistical errors. Clinical Therapeutics has many of the same errors that reportedly exist in other journals. Authors require additional guidance to avoid elementary statistical errors and incentives to use the guidance. Implementation of reporting guidelines for authors and reviewers by journals such as Clinical Therapeutics may be a good approach to reduce the rate of statistical errors. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.

  17. [Safety management in pathology laboratory: from specimen handling to confirmation of reports].

    PubMed

    Minato, Hiroshi; Nojima, Takayuki; Nakano, Mariko; Yamazaki, Michiko

    2011-03-01

    Medical errors in pathological diagnosis give a huge amount of physical and psychological damage to patients as well as medical staffs. We discussed here how to avoid medical errors in surgical pathology laboratory through our experience. Handling of surgical specimens and diagnosing process requires intensive labor and involves many steps. Each hospital reports many kinds of accidents or incidents, however, many laboratories share common problems and each process has its specific risk for the certain error. We analyzed the problems in each process and concentrated on avoiding misaccessioning, mislabeling, and misreporting. We have made several changes in our system, such as barcode labels, digital images of all specimens, putting specimens in embedding cassettes directly on the endoscopic biopsied specimens, and using a multitissue control block as controls in immunohistochemistry. Some problems are still left behind, but we have reduced the errors by decreasing the number of artificial operation as much as possible. A pathological system recognizing the status of read or unread the pathological reports by clinician are now underconstruction. We also discussed about quality assurance of diagnosis, cooperation with clinicians and other comedical staffs, and organization and method. In order to operate riskless work, it is important for all the medical staffs to have common awareness of the problems, keeping careful observations, and sharing all the information in common. Incorporation of an organizational management tool such as ISO 15189 and utilizing PDCA cycle is also helpful for safety management and quality improvement of the laboratory.

  18. Common Scientific and Statistical Errors in Obesity Research

    PubMed Central

    George, Brandon J.; Beasley, T. Mark; Brown, Andrew W.; Dawson, John; Dimova, Rositsa; Divers, Jasmin; Goldsby, TaShauna U.; Heo, Moonseong; Kaiser, Kathryn A.; Keith, Scott; Kim, Mimi Y.; Li, Peng; Mehta, Tapan; Oakes, J. Michael; Skinner, Asheley; Stuart, Elizabeth; Allison, David B.

    2015-01-01

    We identify 10 common errors and problems in the statistical analysis, design, interpretation, and reporting of obesity research and discuss how they can be avoided. The 10 topics are: 1) misinterpretation of statistical significance, 2) inappropriate testing against baseline values, 3) excessive and undisclosed multiple testing and “p-value hacking,” 4) mishandling of clustering in cluster randomized trials, 5) misconceptions about nonparametric tests, 6) mishandling of missing data, 7) miscalculation of effect sizes, 8) ignoring regression to the mean, 9) ignoring confirmation bias, and 10) insufficient statistical reporting. We hope that discussion of these errors can improve the quality of obesity research by helping researchers to implement proper statistical practice and to know when to seek the help of a statistician. PMID:27028280

  19. Self-regulation of driving and its relationship to driving ability among older adults.

    PubMed

    Baldock, M R J; Mathias, J L; McLean, A J; Berndt, A

    2006-09-01

    Although it is known that older drivers limit their driving, it is not known whether this self-regulation is related to actual driving ability. A sample of 104 older drivers, aged between 60 and 92, completed a questionnaire about driving habits and attitudes. Ninety of these drivers also completed a structured on-road driving test. A measure of self-regulation was derived from drivers' self-reported avoidance of difficult driving situations. The on-road driving test involved a standard assessment used to determine fitness to drive. Driving test scores for the study were based on the number of errors committed in the driving tests, with weightings given according to the seriousness of the errors. The most commonly avoided difficult driving situations, according to responses on the questionnaire, were parallel parking and driving at night in the rain, while the least avoided situation was driving alone. Poorer performance on the driving test was not related to overall avoidance of difficult driving situations. Stronger relationships were found between driving ability and avoidance of specific difficult driving situations. These specific driving situations were the ones in which the drivers had low confidence and that the drivers were most able to avoid if they wished to.

  20. The Assumption of a Reliable Instrument and Other Pitfalls to Avoid When Considering the Reliability of Data

    PubMed Central

    Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K.

    2012-01-01

    The purpose of this article is to help researchers avoid common pitfalls associated with reliability including incorrectly assuming that (a) measurement error always attenuates observed score correlations, (b) different sources of measurement error originate from the same source, and (c) reliability is a function of instrumentation. To accomplish our purpose, we first describe what reliability is and why researchers should care about it with focus on its impact on effect sizes. Second, we review how reliability is assessed with comment on the consequences of cumulative measurement error. Third, we consider how researchers can use reliability generalization as a prescriptive method when designing their research studies to form hypotheses about whether or not reliability estimates will be acceptable given their sample and testing conditions. Finally, we discuss options that researchers may consider when faced with analyzing unreliable data. PMID:22518107

  1. Survey of childhood blindness and visual impairment in Botswana

    PubMed Central

    Nallasamy, Sudha; Anninger, William V; Quinn, Graham E; Kroener, Brian; Zetola, Nicola M; Nkomazana, Oathokwa

    2014-01-01

    Background/aims In terms of blind-person years, the worldwide burden of childhood blindness is second only to cataracts. In many developing countries, 30–72% of childhood blindness is avoidable. The authors conducted this study to determine the causes of childhood blindness and visual impairment (VI) in Botswana, a middle-income country with limited access to ophthalmic care. Methods This study was conducted over 4 weeks in eight cities and villages in Botswana. Children were recruited through a radio advertisement and local outreach programmes. Those ≤15 years of age with visual acuity <6/18 in either eye were enrolled. The WHO/Prevention of Blindness Eye Examination Record for Children with Blindness and Low Vision was used to record data. Results The authors enrolled 241 children, 79 with unilateral and 162 with bilateral VI. Of unilateral cases, 89% were avoidable: 23% preventable (83% trauma-related) and 66% treatable (40% refractive error and 31% amblyopia). Of bilateral cases, 63% were avoidable: 5% preventable and 58% treatable (33% refractive error and 31% congenital cataracts). Conclusion Refractive error, which is easily correctable with glasses, is the most common cause of bilateral VI, with cataracts a close second. A nationwide intervention is currently being planned to reduce the burden of avoidable childhood VI in Botswana. PMID:21242581

  2. Communication errors in radiology - Pitfalls and how to avoid them.

    PubMed

    Waite, Stephen; Scott, Jinel Moore; Drexler, Ian; Martino, Jennifer; Legasto, Alan; Gale, Brian; Kolla, Srinivas

    2018-06-07

    Communication failures are a common cause of patient harm and malpractice claims against radiologists. In addition to overt communication breakdowns among providers, it is also important to address the quality of communication to optimize patient outcomes. In this review, we describe common communication failures and potential solutions providing a framework for radiologists to improve health care delivery. Copyright © 2018. Published by Elsevier Inc.

  3. Avoidable errors in dealing with anaphylactoid reactions to iodinated contrast media.

    PubMed

    Segal, Arthur J; Bush, William H

    2011-03-01

    Contrast reactions are much less common today than in the past. This is principally because of the current and predominant use of low and iso-osmolar contrast media compared with the prior use of high osmolality contrast media. As a result of the significantly diminished frequency, there are now fewer opportunities for physicians to recognize and appropriately treat such adverse reactions. In review of the literature combined with our own clinical and legal experience, 12 potential errors were identified and these are reviewed in detail so that they can be avoided by the physician-in-charge. Basic treatment considerations are presented along with a plan to systematize an approach to contrast reactions, simplify treatment options and plans, and schedule periodic drills.

  4. Monte Carlo Analysis as a Trajectory Design Driver for the TESS Mission

    NASA Technical Reports Server (NTRS)

    Nickel, Craig; Lebois, Ryan; Lutz, Stephen; Dichmann, Donald; Parker, Joel

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will be injected into a highly eccentric Earth orbit and fly 3.5 phasing loops followed by a lunar flyby to enter a mission orbit with lunar 2:1 resonance. Through the phasing loops and mission orbit, the trajectory is significantly affected by lunar and solar gravity. We have developed a trajectory design to achieve the mission orbit and meet mission constraints, including eclipse avoidance and a 30-year geostationary orbit avoidance requirement. A parallelized Monte Carlo simulation was performed to validate the trajectory after injecting common perturbations, including launch dispersions, orbit determination errors, and maneuver execution errors. The Monte Carlo analysis helped identify mission risks and is used in the trajectory selection process.

  5. Observations of fallibility in applications of modern programming methodologies

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.; Yelowitz, L.

    1976-01-01

    Errors, inconsistencies, or confusing points are noted in a variety of published algorithms, many of which are being used as examples in formulating or teaching principles of such modern programming methodologies as formal specification, systematic construction, and correctness proving. Common properties of these points of contention are abstracted. These properties are then used to pinpoint possible causes of the errors and to formulate general guidelines which might help to avoid further errors. The common characteristic of mathematical rigor and reasoning in these examples is noted, leading to some discussion about fallibility in mathematics, and its relationship to fallibility in these programming methodologies. The overriding goal is to cast a more realistic perspective on the methodologies, particularly with respect to older methodologies, such as testing, and to provide constructive recommendations for their improvement.

  6. Evidence for aversive withdrawal response to own errors.

    PubMed

    Hochman, Eldad Yitzhak; Milman, Valery; Tal, Liron

    2017-10-01

    Recent model suggests that error detection gives rise to defensive motivation prompting protective behavior. Models of active avoidance behavior predict it should grow larger with threat imminence and avoidance. We hypothesized that in a task requiring left or right key strikes, error detection would drive an avoidance reflex manifested by rapid withdrawal of an erring finger growing larger with threat imminence and avoidance. In experiment 1, three groups differing by error-related threat imminence and avoidance performed a flanker task requiring left or right force sensitive-key strikes. As predicted, errors were followed by rapid force release growing faster with threat imminence and opportunity to evade threat. In experiment 2, we established a link between error key release time (KRT) and the subjective sense of inner-threat. In a simultaneous, multiple regression analysis of three error-related compensatory mechanisms (error KRT, flanker effect, error correction RT), only error KRT was significantly associated with increased compulsive checking tendencies. We propose that error response withdrawal reflects an error-withdrawal reflex. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Errors in imaging of traumatic injuries.

    PubMed

    Scaglione, Mariano; Iaselli, Francesco; Sica, Giacomo; Feragalli, Beatrice; Nicola, Refky

    2015-10-01

    The advent of multi-detector computed tomography (MDCT) has drastically improved the outcomes of patients with multiple traumatic injuries. However, there are still diagnostic challenges to be considered. A missed or the delay of a diagnosis in trauma patients can sometimes be related to perception or other non-visual cues, while other errors are due to poor technique or poor image quality. In order to avoid any serious complications, it is important for the practicing radiologist to be cognizant of some of the most common types of errors. The objective of this article is to review the various types of errors in the evaluation of patients with multiple trauma injuries or polytrauma with MDCT.

  8. Risk of Performance and Behavioral Health Decrements Due to Inadequate Cooperation, Coordination, Communication, and Psychosocial Adaptation within a Team

    NASA Technical Reports Server (NTRS)

    Landon, Lauren Blackwell; Vessey, William B.; Barrett, Jamie D.

    2015-01-01

    A team is defined as: "two or more individuals who interact socially and adaptively, have shared or common goals, and hold meaningful task interdependences; it is hierarchically structured and has a limited life span; in it expertise and roles are distributed; and it is embedded within an organization/environmental context that influences and is influenced by ongoing processes and performance outcomes" (Salas, Stagl, Burke, & Goodwin, 2007, p. 189). From the NASA perspective, a team is commonly understood to be a collection of individuals that is assigned to support and achieve a particular mission. Thus, depending on context, this definition can encompass both the spaceflight crew and the individuals and teams in the larger multi-team system who are assigned to support that crew during a mission. The Team Risk outcomes of interest are predominantly performance related, with a secondary emphasis on long-term health; this is somewhat unique in the NASA HRP in that most Risk areas are medically related and primarily focused on long-term health consequences. In many operational environments (e.g., aviation), performance is assessed as the avoidance of errors. However, the research on performance errors is ambiguous. It implies that actions may be dichotomized into "correct" or "incorrect" responses, where incorrect responses or errors are always undesirable. Researchers have argued that this dichotomy is a harmful oversimplification, and it would be more productive to focus on the variability of human performance and how organizations can manage that variability (Hollnagel, Woods, & Leveson, 2006) (Category III1). Two problems occur when focusing on performance errors: 1) the errors are infrequent and, therefore, difficult to observe and record; and 2) the errors do not directly correspond to failure. Research reveals that humans are fairly adept at correcting or compensating for performance errors before such errors result in recognizable or recordable failures. Astronauts are notably adept high performers. Most failures are recorded only when multiple, small errors occur and humans are unable to recognize and correct or compensate for these errors in time to prevent a failure (Dismukes, Berman, Loukopoulos, 2007) (Category III). More commonly, observers record variability in levels of performance. Some teams commit no observable errors but fail to achieve performance objectives or perform only adequately, while other teams commit some errors but perform spectacularly. Successful performance, therefore, cannot be viewed as simply the absence of errors or the avoidance of failure Johnson Space Center (JSC) Joint Leadership Team, 2008). While failure is commonly attributed to making a major error, focusing solely on the elimination of error(s) does not significantly reduce the risk of failure. Failure may also occur when performance is simply insufficient or an effort is incapable of adjusting sufficiently to a contextual change (e.g., changing levels of autonomy).

  9. Monte Carlo Analysis as a Trajectory Design Driver for the Transiting Exoplanet Survey Satellite (TESS) Mission

    NASA Technical Reports Server (NTRS)

    Nickel, Craig; Parker, Joel; Dichmann, Don; Lebois, Ryan; Lutz, Stephen

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will be injected into a highly eccentric Earth orbit and fly 3.5 phasing loops followed by a lunar flyby to enter a mission orbit with lunar 2:1 resonance. Through the phasing loops and mission orbit, the trajectory is significantly affected by lunar and solar gravity. We have developed a trajectory design to achieve the mission orbit and meet mission constraints, including eclipse avoidance and a 30-year geostationary orbit avoidance requirement. A parallelized Monte Carlo simulation was performed to validate the trajectory after injecting common perturbations, including launch dispersions, orbit determination errors, and maneuver execution errors. The Monte Carlo analysis helped identify mission risks and is used in the trajectory selection process.

  10. Medication Errors in Pediatric Anesthesia: A Report From the Wake Up Safe Quality Improvement Initiative.

    PubMed

    Lobaugh, Lauren M Y; Martin, Lizabeth D; Schleelein, Laura E; Tyler, Donald C; Litman, Ronald S

    2017-09-01

    Wake Up Safe is a quality improvement initiative of the Society for Pediatric Anesthesia that contains a deidentified registry of serious adverse events occurring in pediatric anesthesia. The aim of this study was to describe and characterize reported medication errors to find common patterns amenable to preventative strategies. In September 2016, we analyzed approximately 6 years' worth of medication error events reported to Wake Up Safe. Medication errors were classified by: (1) medication category; (2) error type by phase of administration: prescribing, preparation, or administration; (3) bolus or infusion error; (4) provider type and level of training; (5) harm as defined by the National Coordinating Council for Medication Error Reporting and Prevention; and (6) perceived preventability. From 2010 to the time of our data analysis in September 2016, 32 institutions had joined and submitted data on 2087 adverse events during 2,316,635 anesthetics. These reports contained details of 276 medication errors, which comprised the third highest category of events behind cardiac and respiratory related events. Medication errors most commonly involved opioids and sedative/hypnotics. When categorized by phase of handling, 30 events occurred during preparation, 67 during prescribing, and 179 during administration. The most common error type was accidental administration of the wrong dose (N = 84), followed by syringe swap (accidental administration of the wrong syringe, N = 49). Fifty-seven (21%) reported medication errors involved medications prepared as infusions as opposed to 1 time bolus administrations. Medication errors were committed by all types of anesthesia providers, most commonly by attendings. Over 80% of reported medication errors reached the patient and more than half of these events caused patient harm. Fifteen events (5%) required a life sustaining intervention. Nearly all cases (97%) were judged to be either likely or certainly preventable. Our findings characterize the most common types of medication errors in pediatric anesthesia practice and provide guidance on future preventative strategies. Many of these errors will be almost entirely preventable with the use of prefilled medication syringes to avoid accidental ampule swap, bar-coding at the point of medication administration to prevent syringe swap and to confirm the proper dose, and 2-person checking of medication infusions for accuracy.

  11. The science of medical decision making: neurosurgery, errors, and personal cognitive strategies for improving quality of care.

    PubMed

    Fargen, Kyle M; Friedman, William A

    2014-01-01

    During the last 2 decades, there has been a shift in the U.S. health care system towards improving the quality of health care provided by enhancing patient safety and reducing medical errors. Unfortunately, surgical complications, patient harm events, and malpractice claims remain common in the field of neurosurgery. Many of these events are potentially avoidable. There are an increasing number of publications in the medical literature in which authors address cognitive errors in diagnosis and treatment and strategies for reducing such errors, but these are for the most part absent in the neurosurgical literature. The purpose of this article is to highlight the complexities of medical decision making to a neurosurgical audience, with the hope of providing insight into the biases that lead us towards error and strategies to overcome our innate cognitive deficiencies. To accomplish this goal, we review the current literature on medical errors and just culture, explain the dual process theory of cognition, identify common cognitive errors affecting neurosurgeons in practice, review cognitive debiasing strategies, and finally provide simple methods that can be easily assimilated into neurosurgical practice to improve clinical decision making. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Guidelines for Teaching the Holocaust: Avoiding Common Pedagogical Errors

    ERIC Educational Resources Information Center

    Lindquist, David H.

    2006-01-01

    Teaching the Holocaust is a complex undertaking involving twists and turns that can frustrate and even intimidate educators who teach the Holocaust. This complexity involves both the event's history and its pedagogy. In this article, the author considers eight pedagogical approaches that often cause problems in teaching the event. He states each…

  13. Clinical Problem Analysis (CPA): A Systematic Approach To Teaching Complex Medical Problem Solving.

    ERIC Educational Resources Information Center

    Custers, Eugene J. F. M.; Robbe, Peter F. De Vries; Stuyt, Paul M. J.

    2000-01-01

    Discusses clinical problem analysis (CPA) in medical education, an approach to solving complex clinical problems. Outlines the five step CPA model and examines the value of CPA's content-independent (methodical) approach. Argues that teaching students to use CPA will enable them to avoid common diagnostic reasoning errors and pitfalls. Compares…

  14. Seven rules to avoid the tragedy of the commons.

    PubMed

    Murase, Yohsuke; Baek, Seung Ki

    2018-07-14

    Cooperation among self-interested players in a social dilemma is fragile and easily interrupted by mistakes. In this work, we study the repeated n-person public-goods game and search for a strategy that forms a cooperative Nash equilibrium in the presence of implementation error with a guarantee that the resulting payoff will be no less than any of the co-players'. By enumerating strategic possibilities for n=3, we show that such a strategy indeed exists when its memory length m equals three. It means that a deterministic strategy can be publicly employed to stabilize cooperation against error with avoiding the risk of being exploited. We furthermore show that, for general n-person public-goods game, m ≥ n is necessary to satisfy the above criteria. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. Avoiding Misdiagnosis in Patients with Neurological Emergencies

    PubMed Central

    Pope, Jennifer V.; Edlow, Jonathan A.

    2012-01-01

    Approximately 5% of patients presenting to emergency departments have neurological symptoms. The most common symptoms or diagnoses include headache, dizziness, back pain, weakness, and seizure disorder. Little is known about the actual misdiagnosis of these patients, which can have disastrous consequences for both the patients and the physicians. This paper reviews the existing literature about the misdiagnosis of neurological emergencies and analyzes the reason behind the misdiagnosis by specific presenting complaint. Our goal is to help emergency physicians and other providers reduce diagnostic error, understand how these errors are made, and improve patient care. PMID:22888439

  16. Common errors of drug administration in infants: causes and avoidance.

    PubMed

    Anderson, B J; Ellis, J F

    1999-01-01

    Drug administration errors are common in infants. Although the infant population has a high exposure to drugs, there are few data concerning pharmacokinetics or pharmacodynamics, or the influence of paediatric diseases on these processes. Children remain therapeutic orphans. Formulations are often suitable only for adults; in addition, the lack of maturation of drug elimination processes, alteration of body composition and influence of size render the calculation of drug doses complex in infants. The commonest drug administration error in infants is one of dose, and the commonest hospital site for this error is the intensive care unit. Drug errors are a consequence of system error, and preventive strategies are possible through system analysis. The goal of a zero drug error rate should be aggressively sought, with systems in place that aim to eliminate the effects of inevitable human error. This involves review of the entire system from drug manufacture to drug administration. The nuclear industry, telecommunications and air traffic control services all practise error reduction policies with zero error as a clear goal, not by finding fault in the individual, but by identifying faults in the system and building into that system mechanisms for picking up faults before they occur. Such policies could be adapted to medicine using interventions both specific (the production of formulations which are for children only and clearly labelled, regular audit by pharmacists, legible prescriptions, standardised dose tables) and general (paediatric drug trials, education programmes, nonpunitive error reporting) to reduce the number of errors made in giving medication to infants.

  17. A robust interpolation procedure for producing tidal current ellipse inputs for regional and coastal ocean numerical models

    NASA Astrophysics Data System (ADS)

    Byun, Do-Seong; Hart, Deirdre E.

    2017-04-01

    Regional and/or coastal ocean models can use tidal current harmonic forcing, together with tidal harmonic forcing along open boundaries in order to successfully simulate tides and tidal currents. These inputs can be freely generated using online open-access data, but the data produced are not always at the resolution required for regional or coastal models. Subsequent interpolation procedures can produce tidal current forcing data errors for parts of the world's coastal ocean where tidal ellipse inclinations and phases move across the invisible mathematical "boundaries" between 359° and 0° degrees (or 179° and 0°). In nature, such "boundaries" are in fact smooth transitions, but if these mathematical "boundaries" are not treated correctly during interpolation, they can produce inaccurate input data and hamper the accurate simulation of tidal currents in regional and coastal ocean models. These avoidable errors arise due to procedural shortcomings involving vector embodiment problems (i.e., how a vector is represented mathematically, for example as velocities or as coordinates). Automated solutions for producing correct tidal ellipse parameter input data are possible if a series of steps are followed correctly, including the use of Cartesian coordinates during interpolation. This note comprises the first published description of scenarios where tidal ellipse parameter interpolation errors can arise, and of a procedure to successfully avoid these errors when generating tidal inputs for regional and/or coastal ocean numerical models. We explain how a straightforward sequence of data production, format conversion, interpolation, and format reconversion steps may be used to check for the potential occurrence and avoidance of tidal ellipse interpolation and phase errors. This sequence is demonstrated via a case study of the M2 tidal constituent in the seas around Korea but is designed to be universally applicable. We also recommend employing tidal ellipse parameter calculation methods that avoid the use of Foreman's (1978) "northern semi-major axis convention" since, as revealed in our analysis, this commonly used conversion can result in inclination interpolation errors even when Cartesian coordinate-based "vector embodiment" solutions are employed.

  18. Applying Jlint to Space Exploration Software

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus

    2004-01-01

    Java is a very successful programming language which is also becoming widespread in embedded systems, where software correctness is critical. Jlint is a simple but highly efficient static analyzer that checks a Java program for several common errors, such as null pointer exceptions, and overflow errors. It also includes checks for multi-threading problems, such as deadlocks and data races. The case study described here shows the effectiveness of Jlint in find-false positives in the multi-threading warnings gives an insight into design patterns commonly used in multi-threaded code. The results show that a few analysis techniques are sufficient to avoid almost all false positives. These techniques include investigating all possible callers and a few code idioms. Verifying the correct application of these patterns is still crucial, because their correct usage is not trivial.

  19. Systolic VLSI Reed-Solomon Decoder

    NASA Technical Reports Server (NTRS)

    Shao, H. M.; Truong, T. K.; Deutsch, L. J.; Yuen, J. H.

    1986-01-01

    Decoder for digital communications provides high-speed, pipelined ReedSolomon (RS) error-correction decoding of data streams. Principal new feature of proposed decoder is modification of Euclid greatest-common-divisor algorithm to avoid need for time-consuming computations of inverse of certain Galois-field quantities. Decoder architecture suitable for implementation on very-large-scale integrated (VLSI) chips with negative-channel metaloxide/silicon circuitry.

  20. Move In and Move Up.

    ERIC Educational Resources Information Center

    Butler, E. A.

    A man's work shapes him far more profoundly than any other single influence in his life. There are many ways in which a person can find himself in the wrong job, but time, thought, and action invested before accepting a position can help the job seeker avoid many of the common errors. The introductory letter and resume can make or break a career.…

  1. Talking about God with Trauma Survivors.

    PubMed

    Ross, Colin A

    2016-12-31

    Severe, chronic childhood trauma commonly results in a set of negative core self-beliefs. These include blaming the self for the abuse, feeling unworthy and unlovable, believing the world would be better off if one committed suicide, and believing that one does not deserve peace or happiness. Linked to these cognitive errors are beliefs that one is not worthy of God's love, that God wanted the person to be abused, and that the person can avoid God's judgment if she does not go to church. Strategies for dealing with these cognitive errors about God are presented within the context of a secular psychotherapy.

  2. Types of diagnostic errors in neurological emergencies in the emergency department.

    PubMed

    Dubosh, Nicole M; Edlow, Jonathan A; Lefton, Micah; Pope, Jennifer V

    2015-02-01

    Neurological emergencies often pose diagnostic challenges for emergency physicians because these patients often present with atypical symptoms and standard imaging tests are imperfect. Misdiagnosis occurs due to a variety of errors. These can be classified as knowledge gaps, cognitive errors, and systems-based errors. The goal of this study was to describe these errors through review of quality assurance (QA) records. This was a retrospective pilot study of patients with neurological emergency diagnoses that were missed or delayed at one urban, tertiary academic emergency department. Cases meeting inclusion criteria were identified through review of QA records. Three emergency physicians independently reviewed each case and determined the type of error that led to the misdiagnosis. Proportions, confidence intervals, and a reliability coefficient were calculated. During the study period, 1168 cases were reviewed. Forty-two cases were found to include a neurological misdiagnosis and twenty-nine were determined to be the result of an error. The distribution of error types was as follows: knowledge gap 45.2% (95% CI 29.2, 62.2), cognitive error 29.0% (95% CI 15.9, 46.8), and systems-based error 25.8% (95% CI 13.5, 43.5). Cerebellar strokes were the most common type of stroke misdiagnosed, accounting for 27.3% of missed strokes. All three error types contributed to the misdiagnosis of neurological emergencies. Misdiagnosis of cerebellar lesions and erroneous radiology resident interpretations of neuroimaging were the most common mistakes. Understanding the types of errors may enable emergency physicians to develop possible solutions and avoid them in the future.

  3. Appendiceal goblet cell carcinoid: common errors in staging and clinical interpretation with a proposal for an improved terminology.

    PubMed

    Wen, Kwun Wah; Hale, Gillian; Shafizadeh, Nafis; Hosseini, Mojgan; Huang, Anne; Kakar, Sanjay

    2017-07-01

    Goblet cell carcinoid (GCC) is staged and treated as adenocarcinoma (AC) and not as neuroendocrine tumor (NET) or neuroendocrine carcinoma. The term carcinoid may lead to incorrect interpretation as NET. The aim of the study was to explore pitfalls in staging and clinical interpretation of GCC and mixed GCC-AC, and propose strategies to avoid common errors. Diagnostic terminology, staging, and clinical interpretation were evaluated in 58 cases (27 GCCs, 31 mixed GCC-ACs). Opinions were collected from 23 pathologists using a survey. Clinical notes were reviewed to assess the interpretation of pathology diagnoses by oncologists. NET staging was incorrectly used for 25% of GCCs and 5% of mixed GCC-ACs. In the survey, 43% of pathologists incorrectly indicated that NET staging is applicable to GCCs, and 43% incorrectly responded that Ki-67 proliferation index is necessary for GCC grading. Two cases each of GCC and mixed GCC-AC were incorrectly interpreted as neuroendocrine neoplasms by oncologists, and platinum-based therapy was considered for 2 GCC-AC cases because of the mistaken impression of neuroendocrine carcinoma created by use of the World Health Organization 2010 term mixed adenoneuroendocrine carcinoma. The term carcinoid in GCC and use of mixed adenoneuroendocrine carcinoma for mixed GCC-AC lead to errors in staging and treatment. We propose that goblet cell carcinoid should be changed to goblet cell carcinoma, whereas GCC with AC should be referred to as mixed GCC-AC with a comment about the proportion of each component and the histologic subtype of AC. This terminology will facilitate appropriate staging and clinical management, and avoid errors in interpretation. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Secondary data analysis of large data sets in urology: successes and errors to avoid.

    PubMed

    Schlomer, Bruce J; Copp, Hillary L

    2014-03-01

    Secondary data analysis is the use of data collected for research by someone other than the investigator. In the last several years there has been a dramatic increase in the number of these studies being published in urological journals and presented at urological meetings, especially involving secondary data analysis of large administrative data sets. Along with this expansion, skepticism for secondary data analysis studies has increased for many urologists. In this narrative review we discuss the types of large data sets that are commonly used for secondary data analysis in urology, and discuss the advantages and disadvantages of secondary data analysis. A literature search was performed to identify urological secondary data analysis studies published since 2008 using commonly used large data sets, and examples of high quality studies published in high impact journals are given. We outline an approach for performing a successful hypothesis or goal driven secondary data analysis study and highlight common errors to avoid. More than 350 secondary data analysis studies using large data sets have been published on urological topics since 2008 with likely many more studies presented at meetings but never published. Nonhypothesis or goal driven studies have likely constituted some of these studies and have probably contributed to the increased skepticism of this type of research. However, many high quality, hypothesis driven studies addressing research questions that would have been difficult to conduct with other methods have been performed in the last few years. Secondary data analysis is a powerful tool that can address questions which could not be adequately studied by another method. Knowledge of the limitations of secondary data analysis and of the data sets used is critical for a successful study. There are also important errors to avoid when planning and performing a secondary data analysis study. Investigators and the urological community need to strive to use secondary data analysis of large data sets appropriately to produce high quality studies that hopefully lead to improved patient outcomes. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  5. Advice on Writing a Scientific Paper

    NASA Astrophysics Data System (ADS)

    Sterken, C.

    2006-04-01

    What makes one author a good communicator and another a poor one? What turns out one manuscript a swift editorial task, and another an editorial nightmare? Based on direct experience from the manuscripts of the lectures and papers presented during this school, advice is given on what to do and on what to avoid when writing a scientific paper. Some feedback recommendation is also provided on how to prepare manuscripts, handle copyright and permissions to reproduce, how to anticipate plagiarism, how to deal with editors and referees, and how to avoid common errors. A few illustrations of English grammar and style for the foreign author are given.

  6. Unfavourable results with distraction in craniofacial skeleton

    PubMed Central

    Agarwal, Rajiv

    2013-01-01

    Distraction osteogenesis has revolutionised the management of craniofacial abnormalities. The technique however requires precise planning, patient selection, execution and follow-up to achieve consistent and positive results and to avoid unfavourable results. The unfavourable results with craniofacial distraction stem from many factors ranging from improper patient selection, planning and use of inappropriate distraction device and vector. The present study analyses the current standards and techniques of distraction and details in depth the various errors and complications that may occur due to this technique. The commonly observed complications of distraction have been detailed along with measures and suggestions to avoid them in clinical practice. PMID:24501455

  7. Neural Mechanisms for Adaptive Learned Avoidance of Mental Effort.

    PubMed

    Mitsuto Nagase, Asako; Onoda, Keiichi; Clifford Foo, Jerome; Haji, Tomoki; Akaishi, Rei; Yamaguchi, Shuhei; Sakai, Katsuyuki; Morita, Kenji

    2018-02-05

    Humans tend to avoid mental effort. Previous studies have demonstrated this tendency using various demand-selection tasks; participants generally avoid options associated with higher cognitive demand. However, it remains unclear whether humans avoid mental effort adaptively in uncertain and non-stationary environments, and if so, what neural mechanisms underlie this learned avoidance and whether they remain the same irrespective of cognitive-demand types. We addressed these issues by developing novel demand-selection tasks where associations between choice options and cognitive-demand levels change over time, with two variations using mental arithmetic and spatial reasoning problems (29:4 and 18:2 males:females). Most participants showed avoidance, and their choices depended on the demand experienced on multiple preceding trials. We assumed that participants updated the expected cost of mental effort through experience, and fitted their choices by reinforcement learning models, comparing several possibilities. Model-based fMRI analyses revealed that activity in the dorsomedial and lateral frontal cortices was positively correlated with the trial-by-trial expected cost for the chosen option commonly across the different types of cognitive demand, and also revealed a trend of negative correlation in the ventromedial prefrontal cortex. We further identified correlates of cost-prediction-error at time of problem-presentation or answering the problem, the latter of which partially overlapped with or were proximal to the correlates of expected cost at time of choice-cue in the dorsomedial frontal cortex. These results suggest that humans adaptively learn to avoid mental effort, having neural mechanisms to represent expected cost and cost-prediction-error, and the same mechanisms operate for various types of cognitive demand. SIGNIFICANCE STATEMENT In daily life, humans encounter various cognitive demands, and tend to avoid high-demand options. However, it remains unclear whether humans avoid mental effort adaptively under dynamically changing environments, and if so, what are the underlying neural mechanisms and whether they operate irrespective of cognitive-demand types. To address these issues, we developed novel tasks, where participants could learn to avoid high-demand options under uncertain and non-stationary environments. Through model-based fMRI analyses, we found regions whose activity was correlated with the expected mental effort cost, or cost-prediction-error, regardless of demand-type, with overlap or adjacence in the dorsomedial frontal cortex. This finding contributes to clarifying the mechanisms for cognitive-demand avoidance, and provides empirical building blocks for the emerging computational theory of mental effort. Copyright © 2018 the authors.

  8. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number of avoided administration errors was related to the incremental costs to obtain the cost-effectiveness ratio expressed as the cost per avoided administration error. The AMS resulted in a statistically significant reduction in the proportion of errors in the intervention ward compared with the control ward. The cost analysis showed that the AMS increased the ward's 6-month cost by €16,843. The cost-effectiveness ratio was estimated at €2.01 per avoided administration error, €2.91 per avoided procedural error, and €19.38 per avoided clinical error. The AMS was effective in reducing errors in the medication administration process at a higher overall cost. The cost-effectiveness analysis showed that the AMS was associated with affordable cost-effectiveness rates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Avoidable interruptions during drug administration in an intensive rehabilitation ward: improvement project.

    PubMed

    Buchini, Sara; Quattrin, Rosanna

    2012-04-01

    To record the frequency of interruptions and their causes, to identify 'avoidable' interruptions and to build an improvement project to reduce 'avoidable' interruptions. In Italy each year 30,000-35,000 deaths per year are attributed to health-care system errors, of which 19% are caused by medication errors. The factors that contribute to drug management error also include interruptions and carelessness during treatment administration. A descriptive study design was used to record the frequency of interruptions and their causes and to identify 'avoidable' interruptions in an intensive rehabilitation ward in Northern Italy. A data collection grid was used to record the data over a 6-month period. A total of 3000 work hours were observed. During the study period 1170 interruptions were observed. The study identified 14 causes of interruption. The study shows that of the 14 cases of interruptions at least nine can be defined as 'avoidable'. An improvement project has been proposed to reduce unnecessary interruptions and distractions to avoid making errors. An additional useful step to reduce the incidence of treatment errors would be to implement the use of a single patient medication sheet for the recording of drug prescription, preparation and administration and also the incident reporting. © 2011 Blackwell Publishing Ltd.

  10. Medical Error Avoidance in Intraoperative Neurophysiological Monitoring: The Communication Imperative.

    PubMed

    Skinner, Stan; Holdefer, Robert; McAuliffe, John J; Sala, Francesco

    2017-11-01

    Error avoidance in medicine follows similar rules that apply within the design and operation of other complex systems. The error-reduction concepts that best fit the conduct of testing during intraoperative neuromonitoring are forgiving design (reversibility of signal loss to avoid/prevent injury) and system redundancy (reduction of false reports by the multiplication of the error rate of tests independently assessing the same structure). However, error reduction in intraoperative neuromonitoring is complicated by the dichotomous roles (and biases) of the neurophysiologist (test recording and interpretation) and surgeon (intervention). This "interventional cascade" can be given as follows: test → interpretation → communication → intervention → outcome. Observational and controlled trials within operating rooms demonstrate that optimized communication, collaboration, and situational awareness result in fewer errors. Well-functioning operating room collaboration depends on familiarity and trust among colleagues. Checklists represent one method to initially enhance communication and avoid obvious errors. All intraoperative neuromonitoring supervisors should strive to use sufficient means to secure situational awareness and trusted communication/collaboration. Face-to-face audiovisual teleconnections may help repair deficiencies when a particular practice model disallows personal operating room availability. All supervising intraoperative neurophysiologists need to reject an insular or deferential or distant mindset.

  11. tPA Prescription and Administration Errors within a Regional Stroke System

    PubMed Central

    Chung, Lee S; Tkach, Aleksander; Lingenfelter, Erin M; Dehoney, Sarah; Rollo, Jeannie; de Havenon, Adam; DeWitt, Lucy Dana; Grantz, Matthew Ryan; Wang, Haimei; Wold, Jana J; Hannon, Peter M; Weathered, Natalie R; Majersik, Jennifer J

    2015-01-01

    Background IV tPA utilization in acute ischemic stroke (AIS) requires weight-based dosing and a standardized infusion rate. In our regional network, we have tried to minimize tPA dosing errors. We describe the frequency and types of tPA administration errors made in our comprehensive stroke center (CSC) and at community hospitals (CHs) prior to transfer. Methods Using our stroke quality database, we extracted clinical and pharmacy information on all patients who received IV tPA from 2010–11 at the CSC or CH prior to transfer. All records were analyzed for the presence of inclusion/exclusion criteria deviations or tPA errors in prescription, reconstitution, dispensing, or administration, and analyzed for association with outcomes. Results We identified 131 AIS cases treated with IV tPA: 51% female; mean age 68; 32% treated at CSC, 68% at CH (including 26% by telestroke) from 22 CHs. tPA prescription and administration errors were present in 64% of all patients (41% CSC, 75% CH, p<0.001), the most common being incorrect dosage for body weight (19% CSC, 55% CH, p<0.001). Of the 27 overdoses, there were 3 deaths due to systemic hemorrhage or ICH. Nonetheless, outcomes (parenchymal hematoma, mortality, mRS) did not differ between CSC and CH patients nor between those with and without errors. Conclusion Despite focus on minimization of tPA administration errors in AIS patients, such errors were very common in our regional stroke system. Although an association between tPA errors and stroke outcomes was not demonstrated, quality assurance mechanisms are still necessary to reduce potentially dangerous, avoidable errors. PMID:26698642

  12. Detection and avoidance of errors in computer software

    NASA Technical Reports Server (NTRS)

    Kinsler, Les

    1989-01-01

    The acceptance test errors of a computer software project to determine if the errors could be detected or avoided in earlier phases of development. GROAGSS (Gamma Ray Observatory Attitude Ground Support System) was selected as the software project to be examined. The development of the software followed the standard Flight Dynamics Software Development methods. GROAGSS was developed between August 1985 and April 1989. The project is approximately 250,000 lines of code of which approximately 43,000 lines are reused from previous projects. GROAGSS had a total of 1715 Change Report Forms (CRFs) submitted during the entire development and testing. These changes contained 936 errors. Of these 936 errors, 374 were found during the acceptance testing. These acceptance test errors were first categorized into methods of avoidance including: more clearly written requirements; detail review; code reading; structural unit testing; and functional system integration testing. The errors were later broken down in terms of effort to detect and correct, class of error, and probability that the prescribed detection method would be successful. These determinations were based on Software Engineering Laboratory (SEL) documents and interviews with the project programmers. A summary of the results of the categorizations is presented. The number of programming errors at the beginning of acceptance testing can be significantly reduced. The results of the existing development methodology are examined for ways of improvements. A basis is provided for the definition is a new development/testing paradigm. Monitoring of the new scheme will objectively determine its effectiveness on avoiding and detecting errors.

  13. Can we improve patient safety?

    PubMed

    Corbally, Martin Thomas

    2014-01-01

    Despite greater awareness of patient safety issues especially in the operating room and the widespread implementation of surgical time out World Health Organization (WHO), errors, especially wrong site surgery, continue. Most such errors are due to lapses in communication where decision makers fail to consult or confirm operative findings but worryingly where parental concerns over the planned procedure are ignored or not followed through. The WHO Surgical Pause/Time Out aims to capture these errors and prevent them, but the combination of human error and complex hospital environments can overwhelm even robust safety structures and simple common sense. Parents are the ultimate repository of information on their child's condition and planned surgery but are traditionally excluded from the process of Surgical Pause and Time Out, perhaps to avoid additional stress. In addition, surgeons, like pilots, are subject to the phenomenon of "plan-continue-fail" with potentially disastrous outcomes. If we wish to improve patient safety during surgery and avoid wrong site errors then we must include parents in the Surgical Pause/Time Out. A recent pilot study has shown that neither staff nor parents found it added to their stress, but, moreover, 100% of parents considered that it should be a mandatory component of the Surgical Pause nor does it add to the stress of surgery. Surgeons should be required to confirm that the planned procedure is in keeping with the operative findings especially in extirpative surgery and this "step back" should be incorporated into the standard Surgical Pause. It is clear that we must improve patient safety further and these simple measures should add to that potential.

  14. Concomitant prescribing and dispensing errors at a Brazilian hospital: a descriptive study

    PubMed Central

    Silva, Maria das Dores Graciano; Rosa, Mário Borges; Franklin, Bryony Dean; Reis, Adriano Max Moreira; Anchieta, Lêni Márcia; Mota, Joaquim Antônio César

    2011-01-01

    OBJECTIVE: To analyze the prevalence and types of prescribing and dispensing errors occurring with high-alert medications and to propose preventive measures to avoid errors with these medications. INTRODUCTION: The prevalence of adverse events in health care has increased, and medication errors are probably the most common cause of these events. Pediatric patients are known to be a high-risk group and are an important target in medication error prevention. METHODS: Observers collected data on prescribing and dispensing errors occurring with high-alert medications for pediatric inpatients in a university hospital. In addition to classifying the types of error that occurred, we identified cases of concomitant prescribing and dispensing errors. RESULTS: One or more prescribing errors, totaling 1,632 errors, were found in 632 (89.6%) of the 705 high-alert medications that were prescribed and dispensed. We also identified at least one dispensing error in each high-alert medication dispensed, totaling 1,707 errors. Among these dispensing errors, 723 (42.4%) content errors occurred concomitantly with the prescribing errors. A subset of dispensing errors may have occurred because of poor prescription quality. The observed concomitancy should be examined carefully because improvements in the prescribing process could potentially prevent these problems. CONCLUSION: The system of drug prescribing and dispensing at the hospital investigated in this study should be improved by incorporating the best practices of medication safety and preventing medication errors. High-alert medications may be used as triggers for improving the safety of the drug-utilization system. PMID:22012039

  15. Optimizing Automatic Deployment Using Non-functional Requirement Annotations

    NASA Astrophysics Data System (ADS)

    Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin

    Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.

  16. PREVALENCE OF REFRACTIVE ERRORS IN MADRASSA STUDENTS OF HARIPUR DISTRICT.

    PubMed

    Atta, Zoia; Arif, Abdus Salam; Ahmed, Iftikhar; Farooq, Umer

    2015-01-01

    Visual impairment due to refractive errors is one of the most common problems among school-age children and is the second leading cause of treatable blindness. The Right to Sight, a global initiative launched by a coalition of non-government organizations and the World Health Organization (WHO), aims to eliminate avoidable visual impairment and blindness at a global level. In order to achieve this goal it is important to know the prevalence of different refractive errors in a community. Children and teenagers are the most susceptible groups to be affected by refractive errors. So, this population needs to be screened for different types of refractive errors. The study was done with the objective to find the frequency of different types of refractive errors in students of madrassas between the ages of 5-20 years in Haripur. This cross sectional study was done with 300 students between ages of 5-20 years in Madrassas of Haripur. The students were screened for refractive errors and the types of the errors were noted. After screening for refractive errors-the glasses were prescribed to the students. Myopia being 52.6% was the most frequent refractive error in students, followed by hyperopia 28.4% and astigmatism 19%. This study showed that myopia is an important problem in madrassa population. Females and males are almost equally affected. Spectacle correction of refractive errors is the cheapest and easy solution of this problem.

  17. Ten quick tips for machine learning in computational biology.

    PubMed

    Chicco, Davide

    2017-01-01

    Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.

  18. Preanalytical Errors in Hematology Laboratory- an Avoidable Incompetence.

    PubMed

    HarsimranKaur, Vikram Narang; Selhi, Pavneet Kaur; Sood, Neena; Singh, Aminder

    2016-01-01

    Quality assurance in the hematology laboratory is a must to ensure laboratory users of reliable test results with high degree of precision and accuracy. Even after so many advances in hematology laboratory practice, pre-analytical errors remain a challenge for practicing pathologists. This study was undertaken with an objective to evaluate the types and frequency of preanalytical errors in hematology laboratory of our center. All the samples received in the Hematology Laboratory of Dayanand Medical College and Hospital, Ludhiana, India over a period of one year (July 2013-July 2014) were included in the study and preanalytical variables like clotted samples, quantity not sufficient, wrong sample, without label, wrong label were studied. Of 471,006 samples received in the laboratory, preanalytical errors, as per the above mentioned categories was found in 1802 samples. The most common error was clotted samples (1332 samples, 0.28% of the total samples) followed by quantity not sufficient (328 sample, 0.06%), wrong sample (96 samples, 0.02%), without label (24 samples, 0.005%) and wrong label (22 samples, 0.005%). Preanalytical errors are frequent in laboratories and can be corrected by regular analysis of the variables involved. Rectification can be done by regular education of the staff.

  19. [The quality of medication orders--can it be improved?].

    PubMed

    Vaknin, Ofra; Wingart-Emerel, Efrat; Stern, Zvi

    2003-07-01

    Medication errors are a common cause of morbidity and mortality among patients. Medication administration in hospitals is a complicated procedure with the possibility of error at each step. Errors are most commonly found at the prescription and transcription stages, although it is known that most errors can easily be avoided through strict adherence to standardized procedure guidelines. In examination of medication errors reported in the hospital in the year 2000, we found that 38% reported to have resulted from transcription errors. In the year 2001, the hospital initiated a program designed to identify faulty process of orders in an effort to improve the quality and effectiveness of the medication administration process. As part of this program, it was decided to check and evaluate the quality of the written doctor's orders and the transcription of those orders to the nursing cadre, in various hospital units. The study was conducted using a questionnaire which checked compliance to hospital standards with regard to the medication administration process, as applied to 6 units over the course of 8 weeks. Results of the survey showed poor compliance to guidelines on the part of doctors and nurses. Only 18% of doctors' orders in the study and 37% of the nurses' transcriptions were written according to standards. The Emergency Department showed an even lower compliance with only 3% of doctors' orders and 25% of nurses' transcriptions complying to standards. As a result of this study, it was decided to initiate an intensive in-service teaching course to refresh the staff's knowledge of medication administration guidelines. In the future it is recommended that hand-written orders be replaced by computerized orders in an effort to limit the chance of error.

  20. Risk management: correct patient and specimen identification in a surgical pathology laboratory. The experience of Infermi Hospital, Rimini, Italy.

    PubMed

    Fabbretti, G

    2010-06-01

    Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.

  1. Feasibility, strategy, methodology, and analysis of probe measurements in plasma under high gas pressure

    NASA Astrophysics Data System (ADS)

    Demidov, V. I.; Koepke, M. E.; Kurlyandskaya, I. P.; Malkov, M. A.

    2018-02-01

    This paper reviews existing theories for interpreting probe measurements of electron distribution functions (EDF) at high gas pressure when collisions of electrons with atoms and/or molecules near the probe are pervasive. An explanation of whether or not the measurements are realizable and reliable, an enumeration of the most common sources of measurement error, and an outline of proper probe-experiment design elements that inherently limit or avoid error is presented. Additionally, we describe recent expanded plasma-condition compatibility for EDF measurement, including in applications of large wall probe plasma diagnostics. This summary of the authors’ experiences gained over decades of practicing and developing probe diagnostics is intended to inform, guide, suggest, and detail the advantages and disadvantages of probe application in plasma research.

  2. Review of medication errors that are new or likely to occur more frequently with electronic medication management systems.

    PubMed

    Van de Vreede, Melita; McGrath, Anne; de Clifford, Jan

    2018-05-14

    Objective. The aim of the present study was to identify and quantify medication errors reportedly related to electronic medication management systems (eMMS) and those considered likely to occur more frequently with eMMS. This included developing a new classification system relevant to eMMS errors. Methods. Eight Victorian hospitals with eMMS participated in a retrospective audit of reported medication incidents from their incident reporting databases between May and July 2014. Site-appointed project officers submitted deidentified incidents they deemed new or likely to occur more frequently due to eMMS, together with the Incident Severity Rating (ISR). The authors reviewed and classified incidents. Results. There were 5826 medication-related incidents reported. In total, 93 (47 prescribing errors, 46 administration errors) were identified as new or potentially related to eMMS. Only one ISR2 (moderate) and no ISR1 (severe or death) errors were reported, so harm to patients in this 3-month period was minimal. The most commonly reported error types were 'human factors' and 'unfamiliarity or training' (70%) and 'cross-encounter or hybrid system errors' (22%). Conclusions. Although the results suggest that the errors reported were of low severity, organisations must remain vigilant to the risk of new errors and avoid the assumption that eMMS is the panacea to all medication error issues. What is known about the topic? eMMS have been shown to reduce some types of medication errors, but it has been reported that some new medication errors have been identified and some are likely to occur more frequently with eMMS. There are few published Australian studies that have reported on medication error types that are likely to occur more frequently with eMMS in more than one organisation and that include administration and prescribing errors. What does this paper add? This paper includes a new simple classification system for eMMS that is useful and outlines the most commonly reported incident types and can inform organisations and vendors on possible eMMS improvements. The paper suggests a new classification system for eMMS medication errors. What are the implications for practitioners? The results of the present study will highlight to organisations the need for ongoing review of system design, refinement of workflow issues, staff education and training and reporting and monitoring of errors.

  3. Fast maximum likelihood estimation using continuous-time neural point process models.

    PubMed

    Lepage, Kyle Q; MacDonald, Christopher J

    2015-06-01

    A recent report estimates that the number of simultaneously recorded neurons is growing exponentially. A commonly employed statistical paradigm using discrete-time point process models of neural activity involves the computation of a maximum-likelihood estimate. The time to computate this estimate, per neuron, is proportional to the number of bins in a finely spaced discretization of time. By using continuous-time models of neural activity and the optimally efficient Gaussian quadrature, memory requirements and computation times are dramatically decreased in the commonly encountered situation where the number of parameters p is much less than the number of time-bins n. In this regime, with q equal to the quadrature order, memory requirements are decreased from O(np) to O(qp), and the number of floating-point operations are decreased from O(np(2)) to O(qp(2)). Accuracy of the proposed estimates is assessed based upon physiological consideration, error bounds, and mathematical results describing the relation between numerical integration error and numerical error affecting both parameter estimates and the observed Fisher information. A check is provided which is used to adapt the order of numerical integration. The procedure is verified in simulation and for hippocampal recordings. It is found that in 95 % of hippocampal recordings a q of 60 yields numerical error negligible with respect to parameter estimate standard error. Statistical inference using the proposed methodology is a fast and convenient alternative to statistical inference performed using a discrete-time point process model of neural activity. It enables the employment of the statistical methodology available with discrete-time inference, but is faster, uses less memory, and avoids any error due to discretization.

  4. Simulating the performance of a distance-3 surface code in a linear ion trap

    NASA Astrophysics Data System (ADS)

    Trout, Colin J.; Li, Muyuan; Gutiérrez, Mauricio; Wu, Yukai; Wang, Sheng-Tao; Duan, Luming; Brown, Kenneth R.

    2018-04-01

    We explore the feasibility of implementing a small surface code with 9 data qubits and 8 ancilla qubits, commonly referred to as surface-17, using a linear chain of 171Yb+ ions. Two-qubit gates can be performed between any two ions in the chain with gate time increasing linearly with ion distance. Measurement of the ion state by fluorescence requires that the ancilla qubits be physically separated from the data qubits to avoid errors on the data due to scattered photons. We minimize the time required to measure one round of stabilizers by optimizing the mapping of the two-dimensional surface code to the linear chain of ions. We develop a physically motivated Pauli error model that allows for fast simulation and captures the key sources of noise in an ion trap quantum computer including gate imperfections and ion heating. Our simulations showed a consistent requirement of a two-qubit gate fidelity of ≥99.9% for the logical memory to have a better fidelity than physical two-qubit operations. Finally, we perform an analysis of the error subsets from the importance sampling method used to bound the logical error rates to gain insight into which error sources are particularly detrimental to error correction.

  5. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    NASA Astrophysics Data System (ADS)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  6. The pitfalls of premature closure: clinical decision-making in a case of aortic dissection

    PubMed Central

    Kumar, Bharat; Kanna, Balavenkatesh; Kumar, Suresh

    2011-01-01

    Premature closure is a type of cognitive error in which the physician fails to consider reasonable alternatives after an initial diagnosis is made. It is a common cause of delayed diagnosis and misdiagnosis borne out of a faulty clinical decision-making process. The authors present a case of aortic dissection in which premature closure was avoided by the aggressive pursuit of the appropriate differential diagnosis, and discuss the importance of disciplined clinical decision-making in the setting of chest pain. PMID:22679162

  7. HOW TO WRITE A SCIENTIFIC ARTICLE

    PubMed Central

    Manske, Robert C.

    2012-01-01

    Successful production of a written product for submission to a peer‐reviewed scientific journal requires substantial effort. Such an effort can be maximized by following a few simple suggestions when composing/creating the product for submission. By following some suggested guidelines and avoiding common errors, the process can be streamlined and success realized for even beginning/novice authors as they negotiate the publication process. The purpose of this invited commentary is to offer practical suggestions for achieving success when writing and submitting manuscripts to The International Journal of Sports Physical Therapy and other professional journals. PMID:23091783

  8. Optical Issues in Measuring Strabismus

    PubMed Central

    Irsch, Kristina

    2015-01-01

    Potential errors and complications during examination and treatment of strabismic patients can be reduced by recognition of certain optical issues. This articles reviews basic as well as guiding principles of prism optics and optics of the eye to equip the reader with the necessary know-how to avoid pitfalls that are commonly encountered when using prisms to measure ocular deviations (e.g., during cover testing), and when observing the corneal light reflex to estimate ocular deviations (e.g., during Hirschberg or Krimsky testing in patients who do not allow for cover testing using prisms). PMID:26180462

  9. Optical Issues in Measuring Strabismus.

    PubMed

    Irsch, Kristina

    2015-01-01

    Potential errors and complications during examination and treatment of strabismic patients can be reduced by recognition of certain optical issues. This articles reviews basic as well as guiding principles of prism optics and optics of the eye to equip the reader with the necessary know-how to avoid pitfalls that are commonly encountered when using prisms to measure ocular deviations (e.g., during cover testing), and when observing the corneal light reflex to estimate ocular deviations (e.g., during Hirschberg or Krimsky testing in patients who do not allow for cover testing using prisms).

  10. Prevalence and types of preanalytical error in hematology laboratory of a tertiary care hospital in South India.

    PubMed

    Arul, Pitchaikaran; Pushparaj, Magesh; Pandian, Kanmani; Chennimalai, Lingasamy; Rajendran, Karthika; Selvaraj, Eniya; Masilamani, Suresh

    2018-01-01

    An important component of laboratory medicine is preanalytical phase. Since laboratory report plays a major role in patient management, more importance should be given to the quality of laboratory tests. The present study was undertaken to find the prevalence and types of preanalytical errors at a tertiary care hospital in South India. In this cross-sectional study, a total of 118,732 samples ([62,474 outpatient department [OPD] and 56,258 inpatient department [IPD]) were received in hematology laboratory. These samples were analyzed for preanalytical errors such as misidentification, incorrect vials, inadequate samples, clotted samples, diluted samples, and hemolyzed samples. The overall prevalence of preanalytical errors found was 513 samples, which is 0.43% of the total number of samples received. The most common preanalytical error observed was inadequate samples followed by clotted samples. Overall frequencies (both OPD and IPD) of preanalytical errors such as misidentification, incorrect vials, inadequate samples, clotted samples, diluted samples, and hemolyzed samples were 0.02%, 0.05%, 0.2%, 0.12%, 0.02%, and 0.03%, respectively. The present study concluded that incorrect phlebotomy techniques due to lack of awareness is the main reason for preanalytical errors. This can be avoided by proper communication and coordination between laboratory and wards, proper training and continuing medical education programs for laboratory and paramedical staffs, and knowledge of the intervening factors that can influence laboratory results.

  11. Reflection of medical error highlighted on media in Turkey: A retrospective study

    PubMed Central

    Isik, Oguz; Bayin, Gamze; Ugurluoglu, Ozgur

    2016-01-01

    Objective: This study was performed with the aim of identifying how news on medical errors have be transmitted, and how the types, reasons, and conclusions of medical errors have been reflected to by the media in Turkey. Methods: A content analysis method was used in the study, and in this context, the data for the study was acquired by scanning five newspapers with the top editions on the national basis between the years 2012 and 2015 for the news about medical errors. Some specific selection criteria was used for the scanning of resulted news, and 116 news items acquired as a result of all the eliminations. Results: According to the results of the study; the vast majority of medical errors (40.5%) transmitted by the news resulted from the negligence of the medical staff. The medical errors were caused by physicians in the ratio of 74.1%, they most commonly occurred in state hospitals (31.9%). Another important result of the research was that medical errors resulted in either patient death to a large extent (51.7%), or permanent damage and disability to patients (25.0%). Conclusion: The news concerning medical errors provided information about the types, causes, and the results of these medical errors. It also reflected the media point of view on the issue. The examination of the content of the medical errors reported by the media were important which calls for appropriate interventions to avoid and minimize the occurrence of medical errors by improving the healthcare delivery system. PMID:27882026

  12. Patient safety in the clinical laboratory: a longitudinal analysis of specimen identification errors.

    PubMed

    Wagar, Elizabeth A; Tamashiro, Lorraine; Yasin, Bushra; Hilborne, Lee; Bruckner, David A

    2006-11-01

    Patient safety is an increasingly visible and important mission for clinical laboratories. Attention to improving processes related to patient identification and specimen labeling is being paid by accreditation and regulatory organizations because errors in these areas that jeopardize patient safety are common and avoidable through improvement in the total testing process. To assess patient identification and specimen labeling improvement after multiple implementation projects using longitudinal statistical tools. Specimen errors were categorized by a multidisciplinary health care team. Patient identification errors were grouped into 3 categories: (1) specimen/requisition mismatch, (2) unlabeled specimens, and (3) mislabeled specimens. Specimens with these types of identification errors were compared preimplementation and postimplementation for 3 patient safety projects: (1) reorganization of phlebotomy (4 months); (2) introduction of an electronic event reporting system (10 months); and (3) activation of an automated processing system (14 months) for a 24-month period, using trend analysis and Student t test statistics. Of 16,632 total specimen errors, mislabeled specimens, requisition mismatches, and unlabeled specimens represented 1.0%, 6.3%, and 4.6% of errors, respectively. Student t test showed a significant decrease in the most serious error, mislabeled specimens (P < .001) when compared to before implementation of the 3 patient safety projects. Trend analysis demonstrated decreases in all 3 error types for 26 months. Applying performance-improvement strategies that focus longitudinally on specimen labeling errors can significantly reduce errors, therefore improving patient safety. This is an important area in which laboratory professionals, working in interdisciplinary teams, can improve safety and outcomes of care.

  13. The heritability of avoidant and dependent personality disorder assessed by personal interview and questionnaire.

    PubMed

    Gjerde, L C; Czajkowski, N; Røysamb, E; Orstavik, R E; Knudsen, G P; Ostby, K; Torgersen, S; Myers, J; Kendler, K S; Reichborn-Kjennerud, T

    2012-12-01

    Personality disorders (PDs) have been shown to be modestly heritable. Accurate heritability estimates are, however, dependent on reliable measurement methods, as measurement error deflates heritability. The aim of this study was to estimate the heritability of DSM-IV avoidant and dependent personality disorder, by including two measures of the PDs at two time points. Data were obtained from a population-based cohort of young adult Norwegian twins, of whom 8045 had completed a self-report questionnaire assessing PD traits. 2794 of these twins subsequently underwent a structured diagnostic interview for DSM-IV PDs. Questionnaire items predicting interview results were selected by multiple regression, and measurement models of the PDs were fitted in Mx. The heritabilities of the PD factors were 0.64 for avoidant PD and 0.66 for dependent PD. No evidence of common environment, that is, environmental factors that are shared between twins and make them similar, was found. Genetic and environmental contributions to avoidant and dependent PD seemed to be the same across sexes. The combination of both a questionnaire- and an interview assessment of avoidant and dependent PD results in substantially higher heritabilities than previously found using single-occasion interviews only. © 2012 John Wiley & Sons A/S.

  14. Error analysis in a stereo vision-based pedestrian detection sensor for collision avoidance applications.

    PubMed

    Llorca, David F; Sotelo, Miguel A; Parra, Ignacio; Ocaña, Manuel; Bergasa, Luis M

    2010-01-01

    This paper presents an analytical study of the depth estimation error of a stereo vision-based pedestrian detection sensor for automotive applications such as pedestrian collision avoidance and/or mitigation. The sensor comprises two synchronized and calibrated low-cost cameras. Pedestrians are detected by combining a 3D clustering method with Support Vector Machine-based (SVM) classification. The influence of the sensor parameters in the stereo quantization errors is analyzed in detail providing a point of reference for choosing the sensor setup according to the application requirements. The sensor is then validated in real experiments. Collision avoidance maneuvers by steering are carried out by manual driving. A real time kinematic differential global positioning system (RTK-DGPS) is used to provide ground truth data corresponding to both the pedestrian and the host vehicle locations. The performed field test provided encouraging results and proved the validity of the proposed sensor for being used in the automotive sector towards applications such as autonomous pedestrian collision avoidance.

  15. Error Analysis in a Stereo Vision-Based Pedestrian Detection Sensor for Collision Avoidance Applications

    PubMed Central

    Llorca, David F.; Sotelo, Miguel A.; Parra, Ignacio; Ocaña, Manuel; Bergasa, Luis M.

    2010-01-01

    This paper presents an analytical study of the depth estimation error of a stereo vision-based pedestrian detection sensor for automotive applications such as pedestrian collision avoidance and/or mitigation. The sensor comprises two synchronized and calibrated low-cost cameras. Pedestrians are detected by combining a 3D clustering method with Support Vector Machine-based (SVM) classification. The influence of the sensor parameters in the stereo quantization errors is analyzed in detail providing a point of reference for choosing the sensor setup according to the application requirements. The sensor is then validated in real experiments. Collision avoidance maneuvers by steering are carried out by manual driving. A real time kinematic differential global positioning system (RTK-DGPS) is used to provide ground truth data corresponding to both the pedestrian and the host vehicle locations. The performed field test provided encouraging results and proved the validity of the proposed sensor for being used in the automotive sector towards applications such as autonomous pedestrian collision avoidance. PMID:22319323

  16. Acute Care Management of the HIV-Infected Patient: A Report from the HIV Practice and Research Network of the American College of Clinical Pharmacy.

    PubMed

    Durham, Spencer H; Badowski, Melissa E; Liedtke, Michelle D; Rathbun, R Chris; Pecora Fulco, Patricia

    2017-05-01

    Patients infected with human immunodeficiency virus (HIV) admitted to the hospital have complex antiretroviral therapy (ART) regimens with an increased medication error rate upon admission. This report provides a resource for clinicians managing HIV-infected patients and ART in the inpatient setting. A survey of the authors was conducted to evaluate common issues that arise during an acute hospitalization for HIV-infected patients. After a group consensus, a review of the medical literature was performed to determine the supporting evidence for the following HIV-associated hospital queries: admission/discharge orders, antiretroviral hospital formularies, laboratory monitoring, altered hepatic/renal function, drug-drug interactions (DDIs), enteral administration, and therapeutic drug monitoring. With any hospital admission for an HIV-infected patient, a specific set of procedures should be followed including a thorough admission medication history and communication with the ambulatory HIV provider to avoid omissions or substitutions in the ART regimen. DDIs are common and should be reviewed at all transitions of care during the hospital admission. ART may be continued if enteral nutrition with a feeding tube is deemed necessary, but the entire regimen should be discontinued if no oral access is available for a prolonged period. Therapeutic drug monitoring is not generally recommended but, if available, should be considered in unique clinical scenarios where antiretroviral pharmacokinetics are difficult to predict. ART may need adjustment if hepatic or renal insufficiency ensues. Treatment of hospitalized patients with HIV is highly complex. HIV-infected patients are at high risk for medication errors during various transitions of care. Baseline knowledge of the principles of antiretroviral pharmacotherapy is necessary for clinicians managing acutely ill HIV-infected patients to avoid medication errors, identify DDIs, and correctly dose medications if organ dysfunction arises. Timely ambulatory follow-up is essential to prevent readmissions and facilitate improved transitions of care. © 2017 Pharmacotherapy Publications, Inc.

  17. Imperfect practice makes perfect: error management training improves transfer of learning.

    PubMed

    Dyre, Liv; Tabor, Ann; Ringsted, Charlotte; Tolsgaard, Martin G

    2017-02-01

    Traditionally, trainees are instructed to practise with as few errors as possible during simulation-based training. However, transfer of learning may improve if trainees are encouraged to commit errors. The aim of this study was to assess the effects of error management instructions compared with error avoidance instructions during simulation-based ultrasound training. Medical students (n = 60) with no prior ultrasound experience were randomised to error management training (EMT) (n = 32) or error avoidance training (EAT) (n = 28). The EMT group was instructed to deliberately make errors during training. The EAT group was instructed to follow the simulator instructions and to commit as few errors as possible. Training consisted of 3 hours of simulation-based ultrasound training focusing on fetal weight estimation. Simulation-based tests were administered before and after training. Transfer tests were performed on real patients 7-10 days after the completion of training. Primary outcomes were transfer test performance scores and diagnostic accuracy. Secondary outcomes included performance scores and diagnostic accuracy during the simulation-based pre- and post-tests. A total of 56 participants completed the study. On the transfer test, EMT group participants attained higher performance scores (mean score: 67.7%, 95% confidence interval [CI]: 62.4-72.9%) than EAT group members (mean score: 51.7%, 95% CI: 45.8-57.6%) (p < 0.001; Cohen's d = 1.1, 95% CI: 0.5-1.7). There was a moderate improvement in diagnostic accuracy in the EMT group compared with the EAT group (16.7%, 95% CI: 10.2-23.3% weight deviation versus 26.6%, 95% CI: 16.5-36.7% weight deviation [p = 0.082; Cohen's d = 0.46, 95% CI: -0.06 to 1.0]). No significant interaction effects between group and performance improvements between the pre- and post-tests were found in either performance scores (p = 0.25) or diagnostic accuracy (p = 0.09). The provision of error management instructions during simulation-based training improves the transfer of learning to the clinical setting compared with error avoidance instructions. Rather than teaching to avoid errors, the use of errors for learning should be explored further in medical education theory and practice. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  18. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    PubMed

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for identifying errors; and barriers and facilitators. There was no common language in the discussion of modelling errors and there was inconsistency in the perceived boundaries of what constitutes an error. Asked about the definition of model error, there was a tendency for interviewees to exclude matters of judgement from being errors and focus on 'slips' and 'lapses', but discussion of slips and lapses comprised less than 20% of the discussion on types of errors. Interviewees devoted 70% of the discussion to softer elements of the process of defining the decision question and conceptual modelling, mostly the realms of judgement, skills, experience and training. The original focus concerned model errors, but it may be more useful to refer to modelling risks. Several interviewees discussed concepts of validation and verification, with notable consistency in interpretation: verification meaning the process of ensuring that the computer model correctly implemented the intended model, whereas validation means the process of ensuring that a model is fit for purpose. Methodological literature on verification and validation of models makes reference to the Hermeneutic philosophical position, highlighting that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Interviewees demonstrated examples of all major error types identified in the literature: errors in the description of the decision problem, in model structure, in use of evidence, in implementation of the model, in operation of the model, and in presentation and understanding of results. The HTA error classifications were compared against existing classifications of model errors in the literature. A range of techniques and processes are currently used to avoid errors in HTA models: engaging with clinical experts, clients and decision-makers to ensure mutual understanding, producing written documentation of the proposed model, explicit conceptual modelling, stepping through skeleton models with experts, ensuring transparency in reporting, adopting standard housekeeping techniques, and ensuring that those parties involved in the model development process have sufficient and relevant training. Clarity and mutual understanding were identified as key issues. However, their current implementation is not framed within an overall strategy for structuring complex problems. Some of the questioning may have biased interviewees responses but as all interviewees were represented in the analysis no rebalancing of the report was deemed necessary. A potential weakness of the literature review was its focus on spreadsheet and program development rather than specifically on model development. It should also be noted that the identified literature concerning programming errors was very narrow despite broad searches being undertaken. Published definitions of overall model validity comprising conceptual model validation, verification of the computer model, and operational validity of the use of the model in addressing the real-world problem are consistent with the views expressed by the HTA community and are therefore recommended as the basis for further discussions of model credibility. Such discussions should focus on risks, including errors of implementation, errors in matters of judgement and violations. Discussions of modelling risks should reflect the potentially complex network of cognitive breakdowns that lead to errors in models and existing research on the cognitive basis of human error should be included in an examination of modelling errors. There is a need to develop a better understanding of the skills requirements for the development, operation and use of HTA models. Interaction between modeller and client in developing mutual understanding of a model establishes that model's significance and its warranty. This highlights that model credibility is the central concern of decision-makers using models so it is crucial that the concept of model validation should not be externalized from the decision-makers and the decision-making process. Recommendations for future research would be studies of verification and validation; the model development process; and identification of modifications to the modelling process with the aim of preventing the occurrence of errors and improving the identification of errors in models.

  19. Rapid and non-invasive analysis of deoxynivalenol in durum and common wheat by Fourier-Transform Near Infrared (FT-NIR) spectroscopy.

    PubMed

    De Girolamo, A; Lippolis, V; Nordkvist, E; Visconti, A

    2009-06-01

    Fourier transform near-infrared spectroscopy (FT-NIR) was used for rapid and non-invasive analysis of deoxynivalenol (DON) in durum and common wheat. The relevance of using ground wheat samples with a homogeneous particle size distribution to minimize measurement variations and avoid DON segregation among particles of different sizes was established. Calibration models for durum wheat, common wheat and durum + common wheat samples, with particle size <500 microm, were obtained by using partial least squares (PLS) regression with an external validation technique. Values of root mean square error of prediction (RMSEP, 306-379 microg kg(-1)) were comparable and not too far from values of root mean square error of cross-validation (RMSECV, 470-555 microg kg(-1)). Coefficients of determination (r(2)) indicated an "approximate to good" level of prediction of the DON content by FT-NIR spectroscopy in the PLS calibration models (r(2) = 0.71-0.83), and a "good" discrimination between low and high DON contents in the PLS validation models (r(2) = 0.58-0.63). A "limited to good" practical utility of the models was ascertained by range error ratio (RER) values higher than 6. A qualitative model, based on 197 calibration samples, was developed to discriminate between blank and naturally contaminated wheat samples by setting a cut-off at 300 microg kg(-1) DON to separate the two classes. The model correctly classified 69% of the 65 validation samples with most misclassified samples (16 of 20) showing DON contamination levels quite close to the cut-off level. These findings suggest that FT-NIR analysis is suitable for the determination of DON in unprocessed wheat at levels far below the maximum permitted limits set by the European Commission.

  20. Preventing Errors in Clinical Practice: A Call for Self-Awareness

    PubMed Central

    Borrell-Carrió, Francesc; Epstein, Ronald M.

    2004-01-01

    While ascribing medical errors primarily to systems factors can free clinicians from individual blame, there are elements of medical errors that can and should be attributed to individual factors. These factors are related less commonly to lack of knowledge and skill than to the inability to apply the clinician’s abilities to situations under certain circumstances. In concert with efforts to improve health care systems, refining physicians’ emotional and cognitive capacities might also prevent many errors. In general, physicians have the sensation of making a mistake because of the interference of emotional elements. We propose a so-called rational-emotive model that emphasizes 2 factors in error causation: (1) difficulty in reframing the first hypothesis that goes to the physician’s mind in an automatic way, and (2) premature closure of the clinical act to avoid confronting inconsistencies, low-level decision rules, and emotions. We propose a teaching strategy based on developing the physician’s insight and self-awareness to detect the inappropriate use of low-level decision rules, as well as detecting the factors that limit a physician’s capacity to tolerate the tension of uncertainty and ambiguity. Emotional self-awareness and self-regulation of attention can be consciously cultivated as habits to help physicians function better in clinical situations. PMID:15335129

  1. Preventing errors in clinical practice: a call for self-awareness.

    PubMed

    Borrell-Carrió, Francesc; Epstein, Ronald M

    2004-01-01

    While ascribing medical errors primarily to systems factors can free clinicians from individual blame, there are elements of medical errors that can and should be attributed to individual factors. These factors are related less commonly to lack of knowledge and skill than to the inability to apply the clinician's abilities to situations under certain circumstances. In concert with efforts to improve health care systems, refining physicians' emotional and cognitive capacities might also prevent many errors. In general, physicians have the sensation of making a mistake because of the interference of emotional elements. We propose a so-called rational-emotive model that emphasizes 2 factors in error causation: (1) difficulty in reframing the first hypothesis that goes to the physician's mind in an automatic way, and (2) premature closure of the clinical act to avoid confronting inconsistencies, low-level decision rules, and emotions. We propose a teaching strategy based on developing the physician's insight and self-awareness to detect the inappropriate use of low-level decision rules, as well as detecting the factors that limit a physician's capacity to tolerate the tension of uncertainty and ambiguity. Emotional self-awareness and self-regulation of attention can be consciously cultivated as habits to help physicians function better in clinical situations.

  2. Radiographic and anatomic basis for prostate contouring errors and methods to improve prostate contouring accuracy.

    PubMed

    McLaughlin, Patrick W; Evans, Cheryl; Feng, Mary; Narayana, Vrinda

    2010-02-01

    Use of highly conformal radiation for prostate cancer can lead to both overtreatment of surrounding normal tissues and undertreatment of the prostate itself. In this retrospective study we analyzed the radiographic and anatomic basis of common errors in computed tomography (CT) contouring and suggest methods to correct them. Three hundred patients with prostate cancer underwent CT and magnetic resonance imaging (MRI). The prostate was delineated independently on the data sets. CT and MRI contours were compared by use of deformable registration. Errors in target delineation were analyzed and methods to avoid such errors detailed. Contouring errors were identified at the prostatic apex, mid gland, and base on CT. At the apex, the genitourinary diaphragm, rectum, and anterior fascia contribute to overestimation. At the mid prostate, the anterior and lateral fasciae contribute to overestimation. At the base, the bladder and anterior fascia contribute to anterior overestimation. Transition zone hypertrophy and bladder neck variability contribute to errors of overestimation and underestimation at the superior base, whereas variable prostate-to-seminal vesicle relationships with prostate hypertrophy contribute to contouring errors at the posterior base. Most CT contouring errors can be detected by (1) inspection of a lateral view of prostate contours to detect projection from the expected globular form and (2) recognition of anatomic structures (genitourinary diaphragm) on the CT scans that are clearly visible on MRI. This study shows that many CT prostate contouring errors can be improved without direct incorporation of MRI data. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Experimental investigation of strain errors in stereo-digital image correlation due to camera calibration

    NASA Astrophysics Data System (ADS)

    Shao, Xinxing; Zhu, Feipeng; Su, Zhilong; Dai, Xiangjun; Chen, Zhenning; He, Xiaoyuan

    2018-03-01

    The strain errors in stereo-digital image correlation (DIC) due to camera calibration were investigated using precisely controlled numerical experiments and real experiments. Three-dimensional rigid body motion tests were conducted to examine the effects of camera calibration on the measured results. For a fully accurate calibration, rigid body motion causes negligible strain errors. However, for inaccurately calibrated camera parameters and a short working distance, rigid body motion will lead to more than 50-μɛ strain errors, which significantly affects the measurement. In practical measurements, it is impossible to obtain a fully accurate calibration; therefore, considerable attention should be focused on attempting to avoid these types of errors, especially for high-accuracy strain measurements. It is necessary to avoid large rigid body motions in both two-dimensional DIC and stereo-DIC.

  4. A new approach based on Machine Learning for predicting corneal curvature (K1) and astigmatism in patients with keratoconus after intracorneal ring implantation.

    PubMed

    Valdés-Mas, M A; Martín-Guerrero, J D; Rupérez, M J; Pastor, F; Dualde, C; Monserrat, C; Peris-Martínez, C

    2014-08-01

    Keratoconus (KC) is the most common type of corneal ectasia. A corneal transplantation was the treatment of choice until the last decade. However, intra-corneal ring implantation has become more and more common, and it is commonly used to treat KC thus avoiding a corneal transplantation. This work proposes a new approach based on Machine Learning to predict the vision gain of KC patients after ring implantation. That vision gain is assessed by means of the corneal curvature and the astigmatism. Different models were proposed; the best results were achieved by an artificial neural network based on the Multilayer Perceptron. The error provided by the best model was 0.97D of corneal curvature and 0.93D of astigmatism. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Using Weather Data and Climate Model Output in Economic Analyses of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auffhammer, M.; Hsiang, S. M.; Schlenker, W.

    2013-06-28

    Economists are increasingly using weather data and climate model output in analyses of the economic impacts of climate change. This article introduces a set of weather data sets and climate models that are frequently used, discusses the most common mistakes economists make in using these products, and identifies ways to avoid these pitfalls. We first provide an introduction to weather data, including a summary of the types of datasets available, and then discuss five common pitfalls that empirical researchers should be aware of when using historical weather data as explanatory variables in econometric applications. We then provide a brief overviewmore » of climate models and discuss two common and significant errors often made by economists when climate model output is used to simulate the future impacts of climate change on an economic outcome of interest.« less

  6. Genotyping and inflated type I error rate in genome-wide association case/control studies

    PubMed Central

    Sampson, Joshua N; Zhao, Hongyu

    2009-01-01

    Background One common goal of a case/control genome wide association study (GWAS) is to find SNPs associated with a disease. Traditionally, the first step in such studies is to assign a genotype to each SNP in each subject, based on a statistic summarizing fluorescence measurements. When the distributions of the summary statistics are not well separated by genotype, the act of genotype assignment can lead to more potential problems than acknowledged by the literature. Results Specifically, we show that the proportions of each called genotype need not equal the true proportions in the population, even as the number of subjects grows infinitely large. The called genotypes for two subjects need not be independent, even when their true genotypes are independent. Consequently, p-values from tests of association can be anti-conservative, even when the distributions of the summary statistic for the cases and controls are identical. To address these problems, we propose two new tests designed to reduce the inflation in the type I error rate caused by these problems. The first algorithm, logiCALL, measures call quality by fully exploring the likelihood profile of intensity measurements, and the second algorithm avoids genotyping by using a likelihood ratio statistic. Conclusion Genotyping can introduce avoidable false positives in GWAS. PMID:19236714

  7. An adaptive Gaussian process-based method for efficient Bayesian experimental design in groundwater contaminant source identification problems: ADAPTIVE GAUSSIAN PROCESS-BASED INVERSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao

    Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.« less

  8. Preventing medication errors in cancer chemotherapy.

    PubMed

    Cohen, M R; Anderson, R W; Attilio, R M; Green, L; Muller, R J; Pruemer, J M

    1996-04-01

    Recommendations for preventing medication errors in cancer chemotherapy are made. Before a health care provider is granted privileges to prescribe, dispense, or administer antineoplastic agents, he or she should undergo a tailored educational program and possibly testing or certification. Appropriate reference materials should be developed. Each institution should develop a dose-verification process with as many independent checks as possible. A detailed checklist covering prescribing, transcribing, dispensing, and administration should be used. Oral orders are not acceptable. All doses should be calculated independently by the physician, the pharmacist, and the nurse. Dosage limits should be established and a review process set up for doses that exceed the limits. These limits should be entered into pharmacy computer systems, listed on preprinted order forms, stated on the product packaging, placed in strategic locations in the institution, and communicated to employees. The prescribing vocabulary must be standardized. Acronyms, abbreviations, and brand names must be avoided and steps taken to avoid other sources of confusion in the written orders, such as trailing zeros. Preprinted antineoplastic drug order forms containing checklists can help avoid errors. Manufacturers should be encouraged to avoid or eliminate ambiguities in drug names and dosing information. Patients must be educated about all aspects of their cancer chemotherapy, as patients represent a last line of defense against errors. An interdisciplinary team at each practice site should review every medication error reported. Pharmacists should be involved at all sites where antineoplastic agents are dispensed. Although it may not be possible to eliminate all medication errors in cancer chemotherapy, the risk can be minimized through specific steps. Because of their training and experience, pharmacists should take the lead in this effort.

  9. Using technology to prevent adverse drug events in the intensive care unit.

    PubMed

    Hassan, Erkan; Badawi, Omar; Weber, Robert J; Cohen, Henry

    2010-06-01

    Critically ill patients are particularly susceptible to adverse drug events (ADEs) due to their rapidly changing and unstable physiology, complex therapeutic regimens, and large percentage of medications administered intravenously. There are a wide variety of technologies that can help prevent the points of failure commonly associated with ADEs (i.e., the five "Rights": right patient; right drug; right route; right dose; right frequency). These technologies are often categorized by their degree of complexity to design and engineer and the type of error they are designed to prevent. Focusing solely on the software and hardware design of technology may over- or underestimate the degree of difficulty to avoid ADEs at the bedside. Alternatively, we propose categorizing technological solutions by identifying the factors essential for success. The two major critical success factors are: 1) the degree of clinical assessment required by the clinician to appropriately evaluate and disposition the issue identified by a technology; and 2) the complexity associated with effective implementation. This classification provides a way of determining how ADE-preventing technologies in the intensive care unit can be successfully integrated into clinical practice. Although there are limited data on the effectiveness of many technologies in reducing ADEs, we will review the technologies currently available in the intensive care unit environment. We will also discuss critical success factors for implementation, common errors made during implementation, and the potential errors using these systems.

  10. Learning clinical reasoning.

    PubMed

    Pinnock, Ralph; Welch, Paul

    2014-04-01

    Errors in clinical reasoning continue to account for significant morbidity and mortality, despite evidence-based guidelines and improved technology. Experts in clinical reasoning often use unconscious cognitive processes that they are not aware of unless they explain how they are thinking. Understanding the intuitive and analytical thinking processes provides a guide for instruction. How knowledge is stored is critical to expertise in clinical reasoning. Curricula should be designed so that trainees store knowledge in a way that is clinically relevant. Competence in clinical reasoning is acquired by supervised practice with effective feedback. Clinicians must recognise the common errors in clinical reasoning and how to avoid them. Trainees can learn clinical reasoning effectively in everyday practice if teachers provide guidance on the cognitive processes involved in making diagnostic decisions. © 2013 The Authors. Journal of Paediatrics and Child Health © 2013 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  11. A sneaky surgical emergency: Acute compartment syndrome. Retrospective analysis of 66 closed claims, medico-legal pitfalls and damages evaluation.

    PubMed

    Marchesi, M; Marchesi, A; Calori, G M; Cireni, L V; Sileo, G; Merzagora, I; Zoia, R; Vaienti, L; Morini, O

    2014-12-01

    Acute compartment syndrome (ACS) is a clinical condition with potentially dramatic consequences, therefore, it is important to recognise and treat it early. Good management of ACS minimises or avoids the sequelae associated with a late diagnosis, and may also reduce the risk of malpractice claims. The aim of this article was to evaluate different errors ascribed to the surgeon and to identify how the damage was evaluated. A total of 66 completed and closed ACS cases were selected. The following were analysed for each case: clinical management before and after diagnosis of ACS, imputed errors, professional fault, damage evaluation and quantification. Particular attention was paid to distinguishing between impairment because of primary injury and iatrogenic impairment. Statistical analyses were performed using Fisher's exact test and Pearson's correlation. The most common presenting symptom was pain. Delay in the diagnosis, and hence delay in decompression, was common in the study. A total of 48 out of 66 cases resolved with the verdict of iatrogenic damage, which varied from 12% to 75% of global capability of the person. A total of $394,780 out of $574,680 (average payment) derived from a medical error. ACS is a clinical emergency that requires continuous clinical surveillance from both medical and nursing staff. The related damage should be evaluated in two parts: damage deriving from the trauma, so that it is considered inevitable and independent from the surgeon's conduct, and damage deriving from a surgeon's error, which is eligible for an indemnity payment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Critical Neural Substrates for Correcting Unexpected Trajectory Errors and Learning from Them

    ERIC Educational Resources Information Center

    Mutha, Pratik K.; Sainburg, Robert L.; Haaland, Kathleen Y.

    2011-01-01

    Our proficiency at any skill is critically dependent on the ability to monitor our performance, correct errors and adapt subsequent movements so that errors are avoided in the future. In this study, we aimed to dissociate the neural substrates critical for correcting unexpected trajectory errors and learning to adapt future movements based on…

  13. Looking for trouble? Diagnostics expanding disease and producing patients.

    PubMed

    Hofmann, Bjørn

    2018-05-23

    Novel tests give great opportunities for earlier and more precise diagnostics. At the same time, new tests expand disease, produce patients, and cause unnecessary harm in overdiagnosis and overtreatment. How can we evaluate diagnostics to obtain the benefits and avoid harm? One way is to pay close attention to the diagnostic process and its core concepts. Doing so reveals 3 errors that expand disease and increase overdiagnosis. The first error is to decouple diagnostics from harm, eg, by diagnosing insignificant conditions. The second error is to bypass proper validation of the relationship between test indicator and disease, eg, by introducing biomarkers for Alzheimer's disease before the tests are properly validated. The third error is to couple the name of disease to insignificant or indecisive indicators, eg, by lending the cancer name to preconditions, such as ductal carcinoma in situ. We need to avoid these errors to promote beneficial testing, bar harmful diagnostics, and evade unwarranted expansion of disease. Accordingly, we must stop identifying and testing for conditions that are only remotely associated with harm. We need more stringent verification of tests, and we must avoid naming indicators and indicative conditions after diseases. If not, we will end like ancient tragic heroes, succumbing because of our very best abilities. © 2018 John Wiley & Sons, Ltd.

  14. Logical Fallacies and the Abuse of Climate Science: Fire, Water, and Ice

    NASA Astrophysics Data System (ADS)

    Gleick, P. H.

    2012-12-01

    Good policy without good science and analysis is unlikely. Good policy with bad science is even more unlikely. Unfortunately, there is a long history of abuse or misuse of science in fields with ideological, religious, or economically controversial policy implications, such as planetary physics during the time of Galileo, the evolution debate, or climate change. Common to these controversies are what are known as "logical fallacies" -- patterns of reasoning that are always -- or at least commonly -- wrong due to a flaw in the structure of the argument that renders the argument invalid. All scientists should understand the nature of logical fallacies in order to (1) avoid making mistakes and reaching unsupported conclusion, (2) help them understand and refute the flaws in arguments made by others, and (3) aid in communicating science to the public. This talk will present a series of logical fallacies often made in the climate science debate, including "arguments from ignorance," "arguments from error," "arguments from misinterpretation," and "cherry picking." Specific examples will be presented in the area of temperature analysis, water resources, and ice dynamics, with a focus on selective use or misuse of data.; "Argument from Error" - an amusing example of a logical fallacy.

  15. Design and tolerance analysis of a transmission sphere by interferometer model

    NASA Astrophysics Data System (ADS)

    Peng, Wei-Jei; Ho, Cheng-Fong; Lin, Wen-Lung; Yu, Zong-Ru; Huang, Chien-Yao; Hsu, Wei-Yao

    2015-09-01

    The design of a 6-in, f/2.2 transmission sphere for Fizeau interferometry is presented in this paper. To predict the actual performance during design phase, we build an interferometer model combined with tolerance analysis in Zemax. Evaluating focus imaging is not enough for a double pass optical system. Thus, we study the interferometer model that includes system error, wavefronts reflected from reference surface and tested surface. Firstly, we generate a deformation map of the tested surface. Because of multiple configurations in Zemax, we can get the test wavefront and the reference wavefront reflected from the tested surface and the reference surface of transmission sphere respectively. According to the theory of interferometry, we subtract both wavefronts to acquire the phase of tested surface. Zernike polynomial is applied to transfer the map from phase to sag and to remove piston, tilt and power. The restored map is the same as original map; because of no system error exists. Secondly, perturbed tolerances including fabrication of lenses and assembly are considered. The system error occurs because the test and reference beam are no longer common path perfectly. The restored map is inaccurate while the system error is added. Although the system error can be subtracted by calibration, it should be still controlled within a small range to avoid calibration error. Generally the reference wavefront error including the system error and the irregularity of the reference surface of 6-in transmission sphere is measured within peak-to-valley (PV) 0.1 λ (λ=0.6328 um), which is not easy to approach. Consequently, it is necessary to predict the value of system error before manufacture. Finally, a prototype is developed and tested by a reference surface with PV 0.1 λ irregularity.

  16. Type I and Type II error concerns in fMRI research: re-balancing the scale

    PubMed Central

    Cunningham, William A.

    2009-01-01

    Statistical thresholding (i.e. P-values) in fMRI research has become increasingly conservative over the past decade in an attempt to diminish Type I errors (i.e. false alarms) to a level traditionally allowed in behavioral science research. In this article, we examine the unintended negative consequences of this single-minded devotion to Type I errors: increased Type II errors (i.e. missing true effects), a bias toward studying large rather than small effects, a bias toward observing sensory and motor processes rather than complex cognitive and affective processes and deficient meta-analyses. Power analyses indicate that the reductions in acceptable P-values over time are producing dramatic increases in the Type II error rate. Moreover, the push for a mapwide false discovery rate (FDR) of 0.05 is based on the assumption that this is the FDR in most behavioral research; however, this is an inaccurate assessment of the conventions in actual behavioral research. We report simulations demonstrating that combined intensity and cluster size thresholds such as P < 0.005 with a 10 voxel extent produce a desirable balance between Types I and II error rates. This joint threshold produces high but acceptable Type II error rates and produces a FDR that is comparable to the effective FDR in typical behavioral science articles (while a 20 voxel extent threshold produces an actual FDR of 0.05 with relatively common imaging parameters). We recommend a greater focus on replication and meta-analysis rather than emphasizing single studies as the unit of analysis for establishing scientific truth. From this perspective, Type I errors are self-erasing because they will not replicate, thus allowing for more lenient thresholding to avoid Type II errors. PMID:20035017

  17. Embarrassing Pronoun Case Errors [and] When Repeating It's Not Necessary To Use Past Tense.

    ERIC Educational Resources Information Center

    Arnold, George

    2002-01-01

    Discusses how to help journalism students avoid pronoun case errors. Notes that many students as well as broadcast journalism professionals make the error of using the past tense when referring to a previous expression or situation that remains current in meaning. (RS)

  18. A Transient Dopamine Signal Represents Avoidance Value and Causally Influences the Demand to Avoid

    PubMed Central

    Pultorak, Katherine J.; Schelp, Scott A.; Isaacs, Dominic P.; Krzystyniak, Gregory

    2018-01-01

    Abstract While an extensive literature supports the notion that mesocorticolimbic dopamine plays a role in negative reinforcement, recent evidence suggests that dopamine exclusively encodes the value of positive reinforcement. In the present study, we employed a behavioral economics approach to investigate whether dopamine plays a role in the valuation of negative reinforcement. Using rats as subjects, we first applied fast-scan cyclic voltammetry (FSCV) to determine that dopamine concentration decreases with the number of lever presses required to avoid electrical footshock (i.e., the economic price of avoidance). Analysis of the rate of decay of avoidance demand curves, which depict an inverse relationship between avoidance and increasing price, allows for inference of the worth an animal places on avoidance outcomes. Rapidly decaying demand curves indicate increased price sensitivity, or low worth placed on avoidance outcomes, while slow rates of decay indicate reduced price sensitivity, or greater worth placed on avoidance outcomes. We therefore used optogenetics to assess how inducing dopamine release causally modifies the demand to avoid electrical footshock in an economic setting. Increasing release at an avoidance predictive cue made animals more sensitive to price, consistent with a negative reward prediction error (i.e., the animal perceives they received a worse outcome than expected). Increasing release at avoidance made animals less sensitive to price, consistent with a positive reward prediction error (i.e., the animal perceives they received a better outcome than expected). These data demonstrate that transient dopamine release events represent the value of avoidance outcomes and can predictably modify the demand to avoid. PMID:29766047

  19. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  20. Swimming and other activities: applied aspects of fish swimming performance

    USGS Publications Warehouse

    Castro-Santos, Theodore R.; Farrell, A.P.

    2011-01-01

    Human activities such as hydropower development, water withdrawals, and commercial fisheries often put fish species at risk. Engineered solutions designed to protect species or their life stages are frequently based on assumptions about swimming performance and behaviors. In many cases, however, the appropriate data to support these designs are either unavailable or misapplied. This article provides an overview of the state of knowledge of fish swimming performance – where the data come from and how they are applied – identifying both gaps in knowledge and common errors in application, with guidance on how to avoid repeating mistakes, as well as suggestions for further study.

  1. A new method of measuring gravitational acceleration in an undergraduate laboratory program

    NASA Astrophysics Data System (ADS)

    Wang, Qiaochu; Wang, Chang; Xiao, Yunhuan; Schulte, Jurgen; Shi, Qingfan

    2018-01-01

    This paper presents a high accuracy method to measure gravitational acceleration in an undergraduate laboratory program. The experiment is based on water in a cylindrical vessel rotating about its vertical axis at a constant speed. The water surface forms a paraboloid whose focal length is related to rotational period and gravitational acceleration. This experimental setup avoids classical source errors in determining the local value of gravitational acceleration, so prevalent in the common simple pendulum and inclined plane experiments. The presented method combines multiple physics concepts such as kinematics, classical mechanics and geometric optics, offering the opportunity for lateral as well as project-based learning.

  2. FORMATOMATIC: a program for converting diploid allelic data between common formats for population genetic analysis.

    PubMed

    Manoukis, Nicholas C

    2007-07-01

    There has been a great increase in both the number of population genetic analysis programs and the size of data sets being studied with them. Since the file formats required by the most popular and useful programs are variable, automated reformatting or conversion between them is desirable. formatomatic is an easy to use program that can read allelic data files in genepop, raw (csv) or convert formats and create data files in nine formats: raw (csv), arlequin, genepop, immanc/bayesass +, migrate, newhybrids, msvar, baps and structure. Use of formatomatic should greatly reduce time spent reformatting data sets and avoid unnecessary errors.

  3. [Phenomenology of craving: from differentiation to adequate therapy].

    PubMed

    Mendelevich, V D

    2010-01-01

    The author analyzes a phenomenon of addiction from the psychological/psychiatric position and differentiates it from psychopathological disorders, including parabulia, hyperbulia, paraphylia, commonly used for the definition of drive disorders. It has been concluded that addition is a specific complex of clinical symptoms which is not similar to other drive disorders. To avoid diagnostic and therapeutic errors, the author suggests to revise definitions by assigning the biological sense to the conception of addiction within psychoactive substance dependence and sexual addiction, some forms of eating dependence and to use the definition of paraaddictive drives in cases of over-valued drives (gambling, Internet dependence, fanaticism etc).

  4. Outcome Assessments and Cost Avoidance of an Oral Chemotherapy Management Clinic.

    PubMed

    Wong, Siu-Fun; Bounthavong, Mark; Nguyen, Cham P; Chen, Timothy

    2016-03-01

    Increasing use of oral chemotherapy drugs increases the challenges for drug and patient management. An oral chemotherapy management clinic was developed to provide patients with oral chemotherapy management, concurrent medication (CM) education, and symptom management services. This evaluation aims to measure the need and effectiveness of this practice model due to scarce published data. This is a case series report of all patients referred to the oral chemotherapy management clinic. Data collected included patient demographics, depression scores, CMs, and types of intervention, including detection and management outcomes collected at baseline, 3-day, 7-day, and 3-month follow-ups. Persistence rate was monitored. Secondary analysis assessed potential cost avoidance. A total of 86 evaluated patients (32 men and 54 women, mean age of 63.4 years) did not show a high risk for medication nonadherence. The 3 most common cancer diagnoses were rectal, pancreatic, and breast, with capecitabine most prescribed. Patients had an average of 13.7 CMs. A total of 125 interventions (detection and management of adverse drug event detection, compliance, drug interactions, medication error, and symptom management) occurred in 201 visits, with more than 75% of interventions occurring within the first 14 days. A persistence rate was observed in 78% of 41 evaluable patients. The total estimated annual cost avoidance per 1.0 full time employee (FTE) was $125,761.93. This evaluation demonstrated the need for additional support for patients receiving oral chemotherapy within standard of care medical service. A comprehensive oral chemotherapy management referral service can optimize patient care delivery via early interventions for adverse drug events, drug interactions, and medication errors up to 3 months after initiation of treatment. Copyright © 2016 by the National Comprehensive Cancer Network.

  5. Avoiding Substantive Errors in Individualized Education Program Development

    ERIC Educational Resources Information Center

    Yell, Mitchell L.; Katsiyannis, Antonis; Ennis, Robin Parks; Losinski, Mickey; Christle, Christine A.

    2016-01-01

    The purpose of this article is to discuss major substantive errors that school personnel may make when developing students' Individualized Education Programs (IEPs). School IEP team members need to understand the importance of the procedural and substantive requirements of the IEP, have an awareness of the five serious substantive errors that IEP…

  6. Fail Better: Toward a Taxonomy of E-Learning Error

    ERIC Educational Resources Information Center

    Priem, Jason

    2010-01-01

    The study of student error, important across many fields of educational research, has begun to attract interest in the field of e-learning, particularly in relation to usability. However, it remains unclear when errors should be avoided (as usability failures) or embraced (as learning opportunities). Many domains have benefited from taxonomies of…

  7. Multidisciplinary management of ornithine transcarbamylase (OTC) deficiency in pregnancy: essential to prevent hyperammonemic complications

    PubMed Central

    Lamb, Stephanie; Aye, Christina Yi Ling; Murphy, Elaine; Mackillop, Lucy

    2013-01-01

    Ornithine transcarbamylase (OTC) deficiency is the most common inborn error in the metabolism of the urea cycle with an incidence of 1 in 14 000 live births. Pregnancy can trigger potentially fatal hyperammonemic crises. We report a successful pregnancy in a 29-year-old primiparous patient with a known diagnosis of OTC deficiency since infancy. Hyperammonemic complications were avoided due to careful multidisciplinary management which included a detailed antenatal, intrapartum and postnatal plan. Management principles include avoidance of triggers, a low-protein diet and medications which promote the removal of nitrogen by alternative pathways. Triggers include metabolic stress such as febrile illness, particularly gastroenteritis, fasting and any protein loading. In our case the patient, in addition to a restricted protein intake, was prescribed sodium benzoate 4 g four times a day, sodium phenylbutyrate 2 g four times a day and arginine 500 mg four times a day to aid excretion of ammonia and reduce flux through the urea cycle. PMID:23283608

  8. How personal standards perfectionism and evaluative concerns perfectionism affect the error positivity and post-error behavior with varying stimulus visibility.

    PubMed

    Drizinsky, Jessica; Zülch, Joachim; Gibbons, Henning; Stahl, Jutta

    2016-10-01

    Error detection is required in order to correct or avoid imperfect behavior. Although error detection is beneficial for some people, for others it might be disturbing. We investigated Gaudreau and Thompson's (Personality and Individual Differences, 48, 532-537, 2010) model, which combines personal standards perfectionism (PSP) and evaluative concerns perfectionism (ECP). In our electrophysiological study, 43 participants performed a combination of a modified Simon task, an error awareness paradigm, and a masking task with a variation of stimulus onset asynchrony (SOA; 33, 67, and 100 ms). Interestingly, relative to low-ECP participants, high-ECP participants showed a better post-error accuracy (despite a worse classification accuracy) in the high-visibility SOA 100 condition than in the two low-visibility conditions (SOA 33 and SOA 67). Regarding the electrophysiological results, first, we found a positive correlation between ECP and the amplitude of the error positivity (Pe) under conditions of low stimulus visibility. Second, under the condition of high stimulus visibility, we observed a higher Pe amplitude for high-ECP-low-PSP participants than for high-ECP-high-PSP participants. These findings are discussed within the framework of the error-processing avoidance hypothesis of perfectionism (Stahl, Acharki, Kresimon, Völler, & Gibbons, International Journal of Psychophysiology, 97, 153-162, 2015).

  9. Six Common Mistakes in Conservation Priority Setting

    PubMed Central

    Game, Edward T; Kareiva, Peter; Possingham, Hugh P

    2013-01-01

    Abstract A vast number of prioritization schemes have been developed to help conservation navigate tough decisions about the allocation of finite resources. However, the application of quantitative approaches to setting priorities in conservation frequently includes mistakes that can undermine their authors’ intention to be more rigorous and scientific in the way priorities are established and resources allocated. Drawing on well-established principles of decision science, we highlight 6 mistakes commonly associated with setting priorities for conservation: not acknowledging conservation plans are prioritizations; trying to solve an ill-defined problem; not prioritizing actions; arbitrariness; hidden value judgments; and not acknowledging risk of failure. We explain these mistakes and offer a path to help conservation planners avoid making the same mistakes in future prioritizations. Seis Errores Comunes en la Definición de Prioridades de Conservación Resumen Se ha desarrollado un vasto número de esquemas de priorización para ayudar a que la conservación navegue entre decisiones difíciles en cuanto a la asignación de recursos finitos. Sin embargo, la aplicación de métodos cuantitativos para la definición de prioridades en la conservación frecuentemente incluye errores que pueden socavar la intención de sus autores de ser más rigurosos y científicos en la manera en que se establecen las prioridades y se asignan los recursos. Con base en los bien establecidos principios de la ciencia de la decisión, resaltamos seis errores comúnmente asociados con la definición de prioridades para la conservación: no reconocer que los planes de conservación son priorizaciones; tratar de resolver un problema mal definido; no priorizar acciones; arbitrariedad; juicios de valor ocultos y no reconocer el riesgo de fracasar. Explicamos estos errores y ofrecemos un camino para que planificadores de la conservación no cometan los mismos errores en priorizaciones futuras. PMID:23565990

  10. Feature Migration in Time: Reflection of Selective Attention on Speech Errors

    ERIC Educational Resources Information Center

    Nozari, Nazbanou; Dell, Gary S.

    2012-01-01

    This article describes an initial study of the effect of focused attention on phonological speech errors. In 3 experiments, participants recited 4-word tongue twisters and focused attention on 1 (or none) of the words. The attended word was singled out differently in each experiment; participants were under instructions to avoid errors on the…

  11. Error Patterns in Research Papers by Pacific Rim Students.

    ERIC Educational Resources Information Center

    Crowe, Chris

    By looking for patterns of errors in the research papers of Asian students, educators can uncover pedagogical strategies to help students avoid repeating such errors. While a good deal of research has identified a number of sentence-level problems which are typical of Asian students writing in English, little attempt has been made to consider the…

  12. Handling Errors as They Arise in Whole-Class Interactions

    ERIC Educational Resources Information Center

    Ingram, Jenni; Pitt, Andrea; Baldry, Fay

    2015-01-01

    There has been a long history of research into errors and their role in the teaching and learning of mathematics. This research has led to a change to pedagogical recommendations from avoiding errors to explicitly using them in lessons. In this study, 22 mathematics lessons were video-recorded and transcribed. A conversation analytic (CA) approach…

  13. Causal criteria and counterfactuals; nothing more (or less) than scientific common sense.

    PubMed

    Phillips, Carl V; Goodman, Karen J

    2006-05-26

    Two persistent myths in epidemiology are that we can use a list of "causal criteria" to provide an algorithmic approach to inferring causation and that a modern "counterfactual model" can assist in the same endeavor. We argue that these are neither criteria nor a model, but that lists of causal considerations and formalizations of the counterfactual definition of causation are nevertheless useful tools for promoting scientific thinking. They set us on the path to the common sense of scientific inquiry, including testing hypotheses (really putting them to a test, not just calculating simplistic statistics), responding to the Duhem-Quine problem, and avoiding many common errors. Austin Bradford Hill's famous considerations are thus both over-interpreted by those who would use them as criteria and under-appreciated by those who dismiss them as flawed. Similarly, formalizations of counterfactuals are under-appreciated as lessons in basic scientific thinking. The need for lessons in scientific common sense is great in epidemiology, which is taught largely as an engineering discipline and practiced largely as technical tasks, making attention to core principles of scientific inquiry woefully rare.

  14. The evolution of Crew Resource Management training in commercial aviation

    NASA Technical Reports Server (NTRS)

    Helmreich, R. L.; Merritt, A. C.; Wilhelm, J. A.

    1999-01-01

    In this study, we describe changes in the nature of Crew Resource Management (CRM) training in commercial aviation, including its shift from cockpit to crew resource management. Validation of the impact of CRM is discussed. Limitations of CRM, including lack of cross-cultural generality are considered. An overarching framework that stresses error management to increase acceptance of CRM concepts is presented. The error management approach defines behavioral strategies taught in CRM as error countermeasures that are employed to avoid error, to trap errors committed, and to mitigate the consequences of error.

  15. Interplanetary Trajectories, Encke Method (ITEM)

    NASA Technical Reports Server (NTRS)

    Whitlock, F. H.; Wolfe, H.; Lefton, L.; Levine, N.

    1972-01-01

    Modified program has been developed using improved variation of Encke method which avoids accumulation of round-off errors and avoids numerical ambiguities arising from near-circular orbits of low inclination. Variety of interplanetary trajectory problems can be computed with maximum accuracy and efficiency.

  16. Hand-Eye Calibration in Visually-Guided Robot Grinding.

    PubMed

    Li, Wen-Long; Xie, He; Zhang, Gang; Yan, Si-Jie; Yin, Zhou-Ping

    2016-11-01

    Visually-guided robot grinding is a novel and promising automation technique for blade manufacturing. One common problem encountered in robot grinding is hand-eye calibration, which establishes the pose relationship between the end effector (hand) and the scanning sensor (eye). This paper proposes a new calibration approach for robot belt grinding. The main contribution of this paper is its consideration of both joint parameter errors and pose parameter errors in a hand-eye calibration equation. The objective function of the hand-eye calibration is built and solved, from which 30 compensated values (corresponding to 24 joint parameters and six pose parameters) are easily calculated in a closed solution. The proposed approach is economic and simple because only a criterion sphere is used to calculate the calibration parameters, avoiding the need for an expensive and complicated tracking process using a laser tracker. The effectiveness of this method is verified using a calibration experiment and a blade grinding experiment. The code used in this approach is attached in the Appendix.

  17. Hemispheric Asymmetries in Striatal Reward Responses Relate to Approach-Avoidance Learning and Encoding of Positive-Negative Prediction Errors in Dopaminergic Midbrain Regions.

    PubMed

    Aberg, Kristoffer Carl; Doell, Kimberly C; Schwartz, Sophie

    2015-10-28

    Some individuals are better at learning about rewarding situations, whereas others are inclined to avoid punishments (i.e., enhanced approach or avoidance learning, respectively). In reinforcement learning, action values are increased when outcomes are better than predicted (positive prediction errors [PEs]) and decreased for worse than predicted outcomes (negative PEs). Because actions with high and low values are approached and avoided, respectively, individual differences in the neural encoding of PEs may influence the balance between approach-avoidance learning. Recent correlational approaches also indicate that biases in approach-avoidance learning involve hemispheric asymmetries in dopamine function. However, the computational and neural mechanisms underpinning such learning biases remain unknown. Here we assessed hemispheric reward asymmetry in striatal activity in 34 human participants who performed a task involving rewards and punishments. We show that the relative difference in reward response between hemispheres relates to individual biases in approach-avoidance learning. Moreover, using a computational modeling approach, we demonstrate that better encoding of positive (vs negative) PEs in dopaminergic midbrain regions is associated with better approach (vs avoidance) learning, specifically in participants with larger reward responses in the left (vs right) ventral striatum. Thus, individual dispositions or traits may be determined by neural processes acting to constrain learning about specific aspects of the world. Copyright © 2015 the authors 0270-6474/15/3514491-10$15.00/0.

  18. Prescription errors in the National Health Services, time to change practice.

    PubMed

    Hamid, Tahir; Harper, Luke; Rose, Samman; Petkar, Sanjive; Fienman, Richard; Athar, Syed M; Cushley, Michael

    2016-02-01

    Medication error is a major source of iatrogenic illness. Error in prescription is the most common form of avoidable medication error. We present our study, performed at two, UK, National Health Services Hospitals. The prescription practice of junior doctor's working on general medical and surgical wards in National Health Service District General and University Teaching Hospitals in the UK was reviewed. Practice was assessed against standard hospital prescription charts, developed in accordance with local pharmacy guidance. A total of 407 prescription charts were reviewed in both initial audit and re-audit one year later. In the District General Hospital, documentation of allergy, weight and capital-letter prescription was achieved in 31, 5 and 40% of charts, respectively. Forty-nine per cent of discontinued prescriptions were properly deleted and signed for. In re-audit significant improvement was noted in documentation of the patient's name 100%, gender 54%, allergy status 51% and use of generic drug name 71%. Similarly, in the University Teaching Hospital, 82, 63 and 65% compliance was achieved in documentation of age, generic drug name prescription and capital-letter prescription, respectively. Prescription practice was reassessed one year later after recommendations and changes in the prescription practice, leading to significant improvement in documentation of unit number, generic drug name prescription, insulin prescription and documentation of the patient's ward. Prescription error remains an important, modifiable form of medical error, which may be rectified by introducing multidisciplinary assessment of practice, nationwide standardised prescription charts and revision of current prescribing clinical training. © The Author(s) 2016.

  19. Common misconceptions about 5-aminosalicylates and thiopurines in inflammatory bowel disease

    PubMed Central

    Gisbert, Javier P; Chaparro, María; Gomollón, Fernando

    2011-01-01

    Misconceptions are common in the care of patients with inflammatory bowel disease (IBD). In this paper, we state the most commonly found misconceptions in clinical practice and deal with the use of 5-aminosalicylates and thiopurines, to review the related scientific evidence, and make appropriate recommendations. Prevention of errors needs knowledge to avoid making such errors through ignorance. However, the amount of knowledge is increasing so quickly that one new danger is an overabundance of information. IBD is a model of a very complex disease and our goal with this review is to summarize the key evidence for the most common daily clinical problems. With regard to the use of 5-aminosalicylates, the best practice may to be consider abandoning the use of these drugs in patients with small bowel Crohn’ s disease. The combined approach with oral plus topical 5-aminosalicylates should be the first-line therapy in patients with active ulcerative colitis; once-daily treatment should be offered as a first choice regimen due to its better compliance and higher efficacy. With regard to thiopurines, they seem to be as effective in ulcerative colitis as in Crohn’ s disease. Underdosing of thiopurines is a form of undertreatment. Thiopurines should probably be continued indefinitely because their withdrawal is associated with a high risk of relapse. Mercaptopurine is a safe alternative in patients with digestive intolerance or hepatotoxicity due to azathioprine. Finally, thiopurine methyltransferase (TPMT) screening cannot substitute for regular monitoring because the majority of cases of myelotoxicity are not TPMT-related. PMID:21941413

  20. Safe prescribing: a titanic challenge

    PubMed Central

    Routledge, Philip A

    2012-01-01

    The challenge to achieve safe prescribing merits the adjective ‘titanic’. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the ‘Seven C's’. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. PMID:22738396

  1. Vaccination errors in general practice: creation of a preventive checklist based on a multimodal analysis of declared errors.

    PubMed

    Charles, Rodolphe; Vallée, Josette; Tissot, Claire; Lucht, Frédéric; Botelho-Nevers, Elisabeth

    2016-08-01

    Vaccination is a common act in general practice in which, as in all procedures in medicine, errors may occur. To our best knowledge, in this area, few tools exist to prevent them. To create a checklist that could be used in general practice in order to avoid the main errors. From April to July 2013, we systematically searched for vaccination errors using three sources: a review of literature, individual interviews with 25 health care workers and supervised peer review groups meeting at the Medicine school of Saint-Etienne (France). The errors most frequently retrieved were used to create the checklist that was regularly submitted to interviewed caregivers to improve its construction and content; its stabilization has been conceived as an evidence of finalization. The checklist's draw-up included three parts allowing verification at each stage of the vaccination process: before, during and after the vaccine administration. Before the vaccination, items to be checked were mainly does my patient need and may he/she receive this vaccine in accordance with the national French vaccination guidelines? During the preparation and the administration of vaccination, items to be checked were are the patient and the practitioner comfortable? Is all the material needed correctly prepared? Is the appropriate route defined? Ultimately, after the vaccination, most items to be checked concerned traceability. This checklist seemed useful and usable by the panel of practitioners questioned. This vaccination checklist may be useful to prevent errors. Its efficacy and feasibility in clinical practice will require further testing. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Heterodyne interferometer with subatomic periodic nonlinearity.

    PubMed

    Wu, C M; Lawall, J; Deslattes, R D

    1999-07-01

    A new, to our knowledge, heterodyne interferometer for differential displacement measurements is presented. It is, in principle, free of periodic nonlinearity. A pair of spatially separated light beams with different frequencies is produced by two acousto-optic modulators, avoiding the main source of periodic nonlinearity in traditional heterodyne interferometers that are based on a Zeeman split laser. In addition, laser beams of the same frequency are used in the measurement and the reference arms, giving the interferometer theoretically perfect immunity from common-mode displacement. We experimentally demonstrated a residual level of periodic nonlinearity of less than 20 pm in amplitude. The remaining periodic error is attributed to unbalanced ghost reflections that drift slowly with time.

  3. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    PubMed Central

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  4. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  5. Sources of medical error in refractive surgery.

    PubMed

    Moshirfar, Majid; Simpson, Rachel G; Dave, Sonal B; Christiansen, Steven M; Edmonds, Jason N; Culbertson, William W; Pascucci, Stephen E; Sher, Neal A; Cano, David B; Trattler, William B

    2013-05-01

    To evaluate the causes of laser programming errors in refractive surgery and outcomes in these cases. In this multicenter, retrospective chart review, 22 eyes of 18 patients who had incorrect data entered into the refractive laser computer system at the time of treatment were evaluated. Cases were analyzed to uncover the etiology of these errors, patient follow-up treatments, and final outcomes. The results were used to identify potential methods to avoid similar errors in the future. Every patient experienced compromised uncorrected visual acuity requiring additional intervention, and 7 of 22 eyes (32%) lost corrected distance visual acuity (CDVA) of at least one line. Sixteen patients were suitable candidates for additional surgical correction to address these residual visual symptoms and six were not. Thirteen of 22 eyes (59%) received surgical follow-up treatment; nine eyes were treated with contact lenses. After follow-up treatment, six patients (27%) still had a loss of one line or more of CDVA. Three significant sources of error were identified: errors of cylinder conversion, data entry, and patient identification error. Twenty-seven percent of eyes with laser programming errors ultimately lost one or more lines of CDVA. Patients who underwent surgical revision had better outcomes than those who did not. Many of the mistakes identified were likely avoidable had preventive measures been taken, such as strict adherence to patient verification protocol or rigorous rechecking of treatment parameters. Copyright 2013, SLACK Incorporated.

  6. Analysis of Relationships between the Level of Errors in Leg and Monofin Movement and Stroke Parameters in Monofin Swimming

    PubMed Central

    Rejman, Marek

    2013-01-01

    The aim of this study was to analyze the error structure in propulsive movements with regard to its influence on monofin swimming speed. The random cycles performed by six swimmers were filmed during a progressive test (900m). An objective method to estimate errors committed in the area of angular displacement of the feet and monofin segments was employed. The parameters were compared with a previously described model. Mutual dependences between the level of errors, stroke frequency, stroke length and amplitude in relation to swimming velocity were analyzed. The results showed that proper foot movements and the avoidance of errors, arising at the distal part of the fin, ensure the progression of swimming speed. The individual stroke parameters distribution which consists of optimally increasing stroke frequency to the maximal possible level that enables the stabilization of stroke length leads to the minimization of errors. Identification of key elements in the stroke structure based on the analysis of errors committed should aid in improving monofin swimming technique. Key points The monofin swimming technique was evaluated through the prism of objectively defined errors committed by the swimmers. The dependences between the level of errors, stroke rate, stroke length and amplitude in relation to swimming velocity were analyzed. Optimally increasing stroke rate to the maximal possible level that enables the stabilization of stroke length leads to the minimization of errors. Propriety foot movement and the avoidance of errors arising at the distal part of fin, provide for the progression of swimming speed. The key elements improving monofin swimming technique, based on the analysis of errors committed, were designated. PMID:24149742

  7. Continuous Process Improvement Transformation Guidebook

    DTIC Science & Technology

    2006-05-01

    except full-scale im- plementation. Error Proofing ( Poka Yoke ) Finding and correcting defects caused by errors costs more and more as a system or...proofing. Shigeo Shingo introduced the concept of Poka - Yoke at Toyota Motor Corporation. Poka Yoke (pronounced “poh-kah yoh-kay”) translates to “avoid

  8. Aging and the intrusion superiority effect in visuo-spatial working memory.

    PubMed

    Cornoldi, Cesare; Bassani, Chiara; Berto, Rita; Mammarella, Nicola

    2007-01-01

    This study investigated the active component of visuo-spatial working memory (VSWM) in younger and older adults testing the hypotheses that elderly individuals have a poorer performance than younger ones and that errors in active VSWM tasks depend, at least partially, on difficulties in avoiding intrusions (i.e., avoiding already activated information). In two experiments, participants were presented with sequences of matrices on which three positions were pointed out sequentially: their task was to process all the positions but indicate only the final position of each sequence. Results showed a poorer performance in the elderly compared to the younger group and a higher number of intrusion (errors due to activated but irrelevant positions) rather than invention (errors consisting of pointing out a position never indicated by the experiementer) errors. The number of errors increased when a concurrent task was introduced (Experiment 1) and it was affected by different patterns of matrices (Experiment 2). In general, results show that elderly people have an impaired VSWM and produce a large number of errors due to inhibition failures. However, both the younger and the older adults' visuo-spatial working memory was affected by the presence of activated irrelevant information, the reduction of the available resources, and task constraints.

  9. Safe prescribing: a titanic challenge.

    PubMed

    Routledge, Philip A

    2012-10-01

    The challenge to achieve safe prescribing merits the adjective 'titanic'. The organisational and human errors leading to poor prescribing (e.g. underprescribing, overprescribing, misprescribing or medication errors) have parallels in the organisational and human errors that led to the loss of the Titanic 100 years ago this year. Prescribing can be adversely affected by communication failures, critical conditions, complacency, corner cutting, callowness and a lack of courage of conviction, all of which were also factors leading to the Titanic tragedy. These issues need to be addressed by a commitment to excellence, the final component of the 'Seven C's'. Optimal prescribing is dependent upon close communication and collaborative working between highly trained health professionals, whose role is to ensure maximum clinical effectiveness, whilst also protecting their patients from avoidable harm. Since humans are prone to error, and the environments in which they work are imperfect, it is not surprising that medication errors are common, occurring more often during the prescribing stage than during dispensing or administration. A commitment to excellence in prescribing includes a continued focus on lifelong learning (including interprofessional learning) in pharmacology and therapeutics. This should be accompanied by improvements in the clinical working environment of prescribers, and the encouragement of a strong safety culture (including reporting of adverse incidents as well as suspected adverse drug reactions whenever appropriate). Finally, members of the clinical team must be prepared to challenge each other, when necessary, to ensure that prescribing combines the highest likelihood of benefit with the lowest potential for harm. © 2012 The Author. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  10. Somatic stem cells and the kinetics of mutagenesis and carcinogenesis

    PubMed Central

    Cairns, John

    2002-01-01

    There is now strong experimental evidence that epithelial stem cells arrange their sister chromatids at mitosis such that the same template DNA strands stay together through successive divisions; DNA labeled with tritiated thymidine in infancy is still present in the stem cells of adult mice even though these cells are incorporating (and later losing) bromodeoxyuridine [Potten, C. S., Owen, G., Booth, D. & Booth, C. (2002) J. Cell Sci.115, 2381–2388]. But a cell that preserves “immortal strands” will avoid the accumulation of replication errors only if it inhibits those pathways for DNA repair that involve potentially error-prone resynthesis of damaged strands, and this appears to be a property of intestinal stem cells because they are extremely sensitive to the lethal effects of agents that damage DNA. It seems that the combination, in the stem cell, of immortal strands and the choice of death rather than error-prone repair makes epithelial stem cell systems resistant to short exposures to DNA-damaging agents, because the stem cell accumulates few if any errors, and any errors made by the daughters are destined to be discarded. This paper discusses these issues and shows that they lead to a model that explains the strange kinetics of mutagenesis and carcinogenesis in adult mammalian tissues. Coincidentally, the model also can explain why cancers arise even though the spontaneous mutation rate of differentiated mammalian cells is not high enough to generate the multiple mutations needed to form a cancer and why loss of nucleotide-excision repair does not significantly increase the frequency of the common internal cancers. PMID:12149477

  11. Cost effectiveness of a pharmacist-led information technology intervention for reducing rates of clinically important errors in medicines management in general practices (PINCER).

    PubMed

    Elliott, Rachel A; Putman, Koen D; Franklin, Matthew; Annemans, Lieven; Verhaeghe, Nick; Eden, Martin; Hayre, Jasdeep; Rodgers, Sarah; Sheikh, Aziz; Avery, Anthony J

    2014-06-01

    We recently showed that a pharmacist-led information technology-based intervention (PINCER) was significantly more effective in reducing medication errors in general practices than providing simple feedback on errors, with cost per error avoided at £79 (US$131). We aimed to estimate cost effectiveness of the PINCER intervention by combining effectiveness in error reduction and intervention costs with the effect of the individual errors on patient outcomes and healthcare costs, to estimate the effect on costs and QALYs. We developed Markov models for each of six medication errors targeted by PINCER. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. A composite probabilistic model combined patient-level error models with practice-level error rates and intervention costs from the trial. Cost per extra QALY and cost-effectiveness acceptability curves were generated from the perspective of NHS England, with a 5-year time horizon. The PINCER intervention generated £2,679 less cost and 0.81 more QALYs per practice [incremental cost-effectiveness ratio (ICER): -£3,037 per QALY] in the deterministic analysis. In the probabilistic analysis, PINCER generated 0.001 extra QALYs per practice compared with simple feedback, at £4.20 less per practice. Despite this extremely small set of differences in costs and outcomes, PINCER dominated simple feedback with a mean ICER of -£3,936 (standard error £2,970). At a ceiling 'willingness-to-pay' of £20,000/QALY, PINCER reaches 59 % probability of being cost effective. PINCER produced marginal health gain at slightly reduced overall cost. Results are uncertain due to the poor quality of data to inform the effect of avoiding errors.

  12. A Review of Models and Procedures for Synthetic Validation for Entry-Level Army Jobs

    DTIC Science & Technology

    1988-12-01

    demonstrated that training judges to avoid various types of rating errors is successful. Ivancevich (1979), for example, found that extensive discussion of...judges. Personnel Psychology, 39, 337-344. Ivancevich , J. M. (1979). Longitudinal study of the effects of rater training on psychometric error in

  13. Marketing Across Cultures: Learning from U.S. Corporate Blunders.

    ERIC Educational Resources Information Center

    Raffield, Barney T., III

    Errors in judgment made in international marketing as a result of American corporate provincialism or cultural ignorance are chronicled, and ways to avoid similar problems in the future are discussed. The marketing blunders, which have been both expensive and embarrassing, include errors in language use and mistaken assumptions about cultural…

  14. Eddy-covariance data with low signal-to-noise ratio: time-lag determination, uncertainties and limit of detection

    NASA Astrophysics Data System (ADS)

    Langford, B.; Acton, W.; Ammann, C.; Valach, A.; Nemitz, E.

    2015-10-01

    All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. Here, we are applying a consistent approach based on auto- and cross-covariance functions to quantify the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining data sets from several analysers and using simulations, we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time lag eliminates these effects (provided the time lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.

  15. Eddy-covariance data with low signal-to-noise ratio: time-lag determination, uncertainties and limit of detection

    NASA Astrophysics Data System (ADS)

    Langford, B.; Acton, W.; Ammann, C.; Valach, A.; Nemitz, E.

    2015-03-01

    All eddy-covariance flux measurements are associated with random uncertainties which are a combination of sampling error due to natural variability in turbulence and sensor noise. The former is the principal error for systems where the signal-to-noise ratio of the analyser is high, as is usually the case when measuring fluxes of heat, CO2 or H2O. Where signal is limited, which is often the case for measurements of other trace gases and aerosols, instrument uncertainties dominate. We are here applying a consistent approach based on auto- and cross-covariance functions to quantifying the total random flux error and the random error due to instrument noise separately. As with previous approaches, the random error quantification assumes that the time-lag between wind and concentration measurement is known. However, if combined with commonly used automated methods that identify the individual time-lag by looking for the maximum in the cross-covariance function of the two entities, analyser noise additionally leads to a systematic bias in the fluxes. Combining datasets from several analysers and using simulations we show that the method of time-lag determination becomes increasingly important as the magnitude of the instrument error approaches that of the sampling error. The flux bias can be particularly significant for disjunct data, whereas using a prescribed time-lag eliminates these effects (provided the time-lag does not fluctuate unduly over time). We also demonstrate that when sampling at higher elevations, where low frequency turbulence dominates and covariance peaks are broader, both the probability and magnitude of bias are magnified. We show that the statistical significance of noisy flux data can be increased (limit of detection can be decreased) by appropriate averaging of individual fluxes, but only if systematic biases are avoided by using a prescribed time-lag. Finally, we make recommendations for the analysis and reporting of data with low signal-to-noise and their associated errors.

  16. Geographically correlated errors observed from a laser-based short-arc technique

    NASA Astrophysics Data System (ADS)

    Bonnefond, P.; Exertier, P.; Barlier, F.

    1999-07-01

    The laser-based short-arc technique has been developed in order to avoid local errors which affect the dynamical orbit computation, such as those due to mismodeling in the geopotential. It is based on a geometric method and consists in fitting short arcs (about 4000 km), issued from a global orbit, with satellite laser ranging tracking measurements from a ground station network. Ninety-two TOPEX/Poseidon (T/P) cycles of laser-based short-arc orbits have then been compared to JGM-2 and JGM-3 T/P orbits computed by the Precise Orbit Determination (POD) teams (Service d'Orbitographie Doris/Centre National d'Etudes Spatiales and Goddard Space Flight Center/NASA) over two areas: (1) the Mediterranean area and (2) a part of the Pacific (including California and Hawaii) called hereafter the U.S. area. Geographically correlated orbit errors in these areas are clearly evidenced: for example, -2.6 cm and +0.7 cm for the Mediterranean and U.S. areas, respectively, relative to JGM-3 orbits. However, geographically correlated errors (GCE) which are commonly linked to errors in the gravity model, can also be due to systematic errors in the reference frame and/or to biases in the tracking measurements. The short-arc technique being very sensitive to such error sources, our analysis however demonstrates that the induced geographical systematic effects are at the level of 1-2 cm on the radial orbit component. Results are also compared with those obtained with the GPS-based reduced dynamic technique. The time-dependent part of GCE has also been studied. Over 6 years of T/P data, coherent signals in the radial component of T/P Precise Orbit Ephemeris (POE) are clearly evidenced with a time period of about 6 months. In addition, impact of time varying-error sources coming from the reference frame and the tracking data accuracy has been analyzed, showing a possible linear trend of about 0.5-1 mm/yr in the radial component of T/P POE.

  17. Terrestrial Water Storage in African Hydrological Regimes Derived from GRACE Mission Data: Intercomparison of Spherical Harmonics, Mass Concentration, and Scalar Slepian Methods.

    PubMed

    Rateb, Ashraf; Kuo, Chung-Yen; Imani, Moslem; Tseng, Kuo-Hsin; Lan, Wen-Hau; Ching, Kuo-En; Tseng, Tzu-Pang

    2017-03-10

    Spherical harmonics (SH) and mascon solutions are the two most common types of solutions for Gravity Recovery and Climate Experiment (GRACE) mass flux observations. However, SH signals are degraded by measurement and leakage errors. Mascon solutions (the Jet Propulsion Laboratory (JPL) release, herein) exhibit weakened signals at submascon resolutions. Both solutions require a scale factor examined by the CLM4.0 model to obtain the actual water storage signal. The Slepian localization method can avoid the SH leakage errors when applied to the basin scale. In this study, we estimate SH errors and scale factors for African hydrological regimes. Then, terrestrial water storage (TWS) in Africa is determined based on Slepian localization and compared with JPL-mascon and SH solutions. The three TWS estimates show good agreement for the TWS of large-sized and humid regimes but present discrepancies for the TWS of medium and small-sized regimes. Slepian localization is an effective method for deriving the TWS of arid zones. The TWS behavior in African regimes and its spatiotemporal variations are then examined. The negative TWS trends in the lower Nile and Sahara at -1.08 and -6.92 Gt/year, respectively, are higher than those previously reported.

  18. Terrestrial Water Storage in African Hydrological Regimes Derived from GRACE Mission Data: Intercomparison of Spherical Harmonics, Mass Concentration, and Scalar Slepian Methods

    PubMed Central

    Rateb, Ashraf; Kuo, Chung-Yen; Imani, Moslem; Tseng, Kuo-Hsin; Lan, Wen-Hau; Ching, Kuo-En; Tseng, Tzu-Pang

    2017-01-01

    Spherical harmonics (SH) and mascon solutions are the two most common types of solutions for Gravity Recovery and Climate Experiment (GRACE) mass flux observations. However, SH signals are degraded by measurement and leakage errors. Mascon solutions (the Jet Propulsion Laboratory (JPL) release, herein) exhibit weakened signals at submascon resolutions. Both solutions require a scale factor examined by the CLM4.0 model to obtain the actual water storage signal. The Slepian localization method can avoid the SH leakage errors when applied to the basin scale. In this study, we estimate SH errors and scale factors for African hydrological regimes. Then, terrestrial water storage (TWS) in Africa is determined based on Slepian localization and compared with JPL-mascon and SH solutions. The three TWS estimates show good agreement for the TWS of large-sized and humid regimes but present discrepancies for the TWS of medium and small-sized regimes. Slepian localization is an effective method for deriving the TWS of arid zones. The TWS behavior in African regimes and its spatiotemporal variations are then examined. The negative TWS trends in the lower Nile and Sahara at −1.08 and −6.92 Gt/year, respectively, are higher than those previously reported. PMID:28287453

  19. A minimalist approach to bias estimation for passive sensor measurements with targets of opportunity

    NASA Astrophysics Data System (ADS)

    Belfadel, Djedjiga; Osborne, Richard W.; Bar-Shalom, Yaakov

    2013-09-01

    In order to carry out data fusion, registration error correction is crucial in multisensor systems. This requires estimation of the sensor measurement biases. It is important to correct for these bias errors so that the multiple sensor measurements and/or tracks can be referenced as accurately as possible to a common tracking coordinate system. This paper provides a solution for bias estimation for the minimum number of passive sensors (two), when only targets of opportunity are available. The sensor measurements are assumed time-coincident (synchronous) and perfectly associated. Since these sensors provide only line of sight (LOS) measurements, the formation of a single composite Cartesian measurement obtained from fusing the LOS measurements from different sensors is needed to avoid the need for nonlinear filtering. We evaluate the Cramer-Rao Lower Bound (CRLB) on the covariance of the bias estimate, i.e., the quantification of the available information about the biases. Statistical tests on the results of simulations show that this method is statistically efficient, even for small sample sizes (as few as two sensors and six points on the trajectory of a single target of opportunity). We also show that the RMS position error is significantly improved with bias estimation compared with the target position estimation using the original biased measurements.

  20. Report on Automated Semantic Analysis of Scientific and Engineering Codes

    NASA Technical Reports Server (NTRS)

    Stewart. Maark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.

  1. Design guidelines for avoiding thermo-acoustic oscillations in helium piping systems

    DOE PAGES

    Gupta, Prabhat Kumar; Rabehl, Roger

    2015-04-02

    Thermo-acoustic oscillations are a commonly observed phenomenon in helium cryogenic systems, especially in tubes connecting hot and cold areas. The open ends of these tubes are connected to the lower temperature (typically at 4.5 K), and the closed ends of these tubes are connected to the high temperature (300 K). Cryogenic instrumentation installations provide ideal conditions for these oscillations to occur due to the steep temperature gradient along the tubing. These oscillations create errors in measurements as well as an undesirable heat load to the system. The work presented here develops engineering guidelines to design oscillation-free helium piping. This workmore » also studies the effect of different piping inserts and shows how the proper geometrical combinations have to be chosen to avoid thermo-acoustic oscillations. The effect of an 80 K intercept is also studied and shows that thermo-oscillations can be dampened by placing the intercept at an appropriate location. As a result, the design of helium piping based on the present work is also verified with the experimental results available in open literature.« less

  2. In Your Face: Risk of Punishment Enhances Cognitive Control and Error-Related Activity in the Corrugator Supercilii Muscle.

    PubMed

    Lindström, Björn R; Mattsson-Mårn, Isak Berglund; Golkar, Armita; Olsson, Andreas

    2013-01-01

    Cognitive control is needed when mistakes have consequences, especially when such consequences are potentially harmful. However, little is known about how the aversive consequences of deficient control affect behavior. To address this issue, participants performed a two-choice response time task where error commissions were expected to be punished by electric shocks during certain blocks. By manipulating (1) the perceived punishment risk (no, low, high) associated with error commissions, and (2) response conflict (low, high), we showed that motivation to avoid punishment enhanced performance during high response conflict. As a novel index of the processes enabling successful cognitive control under threat, we explored electromyographic activity in the corrugator supercilii (cEMG) muscle of the upper face. The corrugator supercilii is partially controlled by the anterior midcingulate cortex (aMCC) which is sensitive to negative affect, pain and cognitive control. As hypothesized, the cEMG exhibited several key similarities with the core temporal and functional characteristics of the Error-Related Negativity (ERN) ERP component, the hallmark index of cognitive control elicited by performance errors, and which has been linked to the aMCC. The cEMG was amplified within 100 ms of error commissions (the same time-window as the ERN), particularly during the high punishment risk condition where errors would be most aversive. Furthermore, similar to the ERN, the magnitude of error cEMG predicted post-error response time slowing. Our results suggest that cEMG activity can serve as an index of avoidance motivated control, which is instrumental to adaptive cognitive control when consequences are potentially harmful.

  3. In Your Face: Risk of Punishment Enhances Cognitive Control and Error-Related Activity in the Corrugator Supercilii Muscle

    PubMed Central

    Lindström, Björn R.; Mattsson-Mårn, Isak Berglund; Golkar, Armita; Olsson, Andreas

    2013-01-01

    Cognitive control is needed when mistakes have consequences, especially when such consequences are potentially harmful. However, little is known about how the aversive consequences of deficient control affect behavior. To address this issue, participants performed a two-choice response time task where error commissions were expected to be punished by electric shocks during certain blocks. By manipulating (1) the perceived punishment risk (no, low, high) associated with error commissions, and (2) response conflict (low, high), we showed that motivation to avoid punishment enhanced performance during high response conflict. As a novel index of the processes enabling successful cognitive control under threat, we explored electromyographic activity in the corrugator supercilii (cEMG) muscle of the upper face. The corrugator supercilii is partially controlled by the anterior midcingulate cortex (aMCC) which is sensitive to negative affect, pain and cognitive control. As hypothesized, the cEMG exhibited several key similarities with the core temporal and functional characteristics of the Error-Related Negativity (ERN) ERP component, the hallmark index of cognitive control elicited by performance errors, and which has been linked to the aMCC. The cEMG was amplified within 100 ms of error commissions (the same time-window as the ERN), particularly during the high punishment risk condition where errors would be most aversive. Furthermore, similar to the ERN, the magnitude of error cEMG predicted post-error response time slowing. Our results suggest that cEMG activity can serve as an index of avoidance motivated control, which is instrumental to adaptive cognitive control when consequences are potentially harmful. PMID:23840356

  4. A pharmacist-led information technology intervention for medication errors (PINCER): a multicentre, cluster randomised, controlled trial and cost-effectiveness analysis

    PubMed Central

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Cresswell, Kathrin; Eden, Martin; Elliott, Rachel A; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Prescott, Robin J; Swanwick, Glen; Franklin, Matthew; Putman, Koen; Boyd, Matthew; Sheikh, Aziz

    2012-01-01

    Summary Background Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-effectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months' follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0·58, 95% CI 0·38–0·89); a β blocker if they had asthma (0·73, 0·58–0·91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0·51, 0·34–0·78). PINCER has a 95% probability of being cost effective if the decision-maker's ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding Patient Safety Research Portfolio, Department of Health, England. PMID:22357106

  5. A Flexible Latent Class Approach to Estimating Test-Score Reliability

    ERIC Educational Resources Information Center

    van der Palm, Daniël W.; van der Ark, L. Andries; Sijtsma, Klaas

    2014-01-01

    The latent class reliability coefficient (LCRC) is improved by using the divisive latent class model instead of the unrestricted latent class model. This results in the divisive latent class reliability coefficient (DLCRC), which unlike LCRC avoids making subjective decisions about the best solution and thus avoids judgment error. A computational…

  6. Communication Vagueness in the Literature Review Section of Journal Article Submissions

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2018-01-01

    Evidence has been provided about the importance of avoiding American Psychological Association (APA) errors in the abstract, body, reference list, and table sections of empirical research articles. Specifically, authors are significantly more likely to have their manuscripts rejected for publication if they fail to avoid APA violations--and, thus,…

  7. Limitations of the paraxial Debye approximation.

    PubMed

    Sheppard, Colin J R

    2013-04-01

    In the paraxial form of the Debye integral for focusing, higher order defocus terms are ignored, which can result in errors in dealing with aberrations, even for low numerical aperture. These errors can be avoided by using a different integration variable. The aberrations of a glass slab, such as a coverslip, are expanded in terms of the new variable, and expressed in terms of Zernike polynomials to assist with aberration balancing. Tube length error is also discussed.

  8. Recursive Construction of Noiseless Subsystem for Qudits

    NASA Astrophysics Data System (ADS)

    Güngördü, Utkan; Li, Chi-Kwong; Nakahara, Mikio; Poon, Yiu-Tung; Sze, Nung-Sing

    2014-03-01

    When the environmental noise acting on the system has certain symmetries, a subsystem of the total system can avoid errors. Encoding information into such a subsystem is advantageous since it does not require any error syndrome measurements, which may introduce further errors to the system. However, utilizing such a subsystem for large systems gets impractical with the increasing number of qudits. A recursive scheme offers a solution to this problem. Here, we review the recursive construct introduced in, which can asymptotically protect 1/d of the qudits in system against collective errors.

  9. Locked-mode avoidance and recovery without external momentum input

    NASA Astrophysics Data System (ADS)

    Delgado-Aparicio, L.; Gates, D. A.; Wolfe, S.; Rice, J. E.; Gao, C.; Wukitch, S.; Greenwald, M.; Hughes, J.; Marmar, E.; Scott, S.

    2014-10-01

    Error-field-induced locked-modes (LMs) have been studied in C-Mod at ITER toroidal fields without NBI fueling and momentum input. The use of ICRH heating in synch with the error-field ramp-up resulted in a successful delay of the mode-onset when PICRH > 1 MW and a transition into H-mode when PICRH > 2 MW. The recovery experiments consisted in applying ICRH power during the LM non-rotating phase successfully unlocking the core plasma. The ``induced'' toroidal rotation was in the counter-current direction, restoring the direction and magnitude of the toroidal flow before the LM formation, but contrary to the expected Rice-scaling in the co-current direction. However, the LM occurs near the LOC/SOC transition where rotation reversals are commonly observed. Once PICRH is turned off, the core plasma ``locks'' at later times depending on the evolution of ne and Vt. This work was performed under US DoE contracts including DE-FC02-99ER54512 and others at MIT and DE-AC02-09CH11466 at PPPL.

  10. Variations in the axis of motion during head repositioning--a comparison of subjects with whiplash-associated disorders or non-specific neck pain and healthy controls.

    PubMed

    Grip, Helena; Sundelin, Gunnevi; Gerdle, Björn; Karlsson, J Stefan

    2007-10-01

    The ability to reproduce head position can be affected in patients after a neck injury. The repositioning error is commonly used as a measure of proprioception, but variations in the movement might provide additional information. The axis of motion and target performance were analyzed during a head repositioning task (flexion, extension and side rotations) for 24 control subjects, 22 subjects with whiplash-associated disorders and 21 with non-specific neck pain. Questionnaires regarding pain intensity and fear avoidance were collected. Head position and axis of motion parameters were calculated using a helical axis model with a moving window of 4 degrees . During flexion the whiplash group had a larger constant repositioning error than the control group (-1.8(2.9) degrees vs. 0.1(2.4) degrees , P=0.04). The axis was more inferior in both neck pain groups (12.0(1.6)cm vs. 14.5(2.0)cm, P<0.05) indicating movement at a lower level in the spine. Including pain intensity from shoulder and neck region as covariates showed an effect on the axis position (P=0.03 and 0.04). During axial rotation to the left there was more variation in axis direction for neckpain groups as compared with controls (4.0(1.7) degrees and 3.7(2.4) degrees vs. 2.3(1.9) degrees , P=0.01 and 0.05). No significant difference in fear avoidance was found between the two neck pain groups. Measuring variation in the axis of motion together with target performance gives objective measures on proprioceptive ability that are difficult to quantify by visual inspection. Repositioning errors were in general small, suggesting it is not sufficient as a single measurement variable in a clinical situation, but should be measured in combination with other tests, such as range of motion.

  11. Phantom feet on digital radionuclide images and other scary computer tales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freitas, J.E.; Dworkin, H.J.; Dees, S.M.

    1989-09-01

    Malfunction of a computer-assisted digital gamma camera is reported. Despite what appeared to be adequate acceptance testing, an error in the system gave rise to switching of images and identification text. A suggestion is made for using a hot marker, which would avoid the potential error of misinterpretation of patient images.

  12. Progressive Care Nurses Improving Patient Safety by Limiting Interruptions During Medication Administration.

    PubMed

    Flynn, Fran; Evanish, Julie Q; Fernald, Josephine M; Hutchinson, Dawn E; Lefaiver, Cheryl

    2016-08-01

    Because of the high frequency of interruptions during medication administration, the effectiveness of strategies to limit interruptions during medication administration has been evaluated in numerous quality improvement initiatives in an effort to reduce medication administration errors. To evaluate the effectiveness of evidence-based strategies to limit interruptions during scheduled, peak medication administration times in 3 progressive cardiac care units (PCCUs). A secondary aim of the project was to evaluate the impact of limiting interruptions on medication errors. The percentages of interruptions and medication errors before and after implementation of evidence-based strategies to limit interruptions were measured by using direct observations of nurses on 2 PCCUs. Nurses in a third PCCU served as a comparison group. Interruptions (P < .001) and medication errors (P = .02) decreased significantly in 1 PCCU after implementation of evidence-based strategies to limit interruptions. Avoidable interruptions decreased 83% in PCCU1 and 53% in PCCU2 after implementation of the evidence-based strategies. Implementation of evidence-based strategies to limit interruptions in PCCUs decreases avoidable interruptions and promotes patient safety. ©2016 American Association of Critical-Care Nurses.

  13. Diagnostic decision-making and strategies to improve diagnosis.

    PubMed

    Thammasitboon, Satid; Cutrer, William B

    2013-10-01

    A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.

  14. Analysis of the impact of error detection on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. C.; Lee, Y. H.

    1983-01-01

    Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.

  15. Spatiotemporal Filtering Using Principal Component Analysis and Karhunen-Loeve Expansion Approaches for Regional GPS Network Analysis

    NASA Technical Reports Server (NTRS)

    Dong, D.; Fang, P.; Bock, F.; Webb, F.; Prawirondirdjo, L.; Kedar, S.; Jamason, P.

    2006-01-01

    Spatial filtering is an effective way to improve the precision of coordinate time series for regional GPS networks by reducing so-called common mode errors, thereby providing better resolution for detecting weak or transient deformation signals. The commonly used approach to regional filtering assumes that the common mode error is spatially uniform, which is a good approximation for networks of hundreds of kilometers extent, but breaks down as the spatial extent increases. A more rigorous approach should remove the assumption of spatially uniform distribution and let the data themselves reveal the spatial distribution of the common mode error. The principal component analysis (PCA) and the Karhunen-Loeve expansion (KLE) both decompose network time series into a set of temporally varying modes and their spatial responses. Therefore they provide a mathematical framework to perform spatiotemporal filtering.We apply the combination of PCA and KLE to daily station coordinate time series of the Southern California Integrated GPS Network (SCIGN) for the period 2000 to 2004. We demonstrate that spatially and temporally correlated common mode errors are the dominant error source in daily GPS solutions. The spatial characteristics of the common mode errors are close to uniform for all east, north, and vertical components, which implies a very long wavelength source for the common mode errors, compared to the spatial extent of the GPS network in southern California. Furthermore, the common mode errors exhibit temporally nonrandom patterns.

  16. Probabilistic segmentation and intensity estimation for microarray images.

    PubMed

    Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro

    2006-01-01

    We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.

  17. An evidence-based approach to the evaluation and treatment of low back pain in the emergency department.

    PubMed

    Borczuk, Pierre

    2013-07-01

    Low back pain is the most common musculoskeletal complaint that results in a visit to the emergency department, and it is 1 of the top 5 most common complaints in emergency medicine. Estimates of annual healthcare expenditures for low back pain in the United States exceed $90 billion annually, not even taking lost productivity and business costs into account. This review explores an evidence-based rationale for the evaluation of the patient with low back pain, and it provides guidance on risk stratification pertaining to laboratory assessment and radiologic imaging in the emergency department. Published guidelines from the American College of Physicians and American Pain Society are reviewed, with emphasis on best evidence for pharmacologic treatments, self-care interventions, and more invasive procedures and surgery in management of low back pain. Utilizing effective and proven strategies will avoid medical errors, provide better care for patients, and help manage healthcare resources and costs.

  18. Analytical Assessment of Simultaneous Parallel Approach Feasibility from Total System Error

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2014-01-01

    In a simultaneous paired approach to closely-spaced parallel runways, a pair of aircraft flies in close proximity on parallel approach paths. The aircraft pair must maintain a longitudinal separation within a range that avoids wake encounters and, if one of the aircraft blunders, avoids collision. Wake avoidance defines the rear gate of the longitudinal separation. The lead aircraft generates a wake vortex that, with the aid of crosswinds, can travel laterally onto the path of the trail aircraft. As runway separation decreases, the wake has less distance to traverse to reach the path of the trail aircraft. The total system error of each aircraft further reduces this distance. The total system error is often modeled as a probability distribution function. Therefore, Monte-Carlo simulations are a favored tool for assessing a "safe" rear-gate. However, safety for paired approaches typically requires that a catastrophic wake encounter be a rare one-in-a-billion event during normal operation. Using a Monte-Carlo simulation to assert this event rarity with confidence requires a massive number of runs. Such large runs do not lend themselves to rapid turn-around during the early stages of investigation when the goal is to eliminate the infeasible regions of the solution space and to perform trades among the independent variables in the operational concept. One can employ statistical analysis using simplified models more efficiently to narrow the solution space and identify promising trades for more in-depth investigation using Monte-Carlo simulations. These simple, analytical models not only have to address the uncertainty of the total system error but also the uncertainty in navigation sources used to alert an abort of the procedure. This paper presents a method for integrating total system error, procedure abort rates, avionics failures, and surveillance errors into a statistical analysis that identifies the likely feasible runway separations for simultaneous paired approaches.

  19. On the unity of children’s phonological error patterns: Distinguishing symptoms from the problem

    PubMed Central

    Dinnsen, Daniel A.

    2012-01-01

    This article compares the claims of rule- and constraint-based accounts of three seemingly distinct error patterns, namely, Deaffrication, Consonant Harmony and Assibilation, in the sound system of a child with a phonological delay. It is argued that these error patterns are not separate problems, but rather are symptoms of a larger conspiracy to avoid word-initial coronal stops. The clinical implications of these findings are also considered. PMID:21787147

  20. A comparison of smartphone and paper data-collection tools in the Burden of Obstructive Lung Disease (BOLD) study in Gezira state, Sudan

    PubMed Central

    Ahmed, Rana; Robinson, Ryan; Elsony, Asma; Thomson, Rachael; Squire, S. Bertel; Malmborg, Rasmus; Burney, Peter

    2018-01-01

    Introduction Data collection using paper-based questionnaires can be time consuming and return errors affect data accuracy, completeness, and information quality in health surveys. We compared smartphone and paper-based data collection systems in the Burden of Obstructive Lung Disease (BOLD) study in rural Sudan. Methods This exploratory pilot study was designed to run in parallel with the cross-sectional household survey. The Open Data Kit was used to programme questionnaires in Arabic into smartphones. We included 100 study participants (83% women; median age = 41.5 ± 16.4 years) from the BOLD study from 3 rural villages in East-Gezira and Kamleen localities of Gezira state, Sudan. Questionnaire data were collected using smartphone and paper-based technologies simultaneously. We used Kappa statistics and inter-rater class coefficient to test agreement between the two methods. Results Symptoms reported included cough (24%), phlegm (15%), wheezing (17%), and shortness of breath (18%). One in five were or had been cigarette smokers. The two data collection methods varied between perfect to slight agreement across the 204 variables evaluated (Kappa varied between 1.00 and 0.02 and inter-rater coefficient between 1.00 and -0.12). Errors were most commonly seen with paper questionnaires (83% of errors seen) vs smartphones (17% of errors seen) administered questionnaires with questions with complex skip-patterns being a major source of errors in paper questionnaires. Automated checks and validations in smartphone-administered questionnaires avoided skip-pattern related errors. Incomplete and inconsistent records were more likely seen on paper questionnaires. Conclusion Compared to paper-based data collection, smartphone technology worked well for data collection in the study, which was conducted in a challenging rural environment in Sudan. This approach provided timely, quality data with fewer errors and inconsistencies compared to paper-based data collection. We recommend this method for future BOLD studies and other population-based studies in similar settings. PMID:29518132

  1. The Crucial Role of Error Correlation for Uncertainty Modeling of CFD-Based Aerodynamics Increments

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Walker, Eric L.

    2011-01-01

    The Ares I ascent aerodynamics database for Design Cycle 3 (DAC-3) was built from wind-tunnel test results and CFD solutions. The wind tunnel results were used to build the baseline response surfaces for wind-tunnel Reynolds numbers at power-off conditions. The CFD solutions were used to build increments to account for Reynolds number effects. We calculate the validation errors for the primary CFD code results at wind tunnel Reynolds number power-off conditions and would like to be able to use those errors to predict the validation errors for the CFD increments. However, the validation errors are large compared to the increments. We suggest a way forward that is consistent with common practice in wind tunnel testing which is to assume that systematic errors in the measurement process and/or the environment will subtract out when increments are calculated, thus making increments more reliable with smaller uncertainty than absolute values of the aerodynamic coefficients. A similar practice has arisen for the use of CFD to generate aerodynamic database increments. The basis of this practice is the assumption of strong correlation of the systematic errors inherent in each of the results used to generate an increment. The assumption of strong correlation is the inferential link between the observed validation uncertainties at wind-tunnel Reynolds numbers and the uncertainties to be predicted for flight. In this paper, we suggest a way to estimate the correlation coefficient and demonstrate the approach using code-to-code differences that were obtained for quality control purposes during the Ares I CFD campaign. Finally, since we can expect the increments to be relatively small compared to the baseline response surface and to be typically of the order of the baseline uncertainty, we find that it is necessary to be able to show that the correlation coefficients are close to unity to avoid overinflating the overall database uncertainty with the addition of the increments.

  2. [Refractive errors in patients with cerebral palsy].

    PubMed

    Mrugacz, Małgorzata; Bandzul, Krzysztof; Kułak, Wojciech; Poppe, Ewa; Jurowski, Piotr

    2013-04-01

    Ocular changes are common in patients with cerebral palsy (CP) and they exist in about 50% of cases. The most common are refractive errors and strabismus disease. The aim of the paper was to estimate the relativeness between refractive errors and neurological pathologies in patients with selected types of CP. MATERIAL AND METHODS. The subject of the analysis was showing refractive errors in patients within two groups of CP: diplegia spastica and tetraparesis, with nervous system pathologies taken into account. Results. This study was proven some correlations between refractive errors and type of CP and severity of the CP classified in GMFCS scale. Refractive errors were more common in patients with tetraparesis than with diplegia spastica. In the group with diplegia spastica more common were myopia and astigmatism, however in tetraparesis - hyperopia.

  3. Classification based upon gene expression data: bias and precision of error rates.

    PubMed

    Wood, Ian A; Visscher, Peter M; Mengersen, Kerrie L

    2007-06-01

    Gene expression data offer a large number of potentially useful predictors for the classification of tissue samples into classes, such as diseased and non-diseased. The predictive error rate of classifiers can be estimated using methods such as cross-validation. We have investigated issues of interpretation and potential bias in the reporting of error rate estimates. The issues considered here are optimization and selection biases, sampling effects, measures of misclassification rate, baseline error rates, two-level external cross-validation and a novel proposal for detection of bias using the permutation mean. Reporting an optimal estimated error rate incurs an optimization bias. Downward bias of 3-5% was found in an existing study of classification based on gene expression data and may be endemic in similar studies. Using a simulated non-informative dataset and two example datasets from existing studies, we show how bias can be detected through the use of label permutations and avoided using two-level external cross-validation. Some studies avoid optimization bias by using single-level cross-validation and a test set, but error rates can be more accurately estimated via two-level cross-validation. In addition to estimating the simple overall error rate, we recommend reporting class error rates plus where possible the conditional risk incorporating prior class probabilities and a misclassification cost matrix. We also describe baseline error rates derived from three trivial classifiers which ignore the predictors. R code which implements two-level external cross-validation with the PAMR package, experiment code, dataset details and additional figures are freely available for non-commercial use from http://www.maths.qut.edu.au/profiles/wood/permr.jsp

  4. WE-G-BRA-04: Common Errors and Deficiencies in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kry, S; Dromgoole, L; Alvarez, P

    Purpose: Dosimetric errors in radiotherapy dose delivery lead to suboptimal treatments and outcomes. This work reviews the frequency and severity of dosimetric and programmatic errors identified by on-site audits performed by the IROC Houston QA center. Methods: IROC Houston on-site audits evaluate absolute beam calibration, relative dosimetry data compared to the treatment planning system data, and processes such as machine QA. Audits conducted from 2000-present were abstracted for recommendations, including type of recommendation and magnitude of error when applicable. Dosimetric recommendations corresponded to absolute dose errors >3% and relative dosimetry errors >2%. On-site audits of 1020 accelerators at 409 institutionsmore » were reviewed. Results: A total of 1280 recommendations were made (average 3.1/institution). The most common recommendation was for inadequate QA procedures per TG-40 and/or TG-142 (82% of institutions) with the most commonly noted deficiency being x-ray and electron off-axis constancy versus gantry angle. Dosimetrically, the most common errors in relative dosimetry were in small-field output factors (59% of institutions), wedge factors (33% of institutions), off-axis factors (21% of institutions), and photon PDD (18% of institutions). Errors in calibration were also problematic: 20% of institutions had an error in electron beam calibration, 8% had an error in photon beam calibration, and 7% had an error in brachytherapy source calibration. Almost all types of data reviewed included errors up to 7% although 20 institutions had errors in excess of 10%, and 5 had errors in excess of 20%. The frequency of electron calibration errors decreased significantly with time, but all other errors show non-significant changes. Conclusion: There are many common and often serious errors made during the establishment and maintenance of a radiotherapy program that can be identified through independent peer review. Physicists should be cautious, particularly in areas highlighted herein that show a tendency for errors.« less

  5. Learning processes underlying avoidance of negative outcomes.

    PubMed

    Andreatta, Marta; Michelmann, Sebastian; Pauli, Paul; Hewig, Johannes

    2017-04-01

    Successful avoidance of a threatening event may negatively reinforce the behavior due to activation of brain structures involved in reward processing. Here, we further investigated the learning-related properties of avoidance using feedback-related negativity (FRN). The FRN is modulated by violations of an intended outcome (prediction error, PE), that is, the bigger the difference between intended and actual outcome, the larger the FRN amplitude is. Twenty-eight participants underwent an operant conditioning paradigm, in which a behavior (button press) allowed them to avoid a painful electric shock. During two learning blocks, participants could avoid an electric shock in 80% of the trials by pressing one button (avoidance button), or by not pressing another button (punishment button). After learning, participants underwent two test blocks, which were identical to the learning ones except that no shocks were delivered. Participants pressed the avoidance button more often than the punishment button. Importantly, response frequency increased throughout the learning blocks but it did not decrease during the test blocks, indicating impaired extinction and/or habit formation. In line with a PE account, FRN amplitude to negative feedback after correct responses (i.e., unexpected punishment) was significantly larger than to positive feedback (i.e., expected omission of punishment), and it increased throughout the blocks. Highly anxious individuals showed equal FRN amplitudes to negative and positive feedback, suggesting impaired discrimination. These results confirm the role of negative reinforcement in motivating behavior and learning, and reveal important differences between high and low anxious individuals in the processing of prediction errors. © 2017 Society for Psychophysiological Research.

  6. Computations of Aerodynamic Performance Databases Using Output-Based Refinement

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2009-01-01

    Objectives: Handle complex geometry problems; Control discretization errors via solution-adaptive mesh refinement; Focus on aerodynamic databases of parametric and optimization studies: 1. Accuracy: satisfy prescribed error bounds 2. Robustness and speed: may require over 105 mesh generations 3. Automation: avoid user supervision Obtain "expert meshes" independent of user skill; and Run every case adaptively in production settings.

  7. [Medication errors in a neonatal unit: One of the main adverse events].

    PubMed

    Esqué Ruiz, M T; Moretones Suñol, M G; Rodríguez Miguélez, J M; Sánchez Ortiz, E; Izco Urroz, M; de Lamo Camino, M; Figueras Aloy, J

    2016-04-01

    Neonatal units are one of the hospital areas most exposed to the committing of treatment errors. A medication error (ME) is defined as the avoidable incident secondary to drug misuse that causes or may cause harm to the patient. The aim of this paper is to present the incidence of ME (including feeding) reported in our neonatal unit and its characteristics and possible causal factors. A list of the strategies implemented for prevention is presented. An analysis was performed on the ME declared in a neonatal unit. A total of 511 MEs have been reported over a period of seven years in the neonatal unit. The incidence in the critical care unit was 32.2 per 1000 hospital days or 20 per 100 patients, of which 0.22 per 1000 days had serious repercussions. The ME reported were, 39.5% prescribing errors, 68.1% administration errors, 0.6% were adverse drug reactions. Around two-thirds (65.4%) were produced by drugs, with 17% being intercepted. The large majority (89.4%) had no impact on the patient, but 0.6% caused permanent damage or death. Nurses reported 65.4% of MEs. The most commonly implicated causal factor was distraction (59%). Simple corrective action (alerts), and intermediate (protocols, clinical sessions and courses) and complex actions (causal analysis, monograph) were performed. It is essential to determine the current state of ME, in order to establish preventive measures and, together with teamwork and good practices, promote a climate of safety. Copyright © 2015 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  8. Refractive ocular conditions and reasons for spectacles renewal in a resource-limited economy

    PubMed Central

    2010-01-01

    Background Although a leading cause of visual impairment and a treatable cause of blindness globally, the pattern of refractive errors in many populations is unknown. This study determined the pattern of refractive ocular conditions, reasons for spectacles renewal and the effect of correction on refractive errors in a resource-limited community. Methods A retrospective review of case records of 1,413 consecutive patients seen in a private optometry practice, Nigeria between January 2006 and July 2007. Results A total number of 1,216 (86.1%) patients comprising of (486, 40%) males and (730, 60%) females with a mean age of 41.02 years SD 14.19 were analyzed. The age distribution peaked at peri-adolescent and the middle age years. The main ocular complaints were spectacles loss and discomfort (412, 33.9%), blurred near vision (399, 32.8%) and asthenopia (255, 20.9%). The mean duration of ocular symptoms before consultation was 2.05 years SD 1.92. The most common refractive errors include presbyopia (431, 35.3%), hyperopic astigmatism (240, 19.7%) and presbyopia with hyperopia (276, 22.7%). Only (59, 4.9%) had myopia. Following correction, there were reductions in magnitudes of the blind (VA<3/60) and visually impaired (VA<6/18-3/60) patients by (18, 58.1%) and (89, 81.7%) respectively. The main reasons for renewal of spectacles were broken lenses/frame/scratched lenses/lenses' falling off (47, 63.4%). Conclusions Adequate correction of refractive errors reduces visual impairment and avoidable blindness and to achieve optimal control of refractive errors in the community, services should be targeted at individuals in the peri-adolescent and the middle age years. PMID:20459676

  9. Refractive ocular conditions and reasons for spectacles renewal in a resource-limited economy.

    PubMed

    Ayanniyi, Abdulkabir A; Folorunso, Francisca N; Adepoju, Feyisayo G

    2010-05-07

    Although a leading cause of visual impairment and a treatable cause of blindness globally, the pattern of refractive errors in many populations is unknown. This study determined the pattern of refractive ocular conditions, reasons for spectacles renewal and the effect of correction on refractive errors in a resource-limited community. A retrospective review of case records of 1,413 consecutive patients seen in a private optometry practice, Nigeria between January 2006 and July 2007. A total number of 1,216 (86.1%) patients comprising of (486, 40%) males and (730, 60%) females with a mean age of 41.02 years SD 14.19 were analyzed. The age distribution peaked at peri-adolescent and the middle age years. The main ocular complaints were spectacles loss and discomfort (412, 33.9%), blurred near vision (399, 32.8%) and asthenopia (255, 20.9%). The mean duration of ocular symptoms before consultation was 2.05 years SD 1.92. The most common refractive errors include presbyopia (431, 35.3%), hyperopic astigmatism (240, 19.7%) and presbyopia with hyperopia (276, 22.7%). Only (59, 4.9%) had myopia. Following correction, there were reductions in magnitudes of the blind (VA<3/60) and visually impaired (VA<6/18-3/60) patients by (18, 58.1%) and (89, 81.7%) respectively. The main reasons for renewal of spectacles were broken lenses/frame/scratched lenses/lenses' falling off (47, 63.4%). Adequate correction of refractive errors reduces visual impairment and avoidable blindness and to achieve optimal control of refractive errors in the community, services should be targeted at individuals in the peri-adolescent and the middle age years.

  10. Mathematical Writing Errors in Expository Writings of College Mathematics Students

    ERIC Educational Resources Information Center

    Guce, Ivee K.

    2017-01-01

    Despite the efforts to confirm the effectiveness of writing in learning mathematics, analysis on common errors in mathematical writings has not received sufficient attention. This study aimed to provide an account of the students' procedural explanations in terms of their commonly committed errors in mathematical writing. Nine errors in…

  11. Prediction of human errors by maladaptive changes in event-related brain networks.

    PubMed

    Eichele, Tom; Debener, Stefan; Calhoun, Vince D; Specht, Karsten; Engel, Andreas K; Hugdahl, Kenneth; von Cramon, D Yves; Ullsperger, Markus

    2008-04-22

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve approximately 30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations.

  12. Prediction of human errors by maladaptive changes in event-related brain networks

    PubMed Central

    Eichele, Tom; Debener, Stefan; Calhoun, Vince D.; Specht, Karsten; Engel, Andreas K.; Hugdahl, Kenneth; von Cramon, D. Yves; Ullsperger, Markus

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional MRI and applying independent component analysis followed by deconvolution of hemodynamic responses, we studied error preceding brain activity on a trial-by-trial basis. We found a set of brain regions in which the temporal evolution of activation predicted performance errors. These maladaptive brain activity changes started to evolve ≈30 sec before the error. In particular, a coincident decrease of deactivation in default mode regions of the brain, together with a decline of activation in regions associated with maintaining task effort, raised the probability of future errors. Our findings provide insights into the brain network dynamics preceding human performance errors and suggest that monitoring of the identified precursor states may help in avoiding human errors in critical real-world situations. PMID:18427123

  13. Characteristics of pediatric chemotherapy medication errors in a national error reporting database.

    PubMed

    Rinke, Michael L; Shore, Andrew D; Morlock, Laura; Hicks, Rodney W; Miller, Marlene R

    2007-07-01

    Little is known regarding chemotherapy medication errors in pediatrics despite studies suggesting high rates of overall pediatric medication errors. In this study, the authors examined patterns in pediatric chemotherapy errors. The authors queried the United States Pharmacopeia MEDMARX database, a national, voluntary, Internet-accessible error reporting system, for all error reports from 1999 through 2004 that involved chemotherapy medications and patients aged <18 years. Of the 310 pediatric chemotherapy error reports, 85% reached the patient, and 15.6% required additional patient monitoring or therapeutic intervention. Forty-eight percent of errors originated in the administering phase of medication delivery, and 30% originated in the drug-dispensing phase. Of the 387 medications cited, 39.5% were antimetabolites, 14.0% were alkylating agents, 9.3% were anthracyclines, and 9.3% were topoisomerase inhibitors. The most commonly involved chemotherapeutic agents were methotrexate (15.3%), cytarabine (12.1%), and etoposide (8.3%). The most common error types were improper dose/quantity (22.9% of 327 cited error types), wrong time (22.6%), omission error (14.1%), and wrong administration technique/wrong route (12.2%). The most common error causes were performance deficit (41.3% of 547 cited error causes), equipment and medication delivery devices (12.4%), communication (8.8%), knowledge deficit (6.8%), and written order errors (5.5%). Four of the 5 most serious errors occurred at community hospitals. Pediatric chemotherapy errors often reached the patient, potentially were harmful, and differed in quality between outpatient and inpatient areas. This study indicated which chemotherapeutic agents most often were involved in errors and that administering errors were common. Investigation is needed regarding targeted medication administration safeguards for these high-risk medications. Copyright (c) 2007 American Cancer Society.

  14. [Can the scattering of differences from the target refraction be avoided?].

    PubMed

    Janknecht, P

    2008-10-01

    We wanted to check how the stochastic error is affected by two lens formulae. The power of the intraocular lens was calculated using the SRK-II formula and the Haigis formula after eye length measurement with ultrasound and the IOL Master. Both lens formulae were partially derived and Gauss error analysis was used for examination of the propagated error. 61 patients with a mean age of 73.8 years were analysed. The postoperative refraction differed from the calculated refraction after ultrasound biometry using the SRK-II formula by 0.05 D (-1.56 to + 1.31, S. D.: 0.59 D; 92 % within +/- 1.0 D), after IOL Master biometry using the SRK-II formula by -0.15 D (-1.18 to + 1.25, S. D.: 0.52 D; 97 % within +/- 1.0 D), and after IOL Master biometry using the Haigis formula by -0.11 D (-1.14 to + 1.14, S. D.: 0.48 D; 95 % within +/- 1.0 D). The results did not differ from one another. The propagated error of the Haigis formula can be calculated according to DeltaP = square root (deltaL x (-4.206))(2) + (deltaVK x 0.9496)(2) + (DeltaDC x (-1.4950))(2). (DeltaL: error measuring axial length, DeltaVK error measuring anterior chamber depth, DeltaDC error measuring corneal power), the propagated error of the SRK-II formula according to DeltaP = square root (DeltaL x (-2.5))(2) + (DeltaDC x (-0.9))(2). The propagated error of the Haigis formula is always larger than the propagated error of the SRK-II formula. Scattering of the postoperative difference from the expected refraction cannot be avoided completely. It is possible to limit the systematic error by developing complicated formulae like the Haigis formula. However, increasing the number of parameters which need to be measured increases the dispersion of the calculated postoperative refraction. A compromise has to be found, and therefore the SRK-II formula is not outdated.

  15. Sub-aperture switching based ptychographic iterative engine (sasPIE) method for quantitative imaging

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Kong, Yan; Jiang, Zhilong; Yu, Wei; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-03-01

    Though ptychographic iterative engine (PIE) has been widely adopted in the quantitative micro-imaging with various illuminations as visible light, X-ray and electron beam, the mechanical inaccuracy in the raster scanning of the sample relative to the illumination always degrades the reconstruction quality seriously and makes the resolution reached much lower than that determined by the numerical aperture of the optical system. To overcome this disadvantage, the sub-aperture switching based PIE method is proposed: the mechanical scanning in the common PIE is replaced by the sub-aperture switching, and the reconstruction error related to the positioning inaccuracy is completely avoided. The proposed technique remarkably improves the reconstruction quality, reduces the complexity of the experimental setup and fundamentally accelerates the data acquisition and reconstruction.

  16. Measurement errors in voice-key naming latency for Hiragana.

    PubMed

    Yamada, Jun; Tamaoka, Katsuo

    2003-12-01

    This study makes explicit the limitations and possibilities of voice-key naming latency research on single hiragana symbols (a Japanese syllabic script) by examining three sets of voice-key naming data against Sakuma, Fushimi, and Tatsumi's 1997 speech-analyzer voice-waveform data. Analysis showed that voice-key measurement errors can be substantial in standard procedures as they may conceal the true effects of significant variables involved in hiragana-naming behavior. While one can avoid voice-key measurement errors to some extent by applying Sakuma, et al.'s deltas and by excluding initial phonemes which induce measurement errors, such errors may be ignored when test items are words and other higher-level linguistic materials.

  17. Dictionary learning-based spatiotemporal regularization for 3D dense speckle tracking

    NASA Astrophysics Data System (ADS)

    Lu, Allen; Zontak, Maria; Parajuli, Nripesh; Stendahl, John C.; Boutagy, Nabil; Eberle, Melissa; O'Donnell, Matthew; Sinusas, Albert J.; Duncan, James S.

    2017-03-01

    Speckle tracking is a common method for non-rigid tissue motion analysis in 3D echocardiography, where unique texture patterns are tracked through the cardiac cycle. However, poor tracking often occurs due to inherent ultrasound issues, such as image artifacts and speckle decorrelation; thus regularization is required. Various methods, such as optical flow, elastic registration, and block matching techniques have been proposed to track speckle motion. Such methods typically apply spatial and temporal regularization in a separate manner. In this paper, we propose a joint spatiotemporal regularization method based on an adaptive dictionary representation of the dense 3D+time Lagrangian motion field. Sparse dictionaries have good signal adaptive and noise-reduction properties; however, they are prone to quantization errors. Our method takes advantage of the desirable noise suppression, while avoiding the undesirable quantization error. The idea is to enforce regularization only on the poorly tracked trajectories. Specifically, our method 1.) builds data-driven 4-dimensional dictionary of Lagrangian displacements using sparse learning, 2.) automatically identifies poorly tracked trajectories (outliers) based on sparse reconstruction errors, and 3.) performs sparse reconstruction of the outliers only. Our approach can be applied on dense Lagrangian motion fields calculated by any method. We demonstrate the effectiveness of our approach on a baseline block matching speckle tracking and evaluate performance of the proposed algorithm using tracking and strain accuracy analysis.

  18. Algorithm for ion beam figuring of low-gradient mirrors.

    PubMed

    Jiao, Changjun; Li, Shengyi; Xie, Xuhui

    2009-07-20

    Ion beam figuring technology for low-gradient mirrors is discussed. Ion beam figuring is a noncontact machining technique in which a beam of high-energy ions is directed toward a target workpiece to remove material in a predetermined and controlled fashion. Owing to this noncontact mode of material removal, problems associated with tool wear and edge effects, which are common in conventional contact polishing processes, are avoided. Based on the Bayesian principle, an iterative dwell time algorithm for planar mirrors is deduced from the computer-controlled optical surfacing (CCOS) principle. With the properties of the removal function, the shaping process of low-gradient mirrors can be approximated by the linear model for planar mirrors. With these discussions, the error surface figuring technology for low-gradient mirrors with a linear path is set up. With the near-Gaussian property of the removal function, the figuring process with a spiral path can be described by the conventional linear CCOS principle, and a Bayesian-based iterative algorithm can be used to deconvolute the dwell time. Moreover, the selection criterion of the spiral parameter is given. Ion beam figuring technology with a spiral scan path based on these methods can be used to figure mirrors with non-axis-symmetrical errors. Experiments on SiC chemical vapor deposition planar and Zerodur paraboloid samples are made, and the final surface errors are all below 1/100 lambda.

  19. GIZMO: Multi-method magneto-hydrodynamics+gravity code

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2014-10-01

    GIZMO is a flexible, multi-method magneto-hydrodynamics+gravity code that solves the hydrodynamic equations using a variety of different methods. It introduces new Lagrangian Godunov-type methods that allow solving the fluid equations with a moving particle distribution that is automatically adaptive in resolution and avoids the advection errors, angular momentum conservation errors, and excessive diffusion problems that seriously limit the applicability of “adaptive mesh” (AMR) codes, while simultaneously avoiding the low-order errors inherent to simpler methods like smoothed-particle hydrodynamics (SPH). GIZMO also allows the use of SPH either in “traditional” form or “modern” (more accurate) forms, or use of a mesh. Self-gravity is solved quickly with a BH-Tree (optionally a hybrid PM-Tree for periodic boundaries) and on-the-fly adaptive gravitational softenings. The code is descended from P-GADGET, itself descended from GADGET-2 (ascl:0003.001), and many of the naming conventions remain (for the sake of compatibility with the large library of GADGET work and analysis software).

  20. Prevention 0f Unwanted Free-Declaration of Static Obstacles in Probability Occupancy Grids

    NASA Astrophysics Data System (ADS)

    Krause, Stefan; Scholz, M.; Hohmann, R.

    2017-10-01

    Obstacle detection and avoidance are major research fields in unmanned aviation. Map based obstacle detection approaches often use discrete world representations such as probabilistic grid maps to fuse incremental environment data from different views or sensors to build a comprehensive representation. The integration of continuous measurements into a discrete representation can result in rounding errors which, in turn, leads to differences between the artificial model and real environment. The cause of these deviations is a low spatial resolution of the world representation comparison to the used sensor data. Differences between artificial representations which are used for path planning or obstacle avoidance and the real world can lead to unexpected behavior up to collisions with unmapped obstacles. This paper presents three approaches to the treatment of errors that can occur during the integration of continuous laser measurement in the discrete probabilistic grid. Further, the quality of the error prevention and the processing performance are compared with real sensor data.

  1. Analysis of Compression Algorithm in Ground Collision Avoidance Systems (Auto-GCAS)

    NASA Technical Reports Server (NTRS)

    Schmalz, Tyler; Ryan, Jack

    2011-01-01

    Automatic Ground Collision Avoidance Systems (Auto-GCAS) utilizes Digital Terrain Elevation Data (DTED) stored onboard a plane to determine potential recovery maneuvers. Because of the current limitations of computer hardware on military airplanes such as the F-22 and F-35, the DTED must be compressed through a lossy technique called binary-tree tip-tilt. The purpose of this study is to determine the accuracy of the compressed data with respect to the original DTED. This study is mainly interested in the magnitude of the error between the two as well as the overall distribution of the errors throughout the DTED. By understanding how the errors of the compression technique are affected by various factors (topography, density of sampling points, sub-sampling techniques, etc.), modifications can be made to the compression technique resulting in better accuracy. This, in turn, would minimize unnecessary activation of A-GCAS during flight as well as maximizing its contribution to fighter safety.

  2. A Framework for Human Microbiome Research

    DTIC Science & Technology

    2012-06-14

    determined that many compo- nents of data production and processing can contribute errors and artefacts. We investigated methods that avoid these errors and...protocol that ensured consistency in the high-throughput production . To maximize accuracy and consistency, protocols were evaluated primarily using a...future benefits, this resource may promote the development of novel prophylactic strategies such as the application of prebiotics and probiotics to

  3. Thinly disguised contempt: a barrier to excellence.

    PubMed

    Brown-Stewart, P

    1987-04-01

    Many elements in contemporary leadership and management convey contempt for employees. "Thinly disguised contempt," a concept introduced by Peters and Austin in A Passion For Excellence, explains many barriers to the achievement of excellence in corporations across disciplines. Health care executives and managers can learn from the errors of corporate management and avoid replicating these errors in the health care industry.

  4. On P values and effect modification.

    PubMed

    Mayer, Martin

    2017-12-01

    A crucial element of evidence-based healthcare is the sound understanding and use of statistics. As part of instilling sound statistical knowledge and practice, it seems useful to highlight instances of unsound statistical reasoning or practice, not merely in captious or vitriolic spirit, but rather, to use such error as a springboard for edification by giving tangibility to the concepts at hand and highlighting the importance of avoiding such error. This article aims to provide an instructive overview of two key statistical concepts: effect modification and P values. A recent article published in the Journal of the American College of Cardiology on side effects related to statin therapy offers a notable example of errors in understanding effect modification and P values, and although not so critical as to entirely invalidate the article, the errors still demand considerable scrutiny and correction. In doing so, this article serves as an instructive overview of the statistical concepts of effect modification and P values. Judicious handling of statistics is imperative to avoid muddying their utility. This article contributes to the body of literature aiming to improve the use of statistics, which in turn will help facilitate evidence appraisal, synthesis, translation, and application.

  5. Common but unappreciated sources of error in one, two, and multiple-color pyrometry

    NASA Technical Reports Server (NTRS)

    Spjut, R. Erik

    1988-01-01

    The most common sources of error in optical pyrometry are examined. They can be classified as either noise and uncertainty errors, stray radiation errors, or speed-of-response errors. Through judicious choice of detectors and optical wavelengths the effect of noise errors can be minimized, but one should strive to determine as many of the system properties as possible. Careful consideration of the optical-collection system can minimize stray radiation errors. Careful consideration must also be given to the slowest elements in a pyrometer when measuring rapid phenomena.

  6. Antiretroviral medication prescribing errors are common with hospitalization of HIV-infected patients.

    PubMed

    Commers, Tessa; Swindells, Susan; Sayles, Harlan; Gross, Alan E; Devetten, Marcel; Sandkovsky, Uriel

    2014-01-01

    Errors in prescribing antiretroviral therapy (ART) often occur with the hospitalization of HIV-infected patients. The rapid identification and prevention of errors may reduce patient harm and healthcare-associated costs. A retrospective review of hospitalized HIV-infected patients was carried out between 1 January 2009 and 31 December 2011. Errors were documented as omission, underdose, overdose, duplicate therapy, incorrect scheduling and/or incorrect therapy. The time to error correction was recorded. Relative risks (RRs) were computed to evaluate patient characteristics and error rates. A total of 289 medication errors were identified in 146/416 admissions (35%). The most common was drug omission (69%). At an error rate of 31%, nucleoside reverse transcriptase inhibitors were associated with an increased risk of error when compared with protease inhibitors (RR 1.32; 95% CI 1.04-1.69) and co-formulated drugs (RR 1.59; 95% CI 1.19-2.09). Of the errors, 31% were corrected within the first 24 h, but over half (55%) were never remedied. Admissions with an omission error were 7.4 times more likely to have all errors corrected within 24 h than were admissions without an omission. Drug interactions with ART were detected on 51 occasions. For the study population (n = 177), an increased risk of admission error was observed for black (43%) compared with white (28%) individuals (RR 1.53; 95% CI 1.16-2.03) but no significant differences were observed between white patients and other minorities or between men and women. Errors in inpatient ART were common, and the majority were never detected. The most common errors involved omission of medication, and nucleoside reverse transcriptase inhibitors had the highest rate of prescribing error. Interventions to prevent and correct errors are urgently needed.

  7. Evaluation and mitigation of potential errors in radiochromic film dosimetry due to film curvature at scanning.

    PubMed

    Palmer, Antony L; Bradley, David A; Nisbet, Andrew

    2015-03-08

    This work considers a previously overlooked uncertainty present in film dosimetry which results from moderate curvature of films during the scanning process. Small film samples are particularly susceptible to film curling which may be undetected or deemed insignificant. In this study, we consider test cases with controlled induced curvature of film and with film raised horizontally above the scanner plate. We also evaluate the difference in scans of a film irradiated with a typical brachytherapy dose distribution with the film naturally curved and with the film held flat on the scanner. Typical naturally occurring curvature of film at scanning, giving rise to a maximum height 1 to 2 mm above the scan plane, may introduce dose errors of 1% to 4%, and considerably reduce gamma evaluation passing rates when comparing film-measured doses with treatment planning system-calculated dose distributions, a common application of film dosimetry in radiotherapy. The use of a triple-channel dosimetry algorithm appeared to mitigate the error due to film curvature compared to conventional single-channel film dosimetry. The change in pixel value and calibrated reported dose with film curling or height above the scanner plate may be due to variations in illumination characteristics, optical disturbances, or a Callier-type effect. There is a clear requirement for physically flat films at scanning to avoid the introduction of a substantial error source in film dosimetry. Particularly for small film samples, a compression glass plate above the film is recommended to ensure flat-film scanning. This effect has been overlooked to date in the literature.

  8. The Performance Analysis of a Real-Time Integrated INS/GPS Vehicle Navigation System with Abnormal GPS Measurement Elimination

    PubMed Central

    Chiang, Kai-Wei; Duong, Thanh Trung; Liao, Jhen-Kai

    2013-01-01

    The integration of an Inertial Navigation System (INS) and the Global Positioning System (GPS) is common in mobile mapping and navigation applications to seamlessly determine the position, velocity, and orientation of the mobile platform. In most INS/GPS integrated architectures, the GPS is considered to be an accurate reference with which to correct for the systematic errors of the inertial sensors, which are composed of biases, scale factors and drift. However, the GPS receiver may produce abnormal pseudo-range errors mainly caused by ionospheric delay, tropospheric delay and the multipath effect. These errors degrade the overall position accuracy of an integrated system that uses conventional INS/GPS integration strategies such as loosely coupled (LC) and tightly coupled (TC) schemes. Conventional tightly coupled INS/GPS integration schemes apply the Klobuchar model and the Hopfield model to reduce pseudo-range delays caused by ionospheric delay and tropospheric delay, respectively, but do not address the multipath problem. However, the multipath effect (from reflected GPS signals) affects the position error far more significantly in a consumer-grade GPS receiver than in an expensive, geodetic-grade GPS receiver. To avoid this problem, a new integrated INS/GPS architecture is proposed. The proposed method is described and applied in a real-time integrated system with two integration strategies, namely, loosely coupled and tightly coupled schemes, respectively. To verify the effectiveness of the proposed method, field tests with various scenarios are conducted and the results are compared with a reliable reference system. PMID:23955434

  9. [Epidemiology of refractive errors].

    PubMed

    Wolfram, C

    2017-07-01

    Refractive errors are very common and can lead to severe pathological changes in the eye. This article analyzes the epidemiology of refractive errors in the general population in Germany and worldwide and describes common definitions for refractive errors and clinical characteristics for pathologicaal changes. Refractive errors differ between age groups due to refractive changes during the life time and also due to generation-specific factors. Current research about the etiology of refractive errors has strengthened the influence of environmental factors, which led to new strategies for the prevention of refractive pathologies.

  10. Hypothesis Testing Using Factor Score Regression

    PubMed Central

    Devlieger, Ines; Mayer, Axel; Rosseel, Yves

    2015-01-01

    In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and with structural equation modeling (SEM) by using analytic calculations and two Monte Carlo simulation studies to examine their finite sample characteristics. Several performance criteria are used, such as the bias using the unstandardized and standardized parameterization, efficiency, mean square error, standard error bias, type I error rate, and power. The results show that the bias correcting method, with the newly developed standard error, is the only suitable alternative for SEM. While it has a higher standard error bias than SEM, it has a comparable bias, efficiency, mean square error, power, and type I error rate. PMID:29795886

  11. Error identification and recovery by student nurses using human patient simulation: opportunity to improve patient safety.

    PubMed

    Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L

    2010-02-01

    This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.

  12. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  13. Optimizing methods and dodging pitfalls in microbiome research.

    PubMed

    Kim, Dorothy; Hofstaedter, Casey E; Zhao, Chunyu; Mattei, Lisa; Tanes, Ceylan; Clarke, Erik; Lauder, Abigail; Sherrill-Mix, Scott; Chehoud, Christel; Kelsen, Judith; Conrad, Máire; Collman, Ronald G; Baldassano, Robert; Bushman, Frederic D; Bittinger, Kyle

    2017-05-05

    Research on the human microbiome has yielded numerous insights into health and disease, but also has resulted in a wealth of experimental artifacts. Here, we present suggestions for optimizing experimental design and avoiding known pitfalls, organized in the typical order in which studies are carried out. We first review best practices in experimental design and introduce common confounders such as age, diet, antibiotic use, pet ownership, longitudinal instability, and microbial sharing during cohousing in animal studies. Typically, samples will need to be stored, so we provide data on best practices for several sample types. We then discuss design and analysis of positive and negative controls, which should always be run with experimental samples. We introduce a convenient set of non-biological DNA sequences that can be useful as positive controls for high-volume analysis. Careful analysis of negative and positive controls is particularly important in studies of samples with low microbial biomass, where contamination can comprise most or all of a sample. Lastly, we summarize approaches to enhancing experimental robustness by careful control of multiple comparisons and to comparing discovery and validation cohorts. We hope the experimental tactics summarized here will help researchers in this exciting field advance their studies efficiently while avoiding errors.

  14. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  15. Frequency and Type of Situational Awareness Errors Contributing to Death and Brain Damage: A Closed Claims Analysis.

    PubMed

    Schulz, Christian M; Burden, Amanda; Posner, Karen L; Mincer, Shawn L; Steadman, Randolph; Wagner, Klaus J; Domino, Karen B

    2017-08-01

    Situational awareness errors may play an important role in the genesis of patient harm. The authors examined closed anesthesia malpractice claims for death or brain damage to determine the frequency and type of situational awareness errors. Surgical and procedural anesthesia death and brain damage claims in the Anesthesia Closed Claims Project database were analyzed. Situational awareness error was defined as failure to perceive relevant clinical information, failure to comprehend the meaning of available information, or failure to project, anticipate, or plan. Patient and case characteristics, primary damaging events, and anesthesia payments in claims with situational awareness errors were compared to other death and brain damage claims from 2002 to 2013. Anesthesiologist situational awareness errors contributed to death or brain damage in 198 of 266 claims (74%). Respiratory system damaging events were more common in claims with situational awareness errors (56%) than other claims (21%, P < 0.001). The most common specific respiratory events in error claims were inadequate oxygenation or ventilation (24%), difficult intubation (11%), and aspiration (10%). Payments were made in 85% of situational awareness error claims compared to 46% in other claims (P = 0.001), with no significant difference in payment size. Among 198 claims with anesthesia situational awareness error, perception errors were most common (42%), whereas comprehension errors (29%) and projection errors (29%) were relatively less common. Situational awareness error definitions were operationalized for reliable application to real-world anesthesia cases. Situational awareness errors may have contributed to catastrophic outcomes in three quarters of recent anesthesia malpractice claims.Situational awareness errors resulting in death or brain damage remain prevalent causes of malpractice claims in the 21st century.

  16. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  17. A pharmacist-led information technology intervention for medication errors (PINCER): a multicentre, cluster randomised, controlled trial and cost-effectiveness analysis.

    PubMed

    Avery, Anthony J; Rodgers, Sarah; Cantrill, Judith A; Armstrong, Sarah; Cresswell, Kathrin; Eden, Martin; Elliott, Rachel A; Howard, Rachel; Kendrick, Denise; Morris, Caroline J; Prescott, Robin J; Swanwick, Glen; Franklin, Matthew; Putman, Koen; Boyd, Matthew; Sheikh, Aziz

    2012-04-07

    Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to researchers and statisticians involved in processing and analysing the data. The allocation was not masked to general practices, pharmacists, patients, or researchers who visited practices to extract data. [corrected]. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-effectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. 72 general practices with a combined list size of 480,942 patients were randomised. At 6 months' follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0·58, 95% CI 0·38-0·89); a β blocker if they had asthma (0·73, 0·58-0·91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0·51, 0·34-0·78). PINCER has a 95% probability of being cost effective if the decision-maker's ceiling willingness to pay reaches £75 per error avoided at 6 months. The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Patient Safety Research Portfolio, Department of Health, England. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Medication errors in anesthesia: unacceptable or unavoidable?

    PubMed

    Dhawan, Ira; Tewari, Anurag; Sehgal, Sankalp; Sinha, Ashish Chandra

    Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to 'treat' drug errors is to prevent them. Wrong medication (due to syringe swap), overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error), incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and 'just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors. Copyright © 2016. Published by Elsevier Editora Ltda.

  19. How to Avoid Errors in Error Propagation: Prediction Intervals and Confidence Intervals in Forest Biomass

    NASA Astrophysics Data System (ADS)

    Lilly, P.; Yanai, R. D.; Buckley, H. L.; Case, B. S.; Woollons, R. C.; Holdaway, R. J.; Johnson, J.

    2016-12-01

    Calculations of forest biomass and elemental content require many measurements and models, each contributing uncertainty to the final estimates. While sampling error is commonly reported, based on replicate plots, error due to uncertainty in the regression used to estimate biomass from tree diameter is usually not quantified. Some published estimates of uncertainty due to the regression models have used the uncertainty in the prediction of individuals, ignoring uncertainty in the mean, while others have propagated uncertainty in the mean while ignoring individual variation. Using the simple case of the calcium concentration of sugar maple leaves, we compare the variation among individuals (the standard deviation) to the uncertainty in the mean (the standard error) and illustrate the declining importance in the prediction of individual concentrations as the number of individuals increases. For allometric models, the analogous statistics are the prediction interval (or the residual variation in the model fit) and the confidence interval (describing the uncertainty in the best fit model). The effect of propagating these two sources of error is illustrated using the mass of sugar maple foliage. The uncertainty in individual tree predictions was large for plots with few trees; for plots with 30 trees or more, the uncertainty in individuals was less important than the uncertainty in the mean. Authors of previously published analyses have reanalyzed their data to show the magnitude of these two sources of uncertainty in scales ranging from experimental plots to entire countries. The most correct analysis will take both sources of uncertainty into account, but for practical purposes, country-level reports of uncertainty in carbon stocks, as required by the IPCC, can ignore the uncertainty in individuals. Ignoring the uncertainty in the mean will lead to exaggerated estimates of confidence in estimates of forest biomass and carbon and nutrient contents.

  20. Frequency and types of the medication errors in an academic emergency department in Iran: The emergent need for clinical pharmacy services in emergency departments.

    PubMed

    Zeraatchi, Alireza; Talebian, Mohammad-Taghi; Nejati, Amir; Dashti-Khavidaki, Simin

    2013-07-01

    Emergency departments (EDs) are characterized by simultaneous care of multiple patients with various medical conditions. Due to a large number of patients with complex diseases, speed and complexity of medication use, working in under-staffing and crowded environment, medication errors are commonly perpetrated by emergency care providers. This study was designed to evaluate the incidence of medication errors among patients attending to an ED in a teaching hospital in Iran. In this cross-sectional study, a total of 500 patients attending to ED were randomly assessed for incidence and types of medication errors. Some factors related to medication errors such as working shift, weekdays and schedule of the educational program of trainee were also evaluated. Nearly, 22% of patients experienced at least one medication error. The rate of medication errors were 0.41 errors per patient and 0.16 errors per ordered medication. The frequency of medication errors was higher in men, middle age patients, first weekdays, night-time work schedules and the first semester of educational year of new junior emergency medicine residents. More than 60% of errors were prescription errors by physicians and the remaining were transcription or administration errors by nurses. More than 35% of the prescribing errors happened during the selection of drug dose and frequency. The most common medication errors by nurses during the administration were omission error (16.2%) followed by unauthorized drug (6.4%). Most of the medication errors happened for anticoagulants and thrombolytics (41.2%) followed by antimicrobial agents (37.7%) and insulin (7.4%). In this study, at least one-fifth of the patients attending to ED experienced medication errors resulting from multiple factors. More common prescription errors happened during ordering drug dose and frequency. More common administration errors included dug omission or unauthorized drug.

  1. [Classifications in forensic medicine and their logical basis].

    PubMed

    Kovalev, A V; Shmarov, L A; Ten'kov, A A

    2014-01-01

    The objective of the present study was to characterize the main requirements for the correct construction of classifications used in forensic medicine, with special reference to the errors that occur in the relevant text-books, guidelines, and manuals and the ways to avoid them. This publication continues the series of thematic articles of the authors devoted to the logical errors in the expert conclusions. The preparation of further publications is underway to report the results of the in-depth analysis of the logical errors encountered in expert conclusions, text-books, guidelines, and manuals.

  2. Human error and the search for blame

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Human error is a frequent topic in discussions about risks in using computer systems. A rational analysis of human error leads through the consideration of mistakes to standards that designers use to avoid mistakes that lead to known breakdowns. The irrational side, however, is more interesting. It conditions people to think that breakdowns are inherently wrong and that there is ultimately someone who is responsible. This leads to a search for someone to blame which diverts attention from: learning from the mistakes; seeing the limitations of current engineering methodology; and improving the discourse of design.

  3. The finer points of writing and refereeing scientific articles.

    PubMed

    Bain, Barbara J; Littlewood, Tim J; Szydlo, Richard M

    2016-02-01

    Writing scientific papers is a skill required by all haematologists. Many also need to be able to referee papers submitted to journals. These skills are not often formally taught and as a result may not be done well. We have reviewed published evidence of errors in these processes. Such errors may be ethical, scientific or linguistic, or may result from a lack of understanding of the processes. The objective of the review is, by highlighting errors, to help writers and referees to avoid them. © 2016 John Wiley & Sons Ltd.

  4. Lead optimization mapper: automating free energy calculations for lead optimization.

    PubMed

    Liu, Shuai; Wu, Yujie; Lin, Teng; Abel, Robert; Redmann, Jonathan P; Summa, Christopher M; Jaber, Vivian R; Lim, Nathan M; Mobley, David L

    2013-09-01

    Alchemical free energy calculations hold increasing promise as an aid to drug discovery efforts. However, applications of these techniques in discovery projects have been relatively few, partly because of the difficulty of planning and setting up calculations. Here, we introduce lead optimization mapper, LOMAP, an automated algorithm to plan efficient relative free energy calculations between potential ligands within a substantial library of perhaps hundreds of compounds. In this approach, ligands are first grouped by structural similarity primarily based on the size of a (loosely defined) maximal common substructure, and then calculations are planned within and between sets of structurally related compounds. An emphasis is placed on ensuring that relative free energies can be obtained between any pair of compounds without combining the results of too many different relative free energy calculations (to avoid accumulation of error) and by providing some redundancy to allow for the possibility of error and consistency checking and provide some insight into when results can be expected to be unreliable. The algorithm is discussed in detail and a Python implementation, based on both Schrödinger's and OpenEye's APIs, has been made available freely under the BSD license.

  5. Organisational sources of safety and danger: sociological contributions to the study of adverse events

    PubMed Central

    West, E.

    2000-01-01

    Organisational sociology has long accepted that mistakes of all kinds are a common, even normal, part of work. Medical work may be particularly prone to error because of its complexity and technological sophistication. The results can be tragic for individuals and families. This paper describes four intrinsic characteristics of organisations that are relevant to the level of risk and danger in healthcare settings—namely, the division of labour and "structural secrecy" in complex organisations; the homophily principle and social structural barriers to communication; diffusion of responsibility and the "problem of many hands"; and environmental or other pressures leading to goal displacement when organisations take their "eyes off the ball". The paper argues that each of these four intrinsic characteristics invokes specific mechanisms that increase danger in healthcare organisations but also offer the possibility of devising strategies and behaviours to increase patient safety. Stated as hypotheses, these ideas could be tested empirically, thus adding to the evidence on which the avoidance of adverse events in healthcare settings is based and contributing to the development of theory in this important area. (Quality in Health Care 2000;9:120–126) Key Words: organisation; safety; errors; adverse events PMID:11067250

  6. Partitioning degrees of freedom in hierarchical and other richly-parameterized models.

    PubMed

    Cui, Yue; Hodges, James S; Kong, Xiaoxiao; Carlin, Bradley P

    2010-02-01

    Hodges & Sargent (2001) developed a measure of a hierarchical model's complexity, degrees of freedom (DF), that is consistent with definitions for scatterplot smoothers, interpretable in terms of simple models, and that enables control of a fit's complexity by means of a prior distribution on complexity. DF describes complexity of the whole fitted model but in general it is unclear how to allocate DF to individual effects. We give a new definition of DF for arbitrary normal-error linear hierarchical models, consistent with Hodges & Sargent's, that naturally partitions the n observations into DF for individual effects and for error. The new conception of an effect's DF is the ratio of the effect's modeled variance matrix to the total variance matrix. This gives a way to describe the sizes of different parts of a model (e.g., spatial clustering vs. heterogeneity), to place DF-based priors on smoothing parameters, and to describe how a smoothed effect competes with other effects. It also avoids difficulties with the most common definition of DF for residuals. We conclude by comparing DF to the effective number of parameters p(D) of Spiegelhalter et al (2002). Technical appendices and a dataset are available online as supplemental materials.

  7. [Errors in medicine. Causes, impact and improvement measures to improve patient safety].

    PubMed

    Waeschle, R M; Bauer, M; Schmidt, C E

    2015-09-01

    The guarantee of quality of care and patient safety is of major importance in hospitals even though increased economic pressure and work intensification are ubiquitously present. Nevertheless, adverse events still occur in 3-4 % of hospital stays and of these 25-50 % are estimated to be avoidable. The identification of possible causes of error and the development of measures for the prevention of medical errors are essential for patient safety. The implementation and continuous development of a constructive culture of error tolerance are fundamental.The origins of errors can be differentiated into systemic latent and individual active causes and components of both categories are typically involved when an error occurs. Systemic causes are, for example out of date structural environments, lack of clinical standards and low personnel density. These causes arise far away from the patient, e.g. management decisions and can remain unrecognized for a long time. Individual causes involve, e.g. confirmation bias, error of fixation and prospective memory failure. These causes have a direct impact on patient care and can result in immediate injury to patients. Stress, unclear information, complex systems and a lack of professional experience can promote individual causes. Awareness of possible causes of error is a fundamental precondition to establishing appropriate countermeasures.Error prevention should include actions directly affecting the causes of error and includes checklists and standard operating procedures (SOP) to avoid fixation and prospective memory failure and team resource management to improve communication and the generation of collective mental models. Critical incident reporting systems (CIRS) provide the opportunity to learn from previous incidents without resulting in injury to patients. Information technology (IT) support systems, such as the computerized physician order entry system, assist in the prevention of medication errors by providing information on dosage, pharmacological interactions, side effects and contraindications of medications.The major challenges for quality and risk management, for the heads of departments and the executive board is the implementation and support of the described actions and a sustained guidance of the staff involved in the modification management process. The global trigger tool is suitable for improving transparency and objectifying the frequency of medical errors.

  8. Threat interferes with response inhibition.

    PubMed

    Hartikainen, Kaisa M; Siiskonen, Anna R; Ogawa, Keith H

    2012-05-09

    A potential threat, such as a spider, captures attention and engages executive functions to adjust ongoing behavior and avoid danger. We and many others have reported slowed responses to neutral targets in the context of emotional distractors. This behavioral slowing has been explained in the framework of attentional competition for limited resources with emotional stimuli prioritized. Alternatively, slowed performance could reflect the activation of avoidance/freezing-type motor behaviors associated with threat. Although the interaction of attention and emotion has been widely studied, little is known on the interaction between emotion and executive functions. We studied how threat-related stimuli (spiders) interact with executive performance and whether the interaction profile fits with a resource competition model or avoidance/freezing-type motor behaviors. Twenty-one young healthy individuals performed a Go-NoGo visual discrimination reaction time (RT) task engaging several executive functions with threat-related and emotionally neutral distractors. The threat-related distractors had no effect on the RT or the error rate in the Go trials. The NoGo error rate, reflecting failure in response inhibition, increased significantly because of threat-related distractors in contrast to neutral distractors, P less than 0.05. Thus, threat-related distractors temporarily impaired response inhibition. Threat-related distractors associated with increased commission errors and no effect on RT does not suggest engagement of avoidance/freezing-type motor behaviors. The results fit in the framework of the resource competition model. A potential threat calls for evaluation of affective significance as well as inhibition of undue emotional reactivity. We suggest that these functions tax executive resources and may render other executive functions, such as response inhibition, temporarily compromised when the demands for resources exceed availability.

  9. Errors in otology.

    PubMed

    Kartush, J M

    1996-11-01

    Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.

  10. Syntactic and semantic errors in radiology reports associated with speech recognition software.

    PubMed

    Ringler, Michael D; Goss, Brian C; Bartholmai, Brian J

    2017-03-01

    Speech recognition software can increase the frequency of errors in radiology reports, which may affect patient care. We retrieved 213,977 speech recognition software-generated reports from 147 different radiologists and proofread them for errors. Errors were classified as "material" if they were believed to alter interpretation of the report. "Immaterial" errors were subclassified as intrusion/omission or spelling errors. The proportion of errors and error type were compared among individual radiologists, imaging subspecialty, and time periods. In all, 20,759 reports (9.7%) contained errors, of which 3992 (1.9%) were material errors. Among immaterial errors, spelling errors were more common than intrusion/omission errors ( p < .001). Proportion of errors and fraction of material errors varied significantly among radiologists and between imaging subspecialties ( p < .001). Errors were more common in cross-sectional reports, reports reinterpreting results of outside examinations, and procedural studies (all p < .001). Error rate decreased over time ( p < .001), which suggests that a quality control program with regular feedback may reduce errors.

  11. A switching formation strategy for obstacle avoidance of a multi-robot system based on robot priority model.

    PubMed

    Dai, Yanyan; Kim, YoonGu; Wee, SungGil; Lee, DongHa; Lee, SukGyu

    2015-05-01

    This paper describes a switching formation strategy for multi-robots with velocity constraints to avoid and cross obstacles. In the strategy, a leader robot plans a safe path using the geometric obstacle avoidance control method (GOACM). By calculating new desired distances and bearing angles with the leader robot, the follower robots switch into a safe formation. With considering collision avoidance, a novel robot priority model, based on the desired distance and bearing angle between the leader and follower robots, is designed during the obstacle avoidance process. The adaptive tracking control algorithm guarantees that the trajectory and velocity tracking errors converge to zero. To demonstrate the validity of the proposed methods, simulation and experiment results present that multi-robots effectively form and switch formation avoiding obstacles without collisions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Using warnings to reduce categorical false memories in younger and older adults.

    PubMed

    Carmichael, Anna M; Gutchess, Angela H

    2016-07-01

    Warnings about memory errors can reduce their incidence, although past work has largely focused on associative memory errors. The current study sought to explore whether warnings could be tailored to specifically reduce false recall of categorical information in both younger and older populations. Before encoding word pairs designed to induce categorical false memories, half of the younger and older participants were warned to avoid committing these types of memory errors. Older adults who received a warning committed fewer categorical memory errors, as well as other types of semantic memory errors, than those who did not receive a warning. In contrast, young adults' memory errors did not differ for the warning versus no-warning groups. Our findings provide evidence for the effectiveness of warnings at reducing categorical memory errors in older adults, perhaps by supporting source monitoring, reduction in reliance on gist traces, or through effective metacognitive strategies.

  13. Observational study of child restraining practice on Norwegian high-speed roads: restraint misuse poses a major threat to child passenger safety.

    PubMed

    Skjerven-Martinsen; Naess, P A; Hansen, T B; Staff, T; Stray-Pedersen, A

    2013-10-01

    Restraint misuse and other occupant safety errors are the major cause of fatal and, severe injuries among child passengers in motor vehicle collisions. The main objectives of the present, study were to provide estimates of restraining practice among children younger than 16 years, traveling on Norwegian high-speed roads, and to uncover the high-risk groups associated with, restraint misuse and other safety errors. A cross-sectional observational study was performed in conjunction with regular traffic, control posts on high-speed roads. The seating and restraining of child occupants younger than 16, years were observed, the interior environment of the vehicles was examined, and a structured, interview of the driver was conducted according to a specific protocol. In total, 1260 child occupants aged 0-15 years were included in the study. Misuse of restraints, was observed in 38% of cases, with this being severe or critical in 24%. The presence of restraint, misuse varied significantly with age (p<0.001), with the frequency being highest among child, occupants in the age group 4-7 years. The most common error in this group was improperly routed, seat belts. The highest frequency of severe and critical errors was observed among child occupants in, the age group 0-3 years. The most common errors were loose or improperly routed harness straps and, incorrect installations of the child restraint system. Moreover, 24% of the children were seated in, vehicles with heavy, unsecured objects in the passenger compartment and/or the trunk that were, likely to move into the compartment upon impact and cause injury. No totally unrestrained children, were observed. This study provides a detailed description of the characteristics of restraint misuse and, the occupant's exposure to unsecured objects. Future education and awareness campaigns should, focus on children aged <8 years. The main challenges are to ensure correct routing and tightness of, harness straps and seat belts, correct installation of child restraints, and avoidance of premature, graduation from child restraints to seat belts only. Information campaigns should also advocate the use, of chest clips and address the potential risks of hard, heavy objects in the passenger compartment and, the importance of the placement and strapping of heavy objects in the trunk. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Errors Analysis of Solving Linear Inequalities among the Preparatory Year Students at King Saud University

    ERIC Educational Resources Information Center

    El-khateeb, Mahmoud M. A.

    2016-01-01

    The purpose of this study aims to investigate the errors classes occurred by the Preparatory year students at King Saud University, through analysis student responses to the items of the study test, and to identify the varieties of the common errors and ratios of common errors that occurred in solving inequalities. In the collection of the data,…

  15. Förster resonance energy transfer (FRET)-based picosecond lifetime reference for instrument response evaluation

    NASA Astrophysics Data System (ADS)

    Luchowski, R.; Kapusta, P.; Szabelski, M.; Sarkar, P.; Borejdo, J.; Gryczynski, Z.; Gryczynski, I.

    2009-09-01

    Förster resonance energy transfer (FRET) can be utilized to achieve ultrashort fluorescence responses in time-domain fluorometry. In a poly(vinyl) alcohol matrix, the presence of 60 mM Rhodamine 800 acceptor shortens the fluorescence lifetime of a pyridine 1 donor to about 20 ps. Such a fast fluorescence response is very similar to the instrument response function (IRF) obtained using scattered excitation light. A solid fluorescent sample (e.g a film) with picosecond lifetime is ideal for IRF measurements and particularly useful for time-resolved microscopy. Avalanche photodiode detectors, commonly used in this field, feature color- dependent-timing responses. We demonstrate that recording the fluorescence decay of the proposed FRET-based reference sample yields a better IRF approximation than the conventional light-scattering method and therefore avoids systematic errors in decay curve analysis.

  16. Adaptive zooming in X-ray computed tomography.

    PubMed

    Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan

    2014-01-01

    In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.

  17. Plasma equilibrium control during slow plasma current quench with avoidance of plasma-wall interaction in JT-60U

    NASA Astrophysics Data System (ADS)

    Yoshino, R.; Nakamura, Y.; Neyatani, Y.

    1997-08-01

    In JT-60U a vertical displacement event (VDE) is observed during slow plasma current quench (Ip quench) for a vertically elongated divertor plasma with a single null. The VDE is generated by an error in the feedback control of the vertical position of the plasma current centre (ZJ). It has been perfectly avoided by improving the accuracy of the ZJ measurement in real time. Furthermore, plasma-wall interaction has been avoided successfully during slow Ip quench owing to the good performance of the plasma equilibrium control system

  18. Hospital-based transfusion error tracking from 2005 to 2010: identifying the key errors threatening patient transfusion safety.

    PubMed

    Maskens, Carolyn; Downie, Helen; Wendt, Alison; Lima, Ana; Merkley, Lisa; Lin, Yulia; Callum, Jeannie

    2014-01-01

    This report provides a comprehensive analysis of transfusion errors occurring at a large teaching hospital and aims to determine key errors that are threatening transfusion safety, despite implementation of safety measures. Errors were prospectively identified from 2005 to 2010. Error data were coded on a secure online database called the Transfusion Error Surveillance System. Errors were defined as any deviation from established standard operating procedures. Errors were identified by clinical and laboratory staff. Denominator data for volume of activity were used to calculate rates. A total of 15,134 errors were reported with a median number of 215 errors per month (range, 85-334). Overall, 9083 (60%) errors occurred on the transfusion service and 6051 (40%) on the clinical services. In total, 23 errors resulted in patient harm: 21 of these errors occurred on the clinical services and two in the transfusion service. Of the 23 harm events, 21 involved inappropriate use of blood. Errors with no harm were 657 times more common than events that caused harm. The most common high-severity clinical errors were sample labeling (37.5%) and inappropriate ordering of blood (28.8%). The most common high-severity error in the transfusion service was sample accepted despite not meeting acceptance criteria (18.3%). The cost of product and component loss due to errors was $593,337. Errors occurred at every point in the transfusion process, with the greatest potential risk of patient harm resulting from inappropriate ordering of blood products and errors in sample labeling. © 2013 American Association of Blood Banks (CME).

  19. Potential effects of reward and loss avoidance in overweight adolescents.

    PubMed

    Reyes, Sussanne; Peirano, Patricio; Luna, Beatriz; Lozoff, Betsy; Algarín, Cecilia

    2015-08-01

    Reward system and inhibitory control are brain functions that exert an influence on eating behavior regulation. We studied the differences in inhibitory control and sensitivity to reward and loss avoidance between overweight/obese and normal-weight adolescents. We assessed 51 overweight/obese and 52 normal-weight 15-y-old Chilean adolescents. The groups were similar regarding sex and intelligence quotient. Using Antisaccade and Incentive tasks, we evaluated inhibitory control and the effect of incentive trials (neutral, loss avoidance, and reward) on generating correct and incorrect responses (latency and error rate). Compared to normal-weight group participants, overweight/obese adolescents showed shorter latency for incorrect antisaccade responses (186.0 (95% CI: 176.8-195.2) vs. 201.3 ms (95% CI: 191.2-211.5), P < 0.05) and better performance reflected by lower error rate in incentive trials (43.6 (95% CI: 37.8-49.4) vs. 53.4% (95% CI: 46.8-60.0), P < 0.05). Overweight/obese adolescents were more accurate on loss avoidance (40.9 (95% CI: 33.5-47.7) vs. 49.8% (95% CI: 43.0-55.1), P < 0.05) and reward (41.0 (95% CI: 34.5-47.5) vs. 49.8% (95% CI: 43.0-55.1), P < 0.05) compared to neutral trials. Overweight/obese adolescents showed shorter latency for incorrect responses and greater accuracy in reward and loss avoidance trials. These findings could suggest that an imbalance of inhibition and reward systems influence their eating behavior.

  20. Phaeochromocytoma.

    PubMed

    Cryer, P E

    1985-02-01

    Phaeochromocytomas are uncommon among patients with hypertension, and sometimes occur in persons without known hypertension, but are important to detect because they are often lethal but commonly curable, and because they are a clue to the presence of associated conditions. Paroxysmal symptoms (especially headache, palpitations, diaphoresis and anxiety), hypertension that is intermittent, unusually labile or resistant to conventional therapy, and conditions known to be associated raise the clinical suspicion of phaeochromocytoma. Biochemical confirmation is commonly achieved by measurement of urinary catecholamines, metanephrines or VMA. Plasma noradrenaline and adrenaline measurements may be superior to measurements of urinary catecholamine metabolites, but strict attention to the details of sample collection, handling and storage, the many sources of possible biological variation and the effects of drugs is critical if diagnostic error is to be avoided. Patients should be evaluated in the drug-free state if at all possible. Anatomical localization, in the abdomen in the vast majority of cases and usually in the adrenal medullae, can generally be accomplished with computed tomographic scans. Bilateral adrenomedullary tumours are the rule in familial phaeochromocytoma. Most phaeochromocytomas are benign and can be excised totally after medical preparation with an alpha-adrenergic antagonist.

  1. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems

    PubMed Central

    Zainudin, Suhaila; Arif, Shereena M.

    2017-01-01

    Gene regulatory network (GRN) reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR) to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C) as a direct interaction (A → C). Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5. PMID:28250767

  2. Economic turmoil, new administration to affect revenue cycle in 2009.

    PubMed

    2009-01-01

    Healthcare revenue cycle leaders willface some pressing issues in 2009, including continuing economic turmoil, increasing numbers of underinsured patients, avoiding unreimbursable medical errors, and implementation of ICD-10.

  3. Optimization of isotherm models for pesticide sorption on biopolymer-nanoclay composite by error analysis.

    PubMed

    Narayanan, Neethu; Gupta, Suman; Gajbhiye, V T; Manjaiah, K M

    2017-04-01

    A carboxy methyl cellulose-nano organoclay (nano montmorillonite modified with 35-45 wt % dimethyl dialkyl (C 14 -C 18 ) amine (DMDA)) composite was prepared by solution intercalation method. The prepared composite was characterized by infrared spectroscopy (FTIR), X-Ray diffraction spectroscopy (XRD) and scanning electron microscopy (SEM). The composite was utilized for its pesticide sorption efficiency for atrazine, imidacloprid and thiamethoxam. The sorption data was fitted into Langmuir and Freundlich isotherms using linear and non linear methods. The linear regression method suggested best fitting of sorption data into Type II Langmuir and Freundlich isotherms. In order to avoid the bias resulting from linearization, seven different error parameters were also analyzed by non linear regression method. The non linear error analysis suggested that the sorption data fitted well into Langmuir model rather than in Freundlich model. The maximum sorption capacity, Q 0 (μg/g) was given by imidacloprid (2000) followed by thiamethoxam (1667) and atrazine (1429). The study suggests that the degree of determination of linear regression alone cannot be used for comparing the best fitting of Langmuir and Freundlich models and non-linear error analysis needs to be done to avoid inaccurate results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Case report of a near medical event in stereotactic radiotherapy due to improper units of measure from a treatment planning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gladstone, D. J.; Li, S.; Jarvis, L. A.

    2011-07-15

    Purpose: The authors hereby notify the Radiation Oncology community of a potentially lethal error due to improper implementation of linear units of measure in a treatment planning system. The authors report an incident in which a patient was nearly mistreated during a stereotactic radiotherapy procedure due to inappropriate reporting of stereotactic coordinates by the radiation therapy treatment planning system in units of centimeter rather than in millimeter. The authors suggest a method to detect such errors during treatment planning so they are caught and corrected prior to the patient positioning for treatment on the treatment machine. Methods: Using pretreatment imaging,more » the authors found that stereotactic coordinates are reported with improper linear units by a treatment planning system. The authors have implemented a redundant, independent method of stereotactic coordinate calculation. Results: Implementation of a double check of stereotactic coordinates via redundant, independent calculation is simple and accurate. Use of this technique will avoid any future error in stereotactic treatment coordinates due to improper linear units, transcription, or other similar errors. Conclusions: The authors recommend an independent double check of stereotactic treatment coordinates during the treatment planning process in order to avoid potential mistreatment of patients.« less

  5. Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems

    NASA Technical Reports Server (NTRS)

    Song, Lixia; Kuchar, James K.

    2003-01-01

    Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.

  6. Investigation of technology needs for avoiding helicopter pilot error related accidents

    NASA Technical Reports Server (NTRS)

    Chais, R. I.; Simpson, W. E.

    1985-01-01

    Pilot error which is cited as a cause or related factor in most rotorcraft accidents was examined. Pilot error related accidents in helicopters to identify areas in which new technology could reduce or eliminate the underlying causes of these human errors were investigated. The aircraft accident data base at the U.S. Army Safety Center was studied as the source of data on helicopter accidents. A randomly selected sample of 110 aircraft records were analyzed on a case-by-case basis to assess the nature of problems which need to be resolved and applicable technology implications. Six technology areas in which there appears to be a need for new or increased emphasis are identified.

  7. Trajectory specification for high capacity air traffic control

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A. (Inventor)

    2010-01-01

    Method and system for analyzing and processing information on one or more aircraft flight paths, using a four-dimensional coordinate system including three Cartesian or equivalent coordinates (x, y, z) and a fourth coordinate .delta. that corresponds to a distance estimated along a reference flight path to a nearest reference path location corresponding to a present location of the aircraft. Use of the coordinate .delta., rather than elapsed time t, avoids coupling of along-track error into aircraft altitude and reduces effects of errors on an aircraft landing site. Along-track, cross-track and/or altitude errors are estimated and compared with a permitted error bounding space surrounding the reference flight path.

  8. Intrusion errors in visuospatial working memory performance.

    PubMed

    Cornoldi, Cesare; Mammarella, Nicola

    2006-02-01

    This study tested the hypothesis that failure in active visuospatial working memory tasks involves a difficulty in avoiding intrusions due to information that is already activated. Two experiments are described, in which participants were required to process several series of locations on a 4 x 4 matrix and then to produce only the final location of each series. Results revealed a higher number of errors due to already activated locations (intrusions) compared with errors due to new locations (inventions). Moreover, when participants were required to pay extra attention to some irrelevant (non-final) locations by tapping on the table, intrusion errors increased. Results are discussed in terms of current models of working memory functioning.

  9. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks

    PubMed Central

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X.; Li, Huailiang; Shi, Rui

    2017-01-01

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring. PMID:28471418

  10. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks.

    PubMed

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X; Li, Huailiang; Shi, Rui

    2017-05-04

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring.

  11. An Alternative Time Metric to Modified Tau for Unmanned Aircraft System Detect And Avoid

    NASA Technical Reports Server (NTRS)

    Wu, Minghong G.; Bageshwar, Vibhor L.; Euteneuer, Eric A.

    2017-01-01

    A new horizontal time metric, Time to Protected Zone, is proposed for use in the Detect and Avoid (DAA) Systems equipped by unmanned aircraft systems (UAS). This time metric has three advantages over the currently adopted time metric, modified tau: it corresponds to a physical event, it is linear with time, and it can be directly used to prioritize intruding aircraft. The protected zone defines an area around the UAS that can be a function of each intruding aircraft's surveillance measurement errors. Even with its advantages, the Time to Protected Zone depends explicitly on encounter geometry and may be more sensitive to surveillance sensor errors than modified tau. To quantify its sensitivity, simulation of 972 encounters using realistic sensor models and a proprietary fusion tracker is performed. Two sensitivity metrics, the probability of time reversal and the average absolute time error, are computed for both the Time to Protected Zone and modified tau. Results show that the sensitivity of the Time to Protected Zone is comparable to that of modified tau if the dimensions of the protected zone are adequately defined.

  12. Locked-mode avoidance and recovery without momentum input

    NASA Astrophysics Data System (ADS)

    Delgado-Aparicio, L.; Rice, J. E.; Wolfe, S.; Cziegler, I.; Gao, C.; Granetz, R.; Wukitch, S.; Terry, J.; Greenwald, M.; Sugiyama, L.; Hubbard, A.; Hugges, J.; Marmar, E.; Phillips, P.; Rowan, W.

    2015-11-01

    Error-field-induced locked-modes (LMs) have been studied in Alcator C-Mod at ITER-Bϕ, without NBI fueling and momentum input. Delay of the mode-onset and locked-mode recovery has been successfully obtained without external momentum input using Ion Cyclotron Resonance Heating (ICRH). The use of external heating in-sync with the error-field ramp-up resulted in a successful delay of the mode-onset when PICRH > 1 MW, which demonstrates the existence of a power threshold to ``unlock'' the mode; in the presence of an error field the L-mode discharge can transition into H-mode only when PICRH > 2 MW and at high densities, avoiding also the density pump-out. The effects of ion heating observed on unlocking the core plasma may be due to ICRH induced flows in the plasma boundary, or modifications of plasma profiles that changed the underlying turbulence. This work was performed under US DoE contracts including DE-FC02-99ER54512 and others at MIT, DE-FG03-96ER-54373 at University of Texas at Austin, and DE-AC02-09CH11466 at PPPL.

  13. Dopamine Reward Prediction Error Responses Reflect Marginal Utility

    PubMed Central

    Stauffer, William R.; Lak, Armin; Schultz, Wolfram

    2014-01-01

    Summary Background Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. Results In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions’ shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. Conclusions These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). PMID:25283778

  14. Evaluation and mitigation of potential errors in radiochromic film dosimetry due to film curvature at scanning

    PubMed Central

    Bradley, David A.; Nisbet, Andrew

    2015-01-01

    This work considers a previously overlooked uncertainty present in film dosimetry which results from moderate curvature of films during the scanning process. Small film samples are particularly susceptible to film curling which may be undetected or deemed insignificant. In this study, we consider test cases with controlled induced curvature of film and with film raised horizontally above the scanner plate. We also evaluate the difference in scans of a film irradiated with a typical brachytherapy dose distribution with the film naturally curved and with the film held flat on the scanner. Typical naturally occurring curvature of film at scanning, giving rise to a maximum height 1 to 2 mm above the scan plane, may introduce dose errors of 1% to 4%, and considerably reduce gamma evaluation passing rates when comparing film‐measured doses with treatment planning system‐calculated dose distributions, a common application of film dosimetry in radiotherapy. The use of a triple‐channel dosimetry algorithm appeared to mitigate the error due to film curvature compared to conventional single‐channel film dosimetry. The change in pixel value and calibrated reported dose with film curling or height above the scanner plate may be due to variations in illumination characteristics, optical disturbances, or a Callier‐type effect. There is a clear requirement for physically flat films at scanning to avoid the introduction of a substantial error source in film dosimetry. Particularly for small film samples, a compression glass plate above the film is recommended to ensure flat‐film scanning. This effect has been overlooked to date in the literature. PACS numbers: 87.55.Qr, 87.56.bg, 87.55.km PMID:26103181

  15. Causes of blindness and visual impairment in Pakistan. The Pakistan national blindness and visual impairment survey

    PubMed Central

    Dineen, B; Bourne, R R A; Jadoon, Z; Shah, S P; Khan, M A; Foster, A; Gilbert, C E; Khan, M D

    2007-01-01

    Objective To determine the causes of blindness and visual impairment in adults (⩾30 years old) in Pakistan, and to explore socio‐demographic variations in cause. Methods A multi‐stage, stratified, cluster random sampling survey was used to select a nationally representative sample of adults. Each subject was interviewed, had their visual acuity measured and underwent autorefraction and fundus/optic disc examination. Those with a visual acuity of <6/12 in either eye underwent a more detailed ophthalmic examination. Causes of visual impairment were classified according to the accepted World Health Organization (WHO) methodology. An exploration of demographic variables was conducted using regression modeling. Results A sample of 16 507 adults (95.5% of those enumerated) was examined. Cataract was the most common cause of blindness (51.5%; defined as <3/60 in the better eye on presentation) followed by corneal opacity (11.8%), uncorrected aphakia (8.6%) and glaucoma (7.1%). Posterior capsular opacification accounted for 3.6% of blindness. Among the moderately visually impaired (<6/18 to ⩾6/60), refractive error was the most common cause (43%), followed by cataract (42%). Refractive error as a cause of severe visual impairment/blindness was significantly higher in rural dwellers than in urban dwellers (odds ratio (OR) 3.5, 95% CI 1.1 to 11.7). Significant provincial differences were also identified. Overall we estimate that 85.5% of causes were avoidable and that 904 000 adults in Pakistan have cataract (<6/60) requiring surgical intervention. Conclusions This comprehensive survey provides reliable estimates of the causes of blindness and visual impairment in Pakistan. Despite expanded surgical services, cataract still accounts for over half of the cases of blindness in Pakistan. One in eight blind adults has visual loss from sequelae of cataract surgery. Services for refractive errors need to be further expanded and integrated into eye care services, particularly those serving rural populations. PMID:17229806

  16. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    NASA Astrophysics Data System (ADS)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The article presents R code throughout.

  17. Mindfulness, Physical Activity and Avoidance of Secondhand Smoke: A Study of College Students in Shanghai.

    PubMed

    Gao, Yu; Shi, Lu

    2015-08-21

    To better understand the documented link between mindfulness and longevity, we examine the association between mindfulness and conscious avoidance of secondhand smoke (SHS), as well as the association between mindfulness and physical activity. In Shanghai University of Finance and Economics (SUFE) we surveyed a convenience sample of 1516 college freshmen. We measured mindfulness, weekly physical activity, and conscious avoidance of secondhand smoke, along with demographic and behavioral covariates. We used a multilevel logistic regression to test the association between mindfulness and conscious avoidance of secondhand smoke, and used a Tobit regression model to test the association between mindfulness and metabolic equivalent hours per week. In both models the home province of the student respondent was used as the cluster variable, and demographic and behavioral covariates, such as age, gender, smoking history, household registration status (urban vs. rural), the perceived smog frequency in their home towns, and the asthma diagnosis. The logistic regression of consciously avoiding SHS shows that a higher level of mindfulness was associated with an increase in the odds ratio of conscious SHS avoidance (logged odds: 0.22, standard error: 0.07, p < 0.01). The Tobit regression shows that a higher level of mindfulness was associated with more metabolic equivalent hours per week (Tobit coefficient: 4.09, standard error: 1.13, p < 0.001). This study is an innovative attempt to study the behavioral issue of secondhand smoke from the perspective of the potential victim, rather than the active smoker. The observed associational patterns here are consistent with previous findings that mindfulness is associated with healthier behaviors in obesity prevention and substance use. Research designs with interventions are needed to test the causal link between mindfulness and these healthy behaviors.

  18. Mindfulness, Physical Activity and Avoidance of Secondhand Smoke: A Study of College Students in Shanghai

    PubMed Central

    Gao, Yu; Shi, Lu

    2015-01-01

    Introduction: To better understand the documented link between mindfulness and longevity, we examine the association between mindfulness and conscious avoidance of secondhand smoke (SHS), as well as the association between mindfulness and physical activity. Method: In Shanghai University of Finance and Economics (SUFE) we surveyed a convenience sample of 1516 college freshmen. We measured mindfulness, weekly physical activity, and conscious avoidance of secondhand smoke, along with demographic and behavioral covariates. We used a multilevel logistic regression to test the association between mindfulness and conscious avoidance of secondhand smoke, and used a Tobit regression model to test the association between mindfulness and metabolic equivalent hours per week. In both models the home province of the student respondent was used as the cluster variable, and demographic and behavioral covariates, such as age, gender, smoking history, household registration status (urban vs. rural), the perceived smog frequency in their home towns, and the asthma diagnosis. Results: The logistic regression of consciously avoiding SHS shows that a higher level of mindfulness was associated with an increase in the odds ratio of conscious SHS avoidance (logged odds: 0.22, standard error: 0.07, p < 0.01). The Tobit regression shows that a higher level of mindfulness was associated with more metabolic equivalent hours per week (Tobit coefficient: 4.09, standard error: 1.13, p < 0.001). Discussion: This study is an innovative attempt to study the behavioral issue of secondhand smoke from the perspective of the potential victim, rather than the active smoker. The observed associational patterns here are consistent with previous findings that mindfulness is associated with healthier behaviors in obesity prevention and substance use. Research designs with interventions are needed to test the causal link between mindfulness and these healthy behaviors. PMID:26308029

  19. Effect of visuospatial neglect on spatial navigation and heading after stroke.

    PubMed

    Aravind, Gayatri; Lamontagne, Anouk

    2017-06-09

    Visuospatial neglect (VSN) impairs the control of locomotor heading in post-stroke individuals, which may affect their ability to safely avoid moving objects while walking. We aimed to compare VSN+ and VSN- stroke individuals in terms of changes in heading and head orientation in space while avoiding obstacles approaching from different directions and reorienting toward the final target. Stroke participants with VSN (VSN+) and without VSN (VSN-) walked in a virtual environment avoiding obstacles that approached contralesionally, head-on or ipsilesionally. Measures of obstacle avoidance (onset-of-heading change, maximum mediolateral deviation) and target alignment (heading and head-rotation errors with respect to target) were compared across groups and obstacle directions. In total, 26 participants with right-hemisphere stroke participated (13 VSN+ and 13 VSN-; 24 males; mean age 60.3 years, range 48 to 72 years). A larger proportion of VSN+ (75%) than VSN- (38%) participants collided with contralesional and head-on obstacles. For VSN- participants, deviating to the same side as the obstacle was a safe strategy to avoid diagonal obstacles and deviating to the opposite-side led to occasional collisions. VSN+ participants deviated ipsilesionally, displaying same-side and opposite-side strategies for ipsilesional and contralesional obstacles, respectively. Overall, VSN+ participants showed greater distances at onset-of-heading change, smaller maximum mediolateral deviation and larger errors in target alignment as compared with VSN- participants. The ipsilesional bias arising from VSN influences the modulation of heading in response to obstacles and, along with the adoption of the "riskier" strategies, contribute to the higher number colliders and poor goal-directed walking abilities in stroke survivors with VSN. Future research should focus on developing assessment and training tools for complex locomotor tasks such as obstacle avoidance in this population. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  20. Cleared for the visual approach: Human factor problems in air carrier operations

    NASA Technical Reports Server (NTRS)

    Monan, W. P.

    1983-01-01

    The study described herein, a set of 353 ASRS reports of unique aviation occurrences significantly involving visual approaches was examined to identify hazards and pitfalls embedded in the visual approach procedure and to consider operational practices that might help avoid future mishaps. Analysis of the report set identified nine aspects of the visual approach procedure that appeared to be predisposing conditions for inducing or exacerbating the effects of operational errors by flight crew members or controllers. Predisposing conditions, errors, and operational consequences of the errors are discussed. In a summary, operational policies that might mitigate the problems are examined.

  1. Using media to teach how not to do psychotherapy.

    PubMed

    Gabbard, Glen; Horowitz, Mardi

    2010-01-01

    This article describes how using media depictions of psychotherapy may help in teaching psychiatric residents. Using the HBO series In Treatment as a model, the authors suggest how boundary transgressions and technical errors may inform residents about optimal psychotherapeutic approaches. The psychotherapy vignettes depicted in In Treatment show how errors in judgment may grow out of therapists' good intentions. These errors can be understood and used constructively for teaching. With the growing interest in depicting psychotherapy on popular TV series, the use of these sessions avoids confidentiality problems and may be a useful adjunct for teaching psychotherapy.

  2. Association of medication errors with drug classifications, clinical units, and consequence of errors: Are they related?

    PubMed

    Muroi, Maki; Shen, Jay J; Angosta, Alona

    2017-02-01

    Registered nurses (RNs) play an important role in safe medication administration and patient safety. This study examined a total of 1276 medication error (ME) incident reports made by RNs in hospital inpatient settings in the southwestern region of the United States. The most common drug class associated with MEs was cardiovascular drugs (24.7%). Among this class, anticoagulants had the most errors (11.3%). The antimicrobials was the second most common drug class associated with errors (19.1%) and vancomycin was the most common antimicrobial that caused errors in this category (6.1%). MEs occurred more frequently in the medical-surgical and intensive care units than any other hospital units. Ten percent of MEs reached the patients with harm and 11% reached the patients with increased monitoring. Understanding the contributing factors related to MEs, addressing and eliminating risk of errors across hospital units, and providing education and resources for nurses may help reduce MEs. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. 32 CFR 1701.30 - Policy and applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INTELLIGENCE ADMINISTRATION OF RECORDS UNDER THE PRIVACY ACT OF 1974 Routine Uses Applicable to More Than One... routine uses to foster simplicity and economy and to avoid redundancy or error by duplication in multiple...

  4. Laying the Groundwork.

    ERIC Educational Resources Information Center

    Kretchmer, Mark R.

    2000-01-01

    Discusses how to avoid costly errors in high-tech retrofits through proper planning and coordination. Guidelines are offered for selecting cable installers, using multi-disciplinary consulting engineering firm, and space planning when making high-tech retrofits. (GR)

  5. Interpretation of physiological indicators of motivation: Caveats and recommendations.

    PubMed

    Richter, Michael; Slade, Kate

    2017-09-01

    Motivation scientists employing physiological measures to gather information about motivation-related states are at risk of committing two fundamental errors: overstating the inferences that can be drawn from their physiological measures and circular reasoning. We critically discuss two complementary approaches, Cacioppo and colleagues' model of psychophysiological relations and construct validation theory, to highlight the conditions under which these errors are committed and provide guidance on how to avoid them. In particular, we demonstrate that the direct inference from changes in a physiological measure to changes in a motivation-related state requires the demonstration that the measure is not related to other relevant psychological states. We also point out that circular reasoning can be avoided by separating the definition of the motivation-related state from the hypotheses that are empirically tested. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Techniques for avoiding discrimination errors in the dynamic sampling of condensable vapors

    NASA Technical Reports Server (NTRS)

    Lincoln, K. A.

    1983-01-01

    In the mass spectrometric sampling of dynamic systems, measurements of the relative concentrations of condensable and noncondensable vapors can be significantly distorted if some subtle, but important, instrumental factors are overlooked. Even with in situ measurements, the condensables are readily lost to the container walls, and the noncondensables can persist within the vacuum chamber and yield a disproportionately high output signal. Where single pulses of vapor are sampled this source of error is avoided by gating either the mass spectrometer ""on'' or the data acquisition instrumentation ""on'' only during the very brief time-window when the initial vapor cloud emanating directly from the vapor source passes through the ionizer. Instrumentation for these techniques is detailed and its effectiveness is demonstrated by comparing gated and nongated spectra obtained from the pulsed-laser vaporization of several materials.

  7. CT Colonography with Computer-aided Detection: Recognizing the Causes of False-Positive Reader Results

    PubMed Central

    Dachman, Abraham H.; Wroblewski, Kristen; Vannier, Michael W.; Horne, John M.

    2014-01-01

    Computed tomography (CT) colonography is a screening modality used to detect colonic polyps before they progress to colorectal cancer. Computer-aided detection (CAD) is designed to decrease errors of detection by finding and displaying polyp candidates for evaluation by the reader. CT colonography CAD false-positive results are common and have numerous causes. The relative frequency of CAD false-positive results and their effect on reader performance on the basis of a 19-reader, 100-case trial shows that the vast majority of CAD false-positive results were dismissed by readers. Many CAD false-positive results are easily disregarded, including those that result from coarse mucosa, reconstruction, peristalsis, motion, streak artifacts, diverticulum, rectal tubes, and lipomas. CAD false-positive results caused by haustral folds, extracolonic candidates, diminutive lesions (<6 mm), anal papillae, internal hemorrhoids, varices, extrinsic compression, and flexural pseudotumors are almost always recognized and disregarded. The ileocecal valve and tagged stool are common sources of CAD false-positive results associated with reader false-positive results. Nondismissable CAD soft-tissue polyp candidates larger than 6 mm are another common cause of reader false-positive results that may lead to further evaluation with follow-up CT colonography or optical colonoscopy. Strategies for correctly evaluating CAD polyp candidates are important to avoid pitfalls from common sources of CAD false-positive results. ©RSNA, 2014 PMID:25384290

  8. The spectrum of medical errors: when patients sue

    PubMed Central

    Kels, Barry D; Grant-Kels, Jane M

    2012-01-01

    Inarguably medical errors constitute a serious, dangerous, and expensive problem for the twenty-first-century US health care system. This review examines the incidence, nature, and complexity of alleged medical negligence and medical malpractice. The authors hope this will constitute a road map to medical providers so that they can better understand the present climate and hopefully avoid the “Scylla and Charybdis” of medical errors and medical malpractice. Despite some documented success in reducing medical errors, adverse events and medical errors continue to represent an indelible stain upon the practice, reputation, and success of the US health care industry. In that regard, what may be required to successfully attack the unacceptably high severity and volume of medical errors is a locally directed and organized initiative sponsored by individual health care organizations that is coordinated, supported, and guided by state and federal governmental and nongovernmental agencies. PMID:22924008

  9. Topological quantum computing with a very noisy network and local error rates approaching one percent.

    PubMed

    Nickerson, Naomi H; Li, Ying; Benjamin, Simon C

    2013-01-01

    A scalable quantum computer could be built by networking together many simple processor cells, thus avoiding the need to create a single complex structure. The difficulty is that realistic quantum links are very error prone. A solution is for cells to repeatedly communicate with each other and so purify any imperfections; however prior studies suggest that the cells themselves must then have prohibitively low internal error rates. Here we describe a method by which even error-prone cells can perform purification: groups of cells generate shared resource states, which then enable stabilization of topologically encoded data. Given a realistically noisy network (≥10% error rate) we find that our protocol can succeed provided that intra-cell error rates for initialisation, state manipulation and measurement are below 0.82%. This level of fidelity is already achievable in several laboratory systems.

  10. Hessian matrix approach for determining error field sensitivity to coil deviations

    NASA Astrophysics Data System (ADS)

    Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.; Song, Yuntao; Wan, Yuanxi

    2018-05-01

    The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code (Zhu et al 2018 Nucl. Fusion 58 016008) is utilized to provide fast and accurate calculations of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.

  11. A study of partial coherence for identifying interior noise sources and paths on general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Howlett, J. T.

    1979-01-01

    The partial coherence analysis method for noise source/path determination is summarized and the application to a two input, single output system with coherence between the inputs is illustrated. The augmentation of the calculations on a digital computer interfaced with a two channel, real time analyzer is also discussed. The results indicate possible sources of error in the computations and suggest procedures for avoiding these errors.

  12. Comparison of two reconfigurable N×N interconnects for a recurrent neural network

    NASA Astrophysics Data System (ADS)

    Berger, Christoph; Collings, Neil; Pourzand, Ali R.; Volkel, Reinnard

    1996-11-01

    Two different methods of pattern replication (conventional and interlaced fan-out) have been investigated and experimentally tested in a reconfigurable 5X5 optical interconnect. Similar alignment problems due to imaging errors (field curvature) were observed in both systems. We conclude that of the two methods the interlaced fan-out is better suited to avoid these imaging errors, to reduce system size and to implement an optical feedback loop.

  13. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  14. Utility-preserving anonymization for health data publishing.

    PubMed

    Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn

    2017-07-11

    Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.

  15. Computing the binding affinity of a ligand buried deep inside a protein with the hybrid steered molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villarreal, Oscar D.; Yu, Lili; Department of Laboratory Medicine, Yancheng Vocational Institute of Health Sciences, Yancheng, Jiangsu 224006

    Computing the ligand-protein binding affinity (or the Gibbs free energy) with chemical accuracy has long been a challenge for which many methods/approaches have been developed and refined with various successful applications. False positives and, even more harmful, false negatives have been and still are a common occurrence in practical applications. Inevitable in all approaches are the errors in the force field parameters we obtain from quantum mechanical computation and/or empirical fittings for the intra- and inter-molecular interactions. These errors propagate to the final results of the computed binding affinities even if we were able to perfectly implement the statistical mechanicsmore » of all the processes relevant to a given problem. And they are actually amplified to various degrees even in the mature, sophisticated computational approaches. In particular, the free energy perturbation (alchemical) approaches amplify the errors in the force field parameters because they rely on extracting the small differences between similarly large numbers. In this paper, we develop a hybrid steered molecular dynamics (hSMD) approach to the difficult binding problems of a ligand buried deep inside a protein. Sampling the transition along a physical (not alchemical) dissociation path of opening up the binding cavity- -pulling out the ligand- -closing back the cavity, we can avoid the problem of error amplifications by not relying on small differences between similar numbers. We tested this new form of hSMD on retinol inside cellular retinol-binding protein 1 and three cases of a ligand (a benzylacetate, a 2-nitrothiophene, and a benzene) inside a T4 lysozyme L99A/M102Q(H) double mutant. In all cases, we obtained binding free energies in close agreement with the experimentally measured values. This indicates that the force field parameters we employed are accurate and that hSMD (a brute force, unsophisticated approach) is free from the problem of error amplification suffered by many sophisticated approaches in the literature.« less

  16. Glucocorticosteroid-free versus glucocorticosteroid-containing immunosuppression for liver transplanted patients.

    PubMed

    Fairfield, Cameron; Penninga, Luit; Powell, James; Harrison, Ewen M; Wigmore, Stephen J

    2018-04-09

    Liver transplantation is an established treatment option for end-stage liver failure. Now that newer, more potent immunosuppressants have been developed, glucocorticosteroids may no longer be needed and their removal may prevent adverse effects. To assess the benefits and harms of glucocorticosteroid avoidance (excluding intra-operative use or treatment of acute rejection) or withdrawal versus glucocorticosteroid-containing immunosuppression following liver transplantation. We searched the Cochrane Hepato-Biliary Group Controlled Trials Register, Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, Embase, Science Citation Index Expanded and Conference Proceedings Citation Index - Science, Literatura Americano e do Caribe em Ciencias da Saude (LILACS), World Health Organization International Clinical Trials Registry Platform, ClinicalTrials.gov, and The Transplant Library until May 2017. Randomised clinical trials assessing glucocorticosteroid avoidance or withdrawal versus glucocorticosteroid-containing immunosuppression for liver transplanted people. Our inclusion criteria stated that participants should have received the same co-interventions. We included trials that assessed complete glucocorticosteroid avoidance (excluding intra-operative use or treatment of acute rejection) versus short-term glucocorticosteroids, as well as trials that assessed short-term glucocorticosteroids versus long-term glucocorticosteroids. We used RevMan to conduct meta-analyses, calculating risk ratio (RR) for dichotomous variables and mean difference (MD) for continuous variables, both with 95% confidence intervals (CIs). We used a random-effects model and a fixed-effect model and reported both results where a discrepancy existed; otherwise we reported only the results from the fixed-effect model. We assessed the risk of systematic errors using 'Risk of bias' domains. We controlled for random errors by performing Trial Sequential Analysis. We presented our results in a 'Summary of findings' table. We included 17 completed randomised clinical trials, but only 16 studies with 1347 participants provided data for the meta-analyses. Ten of the 16 trials assessed complete postoperative glucocorticosteroid avoidance (excluding intra-operative use or treatment of acute rejection) versus short-term glucocorticosteroids (782 participants) and six trials assessed short-term glucocorticosteroids versus long-term glucocorticosteroids (565 participants). One additional study assessed complete post-operative glucocorticosteroid avoidance but could only be incorporated into qualitative analysis of the results due to limited data published in an abstract. All trials were at high risk of bias. Only eight trials reported on the type of donor used. Overall, we found no statistically significant difference for mortality (RR 1.15, 95% CI 0.93 to 1.44; low-quality evidence), graft loss including death (RR 1.15, 95% CI 0.90 to 1.46; low-quality evidence), or infection (RR 0.88, 95% CI 0.73 to 1.05; very low-quality evidence) when glucocorticosteroid avoidance or withdrawal was compared with glucocorticosteroid-containing immunosuppression. Acute rejection and glucocorticosteroid-resistant rejection were statistically significantly more frequent when glucocorticosteroid avoidance or withdrawal was compared with glucocorticosteroid-containing immunosuppression (RR 1.33, 95% CI 1.08 to 1.64; low-quality evidence; and RR 2.14, 95% CI 1.13 to 4.02; very low-quality evidence). Diabetes mellitus and hypertension were statistically significantly less frequent when glucocorticosteroid avoidance or withdrawal was compared with glucocorticosteroid-containing immunosuppression (RR 0.81, 95% CI 0.66 to 0.99; low-quality evidence; and RR 0.76, 95% CI 0.65 to 0.90; low-quality evidence). We performed Trial Sequential Analysis for all outcomes. None of the outcomes crossed the monitoring boundaries or reached the required information size. Hence, we cannot exclude random errors from the results of the conventional meta-analyses. Many of the benefits and harms of glucocorticosteroid avoidance or withdrawal remain uncertain because of the limited number of published randomised clinical trials, limited numbers of participants and outcomes, and high risk of bias in the trials. Glucocorticosteroid avoidance or withdrawal appears to reduce diabetes mellitus and hypertension whilst increasing acute rejection, glucocorticosteroid-resistant rejection, and renal impairment. We could identify no other benefits or harms of glucocorticosteroid avoidance or withdrawal. Glucocorticosteroid avoidance or withdrawal may be of benefit in selected patients, especially those at low risk of rejection and high risk of hypertension or diabetes mellitus. The optimal duration of glucocorticosteroid administration remains unclear. More randomised clinical trials assessing glucocorticosteroid avoidance or withdrawal are needed. These should be large, high-quality trials that minimise the risk of random and systematic error.

  17. Driving safely into the future with applied technology

    DOT National Transportation Integrated Search

    1999-10-01

    Driver error remains the leading cause of highway crashes. Through the Intelligent Vehicle Initiative (IVI), the Department of Transportation hopes to reduce crashes by helping drivers avoid hazardous mistakes. IVI aims to accelerate the development ...

  18. Asynchronous error-correcting secure communication scheme based on fractional-order shifting chaotic system

    NASA Astrophysics Data System (ADS)

    Chao, Luo

    2015-11-01

    In this paper, a novel digital secure communication scheme is firstly proposed. Different from the usual secure communication schemes based on chaotic synchronization, the proposed scheme employs asynchronous communication which avoids the weakness of synchronous systems and is susceptible to environmental interference. Moreover, as to the transmission errors and data loss in the process of communication, the proposed scheme has the ability to be error-checking and error-correcting in real time. In order to guarantee security, the fractional-order complex chaotic system with the shifting of order is utilized to modulate the transmitted signal, which has high nonlinearity and complexity in both frequency and time domains. The corresponding numerical simulations demonstrate the effectiveness and feasibility of the scheme.

  19. Exploring Plant Co-Expression and Gene-Gene Interactions with CORNET 3.0.

    PubMed

    Van Bel, Michiel; Coppens, Frederik

    2017-01-01

    Selecting and filtering a reference expression and interaction dataset when studying specific pathways and regulatory interactions can be a very time-consuming and error-prone task. In order to reduce the duplicated efforts required to amass such datasets, we have created the CORNET (CORrelation NETworks) platform which allows for easy access to a wide variety of data types: coexpression data, protein-protein interactions, regulatory interactions, and functional annotations. The CORNET platform outputs its results in either text format or through the Cytoscape framework, which is automatically launched by the CORNET website.CORNET 3.0 is the third iteration of the web platform designed for the user exploration of the coexpression space of plant genomes, with a focus on the model species Arabidopsis thaliana. Here we describe the platform: the tools, data, and best practices when using the platform. We indicate how the platform can be used to infer networks from a set of input genes, such as upregulated genes from an expression experiment. By exploring the network, new target and regulator genes can be discovered, allowing for follow-up experiments and more in-depth study. We also indicate how to avoid common pitfalls when evaluating the networks and how to avoid over interpretation of the results.All CORNET versions are available at http://bioinformatics.psb.ugent.be/cornet/ .

  20. Adaptive projection intensity adjustment for avoiding saturation in three-dimensional shape measurement

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Gao, Nan; Wang, Xiangjun; Zhang, Zonghua

    2018-03-01

    Phase-based fringe projection methods have been commonly used for three-dimensional (3D) measurements. However, image saturation results in incorrect intensities in captured fringe pattern images, leading to phase and measurement errors. Existing solutions are complex. This paper proposes an adaptive projection intensity adjustment method to avoid image saturation and maintain good fringe modulation in measuring objects with a high range of surface reflectivities. The adapted fringe patterns are created using only one prior step of fringe-pattern projection and image capture. First, a set of phase-shifted fringe patterns with maximum projection intensity value of 255 and a uniform gray level pattern are projected onto the surface of an object. The patterns are reflected from and deformed by the object surface and captured by a digital camera. The best projection intensities corresponding to each saturated-pixel clusters are determined by fitting a polynomial function to transform captured intensities to projected intensities. Subsequently, the adapted fringe patterns are constructed using the best projection intensities at projector pixel coordinate. Finally, the adapted fringe patterns are projected for phase recovery and 3D shape calculation. The experimental results demonstrate that the proposed method achieves high measurement accuracy even for objects with a high range of surface reflectivities.

  1. Effect of refractive error on temperament and character properties.

    PubMed

    Kalkan Akcay, Emine; Canan, Fatih; Simavli, Huseyin; Dal, Derya; Yalniz, Hacer; Ugurlu, Nagihan; Gecici, Omer; Cagil, Nurullah

    2015-01-01

    To determine the effect of refractive error on temperament and character properties using Cloninger's psychobiological model of personality. Using the Temperament and Character Inventory (TCI), the temperament and character profiles of 41 participants with refractive errors (17 with myopia, 12 with hyperopia, and 12 with myopic astigmatism) were compared to those of 30 healthy control participants. Here, temperament comprised the traits of novelty seeking, harm-avoidance, and reward dependence, while character comprised traits of self-directedness, cooperativeness, and self-transcendence. Participants with refractive error showed significantly lower scores on purposefulness, cooperativeness, empathy, helpfulness, and compassion (P<0.05, P<0.01, P<0.05, P<0.05, and P<0.01, respectively). Refractive error might have a negative influence on some character traits, and different types of refractive error might have different temperament and character properties. These personality traits may be implicated in the onset and/or perpetuation of refractive errors and may be a productive focus for psychotherapy.

  2. Potential effects of reward and loss avoidance in overweight adolescents

    PubMed Central

    Reyes, Sussanne; Peirano, Patricio; Luna, Beatriz; Lozoff, Betsy; Algarín, Cecilia

    2015-01-01

    Background Reward system and inhibitory control are brain functions that exert an influence on eating behavior regulation. We studied the differences in inhibitory control and sensitivity to reward and loss avoidance between overweight/obese and normal-weight adolescents. Methods We assessed 51 overweight/obese and 52 normal-weight 15-y-old Chilean adolescents. The groups were similar regarding sex and intelligence quotient. Using Antisaccade and Incentive tasks, we evaluated inhibitory control and the effect of incentive trials (neutral, loss avoidance, and reward) on generating correct and incorrect responses (latency and error rate). Results Compared to normal-weight group participants, overweight/obese adolescents showed shorter latency for incorrect antisaccade responses (186.0 (95% CI: 176.8–195.2) vs. 201.3 ms (95% CI: 191.2–211.5), P < 0.05) and better performance reflected by lower error rate in incentive trials (43.6 (95% CI: 37.8–49.4) vs. 53.4% (95% CI: 46.8–60.0), P < 0.05). Overweight/obese adolescents were more accurate on loss avoidance (40.9 (95% CI: 33.5–47.7) vs. 49.8% (95% CI: 43.0–55.1), P < 0.05) and reward (41.0 (95% CI: 34.5–47.5) vs. 49.8% (95% CI: 43.0–55.1), P < 0.05) compared to neutral trials. Conclusion Overweight/obese adolescents showed shorter latency for incorrect responses and greater accuracy in reward and loss avoidance trials. These findings could suggest that an imbalance of inhibition and reward systems influence their eating behavior. PMID:25927543

  3. A Sensor Dynamic Measurement Error Prediction Model Based on NAPSO-SVM.

    PubMed

    Jiang, Minlan; Jiang, Lan; Jiang, Dingde; Li, Fei; Song, Houbing

    2018-01-15

    Dynamic measurement error correction is an effective way to improve sensor precision. Dynamic measurement error prediction is an important part of error correction, and support vector machine (SVM) is often used for predicting the dynamic measurement errors of sensors. Traditionally, the SVM parameters were always set manually, which cannot ensure the model's performance. In this paper, a SVM method based on an improved particle swarm optimization (NAPSO) is proposed to predict the dynamic measurement errors of sensors. Natural selection and simulated annealing are added in the PSO to raise the ability to avoid local optima. To verify the performance of NAPSO-SVM, three types of algorithms are selected to optimize the SVM's parameters: the particle swarm optimization algorithm (PSO), the improved PSO optimization algorithm (NAPSO), and the glowworm swarm optimization (GSO). The dynamic measurement error data of two sensors are applied as the test data. The root mean squared error and mean absolute percentage error are employed to evaluate the prediction models' performances. The experimental results show that among the three tested algorithms the NAPSO-SVM method has a better prediction precision and a less prediction errors, and it is an effective method for predicting the dynamic measurement errors of sensors.

  4. The Comparative Study of Metacognition: Sharper Paradigms, Safer Inferences

    PubMed Central

    Smith, J. David; Beran, Michael J.; Couchman, Justin J.; Coutinho, Mariana V. C.

    2015-01-01

    Results that point to animals’ metacognitive capacity bear a heavy burden given the potential for competing behavioral descriptions. This article uses formal models to evaluate the force of these descriptions. One example is that many existing studies have directly rewarded so-called “uncertainty” responses. Modeling confirms that this practice is an interpretative danger because it supports associative processes and encourages simpler interpretations. Another example is that existing studies raise the concern that animals avoid difficult stimuli not because of uncertainty monitored but because of aversion given error-causing or reinforcement-lean stimuli. Modeling also justifies this concern and shows that this problem is not addressed by the common practice of comparing performance on Chosen and Forced trials. The models and related discussion have utility for metacognition researchers and theorists broadly because they specify the experimental operations that will best indicate a metacognitive capacity in humans or animals by eliminating alternative behavioral accounts. PMID:18792496

  5. Epidemiological and clinical features of three clustered cases co-infected with Lyme disease and rickettsioses.

    PubMed

    Xuefei, D; Qin, H; Xiaodi, G; Zhen, G; Wei, L; Xuexia, H; Jiazhen, G; Xiuping, F; Meimei, T; Jingshan, Z; Yunru, L; Xiaoling, F; Kanglin, W; Xingwang, L

    2013-11-01

    Lyme disease and rickettsioses are two common diseases in China. However, the concomitant occurrence of both diseases in a single individual has been reported infrequently in literature. We reported three related female patients admitted at Beijing Ditan Hospital from October to December 2010. They had similar epidemiological histories. At the beginning, they only got a single diagnosis, respectively, but after specific screenings, the final diagnoses were made. Because arthropods can harbour more than one disease-causing agent, patients can be infected with more than one pathogen at the same time, so the possibility of co-infection could be higher than what was thought previously. These observations suggested that clinicians should enhance the complete screening of arthropod-related infectious diseases so as to make an accurate diagnosis and to avoid diagnostic errors. © 2012 Blackwell Verlag GmbH.

  6. Nonparametric variational optimization of reaction coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banushkina, Polina V.; Krivov, Sergei V., E-mail: s.krivov@leeds.ac.uk

    State of the art realistic simulations of complex atomic processes commonly produce trajectories of large size, making the development of automated analysis tools very important. A popular approach aimed at extracting dynamical information consists of projecting these trajectories into optimally selected reaction coordinates or collective variables. For equilibrium dynamics between any two boundary states, the committor function also known as the folding probability in protein folding studies is often considered as the optimal coordinate. To determine it, one selects a functional form with many parameters and trains it on the trajectories using various criteria. A major problem with such anmore » approach is that a poor initial choice of the functional form may lead to sub-optimal results. Here, we describe an approach which allows one to optimize the reaction coordinate without selecting its functional form and thus avoiding this source of error.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, Prabhat Kumar; Rabehl, Roger

    Thermo-acoustic oscillations are a commonly observed phenomenon in helium cryogenic systems, especially in tubes connecting hot and cold areas. The open ends of these tubes are connected to the lower temperature (typically at 4.2 K), and the closed ends of these tubes are connected to the high temperature (300K). Cryogenic instrumentation installations provide ideal conditions for these oscillations to occur due to the steep temperature gradient along the tubing. These oscillations create errors in measurements as well as an undesirable heat load to the system. The work presented here develops engineering guidelines to design oscillation-free helium piping. This work alsomore » studies the effect of different piping inserts and shows how the proper geometrical combinations have to be chosen to avoid thermo-oscillations. The effect of an 80 K intercept is also studied and shows that thermo-oscillations can be dampened by placing the intercept at an appropriate location.« less

  8. Physician Practice Audit Targets Now Become Hospital and Health System Compliance Risks.

    PubMed

    Hirsch, Ronald L

    2015-01-01

    In 2013, 22% of the federal budget was spent on Medicare and Medicaid. The Medicare Trust Fund is forecast to be depleted in 2030. More than 12% of Medicare fee-for-service payments in 2014 were made in error. These factors have led Congress to apply more pressure to reduce improper payments. Although hospitals were the initial targets because of their higher reimbursement, recent efforts have shifted to physician billing. Hospitals and health systems continue to acquire physician practices, making them liable for the billing activities of physicians. And for physicians who remain independent, the cost and effort required to respond to audits and denials can be financially devastating, further demonstrating the importance of prevention. This article addresses some of the common audit targets and mistakes made by physicians and provides strategies for physician practices and health systems to respond to and, ultimately, avoid these denials.

  9. Communication Patterns in Preschool Education Institutions – Practical Examples

    PubMed Central

    Radic-Hozo, Endica

    2014-01-01

    Introduction: Proper communication in pre-school institutions for education is undeniable importance to the development of the child, as evidenced by numerous studies. After the child's birth follows the most complex phase in its early phases - preschool education. Only high-quality, synergistic relationship triad: parent-child-educator and the modern postulates of preschool child education, warrants successful preschool child education. Methods and materials: Description, with examples from daily practice in a large institution for preschool education, marked were the critical points on the complex way in child education, many pitfalls encountered by both parents and educators. Considered are the errors in communication with the proposed solution to avoid the same in practice. Conclusion: Proper, daily communication in the preschool institution for education, within a relationship between parent-child-educator, mutual consultation, respect, acceptance, facilitation, resulting in successful common goal - the proper education and socialization of children in institutions for preschool education. PMID:25568636

  10. All-optical simultaneous multichannel quadrature phase shift keying signal regeneration based on phase-sensitive amplification

    NASA Astrophysics Data System (ADS)

    Wang, Hongxiang; Wang, Qi; Bai, Lin; Ji, Yuefeng

    2018-01-01

    A scheme is proposed to realize the all-optical phase regeneration of four-channel quadrature phase shift keying (QPSK) signal based on phase-sensitive amplification. By utilizing conjugate pump and common pump in a highly nonlinear optical fiber, degenerate four-wave mixing process is observed, and QPSK signals are regenerated. The number of waves is reduced to decrease the cross talk caused by undesired nonlinear interaction during the coherent superposition process. In addition, to avoid the effect of overlapping frequency, frequency spans between pumps and signals are set to be nonintegral multiples. Optical signal-to-noise ratio improvement is validated by bit error rate measurements. Compared with single-channel regeneration, multichannel regeneration brings 0.4-dB OSNR penalty when the value of BER is 10-3, which shows the cross talk in regeneration process is negligible.

  11. Spontaneous Endometriosis Within a Primary Umbilical Hernia

    PubMed Central

    Yheulon, Christopher G

    2017-01-01

    Umbilical hernias are rather common in the General Surgery clinic; however, endometriosis of an umbilical hernia is rare. It is especially unusual to have endometriosis of an umbilical hernia spontaneously occur compared to occurring at a site of a prior surgery. We present a case of spontaneous endometriosis of an umbilical hernia without prior surgery to her umbilicus. She had not presented with the usual symptoms of endometriosis and it was not considered as a diagnosis prior to surgery. Umbilical endometriosis is rare but usually occurs after prior laparoscopic surgery. We believe this is the second reported case in the English literature and the first such case reported from North America of spontaneous endometriosis of an umbilical hernia. This case highlights the importance of a full review of systems and qualifying the type and occurrence of pain. Additionally, it is always important to analyze surgical specimens in pathology to avoid errors in diagnosis. PMID:29164008

  12. Spontaneous Endometriosis Within a Primary Umbilical Hernia.

    PubMed

    Laferriere, Nicole R; Yheulon, Christopher G

    2017-11-01

    Umbilical hernias are rather common in the General Surgery clinic; however, endometriosis of an umbilical hernia is rare. It is especially unusual to have endometriosis of an umbilical hernia spontaneously occur compared to occurring at a site of a prior surgery. We present a case of spontaneous endometriosis of an umbilical hernia without prior surgery to her umbilicus. She had not presented with the usual symptoms of endometriosis and it was not considered as a diagnosis prior to surgery. Umbilical endometriosis is rare but usually occurs after prior laparoscopic surgery. We believe this is the second reported case in the English literature and the first such case reported from North America of spontaneous endometriosis of an umbilical hernia. This case highlights the importance of a full review of systems and qualifying the type and occurrence of pain. Additionally, it is always important to analyze surgical specimens in pathology to avoid errors in diagnosis.

  13. Model Error Budgets

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2008-01-01

    An error budget is a commonly used tool in design of complex aerospace systems. It represents system performance requirements in terms of allowable errors and flows these down through a hierarchical structure to lower assemblies and components. The requirements may simply be 'allocated' based upon heuristics or experience, or they may be designed through use of physics-based models. This paper presents a basis for developing an error budget for models of the system, as opposed to the system itself. The need for model error budgets arises when system models are a principle design agent as is increasingly more common for poorly testable high performance space systems.

  14. Patterns of technical error among surgical malpractice claims: an analysis of strategies to prevent injury to surgical patients.

    PubMed

    Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A

    2007-11-01

    To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.

  15. SBL-Online: Implementing Studio-Based Learning Techniques in an Online Introductory Programming Course to Address Common Programming Errors and Misconceptions

    ERIC Educational Resources Information Center

    Polo, Blanca J.

    2013-01-01

    Much research has been done in regards to student programming errors, online education and studio-based learning (SBL) in computer science education. This study furthers this area by bringing together this knowledge and applying it to proactively help students overcome impasses caused by common student programming errors. This project proposes a…

  16. Surgical errors and risks – the head and neck cancer patient

    PubMed Central

    Harréus, Ulrich

    2013-01-01

    Head and neck surgery is one of the basic principles of head and neck cancer therapy. Surgical errors and malpractice can have fatal consequences for the treated patients. It can lead to functional impairment and has impact in future chances for disease related survival. There are many risks for head and neck surgeons that can cause errors and malpractice. To avoid surgical mistakes, thorough preoperative management of patients is mandatory. As there are ensuring operability, cautious evaluation of preoperative diagnostics and operative planning. Moreover knowledge of anatomical structures of the head and neck, of the medical studies and data as well as qualification in modern surgical techniques and the surgeons ability for critical self assessment are basic and important prerequisites for head and neck surgeons in order to make out risks and to prevent from mistakes. Additionally it is important to have profound knowledge in nutrition management of cancer patients, wound healing and to realize and to be able to deal with complications, when they occur. Despite all precaution and surgical care, errors and mistakes cannot always be avoided. For that it is important to be able to deal with mistakes and to establish an appropriate and clear communication and management for such events. The manuscript comments on recognition and prevention of risks and mistakes in the preoperative, operative and postoperative phase of head and neck cancer surgery. PMID:24403972

  17. Remediating Common Math Errors.

    ERIC Educational Resources Information Center

    Wagner, Rudolph F.

    1981-01-01

    Explanations and remediation suggestions for five types of mathematics errors due either to perceptual or cognitive difficulties are given. Error types include directionality problems, mirror writing, visually misperceived signs, diagnosed directionality problems, and mixed process errors. (CL)

  18. Air quality impacts of intercity freight. Volume 1 : guidebook

    DOT National Transportation Integrated Search

    2000-01-01

    Driver error remains the leading cause of highway crashes. Through the Intelligent Vehicle Initiative (IVI), the Department of Transportation hopes to reduce crashes by helping drivers avoid hazardous mistakes. IVI aims to accelerate the development ...

  19. Endodontic Procedural Errors: Frequency, Type of Error, and the Most Frequently Treated Tooth.

    PubMed

    Yousuf, Waqas; Khan, Moiz; Mehdi, Hasan

    2015-01-01

    Introduction. The aim of this study is to determine the most common endodontically treated tooth and the most common error produced during treatment and to note the association of particular errors with particular teeth. Material and Methods. Periapical radiographs were taken of all the included teeth and were stored and assessed using DIGORA Optime. Teeth in each group were evaluated for presence or absence of procedural errors (i.e., overfill, underfill, ledge formation, perforations, apical transportation, and/or instrument separation) and the most frequent tooth to undergo endodontic treatment was also noted. Results. A total of 1748 root canal treated teeth were assessed, out of which 574 (32.8%) contained a procedural error. Out of these 397 (22.7%) were overfilled, 155 (8.9%) were underfilled, 16 (0.9%) had instrument separation, and 7 (0.4%) had apical transportation. The most frequently treated tooth was right permanent mandibular first molar (11.3%). The least commonly treated teeth were the permanent mandibular third molars (0.1%). Conclusion. Practitioners should show greater care to maintain accuracy of the working length throughout the procedure, as errors in length accounted for the vast majority of errors and special care should be taken when working on molars.

  20. Effect of harmane, an endogenous β-carboline, on learning and memory in rats.

    PubMed

    Celikyurt, Ipek Komsuoglu; Utkan, Tijen; Gocmez, Semil Selcen; Hudson, Alan; Aricioglu, Feyza

    2013-01-01

    Our aim was to investigate the effects of acute harmane administration upon learning and memory performance of rats using the three-panel runway paradigm and passive avoidance test. Male rats received harmane (2.5, 5, and 7.5mg/kg, i.p.) or saline 30 min. before each session of experiments. In the three panel runway paradigm, harmane did not affect the number of errors and latency in reference memory. The effect of harmane on the errors of working memory was significantly higher following the doses of 5mg/kg and 7.5mg/kg. The latency was changed significantly at only 7.5mg/kg in comparison to control group. Animals were given pre-training injection of harmane in the passive avoidance test in order to determine the learning function. Harmane treatment decreased the retention latency significantly and dose dependently, which indicates an impairment in learning. In this study, harmane impaired working memory in three panel runway test and learning in passive avoidance test. As an endogenous bioactive molecule, harmane might have a critical role in the modulation of learning and memory functions. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Adaptive control of nonlinear system using online error minimum neural networks.

    PubMed

    Jia, Chao; Li, Xiaoli; Wang, Kang; Ding, Dawei

    2016-11-01

    In this paper, a new learning algorithm named OEM-ELM (Online Error Minimized-ELM) is proposed based on ELM (Extreme Learning Machine) neural network algorithm and the spreading of its main structure. The core idea of this OEM-ELM algorithm is: online learning, evaluation of network performance, and increasing of the number of hidden nodes. It combines the advantages of OS-ELM and EM-ELM, which can improve the capability of identification and avoid the redundancy of networks. The adaptive control based on the proposed algorithm OEM-ELM is set up which has stronger adaptive capability to the change of environment. The adaptive control of chemical process Continuous Stirred Tank Reactor (CSTR) is also given for application. The simulation results show that the proposed algorithm with respect to the traditional ELM algorithm can avoid network redundancy and improve the control performance greatly. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Ironic and Reinvestment Effects in Baseball Pitching: How Information About an Opponent Can Influence Performance Under Pressure.

    PubMed

    Gray, Rob; Orn, Anders; Woodman, Tim

    2017-02-01

    Are pressure-induced performance errors in experts associated with novice-like skill execution (as predicted by reinvestment/conscious processing theories) or expert execution toward a result that the performer typically intends to avoid (as predicted by ironic processes theory)? The present study directly compared these predictions using a baseball pitching task with two groups of experienced pitchers. One group was shown only their target, while the other group was shown the target and an ironic (avoid) zone. Both groups demonstrated significantly fewer target hits under pressure. For the target-only group, this was accompanied by significant changes in expertise-related kinematic variables. In the ironic group, the number of pitches thrown in the ironic zone was significantly higher under pressure, and there were no significant changes in kinematics. These results suggest that information about an opponent can influence the mechanisms underlying pressure-induced performance errors.

  3. Exploring Common Misconceptions and Errors about Fractions among College Students in Saudi Arabia

    ERIC Educational Resources Information Center

    Alghazo, Yazan M.; Alghazo, Runna

    2017-01-01

    The purpose of this study was to investigate what common errors and misconceptions about fractions exist among Saudi Arabian college students. Moreover, the study aimed at investigating the possible explanations for the existence of such misconceptions among students. A researcher developed mathematical test aimed at identifying common errors…

  4. Performance of cardiopulmonary resuscitation feedback systems in a long-distance train with distributed traction.

    PubMed

    González-Otero, Digna M; de Gauna, Sofía Ruiz; Ruiz, Jesus; Rivero, Raquel; Gutierrez, J J; Saiz, Purificación; Russell, James K

    2018-04-20

    Out-of-hospital cardiac arrest is common in public locations, including public transportation sites. Feedback devices are increasingly being used to improve chest-compression quality. However, their performance during public transportation has not been studied yet. To test two CPR feedback devices representative of the current technologies (accelerometer and electromag- netic-field) in a long-distance train. Volunteers applied compressions on a manikin during the train route using both feedback devices. Depth and rate measurements computed by the devices were compared to the gold-standard values. Sixty-four 4-min records were acquired. The accelerometer-based device provided visual help in all experiments. Median absolute errors in depth and rate were 2.4 mm and 1.3 compressions per minute (cpm) during conventional speed, and 2.5 mm and 1.2 cpm during high speed. The electromagnetic-field-based device never provided CPR feedback; alert messages were shown instead. However, measurements were stored in its internal memory. Absolute errors for depth and rate were 2.6 mm and 0.7 cpm during conventional speed, and 2.6 mm and 0.7 cpm during high speed. Both devices were accurate despite the accelerations and the electromagnetic interferences induced by the train. However, the electromagnetic-field-based device would require modifications to avoid excessive alerts impeding feedback.

  5. Tumor Burden Analysis on Computed Tomography by Automated Liver and Tumor Segmentation

    PubMed Central

    Linguraru, Marius George; Richbourg, William J.; Liu, Jianfei; Watt, Jeremy M.; Pamulapati, Vivek; Wang, Shijun; Summers, Ronald M.

    2013-01-01

    The paper presents the automated computation of hepatic tumor burden from abdominal CT images of diseased populations with images with inconsistent enhancement. The automated segmentation of livers is addressed first. A novel three-dimensional (3D) affine invariant shape parameterization is employed to compare local shape across organs. By generating a regular sampling of the organ's surface, this parameterization can be effectively used to compare features of a set of closed 3D surfaces point-to-point, while avoiding common problems with the parameterization of concave surfaces. From an initial segmentation of the livers, the areas of atypical local shape are determined using training sets. A geodesic active contour corrects locally the segmentations of the livers in abnormal images. Graph cuts segment the hepatic tumors using shape and enhancement constraints. Liver segmentation errors are reduced significantly and all tumors are detected. Finally, support vector machines and feature selection are employed to reduce the number of false tumor detections. The tumor detection true position fraction of 100% is achieved at 2.3 false positives/case and the tumor burden is estimated with 0.9% error. Results from the test data demonstrate the method's robustness to analyze livers from difficult clinical cases to allow the temporal monitoring of patients with hepatic cancer. PMID:22893379

  6. Spiral Gradient Coil Design for Use in Cylindrical MRI Systems.

    PubMed

    Wang, Yaohui; Xin, Xuegang; Liu, Feng; Crozier, Stuart

    2018-04-01

    In magnetic resonance imaging, the stream function based method is commonly used in the design of gradient coils. However, this method can be prone to errors associated with the discretization of continuous current density and wire connections. In this paper, we propose a novel gradient coil design scheme that works directly in the wire space, avoiding the system errors that may appear in the stream function approaches. Specifically, the gradient coil pattern is described with dedicated spiral functions adjusted to allow the coil to produce the required field gradients in the imaging area, minimal stray field, and other engineering terms. The performance of a designed spiral gradient coil was compared with its stream-function counterpart. The numerical evaluation shows that when compared with the conventional solution, the inductance and resistance was reduced by 20.9 and 10.5%, respectively. The overall coil performance (evaluated by the figure of merit (FoM)) was improved up to 26.5% for the x -gradient coil design; for the z-gradient coil design, the inductance and resistance were reduced by 15.1 and 6.7% respectively, and the FoM was increased by 17.7%. In addition, by directly controlling the wire distributions, the spiral gradient coil design was much sparser than conventional coils.

  7. Solution algorithm of dwell time in slope-based figuring model

    NASA Astrophysics Data System (ADS)

    Li, Yong; Zhou, Lin

    2017-10-01

    Surface slope profile is commonly used to evaluate X-ray reflective optics, which is used in synchrotron radiation beam. Moreover, the measurement result of measuring instrument for X-ray reflective optics is usually the surface slope profile rather than the surface height profile. To avoid the conversion error, the slope-based figuring model is introduced introduced by processing the X-ray reflective optics based on surface height-based model. However, the pulse iteration method, which can quickly obtain the dell time solution of the traditional height-based figuring model, is not applied to the slope-based figuring model because property of the slope removal function have both positive and negative values and complex asymmetric structure. To overcome this problem, we established the optimal mathematical model for the dwell time solution, By introducing the upper and lower limits of the dwell time and the time gradient constraint. Then we used the constrained least squares algorithm to solve the dwell time in slope-based figuring model. To validate the proposed algorithm, simulations and experiments are conducted. A flat mirror with effective aperture of 80 mm is polished on the ion beam machine. After iterative polishing three times, the surface slope profile error of the workpiece is converged from RMS 5.65 μrad to RMS 1.12 μrad.

  8. Gaming machine addiction: the role of avoidance, accessibility and social support.

    PubMed

    Thomas, Anna C; Allen, Felicity L; Phillips, James; Karantzas, Gery

    2011-12-01

    Commonality in etiology and clinical expression plus high comorbidity between pathological gambling and substance use disorders suggest common underlying motives. It is important to understand common motivators and differentiating factors. An overarching framework of addiction was used to examine predictors of problem gambling in current electronic gaming machine (EGM) gamblers. Path analysis was used to examine the relationships between antecedent factors (stressors, coping habits, social support), gambling motivations (avoidance, accessibility, social) and gambling behavior. Three hundred and forty seven (229 females: M = 29.20 years, SD = 14.93; 118 males: M = 29.64 years, SD = 12.49) people participated. Consistent with stress, coping and addiction theory, situational life stressors and general avoidance coping were positively related to avoidance-motivated gambling. In turn, avoidance-motivated gambling was positively related to EGM gambling frequency and problems. Consistent with exposure theory, life stressors were positively related to accessibility-motivated gambling, and accessibility-motivated gambling was positively related to EGM gambling frequency and gambling problems. These findings are consistent with other addiction research and suggest avoidance-motivated gambling is part of a more generalized pattern of avoidance coping with relative accessibility to EGM gambling explaining its choice as a method of avoidance. Findings also showed social support acted as a direct protective factor in relation to gambling frequency and problems and indirectly via avoidance and accessibility gambling motivations. Finally, life stressors were positively related to socially motivated gambling but this motivation was not related to either social support or gambling behavior suggesting it has little direct influence on gambling problems.

  9. Errors Affect Hypothetical Intertemporal Food Choice in Women

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2014-01-01

    Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534

  10. Error Analysis of Indonesian Junior High School Student in Solving Space and Shape Content PISA Problem Using Newman Procedure

    NASA Astrophysics Data System (ADS)

    Sumule, U.; Amin, S. M.; Fuad, Y.

    2018-01-01

    This study aims to determine the types and causes of errors, as well as efforts being attempted to overcome the mistakes made by junior high school students in completing PISA content space and shape. Two subjects were selected based on the mathematical ability test results with the most error, yet they are able to communicate orally and in writing. Two selected subjects then worked on the PISA ability test question and the subjects were interviewed to find out the type and cause of the error and then given a scaffolding based on the type of mistake made.The results of this study obtained the type of error that students do are comprehension and transformation error. The reasons are students was not able to identify the keywords in the question, write down what is known or given, specify formulas or device a plan. To overcome this error, students were given scaffolding. Scaffolding that given to overcome misunderstandings were reviewing and restructuring. While to overcome the transformation error, scaffolding given were reviewing, restructuring, explaining and developing representational tools. Teachers are advised to use scaffolding to resolve errors so that the students are able to avoid these errors.

  11. Risk prediction and aversion by anterior cingulate cortex.

    PubMed

    Brown, Joshua W; Braver, Todd S

    2007-12-01

    The recently proposed error-likelihood hypothesis suggests that anterior cingulate cortex (ACC) and surrounding areas will become active in proportion to the perceived likelihood of an error. The hypothesis was originally derived from a computational model prediction. The same computational model now makes a further prediction that ACC will be sensitive not only to predicted error likelihood, but also to the predicted magnitude of the consequences, should an error occur. The product of error likelihood and predicted error consequence magnitude collectively defines the general "expected risk" of a given behavior in a manner analogous but orthogonal to subjective expected utility theory. New fMRI results from an incentivechange signal task now replicate the error-likelihood effect, validate the further predictions of the computational model, and suggest why some segments of the population may fail to show an error-likelihood effect. In particular, error-likelihood effects and expected risk effects in general indicate greater sensitivity to earlier predictors of errors and are seen in risk-averse but not risk-tolerant individuals. Taken together, the results are consistent with an expected risk model of ACC and suggest that ACC may generally contribute to cognitive control by recruiting brain activity to avoid risk.

  12. Theoretical and experimental errors for in situ measurements of plant water potential.

    PubMed

    Shackel, K A

    1984-07-01

    Errors in psychrometrically determined values of leaf water potential caused by tissue resistance to water vapor exchange and by lack of thermal equilibrium were evaluated using commercial in situ psychrometers (Wescor Inc., Logan, UT) on leaves of Tradescantia virginiana (L.). Theoretical errors in the dewpoint method of operation for these sensors were demonstrated. After correction for these errors, in situ measurements of leaf water potential indicated substantial errors caused by tissue resistance to water vapor exchange (4 to 6% reduction in apparent water potential per second of cooling time used) resulting from humidity depletions in the psychrometer chamber during the Peltier condensation process. These errors were avoided by use of a modified procedure for dewpoint measurement. Large changes in apparent water potential were caused by leaf and psychrometer exposure to moderate levels of irradiance. These changes were correlated with relatively small shifts in psychrometer zero offsets (-0.6 to -1.0 megapascals per microvolt), indicating substantial errors caused by nonisothermal conditions between the leaf and the psychrometer. Explicit correction for these errors is not possible with the current psychrometer design.

  13. Theoretical and Experimental Errors for In Situ Measurements of Plant Water Potential 1

    PubMed Central

    Shackel, Kenneth A.

    1984-01-01

    Errors in psychrometrically determined values of leaf water potential caused by tissue resistance to water vapor exchange and by lack of thermal equilibrium were evaluated using commercial in situ psychrometers (Wescor Inc., Logan, UT) on leaves of Tradescantia virginiana (L.). Theoretical errors in the dewpoint method of operation for these sensors were demonstrated. After correction for these errors, in situ measurements of leaf water potential indicated substantial errors caused by tissue resistance to water vapor exchange (4 to 6% reduction in apparent water potential per second of cooling time used) resulting from humidity depletions in the psychrometer chamber during the Peltier condensation process. These errors were avoided by use of a modified procedure for dewpoint measurement. Large changes in apparent water potential were caused by leaf and psychrometer exposure to moderate levels of irradiance. These changes were correlated with relatively small shifts in psychrometer zero offsets (−0.6 to −1.0 megapascals per microvolt), indicating substantial errors caused by nonisothermal conditions between the leaf and the psychrometer. Explicit correction for these errors is not possible with the current psychrometer design. PMID:16663701

  14. Comparative Cost-Effectiveness Analysis of Three Different Automated Medication Systems Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2018-02-01

    Automated medication systems have been found to reduce errors in the medication process, but little is known about the cost-effectiveness of such systems. The objective of this study was to perform a model-based indirect cost-effectiveness comparison of three different, real-world automated medication systems compared with current standard practice. The considered automated medication systems were a patient-specific automated medication system (psAMS), a non-patient-specific automated medication system (npsAMS), and a complex automated medication system (cAMS). The economic evaluation used original effect and cost data from prospective, controlled, before-and-after studies of medication systems implemented at a Danish hematological ward and an acute medical unit. Effectiveness was described as the proportion of clinical and procedural error opportunities that were associated with one or more errors. An error was defined as a deviation from the electronic prescription, from standard hospital policy, or from written procedures. The cost assessment was based on 6-month standardization of observed cost data. The model-based comparative cost-effectiveness analyses were conducted with system-specific assumptions of the effect size and costs in scenarios with consumptions of 15,000, 30,000, and 45,000 doses per 6-month period. With 30,000 doses the cost-effectiveness model showed that the cost-effectiveness ratio expressed as the cost per avoided clinical error was €24 for the psAMS, €26 for the npsAMS, and €386 for the cAMS. Comparison of the cost-effectiveness of the three systems in relation to different valuations of an avoided error showed that the psAMS was the most cost-effective system regardless of error type or valuation. The model-based indirect comparison against the conventional practice showed that psAMS and npsAMS were more cost-effective than the cAMS alternative, and that psAMS was more cost-effective than npsAMS.

  15. Global Warming Estimation From Microwave Sounding Unit

    NASA Technical Reports Server (NTRS)

    Prabhakara, C.; Iacovazzi, R., Jr.; Yoo, J.-M.; Dalu, G.

    1998-01-01

    Microwave Sounding Unit (MSU) Ch 2 data sets, collected from sequential, polar-orbiting, Sun-synchronous National Oceanic and Atmospheric Administration operational satellites, contain systematic calibration errors that are coupled to the diurnal temperature cycle over the globe. Since these coupled errors in MSU data differ between successive satellites, it is necessary to make compensatory adjustments to these multisatellite data sets in order to determine long-term global temperature change. With the aid of the observations during overlapping periods of successive satellites, we can determine such adjustments and use them to account for the coupled errors in the long-term time series of MSU Ch 2 global temperature. In turn, these adjusted MSU Ch 2 data sets can be used to yield global temperature trend. In a pioneering study, Spencer and Christy (SC) (1990) developed a procedure to derive the global temperature trend from MSU Ch 2 data. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedure, the magnitude of the coupled errors is not determined explicitly. Furthermore, based on some assumptions, these coupled errors are eliminated in three separate steps. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedures. Based on our analysis, we find there is a global warming of 0.23+/-0.12 K between 1980 and 1991. Also, in this study, the time series of global temperature anomalies constructed by removing the global mean annual temperature cycle compares favorably with a similar time series obtained from conventional observations of temperature.

  16. Design and performance evaluation of a distributed OFDMA-based MAC protocol for MANETs.

    PubMed

    Park, Jaesung; Chung, Jiyoung; Lee, Hyungyu; Lee, Jung-Ryun

    2014-01-01

    In this paper, we propose a distributed MAC protocol for OFDMA-based wireless mobile ad hoc multihop networks, in which the resource reservation and data transmission procedures are operated in a distributed manner. A frame format is designed considering the characteristics of OFDMA that each node can transmit or receive data to or from multiple nodes simultaneously. Under this frame structure, we propose a distributed resource management method including network state estimation and resource reservation processes. We categorize five types of logical errors according to their root causes and show that two of the logical errors are inevitable while three of them are avoided under the proposed distributed MAC protocol. In addition, we provide a systematic method to determine the advertisement period of each node by presenting a clear relation between the accuracy of estimated network states and the signaling overhead. We evaluate the performance of the proposed protocol in respect of the reservation success rate and the success rate of data transmission. Since our method focuses on avoiding logical errors, it could be easily placed on top of the other resource allocation methods focusing on the physical layer issues of the resource management problem and interworked with them.

  17. Effect of heteroscedasticity treatment in residual error models on model calibration and prediction uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Sun, Ruochen; Yuan, Huiling; Liu, Xiaoli

    2017-11-01

    The heteroscedasticity treatment in residual error models directly impacts the model calibration and prediction uncertainty estimation. This study compares three methods to deal with the heteroscedasticity, including the explicit linear modeling (LM) method and nonlinear modeling (NL) method using hyperbolic tangent function, as well as the implicit Box-Cox transformation (BC). Then a combined approach (CA) combining the advantages of both LM and BC methods has been proposed. In conjunction with the first order autoregressive model and the skew exponential power (SEP) distribution, four residual error models are generated, namely LM-SEP, NL-SEP, BC-SEP and CA-SEP, and their corresponding likelihood functions are applied to the Variable Infiltration Capacity (VIC) hydrologic model over the Huaihe River basin, China. Results show that the LM-SEP yields the poorest streamflow predictions with the widest uncertainty band and unrealistic negative flows. The NL and BC methods can better deal with the heteroscedasticity and hence their corresponding predictive performances are improved, yet the negative flows cannot be avoided. The CA-SEP produces the most accurate predictions with the highest reliability and effectively avoids the negative flows, because the CA approach is capable of addressing the complicated heteroscedasticity over the study basin.

  18. What Do Spelling Errors Tell Us? Classification and Analysis of Errors Made by Greek Schoolchildren with and without Dyslexia

    ERIC Educational Resources Information Center

    Protopapas, Athanassios; Fakou, Aikaterini; Drakopoulou, Styliani; Skaloumbakas, Christos; Mouzaki, Angeliki

    2013-01-01

    In this study we propose a classification system for spelling errors and determine the most common spelling difficulties of Greek children with and without dyslexia. Spelling skills of 542 children from the general population and 44 children with dyslexia, Grades 3-4 and 7, were assessed with a dictated common word list and age-appropriate…

  19. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  20. Fringe order correction for the absolute phase recovered by two selected spatial frequency fringe projections in fringe projection profilometry.

    PubMed

    Ding, Yi; Peng, Kai; Yu, Miao; Lu, Lei; Zhao, Kun

    2017-08-01

    The performance of the two selected spatial frequency phase unwrapping methods is limited by a phase error bound beyond which errors will occur in the fringe order leading to a significant error in the recovered absolute phase map. In this paper, we propose a method to detect and correct the wrong fringe orders. Two constraints are introduced during the fringe order determination of two selected spatial frequency phase unwrapping methods. A strategy to detect and correct the wrong fringe orders is also described. Compared with the existing methods, we do not need to estimate the threshold associated with absolute phase values to determine the fringe order error, thus making it more reliable and avoiding the procedure of search in detecting and correcting successive fringe order errors. The effectiveness of the proposed method is validated by the experimental results.

  1. A STUDY ON REASONS OF ERRORS OF OLD SURVEY MAPS IN CADASTRAL SYSTEM

    NASA Astrophysics Data System (ADS)

    Yanase, Norihiko

    This paper explicates sources on survey map errors which were made in 19th century. The present cadastral system stands on registers and survey maps which were compiled to change the land taxation system in the Meiji era. Many Japanese may recognize the reasons why poor survey technique by farmers, too long measure to avoid heavy tax, careless official check and other deception made such errors of acreage from several to more than ten percent of area in survey maps. The author would like to maintain that such errors, called nawa-nobi, were lawful in accordance with the then survey regulation because of results to analyze old survey regulations, history of making maps and studies of cadastral system. In addition to, a kind of survey maps' errors should be pointed out a reason why the easy subdivision system which could approve without real survey and disposal of state property with inadequate survey.

  2. Medication errors as malpractice-a qualitative content analysis of 585 medication errors by nurses in Sweden.

    PubMed

    Björkstén, Karin Sparring; Bergqvist, Monica; Andersén-Karlsson, Eva; Benson, Lina; Ulfvarson, Johanna

    2016-08-24

    Many studies address the prevalence of medication errors but few address medication errors serious enough to be regarded as malpractice. Other studies have analyzed the individual and system contributory factor leading to a medication error. Nurses have a key role in medication administration, and there are contradictory reports on the nurses' work experience in relation to the risk and type for medication errors. All medication errors where a nurse was held responsible for malpractice (n = 585) during 11 years in Sweden were included. A qualitative content analysis and classification according to the type and the individual and system contributory factors was made. In order to test for possible differences between nurses' work experience and associations within and between the errors and contributory factors, Fisher's exact test was used, and Cohen's kappa (k) was performed to estimate the magnitude and direction of the associations. There were a total of 613 medication errors in the 585 cases, the most common being "Wrong dose" (41 %), "Wrong patient" (13 %) and "Omission of drug" (12 %). In 95 % of the cases, an average of 1.4 individual contributory factors was found; the most common being "Negligence, forgetfulness or lack of attentiveness" (68 %), "Proper protocol not followed" (25 %), "Lack of knowledge" (13 %) and "Practice beyond scope" (12 %). In 78 % of the cases, an average of 1.7 system contributory factors was found; the most common being "Role overload" (36 %), "Unclear communication or orders" (30 %) and "Lack of adequate access to guidelines or unclear organisational routines" (30 %). The errors "Wrong patient due to mix-up of patients" and "Wrong route" and the contributory factors "Lack of knowledge" and "Negligence, forgetfulness or lack of attentiveness" were more common in less experienced nurses. The experienced nurses were more prone to "Practice beyond scope of practice" and to make errors in spite of "Lack of adequate access to guidelines or unclear organisational routines". Medication errors regarded as malpractice in Sweden were of the same character as medication errors worldwide. A complex interplay between individual and system factors often contributed to the errors.

  3. Runway safety : it's everybody's business

    DOT National Transportation Integrated Search

    2001-07-01

    This booklet tells pilots and controllers what they can do to help prevent runway incursions by helping them to avoid situations that reduce errors and alerting them to situations as extra vigilance is required. It also provides information on how co...

  4. Experimental magic state distillation for fault-tolerant quantum computing.

    PubMed

    Souza, Alexandre M; Zhang, Jingfu; Ryan, Colm A; Laflamme, Raymond

    2011-01-25

    Any physical quantum device for quantum information processing (QIP) is subject to errors in implementation. In order to be reliable and efficient, quantum computers will need error-correcting or error-avoiding methods. Fault-tolerance achieved through quantum error correction will be an integral part of quantum computers. Of the many methods that have been discovered to implement it, a highly successful approach has been to use transversal gates and specific initial states. A critical element for its implementation is the availability of high-fidelity initial states, such as |0〉 and the 'magic state'. Here, we report an experiment, performed in a nuclear magnetic resonance (NMR) quantum processor, showing sufficient quantum control to improve the fidelity of imperfect initial magic states by distilling five of them into one with higher fidelity.

  5. First order error corrections in common introductory physics experiments

    NASA Astrophysics Data System (ADS)

    Beckey, Jacob; Baker, Andrew; Aravind, Vasudeva; Clarion Team

    As a part of introductory physics courses, students perform different standard lab experiments. Almost all of these experiments are prone to errors owing to factors like friction, misalignment of equipment, air drag, etc. Usually these types of errors are ignored by students and not much thought is paid to the source of these errors. However, paying attention to these factors that give rise to errors help students make better physics models and understand physical phenomena behind experiments in more detail. In this work, we explore common causes of errors in introductory physics experiment and suggest changes that will mitigate the errors, or suggest models that take the sources of these errors into consideration. This work helps students build better and refined physical models and understand physics concepts in greater detail. We thank Clarion University undergraduate student grant for financial support involving this project.

  6. AN ASSESSMENT OF SUNSPOT NUMBER DATA COMPOSITES OVER 1845–2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockwood, M.; Owens, M. J.; Barnard, L.

    2016-06-10

    New sunspot data composites, some of which are radically different in the character of their long-term variation, are evaluated over the interval 1845–2014. The method commonly used to calibrate historic sunspot data, relative to modern-day data, is “daisy-chaining,” whereby calibration is passed from one data subset to the neighboring one, usually using regressions of the data subsets for the intervals of their overlap. Recent studies have illustrated serious pitfalls in these regressions, and the resulting errors can be compounded by their repeated use as the data sequence is extended back in time. Hence, the recent composite data series by Usoskinmore » et al., R {sub UEA}, is a very important advance because it avoids regressions, daisy-chaining, and other common, but invalid, assumptions: this is achieved by comparing the statistics of “active-day” fractions to those for a single reference data set. We study six sunspot data series, including R {sub UEA} and the new “backbone” data series ( R {sub BB}, recently generated by Svalgaard and Schatten by employing both regression and daisy-chaining). We show that all six can be used with a continuity model to reproduce the main features of the open solar flux variation for 1845–2014, as reconstructed from geomagnetic activity data. However, some differences can be identified that are consistent with tests using a basket of other proxies for solar magnetic fields. Using data from a variety of sunspot observers, we illustrate problems with the method employed in generating R {sub BB} that cause it to increasingly overestimate sunspot numbers going back in time, and we recommend using R {sub UEA} because it employs more robust procedures that avoid such problems.« less

  7. Exploring Business Strategy in Health Information Exchange Organizations.

    PubMed

    Langabeer, James R; Champagne, Tiffany

    2016-01-01

    Unlike consumer goods industries, healthcare has been slow to implement technolo gies that support exchange of data in patients' health records. This results in avoid able medication errors, avoidable hospital readmissions, unnecessary duplicate testing, and other inefficient or wasteful practices. Community-based regional health information exchange (HIE) organizations have evolved in response to federal aims to encourage interoperability, yet little is known about their strategic approach. We use the lens of institutional and strategic management theories to empirically explore the differences in business strategies deployed in HIEs that are, to date, financially sustainable versus those that are not. We developed a 20-question survey targeted to CEOs to assess HIE business strategies. Our sample consisted of 60 community-based exchanges distributed throughout the United States, and we achieved a 58% response rate. Questions centered on competitive strategy and financial sustainability. We relied on logistic regression methods to explore relationships between variables. Our regression identified characteristics common to sustainable organizations. We defined sustainability as revenues exceeding operational costs. Seventeen of the 35 organizations (49%) defined themselves as currently sustainable. Focus and cost leadership strategies were significantly associated with sustainability. Growth strate gies, which were much more common than other strategies, were not associated with sustainability. We saw little evidence of a differentiation strategy (i.e., the basis of competition whereby the attributes of a product or service are unmatched by rivals). Most CEOs had a relatively optimistic outlook, with 60% stating they were confident of surviving over the next 5 years; however, nearly 9% of the organizations were in some phase of divestiture or exit from the market. HIEs are evolving differently based on local leadership decisions, yet their strategic approach is isomorphic (or similar). Further insight into successful business strategies could help ensure the long-term survival of HIEs.

  8. Orbital-free bond breaking via machine learning

    NASA Astrophysics Data System (ADS)

    Snyder, John C.; Rupp, Matthias; Hansen, Katja; Blooston, Leo; Müller, Klaus-Robert; Burke, Kieron

    2013-12-01

    Using a one-dimensional model, we explore the ability of machine learning to approximate the non-interacting kinetic energy density functional of diatomics. This nonlinear interpolation between Kohn-Sham reference calculations can (i) accurately dissociate a diatomic, (ii) be systematically improved with increased reference data and (iii) generate accurate self-consistent densities via a projection method that avoids directions with no data. With relatively few densities, the error due to the interpolation is smaller than typical errors in standard exchange-correlation functionals.

  9. Data error and highly parameterized groundwater models

    USGS Publications Warehouse

    Hill, M.C.

    2008-01-01

    Strengths and weaknesses of highly parameterized models, in which the number of parameters exceeds the number of observations, are demonstrated using a synthetic test case. Results suggest that the approach can yield close matches to observations but also serious errors in system representation. It is proposed that avoiding the difficulties of highly parameterized models requires close evaluation of: (1) model fit, (2) performance of the regression, and (3) estimated parameter distributions. Comparisons to hydrogeologic information are expected to be critical to obtaining credible models. Copyright ?? 2008 IAHS Press.

  10. Dopamine reward prediction error responses reflect marginal utility.

    PubMed

    Stauffer, William R; Lak, Armin; Schultz, Wolfram

    2014-11-03

    Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions' shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Reliable Channel-Adapted Error Correction: Bacon-Shor Code Recovery from Amplitude Damping

    NASA Astrophysics Data System (ADS)

    Piedrafita, Álvaro; Renes, Joseph M.

    2017-12-01

    We construct two simple error correction schemes adapted to amplitude damping noise for Bacon-Shor codes and investigate their prospects for fault-tolerant implementation. Both consist solely of Clifford gates and require far fewer qubits, relative to the standard method, to achieve exact correction to a desired order in the damping rate. The first, employing one-bit teleportation and single-qubit measurements, needs only one-fourth as many physical qubits, while the second, using just stabilizer measurements and Pauli corrections, needs only half. The improvements stem from the fact that damping events need only be detected, not corrected, and that effective phase errors arising due to undamped qubits occur at a lower rate than damping errors. For error correction that is itself subject to damping noise, we show that existing fault-tolerance methods can be employed for the latter scheme, while the former can be made to avoid potential catastrophic errors and can easily cope with damping faults in ancilla qubits.

  12. Theoretical study of the accuracy of the elution by characteristic points method for bi-langmuir isotherms.

    PubMed

    Ravald, L; Fornstedt, T

    2001-01-26

    The bi-Langmuir equation has recently been proven essential to describe chiral chromatographic surfaces and we therefore investigated the accuracy of the elution by characteristic points method (ECP) for estimation of bi-Langmuir isotherm parameters. The ECP calculations was done on elution profiles generated by the equilibrium-dispersive model of chromatography for five different sets of bi-Langmuir parameters. The ECP method generates two different errors; (i) the error of the ECP calculated isotherm and (ii) the model error of the fitting to the ECP isotherm. Both errors decreased with increasing column efficiency. Moreover, the model error was strongly affected by the weight of the bi-Langmuir function fitted. For some bi-Langmuir compositions the error of the ECP calculated isotherm is too large even at high column efficiencies. Guidelines will be given on surface types to be avoided and on column efficiencies and loading factors required for adequate parameter estimations with ECP.

  13. Hessian matrix approach for determining error field sensitivity to coil deviations.

    DOE PAGES

    Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.; ...

    2018-03-15

    The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code [Zhu et al., Nucl. Fusion 58(1):016008 (2018)] is utilized to provide fast and accurate calculationsmore » of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.« less

  14. Feature Migration in Time: Reflection of Selective Attention on Speech Errors

    PubMed Central

    Nozari, Nazbanou; Dell, Gary S.

    2012-01-01

    This paper describes an initial study of the effect of focused attention on phonological speech errors. In three experiments, participants recited four-word tongue-twisters, and focused attention on one (or none) of the words. The attended word was singled out differently in each experiment; participants were under instructions to either avoid errors on the attended word, to stress it, or to say it silently. The experiments showed that all methods of attending to a word decreased errors on that word, while increasing errors on the surrounding words. However, this error increase did not result from a relative increase in phonemic migrations originating from the attended word. This pattern is inconsistent with conceptualizing attention either as higher activation of the attended word or greater inhibition of the unattended words throughout the production of the sequence. Instead, it is consistent with a model which presumes that attention exerts its effect at the time of production of the attended word, without lingering effects on the past or the future. PMID:22268910

  15. Hessian matrix approach for determining error field sensitivity to coil deviations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Caoxiang; Hudson, Stuart R.; Lazerson, Samuel A.

    The presence of error fields has been shown to degrade plasma confinement and drive instabilities. Error fields can arise from many sources, but are predominantly attributed to deviations in the coil geometry. In this paper, we introduce a Hessian matrix approach for determining error field sensitivity to coil deviations. A primary cost function used for designing stellarator coils, the surface integral of normalized normal field errors, was adopted to evaluate the deviation of the generated magnetic field from the desired magnetic field. The FOCUS code [Zhu et al., Nucl. Fusion 58(1):016008 (2018)] is utilized to provide fast and accurate calculationsmore » of the Hessian. The sensitivities of error fields to coil displacements are then determined by the eigenvalues of the Hessian matrix. A proof-of-principle example is given on a CNT-like configuration. We anticipate that this new method could provide information to avoid dominant coil misalignments and simplify coil designs for stellarators.« less

  16. Simulation of co-phase error correction of optical multi-aperture imaging system based on stochastic parallel gradient decent algorithm

    NASA Astrophysics Data System (ADS)

    He, Xiaojun; Ma, Haotong; Luo, Chuanxin

    2016-10-01

    The optical multi-aperture imaging system is an effective way to magnify the aperture and increase the resolution of telescope optical system, the difficulty of which lies in detecting and correcting of co-phase error. This paper presents a method based on stochastic parallel gradient decent algorithm (SPGD) to correct the co-phase error. Compared with the current method, SPGD method can avoid detecting the co-phase error. This paper analyzed the influence of piston error and tilt error on image quality based on double-aperture imaging system, introduced the basic principle of SPGD algorithm, and discuss the influence of SPGD algorithm's key parameters (the gain coefficient and the disturbance amplitude) on error control performance. The results show that SPGD can efficiently correct the co-phase error. The convergence speed of the SPGD algorithm is improved with the increase of gain coefficient and disturbance amplitude, but the stability of the algorithm reduced. The adaptive gain coefficient can solve this problem appropriately. This paper's results can provide the theoretical reference for the co-phase error correction of the multi-aperture imaging system.

  17. Human Rhabdomyosarcoma Cell Lines for Rhabdomyosarcoma Research: Utility and Pitfalls

    PubMed Central

    Hinson, Ashley R. P.; Jones, Rosanne; Crose, Lisa E. S.; Belyea, Brian C.; Barr, Frederic G.; Linardic, Corinne M.

    2013-01-01

    Rhabdomyosarcoma (RMS) is the most common soft tissue sarcoma of childhood and adolescence. Despite intergroup clinical trials conducted in Europe and North America, outcomes for high risk patients with this disease have not significantly improved in the last several decades, and survival of metastatic or relapsed disease remains extremely poor. Accrual into new clinical trials is slow and difficult, so in vitro cell-line research and in vivo xenograft models present an attractive alternative for preclinical research for this cancer type. Currently, 30 commonly used human RMS cell lines exist, with differing origins, karyotypes, histologies, and methods of validation. Selecting an appropriate cell line for RMS research has important implications for outcomes. There are also potential pitfalls in using certain cell lines including contamination with murine stromal cells, cross-contamination between cell lines, discordance between the cell line and its associated original tumor, imposter cell lines, and nomenclature errors that result in the circulation of two or more presumed unique cell lines that are actually from the same origin. These pitfalls can be avoided by testing for species-specific isoenzymes, microarray analysis, assays for subtype-specific fusion products, and short tandem repeat analysis. PMID:23882450

  18. Evaluation the effect of energetic particles in solar flares on satellite's life time

    NASA Astrophysics Data System (ADS)

    Bagheri, Z.; Davoudifar, P.

    2016-09-01

    As the satellites have a multiple role in the humans' life, their damages and therefore logical failures of their segment causes problems and lots of expenses. So evaluating different types of failures in their segments has a crustal role. Solar particles are one of the most important reasons of segment damages (hard and soft) during a solar event or in usual times. During a solar event these particle may cause extensive damages which are even permanent (hard errors). To avoid these effects and design shielding mediums, we need to know SEP (solar energetic particles) flux and MTTF (mean time between two failures) of segments. In the present work, we calculated SEP flux witch collide the satellite in common times, in different altitudes. OMERE software was used to determine the coordinates and specifications of a satellite which in simulations has been launched to space. Then we considered a common electronic computer part and calculated MTTF for it. In the same way the SEP fluxes were calculated during different solar flares of different solar cycles and MTFFs were evaluated during occurring of solar flares. Thus a relation between solar flare energy and life time of the satellite electronic part (hours) was obtained.

  19. Halitosis

    MedlinePlus

    ... can also affect your breath. Common examples of foods and beverages that may cause bad breath include onions, garlic, ... and vegetables every day. Eat less meat. Avoid foods that cause you to have bad breath. Also try to avoid alcoholic beverages, which often cause bad breath. Avoid using tobacco ...

  20. A Sensor Dynamic Measurement Error Prediction Model Based on NAPSO-SVM

    PubMed Central

    Jiang, Minlan; Jiang, Lan; Jiang, Dingde; Li, Fei

    2018-01-01

    Dynamic measurement error correction is an effective way to improve sensor precision. Dynamic measurement error prediction is an important part of error correction, and support vector machine (SVM) is often used for predicting the dynamic measurement errors of sensors. Traditionally, the SVM parameters were always set manually, which cannot ensure the model’s performance. In this paper, a SVM method based on an improved particle swarm optimization (NAPSO) is proposed to predict the dynamic measurement errors of sensors. Natural selection and simulated annealing are added in the PSO to raise the ability to avoid local optima. To verify the performance of NAPSO-SVM, three types of algorithms are selected to optimize the SVM’s parameters: the particle swarm optimization algorithm (PSO), the improved PSO optimization algorithm (NAPSO), and the glowworm swarm optimization (GSO). The dynamic measurement error data of two sensors are applied as the test data. The root mean squared error and mean absolute percentage error are employed to evaluate the prediction models’ performances. The experimental results show that among the three tested algorithms the NAPSO-SVM method has a better prediction precision and a less prediction errors, and it is an effective method for predicting the dynamic measurement errors of sensors. PMID:29342942

  1. Impact of Internally Developed Electronic Prescription on Prescribing Errors at Discharge from the Emergency Department

    PubMed Central

    Hitti, Eveline; Tamim, Hani; Bakhti, Rinad; Zebian, Dina; Mufarrij, Afif

    2017-01-01

    Introduction Medication errors are common, with studies reporting at least one error per patient encounter. At hospital discharge, medication errors vary from 15%–38%. However, studies assessing the effect of an internally developed electronic (E)-prescription system at discharge from an emergency department (ED) are comparatively minimal. Additionally, commercially available electronic solutions are cost-prohibitive in many resource-limited settings. We assessed the impact of introducing an internally developed, low-cost E-prescription system, with a list of commonly prescribed medications, on prescription error rates at discharge from the ED, compared to handwritten prescriptions. Methods We conducted a pre- and post-intervention study comparing error rates in a randomly selected sample of discharge prescriptions (handwritten versus electronic) five months pre and four months post the introduction of the E-prescription. The internally developed, E-prescription system included a list of 166 commonly prescribed medications with the generic name, strength, dose, frequency and duration. We included a total of 2,883 prescriptions in this study: 1,475 in the pre-intervention phase were handwritten (HW) and 1,408 in the post-intervention phase were electronic. We calculated rates of 14 different errors and compared them between the pre- and post-intervention period. Results Overall, E-prescriptions included fewer prescription errors as compared to HW-prescriptions. Specifically, E-prescriptions reduced missing dose (11.3% to 4.3%, p <0.0001), missing frequency (3.5% to 2.2%, p=0.04), missing strength errors (32.4% to 10.2%, p <0.0001) and legibility (0.7% to 0.2%, p=0.005). E-prescriptions, however, were associated with a significant increase in duplication errors, specifically with home medication (1.7% to 3%, p=0.02). Conclusion A basic, internally developed E-prescription system, featuring commonly used medications, effectively reduced medication errors in a low-resource setting where the costs of sophisticated commercial electronic solutions are prohibitive. PMID:28874948

  2. Impact of Internally Developed Electronic Prescription on Prescribing Errors at Discharge from the Emergency Department.

    PubMed

    Hitti, Eveline; Tamim, Hani; Bakhti, Rinad; Zebian, Dina; Mufarrij, Afif

    2017-08-01

    Medication errors are common, with studies reporting at least one error per patient encounter. At hospital discharge, medication errors vary from 15%-38%. However, studies assessing the effect of an internally developed electronic (E)-prescription system at discharge from an emergency department (ED) are comparatively minimal. Additionally, commercially available electronic solutions are cost-prohibitive in many resource-limited settings. We assessed the impact of introducing an internally developed, low-cost E-prescription system, with a list of commonly prescribed medications, on prescription error rates at discharge from the ED, compared to handwritten prescriptions. We conducted a pre- and post-intervention study comparing error rates in a randomly selected sample of discharge prescriptions (handwritten versus electronic) five months pre and four months post the introduction of the E-prescription. The internally developed, E-prescription system included a list of 166 commonly prescribed medications with the generic name, strength, dose, frequency and duration. We included a total of 2,883 prescriptions in this study: 1,475 in the pre-intervention phase were handwritten (HW) and 1,408 in the post-intervention phase were electronic. We calculated rates of 14 different errors and compared them between the pre- and post-intervention period. Overall, E-prescriptions included fewer prescription errors as compared to HW-prescriptions. Specifically, E-prescriptions reduced missing dose (11.3% to 4.3%, p <0.0001), missing frequency (3.5% to 2.2%, p=0.04), missing strength errors (32.4% to 10.2%, p <0.0001) and legibility (0.7% to 0.2%, p=0.005). E-prescriptions, however, were associated with a significant increase in duplication errors, specifically with home medication (1.7% to 3%, p=0.02). A basic, internally developed E-prescription system, featuring commonly used medications, effectively reduced medication errors in a low-resource setting where the costs of sophisticated commercial electronic solutions are prohibitive.

  3. Medical error and related factors during internship and residency.

    PubMed

    Ahmadipour, Habibeh; Nahid, Mortazavi

    2015-01-01

    It is difficult to determine the real incidence of medical errors due to the lack of a precise definition of errors, as well as the failure to report them under certain circumstances. We carried out a cross- sectional study in Kerman University of Medical Sciences, Iran in 2013. The participants were selected through the census method. The data were collected using a self-administered questionnaire, which consisted of questions on the participants' demographic data and questions on the medical errors committed. The data were analysed by SPSS 19. It was found that 270 participants had committed medical errors. There was no significant difference in the frequency of errors committed by interns and residents. In the case of residents, the most common error was misdiagnosis and in that of interns, errors related to history-taking and physical examination. Considering that medical errors are common in the clinical setting, the education system should train interns and residents to prevent the occurrence of errors. In addition, the system should develop a positive attitude among them so that they can deal better with medical errors.

  4. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  5. Small mammal community composition in cornfields, roadside ditches, and prairies in eastern Nebraska

    USGS Publications Warehouse

    Kirsch, E.M.

    1997-01-01

    Community composition of small mammals was examined in prairies, cornfields, and their adjacent roadside ditches in eastern Nebraska. Western harvest mice (Reithrodontomys megalotis) and meadow voles (Microtus pennsylvanicus) were associated with prairie habitat, were common in ditches, but avoided cornfields. Prairie voles (M. Ochrogaster) and white-footed mice (Peromyscus leucopus) were associated with ditch habitat, were common in prairies, but avoided cornfields. Short-tailed shrews (Blarina brevicauda) avoided cornfields, were associated with ditches next to cornfields, but were common in prairies and ditches next to prairies. Deer mice (P. Maniculatus) were associated with cornfields but were relatively common in prairies and ditches. House mice (Mus musculus) were most common in ditches next to cornfields, occurred in cornfields and ditches next to prairies, but were not captured in prairies. Although community composition appears to differ among prairies, ditches, and cornfields, ditches support a more complete suite of the native small mammal species in large and relatively even numbers, whereas cornfields only support deer mice in large numbers.

  6. Sources of error in the retracted scientific literature.

    PubMed

    Casadevall, Arturo; Steen, R Grant; Fang, Ferric C

    2014-09-01

    Retraction of flawed articles is an important mechanism for correction of the scientific literature. We recently reported that the majority of retractions are associated with scientific misconduct. In the current study, we focused on the subset of retractions for which no misconduct was identified, in order to identify the major causes of error. Analysis of the retraction notices for 423 articles indexed in PubMed revealed that the most common causes of error-related retraction are laboratory errors, analytical errors, and irreproducible results. The most common laboratory errors are contamination and problems relating to molecular biology procedures (e.g., sequencing, cloning). Retractions due to contamination were more common in the past, whereas analytical errors are now increasing in frequency. A number of publications that have not been retracted despite being shown to contain significant errors suggest that barriers to retraction may impede correction of the literature. In particular, few cases of retraction due to cell line contamination were found despite recognition that this problem has affected numerous publications. An understanding of the errors leading to retraction can guide practices to improve laboratory research and the integrity of the scientific literature. Perhaps most important, our analysis has identified major problems in the mechanisms used to rectify the scientific literature and suggests a need for action by the scientific community to adopt protocols that ensure the integrity of the publication process. © FASEB.

  7. Medication errors in chemotherapy preparation and administration: a survey conducted among oncology nurses in Turkey.

    PubMed

    Ulas, Arife; Silay, Kamile; Akinci, Sema; Dede, Didem Sener; Akinci, Muhammed Bulent; Sendur, Mehmet Ali Nahit; Cubukcu, Erdem; Coskun, Hasan Senol; Degirmenci, Mustafa; Utkan, Gungor; Ozdemir, Nuriye; Isikdogan, Abdurrahman; Buyukcelik, Abdullah; Inanc, Mevlude; Bilici, Ahmet; Odabasi, Hatice; Cihan, Sener; Avci, Nilufer; Yalcin, Bulent

    2015-01-01

    Medication errors in oncology may cause severe clinical problems due to low therapeutic indices and high toxicity of chemotherapeutic agents. We aimed to investigate unintentional medication errors and underlying factors during chemotherapy preparation and administration based on a systematic survey conducted to reflect oncology nurses experience. This study was conducted in 18 adult chemotherapy units with volunteer participation of 206 nurses. A survey developed by primary investigators and medication errors (MAEs) defined preventable errors during prescription of medication, ordering, preparation or administration. The survey consisted of 4 parts: demographic features of nurses; workload of chemotherapy units; errors and their estimated monthly number during chemotherapy preparation and administration; and evaluation of the possible factors responsible from ME. The survey was conducted by face to face interview and data analyses were performed with descriptive statistics. Chi-square or Fisher exact tests were used for a comparative analysis of categorical data. Some 83.4% of the 210 nurses reported one or more than one error during chemotherapy preparation and administration. Prescribing or ordering wrong doses by physicians (65.7%) and noncompliance with administration sequences during chemotherapy administration (50.5%) were the most common errors. The most common estimated average monthly error was not following the administration sequence of the chemotherapeutic agents (4.1 times/month, range 1-20). The most important underlying reasons for medication errors were heavy workload (49.7%) and insufficient number of staff (36.5%). Our findings suggest that the probability of medication error is very high during chemotherapy preparation and administration, the most common involving prescribing and ordering errors. Further studies must address the strategies to minimize medication error in chemotherapy receiving patients, determine sufficient protective measures and establishing multistep control mechanisms.

  8. Bias in the Counseling Process: How to Recognize and Avoid It.

    ERIC Educational Resources Information Center

    Morrow, Kelly A.; Deidan, Cecilia T.

    1992-01-01

    Notes that counselors' vulnerability to inferential bias during counseling process may result in misdiagnosis and improper interventions. Discusses these inferential biases: availability and representativeness heuristics; fundamental attribution error; anchoring, prior knowledge, and labeling; confirmatory hypothesis testing; and reconstructive…

  9. The Treatment of Capital Costs in Educational Projects

    ERIC Educational Resources Information Center

    Bezeau, Lawrence

    1975-01-01

    Failure to account for the cost and depreciation of capital leads to suboptimal investments in education, specifically to excessively capital intensive instructional technologies. This type of error, which is particularly serious when planning for developing countries, can be easily avoided. (Author)

  10. Perspective: Uses and misuses of thresholds in diagnostic decision making.

    PubMed

    Warner, Jeremy L; Najarian, Robert M; Tierney, Lawrence M

    2010-03-01

    The concept of thresholds plays a vital role in decisions involving the initiation, continuation, and completion of diagnostic testing. Much research has focused on the development of explicit thresholds, in the form of practice guidelines and decision analyses. However, these tools are used infrequently; most medical decisions are made at the bedside, using implicit thresholds. Study of these thresholds can lead to a deeper understanding of clinical decision making. The authors examine some factors constituting individual clinicians' implicit thresholds. They propose a model for static thresholds using the concept of situational gravity to explain why some thresholds are high, and some low. Next, they consider the hypothetical effects of incorrect placement of thresholds (miscalibration) and changes to thresholds during diagnosis (manipulation). They demonstrate these concepts using common clinical scenarios. Through analysis of miscalibration of thresholds, the authors demonstrate some common maladaptive clinical behaviors, which are nevertheless internally consistent. They then explain how manipulation of thresholds gives rise to common cognitive heuristics including premature closure and anchoring. They also discuss the case where no threshold has been exceeded despite exhaustive collection of data, which commonly leads to application of the availability or representativeness heuristics. Awareness of implicit thresholds allows for a more effective understanding of the processes of medical decision making and, possibly, to the avoidance of detrimental heuristics and their associated medical errors. Research toward accurately defining these thresholds for individual physicians and toward determining their dynamic properties during the diagnostic process may yield valuable insights.

  11. Steering Law Design for Redundant Single Gimbal Control Moment Gyro Systems. M.S. Thesis - Massachusetts Inst. of Technology.

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth Sarkis

    1987-01-01

    The correspondence between robotic manipulators and single gimbal Control Moment Gyro (CMG) systems was exploited to aid in the understanding and design of single gimbal CMG Steering laws. A test for null motion near a singular CMG configuration was derived which is able to distinguish between escapable and unescapable singular states. Detailed analysis of the Jacobian matrix null-space was performed and results were used to develop and test a variety of single gimbal CMG steering laws. Computer simulations showed that all existing singularity avoidance methods are unable to avoid Elliptic internal singularities. A new null motion algorithm using the Moore-Penrose pseudoinverse, however, was shown by simulation to avoid Elliptic type singularities under certain conditions. The SR-inverse, with appropriate null motion was proposed as a general approach to singularity avoidance, because of its ability to avoid singularities through limited introduction of torque error. Simulation results confirmed the superior performance of this method compared to the other available and proposed pseudoinverse-based Steering laws.

  12. The Relationship Between Technical Errors and Decision Making Skills in the Junior Resident

    PubMed Central

    Nathwani, J. N.; Fiers, R.M.; Ray, R.D.; Witt, A.K.; Law, K. E.; DiMarco, S.M.; Pugh, C.M.

    2017-01-01

    Objective The purpose of this study is to co-evaluate resident technical errors and decision-making capabilities during placement of a subclavian central venous catheter (CVC). We hypothesize that there will be significant correlations between scenario based decision making skills, and technical proficiency in central line insertion. We also predict residents will have problems in anticipating common difficulties and generating solutions associated with line placement. Design Participants were asked to insert a subclavian central line on a simulator. After completion, residents were presented with a real life patient photograph depicting CVC placement and asked to anticipate difficulties and generate solutions. Error rates were analyzed using chi-square tests and a 5% expected error rate. Correlations were sought by comparing technical errors and scenario based decision making. Setting This study was carried out at seven tertiary care centers. Participants Study participants (N=46) consisted of largely first year research residents that could be followed longitudinally. Second year research and clinical residents were not excluded. Results Six checklist errors were committed more often than anticipated. Residents performed an average of 1.9 errors, significantly more than the 1 error, at most, per person expected (t(44)=3.82, p<.001). The most common error was performance of the procedure steps in the wrong order (28.5%, P<.001). Some of the residents (24%) had no errors, 30% committed one error, and 46 % committed more than one error. The number of technical errors committed negatively correlated with the total number of commonly identified difficulties and generated solutions (r(33)= −.429, p=.021, r(33)= −.383, p=.044 respectively). Conclusions Almost half of the surgical residents committed multiple errors while performing subclavian CVC placement. The correlation between technical errors and decision making skills suggests a critical need to train residents in both technique and error management. ACGME Competencies Medical Knowledge, Practice Based Learning and Improvement, Systems Based Practice PMID:27671618

  13. Robot learning and error correction

    NASA Technical Reports Server (NTRS)

    Friedman, L.

    1977-01-01

    A model of robot learning is described that associates previously unknown perceptions with the sensed known consequences of robot actions. For these actions, both the categories of outcomes and the corresponding sensory patterns are incorporated in a knowledge base by the system designer. Thus the robot is able to predict the outcome of an action and compare the expectation with the experience. New knowledge about what to expect in the world may then be incorporated by the robot in a pre-existing structure whether it detects accordance or discrepancy between a predicted consequence and experience. Errors committed during plan execution are detected by the same type of comparison process and learning may be applied to avoiding the errors.

  14. The psychology of doing nothing: forms of decision avoidance result from reason and emotion.

    PubMed

    Anderson, Christopher J

    2003-01-01

    Several independent lines of research bear on the question of why individuals avoid decisions by postponing them, failing to act, or accepting the status quo. This review relates findings across several different disciplines and uncovers 4 decision avoidance effects that offer insight into this common but troubling behavior: choice deferral, status quo bias, omission bias, and inaction inertia. These findings are related by common antecedents and consequences in a rational-emotional model of the factors that predispose humans to do nothing. Prominent components of the model include cost-benefit calculations, anticipated regret, and selection difficulty. Other factors affecting decision avoidance through these key components, such as anticipatory negative emotions, decision strategies, counterfactual thinking, and preference uncertainty, are also discussed.

  15. Prevalence of amblyopia and patterns of refractive error in the amblyopic children of a tertiary eye care center of Nepal.

    PubMed

    Sapkota, K; Pirouzian, A; Matta, N S

    2013-01-01

    Refractive error is a common cause of amblyopia. To determine prevalence of amblyopia and the pattern and the types of refractive error in children with amblyopia in a tertiary eye hospital of Nepal. A retrospective chart review of children diagnosed with amblyopia in the Nepal Eye Hospital (NEH) from July 2006 to June 2011 was conducted. Children of age 13+ or who had any ocular pathology were excluded. Cycloplegic refraction and an ophthalmological examination was performed for all children. The pattern of refractive error and the association between types of refractive error and types of amblyopia were determined. Amblyopia was found in 0.7 % (440) of 62,633 children examined in NEH during this period. All the amblyopic eyes of the subjects had refractive error. Fifty-six percent (248) of the patients were male and the mean age was 7.74 ± 2.97 years. Anisometropia was the most common cause of amblyopia (p less than 0.001). One third (29 %) of the subjects had bilateral amblyopia due to high ametropia. Forty percent of eyes had severe amblyopia with visual acuity of 20/120 or worse. About twothirds (59.2 %) of the eyes had astigmatism. The prevalence of amblyopia in the Nepal Eye Hospital is 0.7%. Anisometropia is the most common cause of amblyopia. Astigmatism is the most common types of refractive error in amblyopic eyes. © NEPjOPH.

  16. Avoidance of APOBEC3B-induced mutation by error-free lesion bypass

    PubMed Central

    Hoopes, James I.; Hughes, Amber L.; Hobson, Lauren A.; Cortez, Luis M.; Brown, Alexander J.

    2017-01-01

    Abstract APOBEC cytidine deaminases mutate cancer genomes by converting cytidines into uridines within ssDNA during replication. Although uracil DNA glycosylases limit APOBEC-induced mutation, it is unknown if subsequent base excision repair (BER) steps function on replication-associated ssDNA. Hence, we measured APOBEC3B-induced CAN1 mutation frequencies in yeast deficient in BER endonucleases or DNA damage tolerance proteins. Strains lacking Apn1, Apn2, Ntg1, Ntg2 or Rev3 displayed wild-type frequencies of APOBEC3B-induced canavanine resistance (CanR). However, strains without error-free lesion bypass proteins Ubc13, Mms2 and Mph1 displayed respective 4.9-, 2.8- and 7.8-fold higher frequency of APOBEC3B-induced CanR. These results indicate that mutations resulting from APOBEC activity are avoided by deoxyuridine conversion to abasic sites ahead of nascent lagging strand DNA synthesis and subsequent bypass by error-free template switching. We found this mechanism also functions during telomere re-synthesis, but with a diminished requirement for Ubc13. Interestingly, reduction of G to C substitutions in Ubc13-deficient strains uncovered a previously unknown role of Ubc13 in controlling the activity of the translesion synthesis polymerase, Rev1. Our results highlight a novel mechanism for error-free bypass of deoxyuridines generated within ssDNA and suggest that the APOBEC mutation signature observed in cancer genomes may under-represent the genomic damage these enzymes induce. PMID:28334887

  17. Statistical Orbit Determination using the Particle Filter for Incorporating Non-Gaussian Uncertainties

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda; Garrison, James L.; Carpenter, J. Russell

    2012-01-01

    The tracking of space objects requires frequent and accurate monitoring for collision avoidance. As even collision events with very low probability are important, accurate prediction of collisions require the representation of the full probability density function (PDF) of the random orbit state. Through representing the full PDF of the orbit state for orbit maintenance and collision avoidance, we can take advantage of the statistical information present in the heavy tailed distributions, more accurately representing the orbit states with low probability. The classical methods of orbit determination (i.e. Kalman Filter and its derivatives) provide state estimates based on only the second moments of the state and measurement errors that are captured by assuming a Gaussian distribution. Although the measurement errors can be accurately assumed to have a Gaussian distribution, errors with a non-Gaussian distribution could arise during propagation between observations. Moreover, unmodeled dynamics in the orbit model could introduce non-Gaussian errors into the process noise. A Particle Filter (PF) is proposed as a nonlinear filtering technique that is capable of propagating and estimating a more complete representation of the state distribution as an accurate approximation of a full PDF. The PF uses Monte Carlo runs to generate particles that approximate the full PDF representation. The PF is applied in the estimation and propagation of a highly eccentric orbit and the results are compared to the Extended Kalman Filter and Splitting Gaussian Mixture algorithms to demonstrate its proficiency.

  18. [Adverse events in general surgery. A prospective analysis of 13,950 consecutive patients].

    PubMed

    Rebasa, Pere; Mora, Laura; Vallverdú, Helena; Luna, Alexis; Montmany, Sandra; Romaguera, Andreu; Navarro, Salvador

    2011-11-01

    Adverse event (AE) rates in General Surgery vary, according to different authors and recording methods, between 2% and 30%. Six years ago we designed a prospective AE recording system to change patient safety culture in our Department. We present the results of this work after a 6 year follow-up. The AE, sequelae and health care errors in a University Hospital surgery department were recorded. An analysis of each incident recorded was performed by a reviewer. The data was entered into data base for rapid access and consultation. The results were routinely presented in Departmental morbidity-mortality sessions. A total of 13,950 patients had suffered 11,254 AE, which affected 5142 of them (36.9% of admissions). A total of 920 patients were subjected to at least one health care error (6.6% of admissions). This meant that 6.6% of our patients suffered an avoidable AE. The overall mortality at 5 years in our department was 2.72% (380 deaths). An adverse event was implicated in the death of the patient in 180 cases (1.29% of admissions). In 49 cases (0.35% of admissions), mortality could be attributed to an avoidable AE. After 6 years there tends to be an increasingly lower incidence of errors. The exhaustive and prospective recording of AE leads to changes in patient safety culture in a Surgery Department and helps decrease the incidence of health care errors. Copyright © 2011 AEC. Published by Elsevier Espana. All rights reserved.

  19. Medical students' experiences with medical errors: an analysis of medical student essays.

    PubMed

    Martinez, William; Lo, Bernard

    2008-07-01

    This study aimed to examine medical students' experiences with medical errors. In 2001 and 2002, 172 fourth-year medical students wrote an anonymous description of a significant medical error they had witnessed or committed during their clinical clerkships. The assignment represented part of a required medical ethics course. We analysed 147 of these essays using thematic content analysis. Many medical students made or observed significant errors. In either situation, some students experienced distress that seemingly went unaddressed. Furthermore, this distress was sometimes severe and persisted after the initial event. Some students also experienced considerable uncertainty as to whether an error had occurred and how to prevent future errors. Many errors may not have been disclosed to patients, and some students who desired to discuss or disclose errors were apparently discouraged from doing so by senior doctors. Some students criticised senior doctors who attempted to hide errors or avoid responsibility. By contrast, students who witnessed senior doctors take responsibility for errors and candidly disclose errors to patients appeared to recognise the importance of honesty and integrity and said they aspired to these standards. There are many missed opportunities to teach students how to respond to and learn from errors. Some faculty members and housestaff may at times respond to errors in ways that appear to contradict professional standards. Medical educators should increase exposure to exemplary responses to errors and help students to learn from and cope with errors.

  20. UAS Well Clear Recovery Against Non-Cooperative Intruders Using Vertical Maneuvers

    NASA Technical Reports Server (NTRS)

    Cone, Andrew C.; Thipphavong, David; Lee, Seung Man; Santiago, Confesor

    2017-01-01

    This paper documents a study that drove the development of a mathematical expression in the detect-and-avoid (DAA) minimum operational performance standards (MOPS) for unmanned aircraft systems (UAS). This equation describes the conditions under which vertical maneuver guidance should be provided during recovery of DAA well clear separation with a non-cooperative VFR aircraft. Although the original hypothesis was that vertical maneuvers for DAA well clear recovery should only be offered when sensor vertical rate errors are small, this paper suggests that UAS climb and descent performance should be considered-in addition to sensor errors for vertical position and vertical rate-when determining whether to offer vertical guidance. A fast-time simulation study involving 108,000 encounters between a UAS and a non-cooperative visual-flight-rules aircraft was conducted. Results are presented showing that, when vertical maneuver guidance for DAA well clear recovery was suppressed, the minimum vertical separation increased by roughly 50 feet (or horizontal separation by 500 to 800 feet). However, the percentage of encounters that had a risk of collision when performing vertical well clear recovery maneuvers was reduced as UAS vertical rate performance increased and sensor vertical rate errors decreased. A class of encounter is identified for which vertical-rate error had a large effect on the efficacy of horizontal maneuvers due to the difficulty of making the correct left/right turn decision: crossing conflict with intruder changing altitude. Overall, these results support logic that would allow vertical maneuvers when UAS vertical performance is sufficient to avoid the intruder, based on the intruder's estimated vertical position and vertical rate, as well as the vertical rate error of the UAS' sensor.

  1. Common Mistakes in Teaching Elementary Math--And How to Avoid Them

    ERIC Educational Resources Information Center

    Liu, Fuchang

    2017-01-01

    Learn the most effective ways to teach elementary math, no matter how much experience you have with the subject. In this book, Fuchang Liu takes you through many common mistakes in math instruction and explains the misunderstandings behind them. He points out practices that should be avoided, helping you to adjust your lessons so that all students…

  2. Anal Health Care Basics

    PubMed Central

    Chang, Jason; McLemore, Elisabeth; Tejirian, Talar

    2016-01-01

    Despite the fact that countless patients suffer from anal problems, there tends to be a lack of understanding of anal health care. Unfortunately, this leads to incorrect diagnoses and treatments. When treating a patient with an anal complaint, the primary goals are to first diagnose the etiology of the symptoms correctly, then to provide an effective and appropriate treatment strategy. The first step in this process is to take an accurate history and physical examination. Specific questions include details about bowel habits, anal hygiene, and fiber supplementation. Specific components of the physical examination include an external anal examination, a digital rectal examination, and anoscopy if appropriate. Common diagnoses include pruritus ani, anal fissures, hemorrhoids, anal abscess or fistula, fecal incontinence, and anal skin tags. However, each problem presents differently and requires a different approach for management. It is of paramount importance that the correct diagnosis is reached. Common errors include an inaccurate diagnosis of hemorrhoids when other pathology is present and subsequent treatment with a steroid product, which is harmful to the anal area. Most of these problems can be avoided by improving bowel habits. Adequate fiber intake with 30 g to 40 g daily is important for many reasons, including improving the quality of stool and preventing colorectal and anal diseases. In this Special Report, we provide an overview of commonly encountered anal problems, their presentation, initial treatment options, and recommendations for referral to specialists. PMID:27723447

  3. Medication errors: an overview for clinicians.

    PubMed

    Wittich, Christopher M; Burkle, Christopher M; Lanier, William L

    2014-08-01

    Medication error is an important cause of patient morbidity and mortality, yet it can be a confusing and underappreciated concept. This article provides a review for practicing physicians that focuses on medication error (1) terminology and definitions, (2) incidence, (3) risk factors, (4) avoidance strategies, and (5) disclosure and legal consequences. A medication error is any error that occurs at any point in the medication use process. It has been estimated by the Institute of Medicine that medication errors cause 1 of 131 outpatient and 1 of 854 inpatient deaths. Medication factors (eg, similar sounding names, low therapeutic index), patient factors (eg, poor renal or hepatic function, impaired cognition, polypharmacy), and health care professional factors (eg, use of abbreviations in prescriptions and other communications, cognitive biases) can precipitate medication errors. Consequences faced by physicians after medication errors can include loss of patient trust, civil actions, criminal charges, and medical board discipline. Methods to prevent medication errors from occurring (eg, use of information technology, better drug labeling, and medication reconciliation) have been used with varying success. When an error is discovered, patients expect disclosure that is timely, given in person, and accompanied with an apology and communication of efforts to prevent future errors. Learning more about medication errors may enhance health care professionals' ability to provide safe care to their patients. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  4. Refractive Errors

    MedlinePlus

    ... and lens of your eye helps you focus. Refractive errors are vision problems that happen when the shape ... cornea, or aging of the lens. Four common refractive errors are Myopia, or nearsightedness - clear vision close up ...

  5. Minimally invasive surgical technique for tethered surgical drains

    PubMed Central

    Hess, Shane R; Satpathy, Jibanananda; Waligora, Andrew C; Ugwu-Oju, Obinna

    2017-01-01

    A feared complication of temporary surgical drain placement is from the technical error of accidentally suturing the surgical drain into the wound. Postoperative discovery of a tethered drain can frequently necessitate return to the operating room if it cannot be successfully removed with nonoperative techniques. Formal wound exploration increases anesthesia and infection risk as well as cost and is best avoided if possible. We present a minimally invasive surgical technique that can avoid the morbidity associated with a full surgical wound exploration to remove a tethered drain when other nonoperative techniques fail. PMID:28400669

  6. Angular and Seasonal Variation of Spectral Surface Reflectance Ratios: Implications for the Remote Sensing of Aerosol over Land

    NASA Technical Reports Server (NTRS)

    Remer, L. A.; Wald, A. E.; Kaufman, Y. J.

    1999-01-01

    We obtain valuable information on the angular and seasonal variability of surface reflectance using a hand-held spectrometer from a light aircraft. The data is used to test a procedure that allows us to estimate visible surface reflectance from the longer wavelength 2.1 micrometer channel (mid-IR). Estimating or avoiding surface reflectance in the visible is a vital first step in most algorithms that retrieve aerosol optical thickness over land targets. The data indicate that specular reflection found when viewing targets from the forward direction can severely corrupt the relationships between the visible and 2.1 micrometer reflectance that were derived from nadir data. There is a month by month variation in the ratios between the visible and the mid-IR, weakly correlated to the Normalized Difference Vegetation Index (NDVI). If specular reflection is not avoided, the errors resulting from estimating surface reflectance from the mid-IR exceed the acceptable limit of DELTA-rho approximately 0.01 in roughly 40% of the cases, using the current algorithm. This is reduced to 25% of the cases if specular reflection is avoided. An alternative method that uses path radiance rather than explicitly estimating visible surface reflectance results in similar errors. The two methods have different strengths and weaknesses that require further study.

  7. Individual variation in the neural processes of motor decisions in the stop signal task: the influence of novelty seeking and harm avoidance personality traits.

    PubMed

    Hu, Jianping; Lee, Dianne; Hu, Sien; Zhang, Sheng; Chao, Herta; Li, Chiang-Shan R

    2016-06-01

    Personality traits contribute to variation in human behavior, including the propensity to take risk. Extant work targeted risk-taking processes with an explicit manipulation of reward, but it remains unclear whether personality traits influence simple decisions such as speeded versus delayed responses during cognitive control. We explored this issue in an fMRI study of the stop signal task, in which participants varied in response time trial by trial, speeding up and risking a stop error or slowing down to avoid errors. Regional brain activations to speeded versus delayed motor responses (risk-taking) were correlated to novelty seeking (NS), harm avoidance (HA) and reward dependence (RD), with age and gender as covariates, in a whole brain regression. At a corrected threshold, the results showed a positive correlation between NS and risk-taking responses in the dorsomedial prefrontal, bilateral orbitofrontal, and frontopolar cortex, and between HA and risk-taking responses in the parahippocampal gyrus and putamen. No regional activations varied with RD. These findings demonstrate that personality traits influence the neural processes of executive control beyond behavioral tasks that involve explicit monetary reward. The results also speak broadly to the importance of characterizing inter-subject variation in studies of cognition and brain functions.

  8. Medication prescribing errors in the medical intensive care unit of Tikur Anbessa Specialized Hospital, Addis Ababa, Ethiopia.

    PubMed

    Sada, Oumer; Melkie, Addisu; Shibeshi, Workineh

    2015-09-16

    Medication errors (MEs) are important problems in all hospitalized populations, especially in intensive care unit (ICU). Little is known about the prevalence of medication prescribing errors in the ICU of hospitals in Ethiopia. The aim of this study was to assess medication prescribing errors in the ICU of Tikur Anbessa Specialized Hospital using retrospective cross-sectional analysis of patient cards and medication charts. About 220 patient charts were reviewed with a total of 1311 patient-days, and 882 prescription episodes. 359 MEs were detected; with prevalence of 40 per 100 orders. Common prescribing errors were omission errors 154 (42.89%), 101 (28.13%) wrong combination, 48 (13.37%) wrong abbreviation, 30 (8.36%) wrong dose, wrong frequency 18 (5.01%) and wrong indications 8 (2.23%). The present study shows that medication errors are common in medical ICU of Tikur Anbessa Specialized Hospital. These results suggest future targets of prevention strategies to reduce the rate of medication error.

  9. Differences among Job Positions Related to Communication Errors at Construction Sites

    NASA Astrophysics Data System (ADS)

    Takahashi, Akiko; Ishida, Toshiro

    In a previous study, we classified the communicatio n errors at construction sites as faulty intention and message pattern, inadequate channel pattern, and faulty comprehension pattern. This study seeks to evaluate the degree of risk of communication errors and to investigate differences among people in various job positions in perception of communication error risk . Questionnaires based on the previous study were a dministered to construction workers (n=811; 149 adminis trators, 208 foremen and 454 workers). Administrators evaluated all patterns of communication error risk equally. However, foremen and workers evaluated communication error risk differently in each pattern. The common contributing factors to all patterns wer e inadequate arrangements before work and inadequate confirmation. Some factors were common among patterns but other factors were particular to a specific pattern. To help prevent future accidents at construction sites, administrators should understand how people in various job positions perceive communication errors and propose human factors measures to prevent such errors.

  10. Reduction of Serious Adverse Events Demanding Study Exclusion in Model Development: Extracorporeal Life Support Resuscitation of Ventricular Fibrillation Cardiac Arrest in Rats.

    PubMed

    Warenits, Alexandra-Maria; Sterz, Fritz; Schober, Andreas; Ettl, Florian; Magnet, Ingrid Anna Maria; Högler, Sandra; Teubenbacher, Ursula; Grassmann, Daniel; Wagner, Michael; Janata, Andreas; Weihs, Wolfgang

    2016-12-01

    Extracorporeal life support is a promising concept for selected patients in refractory cardiogenic shock and for advanced life support of persistent ventricular fibrillation cardiac arrest. Animal models of ventricular fibrillation cardiac arrest could help to investigate new treatment strategies for successful resuscitation. Associated procedural pitfalls in establishing a rat model of extracorporeal life support resuscitation need to be replaced, refined, reduced, and reported.Anesthetized male Sprague-Dawley rats (350-600 g) (n = 126) underwent cardiac arrest induced with a pacing catheter placed into the right ventricle via a jugular cannula. Rats were resuscitated with extracorporeal life support, mechanical ventilation, defibrillation, and medication. Catheter and cannula explantation was performed if restoration of spontaneous circulation was achieved. All observed serious adverse events (SAEs) occurring in each of the experimental phases were analyzed.Restoration of spontaneous circulation could be achieved in 68 of 126 rats (54%); SAEs were observed in 76 (60%) experiments. Experimental procedures related SAEs were 62 (82%) and avoidable human errors were 14 (18%). The most common serious adverse events were caused by insertion or explantation of the venous bypass cannula and resulted in lethal bleeding, cannula dislocation, or air embolism.Establishing an extracorporeal life support model in rats has confronted us with technical challenges. Even advancements in small animal critical care management over the years delivered by an experienced team and technical modifications were not able to totally avoid such serious adverse events. Replacement, refinement, and reduction reports of serious adverse events demanding study exclusions to avoid animal resources are missing and are presented hereby.

  11. How Jordan and Saudi Arabia are avoiding a tragedy of the commons over shared groundwater

    NASA Astrophysics Data System (ADS)

    Müller, Marc F.; Müller-Itten, Michèle C.; Gorelick, Steven M.

    2017-07-01

    Transboundary aquifers are ubiquitous and strategically important to global food and water security. Yet these shared resources are being depleted at an alarming rate. Focusing on the Disi aquifer, a key nonrenewable source of groundwater shared by Jordan and Saudi Arabia, this study develops a two-stage game that evaluates optimal transboundary strategies of common-pool resource exploitation under various assumptions. The analysis relies on estimates of agricultural water use from satellite imagery, which were obtained using three independent remote sensing approaches. Drawdown response to pumping is simulated using a 2-D regional aquifer model. Jordan and Saudi Arabia developed a buffer-zone strategy with a prescribed minimum distance between each country's pumping centers. We show that by limiting the marginal impact of pumping decisions on the other country's pumping costs, this strategy will likely avoid an impeding tragedy of the commons for at least 60 years. Our analysis underscores the role played by distance between wells and disparities in groundwater exploitation costs on common-pool overdraft. In effect, if pumping centers are distant enough, a shared aquifer no longer behaves as a common-pool resource and a tragedy of the commons can be avoided. The 2015 Disi aquifer pumping agreement between Jordan and Saudi Arabia, which in practice relies on a joint technical commission to enforce exclusion zones, is the first agreement of this type between sovereign countries and has a promising potential to avoid conflicts or resolve potential transboundary groundwater disputes over comparable aquifer systems elsewhere.

  12. Needle path planning and steering in a three-dimensional non-static environment using two-dimensional ultrasound images

    PubMed Central

    Vrooijink, Gustaaf J.; Abayazid, Momen; Patil, Sachin; Alterovitz, Ron; Misra, Sarthak

    2015-01-01

    Needle insertion is commonly performed in minimally invasive medical procedures such as biopsy and radiation cancer treatment. During such procedures, accurate needle tip placement is critical for correct diagnosis or successful treatment. Accurate placement of the needle tip inside tissue is challenging, especially when the target moves and anatomical obstacles must be avoided. We develop a needle steering system capable of autonomously and accurately guiding a steerable needle using two-dimensional (2D) ultrasound images. The needle is steered to a moving target while avoiding moving obstacles in a three-dimensional (3D) non-static environment. Using a 2D ultrasound imaging device, our system accurately tracks the needle tip motion in 3D space in order to estimate the tip pose. The needle tip pose is used by a rapidly exploring random tree-based motion planner to compute a feasible needle path to the target. The motion planner is sufficiently fast such that replanning can be performed repeatedly in a closed-loop manner. This enables the system to correct for perturbations in needle motion, and movement in obstacle and target locations. Our needle steering experiments in a soft-tissue phantom achieves maximum targeting errors of 0.86 ± 0.35 mm (without obstacles) and 2.16 ± 0.88 mm (with a moving obstacle). PMID:26279600

  13. Survey of blindness and low vision in Egbedore, South-Western Nigeria.

    PubMed

    Kolawole, O U; Ashaye, A O; Adeoti, C O; Mahmoud, A O

    2010-01-01

    Developing efficient and cost-effective eye care programmes for communities in Nigeria has been hampered by inadequate and inaccurate data on blindness and low vision. To determine the prevalence and causes of blindness and low vision among adults 50 years and older in South-Western Nigeria in order to develop viable eye care programme for the community. Twenty clusters of 60 subjects of age 50 years and older were selected by systematic random cluster sampling. Information was collected and ocular examinations were conducted on each consenting subject. Data were recorded in specially designed questionnaire and analysed using descriptive statistical methods. Out of the 1200 subjects enrolled for the study, 1183(98.6%) were interviewed and examined. Seventy five (6.3%)) of the 1183 subjects were bilaterally blind and 223(18.9%) had bilateral low vision according to WHO definition of blindness and low vision. Blindness was about 1.6 times commoner in men than women. Cataract, glaucoma and posterior segment disorders were major causes of bilateral blindness. Bilateral low vision was mainly due to cataract, refractive errors and posterior segment disorders. The prevalence of blindness and low vision in this study population was high. The main causes are avoidable. Elimination of avoidable blindness and low vision calls for attention and commitment from government and eye care workers in South Western Nigeria.

  14. Teaching Common Errors in Applying a Procedure.

    ERIC Educational Resources Information Center

    Marcone, Stephen; Reigeluth, Charles M.

    1988-01-01

    Discusses study that investigated whether or not the teaching of matched examples and nonexamples in the form of common errors could improve student performance in undergraduate music theory courses. Highlights include hypotheses tested, pretests and posttests, and suggestions for further research with different age groups. (19 references)…

  15. At the cross-roads: an on-road examination of driving errors at intersections.

    PubMed

    Young, Kristie L; Salmon, Paul M; Lenné, Michael G

    2013-09-01

    A significant proportion of road trauma occurs at intersections. Understanding the nature of driving errors at intersections therefore has the potential to lead to significant injury reductions. To further understand how the complexity of modern intersections shapes behaviour of these errors are compared to errors made mid-block, and the role of wider systems failures in intersection error causation is investigated in an on-road study. Twenty-five participants drove a pre-determined urban route incorporating 25 intersections. Two in-vehicle observers recorded the errors made while a range of other data was collected, including driver verbal protocols, video, driver eye glance behaviour and vehicle data (e.g., speed, braking and lane position). Participants also completed a post-trial cognitive task analysis interview. Participants were found to make 39 specific error types, with speeding violations the most common. Participants made significantly more errors at intersections compared to mid-block, with misjudgement, action and perceptual/observation errors more commonly observed at intersections. Traffic signal configuration was found to play a key role in intersection error causation, with drivers making more errors at partially signalised compared to fully signalised intersections. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. [Diagnostic Errors in Medicine].

    PubMed

    Buser, Claudia; Bankova, Andriyana

    2015-12-09

    The recognition of diagnostic errors in everyday practice can help improve patient safety. The most common diagnostic errors are the cognitive errors, followed by system-related errors and no fault errors. The cognitive errors often result from mental shortcuts, known as heuristics. The rate of cognitive errors can be reduced by a better understanding of heuristics and the use of checklists. The autopsy as a retrospective quality assessment of clinical diagnosis has a crucial role in learning from diagnostic errors. Diagnostic errors occur more often in primary care in comparison to hospital settings. On the other hand, the inpatient errors are more severe than the outpatient errors.

  17. Avoidable errors in deposited macromolecular structures: an impediment to efficient data mining.

    PubMed

    Dauter, Zbigniew; Wlodawer, Alexander; Minor, Wladek; Jaskolski, Mariusz; Rupp, Bernhard

    2014-05-01

    Whereas the vast majority of the more than 85 000 crystal structures of macromolecules currently deposited in the Protein Data Bank are of high quality, some suffer from a variety of imperfections. Although this fact has been pointed out in the past, it is still worth periodic updates so that the metadata obtained by global analysis of the available crystal structures, as well as the utilization of the individual structures for tasks such as drug design, should be based on only the most reliable data. Here, selected abnormal deposited structures have been analysed based on the Bayesian reasoning that the correctness of a model must be judged against both the primary evidence as well as prior knowledge. These structures, as well as information gained from the corresponding publications (if available), have emphasized some of the most prevalent types of common problems. The errors are often perfect illustrations of the nature of human cognition, which is frequently influenced by preconceptions that may lead to fanciful results in the absence of proper validation. Common errors can be traced to negligence and a lack of rigorous verification of the models against electron density, creation of non-parsimonious models, generation of improbable numbers, application of incorrect symmetry, illogical presentation of the results, or violation of the rules of chemistry and physics. Paying more attention to such problems, not only in the final validation stages but during the structure-determination process as well, is necessary not only in order to maintain the highest possible quality of the structural repositories and databases but most of all to provide a solid basis for subsequent studies, including large-scale data-mining projects. For many scientists PDB deposition is a rather infrequent event, so the need for proper training and supervision is emphasized, as well as the need for constant alertness of reason and critical judgment as absolutely necessary safeguarding measures against such problems. Ways of identifying more problematic structures are suggested so that their users may be properly alerted to their possible shortcomings.

  18. Avoidable errors in deposited macromolecular structures: an impediment to efficient data mining

    PubMed Central

    Dauter, Zbigniew; Wlodawer, Alexander; Minor, Wladek; Jaskolski, Mariusz; Rupp, Bernhard

    2014-01-01

    Whereas the vast majority of the more than 85 000 crystal structures of macromolecules currently deposited in the Protein Data Bank are of high quality, some suffer from a variety of imperfections. Although this fact has been pointed out in the past, it is still worth periodic updates so that the metadata obtained by global analysis of the available crystal structures, as well as the utilization of the individual structures for tasks such as drug design, should be based on only the most reliable data. Here, selected abnormal deposited structures have been analysed based on the Bayesian reasoning that the correctness of a model must be judged against both the primary evidence as well as prior knowledge. These structures, as well as information gained from the corresponding publications (if available), have emphasized some of the most prevalent types of common problems. The errors are often perfect illustrations of the nature of human cognition, which is frequently influenced by preconceptions that may lead to fanciful results in the absence of proper validation. Common errors can be traced to negligence and a lack of rigorous verification of the models against electron density, creation of non-parsimonious models, generation of improbable numbers, application of incorrect symmetry, illogical presentation of the results, or violation of the rules of chemistry and physics. Paying more attention to such problems, not only in the final validation stages but during the structure-determination process as well, is necessary not only in order to maintain the highest possible quality of the structural repositories and databases but most of all to provide a solid basis for subsequent studies, including large-scale data-mining projects. For many scientists PDB deposition is a rather infrequent event, so the need for proper training and supervision is emphasized, as well as the need for constant alertness of reason and critical judgment as absolutely necessary safeguarding measures against such problems. Ways of identifying more problematic structures are suggested so that their users may be properly alerted to their possible shortcomings. PMID:25075337

  19. Physics and Control of Locked Modes in the DIII-D Tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Volpe, Francesco

    This Final Technical Report summarizes an investigation, carried out under the auspices of the DOE Early Career Award, of the physics and control of non-rotating magnetic islands (“locked modes”) in tokamak plasmas. Locked modes are one of the main causes of disruptions in present tokamaks, and could be an even bigger concern in ITER, due to its relatively high beta (favoring the formation of Neoclassical Tearing Mode islands) and low rotation (favoring locking). For these reasons, this research had the goal of studying and learning how to control locked modes in the DIII-D National Fusion Facility under ITER-relevant conditions ofmore » high pressure and low rotation. Major results included: the first full suppression of locked modes and avoidance of the associated disruptions; the demonstration of error field detection from the interaction between locked modes, applied rotating fields and intrinsic errors; the analysis of a vast database of disruptive locked modes, which led to criteria for disruption prediction and avoidance.« less

  20. Extensibility in local sensor based planning for hyper-redundant manipulators (robot snakes)

    NASA Technical Reports Server (NTRS)

    Choset, Howie; Burdick, Joel

    1994-01-01

    Partial Shape Modification (PSM) is a local sensor feedback method used for hyper-redundant robot manipulators, in which the redundancy is very large or infinite such as that of a robot snake. This aspect of redundancy enables local obstacle avoidance and end-effector placement in real time. Due to the large number of joints or actuators in a hyper-redundant manipulator, small displacement errors of such easily accumulate to large errors in the position of the tip relative to the base. The accuracy could be improved by a local sensor based planning method in which sensors are distributed along the length of the hyper-redundant robot. This paper extends the local sensor based planning strategy beyond the limitations of the fixed length of such a manipulator when its joint limits are met. This is achieved with an algorithm where the length of the deforming part of the robot is variable. Thus , the robot's local avoidance of obstacles is improved through the enhancement of its extensibility.

  1. [Building questions in forensic medicine and their logical basis].

    PubMed

    Kovalev, D; Shmarov, K; Ten'kov, D

    2015-01-01

    The authors characterize in brief the requirements to the correct formulation of the questions posed to forensic medical experts with special reference to the mistakes made in building the questions and the ways to avoid them. This article actually continues the series of publications of the authors concerned with the major logical errors encountered in expert conclusions. Further publications will be dedicated to the results of the in-depth analysis of the logical errors contained in the questions posed to forensic medical experts and encountered in the expert conclusions.

  2. System safety management: A new discipline

    NASA Technical Reports Server (NTRS)

    Pope, W. C.

    1971-01-01

    The systems theory is discussed in relation to safety management. It is suggested that systems safety management, as a new discipline, holds great promise for reducing operating errors, conserving labor resources, avoiding operating costs due to mistakes, and for improving managerial techniques. It is pointed out that managerial failures or system breakdowns are the basic reasons for human errors and condition defects. In this respect, a recommendation is made that safety engineers stop visualizing the problem only with the individual (supervisor or employee) and see the problem from the systems point of view.

  3. Prevalence of teen driver errors leading to serious motor vehicle crashes.

    PubMed

    Curry, Allison E; Hafetz, Jessica; Kallan, Michael J; Winston, Flaura K; Durbin, Dennis R

    2011-07-01

    Motor vehicle crashes are the leading cause of adolescent deaths. Programs and policies should target the most common and modifiable reasons for crashes. We estimated the frequency of critical reasons for crashes involving teen drivers, and examined in more depth specific teen driver errors. The National Highway Traffic Safety Administration's (NHTSA) National Motor Vehicle Crash Causation Survey collected data at the scene of a nationally representative sample of 5470 serious crashes between 7/05 and 12/07. NHTSA researchers assigned a single driver, vehicle, or environmental factor as the critical reason for the event immediately leading to each crash. We analyzed crashes involving 15-18 year old drivers. 822 teen drivers were involved in 795 serious crashes, representing 335,667 teens in 325,291 crashes. Driver error was by far the most common reason for crashes (95.6%), as opposed to vehicle or environmental factors. Among crashes with a driver error, a teen made the error 79.3% of the time (75.8% of all teen-involved crashes). Recognition errors (e.g., inadequate surveillance, distraction) accounted for 46.3% of all teen errors, followed by decision errors (e.g., following too closely, too fast for conditions) (40.1%) and performance errors (e.g., loss of control) (8.0%). Inadequate surveillance, driving too fast for conditions, and distracted driving together accounted for almost half of all crashes. Aggressive driving behavior, drowsy driving, and physical impairments were less commonly cited as critical reasons. Males and females had similar proportions of broadly classified errors, although females were specifically more likely to make inadequate surveillance errors. Our findings support prioritization of interventions targeting driver distraction and surveillance and hazard awareness training. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Global Vision Impairment and Blindness Due to Uncorrected Refractive Error, 1990-2010.

    PubMed

    Naidoo, Kovin S; Leasher, Janet; Bourne, Rupert R; Flaxman, Seth R; Jonas, Jost B; Keeffe, Jill; Limburg, Hans; Pesudovs, Konrad; Price, Holly; White, Richard A; Wong, Tien Y; Taylor, Hugh R; Resnikoff, Serge

    2016-03-01

    The purpose of this systematic review was to estimate worldwide the number of people with moderate and severe visual impairment (MSVI; presenting visual acuity <6/18, ≥3/60) or blindness (presenting visual acuity <3/60) due to uncorrected refractive error (URE), to estimate trends in prevalence from 1990 to 2010, and to analyze regional differences. The review focuses on uncorrected refractive error which is now the most common cause of avoidable visual impairment globally. : The systematic review of 14,908 relevant manuscripts from 1990 to 2010 using Medline, Embase, and WHOLIS yielded 243 high-quality, population-based cross-sectional studies which informed a meta-analysis of trends by region. The results showed that in 2010, 6.8 million (95% confidence interval [CI]: 4.7-8.8 million) people were blind (7.9% increase from 1990) and 101.2 million (95% CI: 87.88-125.5 million) vision impaired due to URE (15% increase since 1990), while the global population increased by 30% (1990-2010). The all-age age-standardized prevalence of URE blindness decreased 33% from 0.2% (95% CI: 0.1-0.2%) in 1990 to 0.1% (95% CI: 0.1-0.1%) in 2010, whereas the prevalence of URE MSVI decreased 25% from 2.1% (95% CI: 1.6-2.4%) in 1990 to 1.5% (95% CI: 1.3-1.9%) in 2010. In 2010, URE contributed 20.9% (95% CI: 15.2-25.9%) of all blindness and 52.9% (95% CI: 47.2-57.3%) of all MSVI worldwide. The contribution of URE to all MSVI ranged from 44.2 to 48.1% in all regions except in South Asia which was at 65.4% (95% CI: 62-72%). : We conclude that in 2010, uncorrected refractive error continues as the leading cause of vision impairment and the second leading cause of blindness worldwide, affecting a total of 108 million people or 1 in 90 persons.

  5. Factors correlated with traffic accidents as a basis for evaluating Advanced Driver Assistance Systems.

    PubMed

    Staubach, Maria

    2009-09-01

    This study aims to identify factors which influence and cause errors in traffic accidents and to use these as a basis for information to guide the application and design of driver assistance systems. A total of 474 accidents were examined in depth for this study by means of a psychological survey, data from accident reports, and technical reconstruction information. An error analysis was subsequently carried out, taking into account the driver, environment, and vehicle sub-systems. Results showed that all accidents were influenced by errors as a consequence of distraction and reduced activity. For crossroad accidents, there were further errors resulting from sight obstruction, masked stimuli, focus errors, and law infringements. Lane departure crashes were additionally caused by errors as a result of masked stimuli, law infringements, expectation errors as well as objective and action slips, while same direction accidents occurred additionally because of focus errors, expectation errors, and objective and action slips. Most accidents were influenced by multiple factors. There is a safety potential for Advanced Driver Assistance Systems (ADAS), which support the driver in information assimilation and help to avoid distraction and reduced activity. The design of the ADAS is dependent on the specific influencing factors of the accident type.

  6. Out-of-This-World Calculations

    ERIC Educational Resources Information Center

    Kalb, Kristina S.; Gravett, Julie M.

    2012-01-01

    By following learned rules rather than reasoning, students often fall into common error patterns, something every experienced teacher has observed in the classroom. In their effort to circumvent the developing common error patterns of their students, the authors decided to supplement their math text with two weeklong investigations. The first was…

  7. Making Sense of Low Back Pain and Pain-Related Fear.

    PubMed

    Bunzli, Samantha; Smith, Anne; Schütze, Robert; Lin, Ivan; O'Sullivan, Peter

    2017-09-01

    Synopsis Pain-related fear is implicated in the transition from acute to chronic low back pain and the persistence of disabling low back pain, making it a key target for physical therapy intervention. The current understanding of pain-related fear is that it is a psychopathological problem, whereby people who catastrophize about the meaning of pain become trapped in a vicious cycle of avoidance behavior, pain, and disability, as recognized in the fear-avoidance model. However, there is evidence that pain-related fear can also be seen as a common-sense response to deal with low back pain, for example, when one is told that one's back is vulnerable, degenerating, or damaged. In this instance, avoidance is a common-sense response to protect a "damaged" back. While the fear-avoidance model proposes that when someone first develops low back pain, the confrontation of normal activity in the absence of catastrophizing leads to recovery, the pathway to recovery for individuals trapped in the fear-avoidance cycle is less clear. Understanding pain-related fear from a common-sense perspective enables physical therapists to offer individuals with low back pain and high fear a pathway to recovery by altering how they make sense of their pain. Drawing on a body of published work exploring the lived experience of pain-related fear in people with low back pain, this clinical commentary illustrates how Leventhal's common-sense model may assist physical therapists to understand the broader sense-making processes involved in the fear-avoidance cycle, and how they can be altered to facilitate fear reduction by applying strategies established in the behavioral medicine literature. J Orthop Sports Phys Ther 2017;47(9):628-636. Epub 13 Jul 2017. doi:10.2519/jospt.2017.7434.

  8. Systematic Errors in an Air Track Experiment.

    ERIC Educational Resources Information Center

    Ramirez, Santos A.; Ham, Joe S.

    1990-01-01

    Errors found in a common physics experiment to measure acceleration resulting from gravity using a linear air track are investigated. Glider position at release and initial velocity are shown to be sources of systematic error. (CW)

  9. New connectors coming for enteral feeding tubes; marqibo and risk of errors; angeliq is not a birth control pill.

    PubMed

    Cohen, Michael R; Smetzer, Judy L

    2014-07-01

    These medication errors have occurred in health care facilities at least once. They will happen again-perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications.

  10. Dynamic performance of an aero-assist spacecraft - AFE

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1992-01-01

    Dynamic performance of the Aero-assist Flight Experiment (AFE) spacecraft was investigated using a high-fidelity 6-DOF simulation model. Baseline guidance logic, control logic, and a strapdown navigation system to be used on the AFE spacecraft are also modeled in the 6-DOF simulation. During the AFE mission, uncertainties in the environment and the spacecraft are described by an error space which includes both correlated and uncorrelated error sources. The principal error sources modeled in this study include navigation errors, initial state vector errors, atmospheric variations, aerodynamic uncertainties, center-of-gravity off-sets, and weight uncertainties. The impact of the perturbations on the spacecraft performance is investigated using Monte Carlo repetitive statistical techniques. During the Solid Rocket Motor (SRM) deorbit phase, a target flight path angle of -4.76 deg at entry interface (EI) offers very high probability of avoiding SRM casing skip-out from the atmosphere. Generally speaking, the baseline designs of the guidance, navigation, and control systems satisfy most of the science and mission requirements.

  11. Identifying the causes of road crashes in Europe

    PubMed Central

    Thomas, Pete; Morris, Andrew; Talbot, Rachel; Fagerlind, Helen

    2013-01-01

    This research applies a recently developed model of accident causation, developed to investigate industrial accidents, to a specially gathered sample of 997 crashes investigated in-depth in 6 countries. Based on the work of Hollnagel the model considers a collision to be a consequence of a breakdown in the interaction between road users, vehicles and the organisation of the traffic environment. 54% of road users experienced interpretation errors while 44% made observation errors and 37% planning errors. In contrast to other studies only 11% of drivers were identified as distracted and 8% inattentive. There was remarkably little variation in these errors between the main road user types. The application of the model to future in-depth crash studies offers the opportunity to identify new measures to improve safety and to mitigate the social impact of collisions. Examples given include the potential value of co-driver advisory technologies to reduce observation errors and predictive technologies to avoid conflicting interactions between road users. PMID:24406942

  12. ISMP Medication Error Report Analysis

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2017-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications. PMID:28179735

  13. ISMP Medication Error Report Analysis

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2017-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your in-service training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers’ names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters’ wishes as to the level of detail included in publications. PMID:29276260

  14. Effective Algorithm for Detection and Correction of the Wave Reconstruction Errors Caused by the Tilt of Reference Wave in Phase-shifting Interferometry

    NASA Astrophysics Data System (ADS)

    Xu, Xianfeng; Cai, Luzhong; Li, Dailin; Mao, Jieying

    2010-04-01

    In phase-shifting interferometry (PSI) the reference wave is usually supposed to be an on-axis plane wave. But in practice a slight tilt of reference wave often occurs, and this tilt will introduce unexpected errors of the reconstructed object wave-front. Usually the least-square method with iterations, which is time consuming, is employed to analyze the phase errors caused by the tilt of reference wave. Here a simple effective algorithm is suggested to detect and then correct this kind of errors. In this method, only some simple mathematic operation is used, avoiding using least-square equations as needed in most methods reported before. It can be used for generalized phase-shifting interferometry with two or more frames for both smooth and diffusing objects, and the excellent performance has been verified by computer simulations. The numerical simulations show that the wave reconstruction errors can be reduced by 2 orders of magnitude.

  15. Errors analysis of problem solving using the Newman stage after applying cooperative learning of TTW type

    NASA Astrophysics Data System (ADS)

    Rr Chusnul, C.; Mardiyana, S., Dewi Retno

    2017-12-01

    Problem solving is the basis of mathematics learning. Problem solving teaches us to clarify an issue coherently in order to avoid misunderstanding information. Sometimes there may be mistakes in problem solving due to misunderstanding the issue, choosing a wrong concept or misapplied concept. The problem-solving test was carried out after students were given treatment on learning by using cooperative learning of TTW type. The purpose of this study was to elucidate student problem regarding to problem solving errors after learning by using cooperative learning of TTW type. Newman stages were used to identify problem solving errors in this study. The new research used a descriptive method to find out problem solving errors in students. The subject in this study were students of Vocational Senior High School (SMK) in 10th grade. Test and interview was conducted for data collection. Thus, the results of this study suggested problem solving errors in students after learning by using cooperative learning of TTW type for Newman stages.

  16. ISMP Medication Error Report Analysis

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2016-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications. PMID:28057945

  17. ISMP Medication Error Report Analysis

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2016-01-01

    ABSTRACT These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural-safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were receivedthrough the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications. PMID:27928183

  18. Portable and Error-Free DNA-Based Data Storage.

    PubMed

    Yazdi, S M Hossein Tabatabaei; Gabrys, Ryan; Milenkovic, Olgica

    2017-07-10

    DNA-based data storage is an emerging nonvolatile memory technology of potentially unprecedented density, durability, and replication efficiency. The basic system implementation steps include synthesizing DNA strings that contain user information and subsequently retrieving them via high-throughput sequencing technologies. Existing architectures enable reading and writing but do not offer random-access and error-free data recovery from low-cost, portable devices, which is crucial for making the storage technology competitive with classical recorders. Here we show for the first time that a portable, random-access platform may be implemented in practice using nanopore sequencers. The novelty of our approach is to design an integrated processing pipeline that encodes data to avoid costly synthesis and sequencing errors, enables random access through addressing, and leverages efficient portable sequencing via new iterative alignment and deletion error-correcting codes. Our work represents the only known random access DNA-based data storage system that uses error-prone nanopore sequencers, while still producing error-free readouts with the highest reported information rate/density. As such, it represents a crucial step towards practical employment of DNA molecules as storage media.

  19. Patient-Controlled Analgesia Basal Infusion Overdose; Life-threatening Errors with Flecainide Suspension in Children; Medical Product Error-Prevention Efforts Need to Be Shared and Harmonized Internationally

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2015-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers’ names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters’ wishes as to the level of detail included in publications. PMID:26715797

  20. What to use to express the variability of data: Standard deviation or standard error of mean?

    PubMed

    Barde, Mohini P; Barde, Prajakt J

    2012-07-01

    Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As readers are generally interested in knowing the variability within sample, descriptive data should be precisely summarized with SD. Use of SEM should be limited to compute CI which measures the precision of population estimate. Journals can avoid such errors by requiring authors to adhere to their guidelines.

  1. Solutions to decrease a systematic error related to AAPH addition in the fluorescence-based ORAC assay.

    PubMed

    Mellado-Ortega, Elena; Zabalgogeazcoa, Iñigo; Vázquez de Aldana, Beatriz R; Arellano, Juan B

    2017-02-15

    Oxygen radical absorbance capacity (ORAC) assay in 96-well multi-detection plate readers is a rapid method to determine total antioxidant capacity (TAC) in biological samples. A disadvantage of this method is that the antioxidant inhibition reaction does not start in all of the 96 wells at the same time due to technical limitations when dispensing the free radical-generating azo initiator 2,2'-azobis (2-methyl-propanimidamide) dihydrochloride (AAPH). The time delay between wells yields a systematic error that causes statistically significant differences in TAC determination of antioxidant solutions depending on their plate position. We propose two alternative solutions to avoid this AAPH-dependent error in ORAC assays. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Lessons Learned.

    ERIC Educational Resources Information Center

    Hassell, Kim Dale

    2000-01-01

    Discusses the common mistakes in school design and construction and how to avoid them. Mistake avoidance in mastering planning, site acquisition, drawing changes, budgeting, school design process, construction management, and the architect's role are highlighted. (GR)

  3. Effects of skilled nursing facility structure and process factors on medication errors during nursing home admission.

    PubMed

    Lane, Sandi J; Troyer, Jennifer L; Dienemann, Jacqueline A; Laditka, Sarah B; Blanchette, Christopher M

    2014-01-01

    Older adults are at greatest risk of medication errors during the transition period of the first 7 days after admission and readmission to a skilled nursing facility (SNF). The aim of this study was to evaluate structure- and process-related factors that contribute to medication errors and harm during transition periods at a SNF. Data for medication errors and potential medication errors during the 7-day transition period for residents entering North Carolina SNFs were from the Medication Error Quality Initiative-Individual Error database from October 2006 to September 2007. The impact of SNF structure and process measures on the number of reported medication errors and harm from errors were examined using bivariate and multivariate model methods. A total of 138 SNFs reported 581 transition period medication errors; 73 (12.6%) caused harm. Chain affiliation was associated with a reduction in the volume of errors during the transition period. One third of all reported transition errors occurred during the medication administration phase of the medication use process, where dose omissions were the most common type of error; however, dose omissions caused harm less often than wrong-dose errors did. Prescribing errors were much less common than administration errors but were much more likely to cause harm. Both structure and process measures of quality were related to the volume of medication errors.However, process quality measures may play a more important role in predicting harm from errors during the transition of a resident into an SNF. Medication errors during transition could be reduced by improving both prescribing processes and transcription and documentation of orders.

  4. A Wolf in Sheep's Clothing?

    ERIC Educational Resources Information Center

    Good, Geoff

    1997-01-01

    Safety qualifications for adventure education are not intended to prevent the enjoyment of adventure. Good training enables participants to avoid basic errors and tackle greater adventure sooner. Discusses the need to balance individual freedom with responsibility, and how the Lyme Bay canoeing deaths prompted increased concern in Great Britain…

  5. Development of an Ontology to Model Medical Errors, Information Needs, and the Clinical Communication Space

    PubMed Central

    Stetson, Peter D.; McKnight, Lawrence K.; Bakken, Suzanne; Curran, Christine; Kubose, Tate T.; Cimino, James J.

    2002-01-01

    Medical errors are common, costly and often preventable. Work in understanding the proximal causes of medical errors demonstrates that systems failures predispose to adverse clinical events. Most of these systems failures are due to lack of appropriate information at the appropriate time during the course of clinical care. Problems with clinical communication are common proximal causes of medical errors. We have begun a project designed to measure the impact of wireless computing on medical errors. We report here on our efforts to develop an ontology representing the intersection of medical errors, information needs and the communication space. We will use this ontology to support the collection, storage and interpretation of project data. The ontology’s formal representation of the concepts in this novel domain will help guide the rational deployment of our informatics interventions. A real-life scenario is evaluated using the ontology in order to demonstrate its utility.

  6. Model-based error diffusion for high fidelity lenticular screening.

    PubMed

    Lau, Daniel; Smith, Trebor

    2006-04-17

    Digital halftoning is the process of converting a continuous-tone image into an arrangement of black and white dots for binary display devices such as digital ink-jet and electrophotographic printers. As printers are achieving print resolutions exceeding 1,200 dots per inch, it is becoming increasingly important for halftoning algorithms to consider the variations and interactions in the size and shape of printed dots between neighboring pixels. In the case of lenticular screening where statistically independent images are spatially multiplexed together, ignoring these variations and interactions, such as dot overlap, will result in poor lenticular image quality. To this end, we describe our use of model-based error-diffusion for the lenticular screening problem where statistical independence between component images is achieved by restricting the diffusion of error to only those pixels of the same component image where, in order to avoid instabilities, the proposed approach involves a novel error-clipping procedure.

  7. Random Weighting, Strong Tracking, and Unscented Kalman Filter for Soft Tissue Characterization.

    PubMed

    Shin, Jaehyun; Zhong, Yongmin; Oetomo, Denny; Gu, Chengfan

    2018-05-21

    This paper presents a new nonlinear filtering method based on the Hunt-Crossley model for online nonlinear soft tissue characterization. This method overcomes the problem of performance degradation in the unscented Kalman filter due to contact model error. It adopts the concept of Mahalanobis distance to identify contact model error, and further incorporates a scaling factor in predicted state covariance to compensate identified model error. This scaling factor is determined according to the principle of innovation orthogonality to avoid the cumbersome computation of Jacobian matrix, where the random weighting concept is adopted to improve the estimation accuracy of innovation covariance. A master-slave robotic indentation system is developed to validate the performance of the proposed method. Simulation and experimental results as well as comparison analyses demonstrate that the efficacy of the proposed method for online characterization of soft tissue parameters in the presence of contact model error.

  8. Error management for musicians: an interdisciplinary conceptual framework

    PubMed Central

    Kruse-Weber, Silke; Parncutt, Richard

    2014-01-01

    Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians’ generally negative attitude toward errors and the tendency to aim for flawless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error) and error management (during and after the error) are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly – or not at all. A more constructive, creative, and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of such an approach. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey-relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further music education and musicians at all levels. PMID:25120501

  9. Self-calibration method without joint iteration for distributed small satellite SAR systems

    NASA Astrophysics Data System (ADS)

    Xu, Qing; Liao, Guisheng; Liu, Aifei; Zhang, Juan

    2013-12-01

    The performance of distributed small satellite synthetic aperture radar systems degrades significantly due to the unavoidable array errors, including gain, phase, and position errors, in real operating scenarios. In the conventional method proposed in (IEEE T Aero. Elec. Sys. 42:436-451, 2006), the spectrum components within one Doppler bin are considered as calibration sources. However, it is found in this article that the gain error estimation and the position error estimation in the conventional method can interact with each other. The conventional method may converge to suboptimal solutions in large position errors since it requires the joint iteration between gain-phase error estimation and position error estimation. In addition, it is also found that phase errors can be estimated well regardless of position errors when the zero Doppler bin is chosen. In this article, we propose a method obtained by modifying the conventional one, based on these two observations. In this modified method, gain errors are firstly estimated and compensated, which eliminates the interaction between gain error estimation and position error estimation. Then, by using the zero Doppler bin data, the phase error estimation can be performed well independent of position errors. Finally, position errors are estimated based on the Taylor-series expansion. Meanwhile, the joint iteration between gain-phase error estimation and position error estimation is not required. Therefore, the problem of suboptimal convergence, which occurs in the conventional method, can be avoided with low computational method. The modified method has merits of faster convergence and lower estimation error compared to the conventional one. Theoretical analysis and computer simulation results verified the effectiveness of the modified method.

  10. Error management for musicians: an interdisciplinary conceptual framework.

    PubMed

    Kruse-Weber, Silke; Parncutt, Richard

    2014-01-01

    Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians' generally negative attitude toward errors and the tendency to aim for flawless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error) and error management (during and after the error) are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly - or not at all. A more constructive, creative, and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of such an approach. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey-relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further music education and musicians at all levels.

  11. A novel multi-planar radiography method for three dimensional pose reconstruction of the patellofemoral and tibiofemoral joints after arthroplasty.

    PubMed

    Amiri, Shahram; Wilson, David R; Masri, Bassam A; Sharma, Gulshan; Anglin, Carolyn

    2011-06-03

    Determining the 3D pose of the patella after total knee arthroplasty is challenging. The commonly used single-plane fluoroscopy is prone to large errors in the clinically relevant mediolateral direction. A conventional fixed bi-planar setup is limited in the minimum angular distance between the imaging planes necessary for visualizing the patellar component, and requires a highly flexible setup to adjust for the subject-specific geometries. As an alternative solution, this study investigated the use of a novel multi-planar imaging setup that consists of a C-arm tracked by an external optoelectric tracking system, to acquire calibrated radiographs from multiple orientations. To determine the accuracies, a knee prosthesis was implanted on artificial bones and imaged in simulated 'Supine' and 'Weightbearing' configurations. The results were compared with measures from a coordinate measuring machine as the ground-truth reference. The weightbearing configuration was the preferred imaging direction with RMS errors of 0.48 mm and 1.32 ° for mediolateral shift and tilt of the patella, respectively, the two most clinically relevant measures. The 'imaging accuracies' of the system, defined as the accuracies in 3D reconstruction of a cylindrical ball bearing phantom (so as to avoid the influence of the shape and orientation of the imaging object), showed an order of magnitude (11.5 times) reduction in the out-of-plane RMS errors in comparison to single-plane fluoroscopy. With this new method, complete 3D pose of the patellofemoral and tibiofemoral joints during quasi-static activities can be determined with a many-fold (up to 8 times) (3.4mm) improvement in the out-of-plane accuracies compared to a conventional single-plane fluoroscopy setup. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Detecting small-study effects and funnel plot asymmetry in meta-analysis of survival data: A comparison of new and existing tests.

    PubMed

    Debray, Thomas P A; Moons, Karel G M; Riley, Richard D

    2018-03-01

    Small-study effects are a common threat in systematic reviews and may indicate publication bias. Their existence is often verified by visual inspection of the funnel plot. Formal tests to assess the presence of funnel plot asymmetry typically estimate the association between the reported effect size and their standard error, the total sample size, or the inverse of the total sample size. In this paper, we demonstrate that the application of these tests may be less appropriate in meta-analysis of survival data, where censoring influences statistical significance of the hazard ratio. We subsequently propose 2 new tests that are based on the total number of observed events and adopt a multiplicative variance component. We compare the performance of the various funnel plot asymmetry tests in an extensive simulation study where we varied the true hazard ratio (0.5 to 1), the number of published trials (N=10 to 100), the degree of censoring within trials (0% to 90%), and the mechanism leading to participant dropout (noninformative versus informative). Results demonstrate that previous well-known tests for detecting funnel plot asymmetry suffer from low power or excessive type-I error rates in meta-analysis of survival data, particularly when trials are affected by participant dropout. Because our novel test (adopting estimates of the asymptotic precision as study weights) yields reasonable power and maintains appropriate type-I error rates, we recommend its use to evaluate funnel plot asymmetry in meta-analysis of survival data. The use of funnel plot asymmetry tests should, however, be avoided when there are few trials available for any meta-analysis. © 2017 The Authors. Research Synthesis Methods Published by John Wiley & Sons, Ltd.

  13. Avoiding Human Error in Mission Operations: Cassini Flight Experience

    NASA Technical Reports Server (NTRS)

    Burk, Thomas A.

    2012-01-01

    Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.

  14. A Motion Planning Approach to Automatic Obstacle Avoidance during Concentric Tube Robot Teleoperation

    PubMed Central

    Torres, Luis G.; Kuntz, Alan; Gilbert, Hunter B.; Swaney, Philip J.; Hendrick, Richard J.; Webster, Robert J.; Alterovitz, Ron

    2015-01-01

    Concentric tube robots are thin, tentacle-like devices that can move along curved paths and can potentially enable new, less invasive surgical procedures. Safe and effective operation of this type of robot requires that the robot’s shaft avoid sensitive anatomical structures (e.g., critical vessels and organs) while the surgeon teleoperates the robot’s tip. However, the robot’s unintuitive kinematics makes it difficult for a human user to manually ensure obstacle avoidance along the entire tentacle-like shape of the robot’s shaft. We present a motion planning approach for concentric tube robot teleoperation that enables the robot to interactively maneuver its tip to points selected by a user while automatically avoiding obstacles along its shaft. We achieve automatic collision avoidance by precomputing a roadmap of collision-free robot configurations based on a description of the anatomical obstacles, which are attainable via volumetric medical imaging. We also mitigate the effects of kinematic modeling error in reaching the goal positions by adjusting motions based on robot tip position sensing. We evaluate our motion planner on a teleoperated concentric tube robot and demonstrate its obstacle avoidance and accuracy in environments with tubular obstacles. PMID:26413381

  15. A Motion Planning Approach to Automatic Obstacle Avoidance during Concentric Tube Robot Teleoperation.

    PubMed

    Torres, Luis G; Kuntz, Alan; Gilbert, Hunter B; Swaney, Philip J; Hendrick, Richard J; Webster, Robert J; Alterovitz, Ron

    2015-05-01

    Concentric tube robots are thin, tentacle-like devices that can move along curved paths and can potentially enable new, less invasive surgical procedures. Safe and effective operation of this type of robot requires that the robot's shaft avoid sensitive anatomical structures (e.g., critical vessels and organs) while the surgeon teleoperates the robot's tip. However, the robot's unintuitive kinematics makes it difficult for a human user to manually ensure obstacle avoidance along the entire tentacle-like shape of the robot's shaft. We present a motion planning approach for concentric tube robot teleoperation that enables the robot to interactively maneuver its tip to points selected by a user while automatically avoiding obstacles along its shaft. We achieve automatic collision avoidance by precomputing a roadmap of collision-free robot configurations based on a description of the anatomical obstacles, which are attainable via volumetric medical imaging. We also mitigate the effects of kinematic modeling error in reaching the goal positions by adjusting motions based on robot tip position sensing. We evaluate our motion planner on a teleoperated concentric tube robot and demonstrate its obstacle avoidance and accuracy in environments with tubular obstacles.

  16. Reliability Generalization: The Importance of Considering Sample Specificity, Confident Intervals, and Subgroup Differences.

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Daniel, Larry G.

    The purposes of this paper are to identify common errors made by researchers when dealing with reliability coefficients and to outline best practices for reporting and interpreting reliability coefficients. Common errors that researchers make are: (1) stating that the instruments are reliable; (2) incorrectly interpreting correlation coefficients;…

  17. The Effectiveness of Chinese NNESTs in Teaching English Syntax

    ERIC Educational Resources Information Center

    Chou, Chun-Hui; Bartz, Kevin

    2007-01-01

    This paper evaluates the effect of Chinese non-native English-speaking teachers (NNESTs) on Chinese ESL students' struggles with English syntax. The paper first classifies Chinese learners' syntactic errors into 10 common types. It demonstrates how each type of error results from an internal attempt to translate a common Chinese construction into…

  18. Extreme deconvolution: Inferring complete distribution functions from noisy, heterogeneous and incomplete observations

    NASA Astrophysics Data System (ADS)

    Bovy Jo; Hogg, David W.; Roweis, Sam T.

    2011-06-01

    We generalize the well-known mixtures of Gaussians approach to density estimation and the accompanying Expectation-Maximization technique for finding the maximum likelihood parameters of the mixture to the case where each data point carries an individual d-dimensional uncertainty covariance and has unique missing data properties. This algorithm reconstructs the error-deconvolved or "underlying" distribution function common to all samples, even when the individual data points are samples from different distributions, obtained by convolving the underlying distribution with the heteroskedastic uncertainty distribution of the data point and projecting out the missing data directions. We show how this basic algorithm can be extended with conjugate priors on all of the model parameters and a "split-and-"erge- procedure designed to avoid local maxima of the likelihood. We demonstrate the full method by applying it to the problem of inferring the three-dimensional veloc! ity distribution of stars near the Sun from noisy two-dimensional, transverse velocity measurements from the Hipparcos satellite.

  19. Musculoskeletal Simulation Model Generation from MRI Data Sets and Motion Capture Data

    NASA Astrophysics Data System (ADS)

    Schmid, Jérôme; Sandholm, Anders; Chung, François; Thalmann, Daniel; Delingette, Hervé; Magnenat-Thalmann, Nadia

    Today computer models and computer simulations of the musculoskeletal system are widely used to study the mechanisms behind human gait and its disorders. The common way of creating musculoskeletal models is to use a generic musculoskeletal model based on data derived from anatomical and biomechanical studies of cadaverous specimens. To adapt this generic model to a specific subject, the usual approach is to scale it. This scaling has been reported to introduce several errors because it does not always account for subject-specific anatomical differences. As a result, a novel semi-automatic workflow is proposed that creates subject-specific musculoskeletal models from magnetic resonance imaging (MRI) data sets and motion capture data. Based on subject-specific medical data and a model-based automatic segmentation approach, an accurate modeling of the anatomy can be produced while avoiding the scaling operation. This anatomical model coupled with motion capture data, joint kinematics information, and muscle-tendon actuators is finally used to create a subject-specific musculoskeletal model.

  20. Seven Pervasive Statistical Flaws in Cognitive Training Interventions

    PubMed Central

    Moreau, David; Kirk, Ian J.; Waldie, Karen E.

    2016-01-01

    The prospect of enhancing cognition is undoubtedly among the most exciting research questions currently bridging psychology, neuroscience, and evidence-based medicine. Yet, convincing claims in this line of work stem from designs that are prone to several shortcomings, thus threatening the credibility of training-induced cognitive enhancement. Here, we present seven pervasive statistical flaws in intervention designs: (i) lack of power; (ii) sampling error; (iii) continuous variable splits; (iv) erroneous interpretations of correlated gain scores; (v) single transfer assessments; (vi) multiple comparisons; and (vii) publication bias. Each flaw is illustrated with a Monte Carlo simulation to present its underlying mechanisms, gauge its magnitude, and discuss potential remedies. Although not restricted to training studies, these flaws are typically exacerbated in such designs, due to ubiquitous practices in data collection or data analysis. The article reviews these practices, so as to avoid common pitfalls when designing or analyzing an intervention. More generally, it is also intended as a reference for anyone interested in evaluating claims of cognitive enhancement. PMID:27148010

  1. Dual-stage deep learning framework for pigment epithelium detachment segmentation in polypoidal choroidal vasculopathy

    PubMed Central

    Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David

    2017-01-01

    Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management. PMID:28966847

  2. Dual-stage deep learning framework for pigment epithelium detachment segmentation in polypoidal choroidal vasculopathy.

    PubMed

    Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David

    2017-09-01

    Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management.

  3. Magnetic Control of Locked Modes in Present Devices and ITER

    NASA Astrophysics Data System (ADS)

    Volpe, F. A.; Sabbagh, S.; Sweeney, R.; Hender, T.; Kirk, A.; La Haye, R. J.; Strait, E. J.; Ding, Y. H.; Rao, B.; Fietz, S.; Maraschek, M.; Frassinetti, L.; in, Y.; Jeon, Y.; Sakakihara, S.

    2014-10-01

    The toroidal phase of non-rotating (``locked'') neoclassical tearing modes was controlled in several devices by means of applied magnetic perturbations. Evidence is presented from various tokamaks (ASDEX Upgrade, DIII-D, JET, J-TEXT, KSTAR), spherical tori (MAST, NSTX) and a reversed field pinch (EXTRAP-T2R). Furthermore, the phase of interchange modes was controlled in the LHD helical device. These results share a common interpretation in terms of torques acting on the mode. Based on this interpretation, it is predicted that control-coil currents will be sufficient to control the phase of locking in ITER. This will be possible both with the internal coils and with the external error-field-correction coils, and might have promising consequences for disruption avoidance (by aiding the electron cyclotron current drive stabilization of locked modes), as well as for spatially distributing heat loads during disruptions. This work was supported in part by the US Department of Energy under DE-SC0008520, DE-FC-02-04ER54698 and DE-AC02-09CH11466.

  4. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    PubMed

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Significance of screening electrocardiogram before the initiation of amitriptyline therapy in children with functional abdominal pain.

    PubMed

    Patra, Kamakshya P; Sankararaman, Senthilkumar; Jackson, Robert; Hussain, Sunny Z

    2012-09-01

    Amitriptyline (AMT) is commonly used in the management of children with irritable bowel syndrome. AMT is pro-arrhythmogenic and increases the risk of sudden cardiac death. However, there is not enough data regarding the cardiac toxicity in therapeutic doses of AMT in children and the need for screening electrocardiogram (EKG). Errors in computer EKG interpretation are not uncommon. In a risk-prevention study, the authors sought to identify the true incidence of prolonged corrected QT (QTc) interval and other arrhythmias in children with irritable bowel syndrome before the initiation of AMT. Out of the 760 EKGs screened, 3 EKGs demonstrated a true prolonged QTc after the careful manual reading by a pediatric cardiologist and they were not picked by computer-generated reading. The authors conclude that screening EKG should always be performed on children before initiating AMT therapy. Also, the computer-generated EKG needs to be verified by a pediatric cardiologist to avoid serious misinterpretations.

  6. Simplification of the DPPH assay for estimating the antioxidant activity of wine and wine by-products.

    PubMed

    Carmona-Jiménez, Yolanda; García-Moreno, M Valme; Igartuburu, Jose M; Garcia Barroso, Carmelo

    2014-12-15

    The DPPH assay is one of the most commonly employed methods for measuring antioxidant activity. Even though this method is considered very simple and efficient, it does present various limitations which make it complicated to perform. The range of linearity between the DPPH inhibition percentage and sample concentration has been studied with a view to simplifying the method for characterising samples of wine origin. It has been concluded that all the samples are linear in a range of inhibition below 40%, which allows the analysis to be simplified. A new parameter more appropriate for the simplification, the EC20, has been proposed to express the assay results. Additionally, the reaction time was analysed with the object of avoiding the need for kinetic studies in the method. The simplifications considered offer a more functional method, without significant errors, which could be used for routine analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  8. Nitrous oxide in pediatric anesthesia: friend or foe?

    PubMed

    Schmitt, Erica L; Baum, Victor C

    2008-06-01

    Nitrous oxide has been used in clinical practice for over 150 years, often for pediatric procedures. Not only are there problems when used in patients with a variety of inborn errors of metabolism, but effects of nitrous oxide on the developing human brain are unknown. A recent adult human trial found that the use of nitrous oxide was associated with increased adverse outcome. Animal studies in several species have shown that nitrous oxide can be associated with apoptosis in the developing brain. Nitrous oxide can also inhibit major enzymatic pathways and repeated exposure may lead to neurologic damage. Single nucleotide polymorphisms in at least one of these enzymes are common in the population. There is a growing body of evidence that supports avoidance of nitrous oxide in both pediatric and adult patients, but the thousands of patients who have been exposed to nitrous oxide without apparent complications would suggest that further studies on long-term side effects and possible neurologic consequences need to be done.

  9. The global burden of diagnostic errors in primary care

    PubMed Central

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-01-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, ‘Improving Diagnosis in Health Care’, concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a ‘magic bullet’ and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO’s leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. PMID:27530239

  10. A Comparison of Medication Histories Obtained by a Pharmacy Technician Versus Nurses in the Emergency Department.

    PubMed

    Markovic, Marija; Mathis, A Scott; Ghin, Hoytin Lee; Gardiner, Michelle; Fahim, Germin

    2017-01-01

    To compare the medication history error rate of the emergency department (ED) pharmacy technician with that of nursing staff and to describe the workflow environment. Fifty medication histories performed by an ED nurse followed by the pharmacy technician were evaluated for discrepancies (RN-PT group). A separate 50 medication histories performed by the pharmacy technician and observed with necessary intervention by the ED pharmacist were evaluated for discrepancies (PT-RPh group). Discrepancies were totaled and categorized by type of error and therapeutic category of the medication. The workflow description was obtained by observation and staff interview. A total of 474 medications in the RN-PT group and 521 in the PT-RPh group were evaluated. Nurses made at least one error in all 50 medication histories (100%), compared to 18 medication histories for the pharmacy technician (36%). In the RN-PT group, 408 medications had at least one error, corresponding to an accuracy rate of 14% for nurses. In the PT-RPh group, 30 medications had an error, corresponding to an accuracy rate of 94.4% for the pharmacy technician ( P < 0.0001). The most common error made by nurses was a missing medication (n = 109), while the most common error for the pharmacy technician was a wrong medication frequency (n = 19). The most common drug class with documented errors for ED nurses was cardiovascular medications (n = 100), while the pharmacy technician made the most errors in gastrointestinal medications (n = 11). Medication histories obtained by the pharmacy technician were significantly more accurate than those obtained by nurses in the emergency department.

  11. Geolocation error tracking of ZY-3 three line cameras

    NASA Astrophysics Data System (ADS)

    Pan, Hongbo

    2017-01-01

    The high-accuracy geolocation of high-resolution satellite images (HRSIs) is a key issue for mapping and integrating multi-temporal, multi-sensor images. In this manuscript, we propose a new geometric frame for analysing the geometric error of a stereo HRSI, in which the geolocation error can be divided into three parts: the epipolar direction, cross base direction, and height direction. With this frame, we proved that the height error of three line cameras (TLCs) is independent of nadir images, and that the terrain effect has a limited impact on the geolocation errors. For ZY-3 error sources, the drift error in both the pitch and roll angle and its influence on the geolocation accuracy are analysed. Epipolar and common tie-point constraints are proposed to study the bundle adjustment of HRSIs. Epipolar constraints explain that the relative orientation can reduce the number of compensation parameters in the cross base direction and have a limited impact on the height accuracy. The common tie points adjust the pitch-angle errors to be consistent with each other for TLCs. Therefore, free-net bundle adjustment of a single strip cannot significantly improve the geolocation accuracy. Furthermore, the epipolar and common tie-point constraints cause the error to propagate into the adjacent strip when multiple strips are involved in the bundle adjustment, which results in the same attitude uncertainty throughout the whole block. Two adjacent strips-Orbit 305 and Orbit 381, covering 7 and 12 standard scenes separately-and 308 ground control points (GCPs) were used for the experiments. The experiments validate the aforementioned theory. The planimetric and height root mean square errors were 2.09 and 1.28 m, respectively, when two GCPs were settled at the beginning and end of the block.

  12. Data-driven region-of-interest selection without inflating Type I error rate.

    PubMed

    Brooks, Joseph L; Zoumpoulaki, Alexia; Bowman, Howard

    2017-01-01

    In ERP and other large multidimensional neuroscience data sets, researchers often select regions of interest (ROIs) for analysis. The method of ROI selection can critically affect the conclusions of a study by causing the researcher to miss effects in the data or to detect spurious effects. In practice, to avoid inflating Type I error rate (i.e., false positives), ROIs are often based on a priori hypotheses or independent information. However, this can be insensitive to experiment-specific variations in effect location (e.g., latency shifts) reducing power to detect effects. Data-driven ROI selection, in contrast, is nonindependent and uses the data under analysis to determine ROI positions. Therefore, it has potential to select ROIs based on experiment-specific information and increase power for detecting effects. However, data-driven methods have been criticized because they can substantially inflate Type I error rate. Here, we demonstrate, using simulations of simple ERP experiments, that data-driven ROI selection can indeed be more powerful than a priori hypotheses or independent information. Furthermore, we show that data-driven ROI selection using the aggregate grand average from trials (AGAT), despite being based on the data at hand, can be safely used for ROI selection under many circumstances. However, when there is a noise difference between conditions, using the AGAT can inflate Type I error and should be avoided. We identify critical assumptions for use of the AGAT and provide a basis for researchers to use, and reviewers to assess, data-driven methods of ROI localization in ERP and other studies. © 2016 Society for Psychophysiological Research.

  13. Current Assessment and Classification of Suicidal Phenomena using the FDA 2012 Draft Guidance Document on Suicide Assessment: A Critical Review.

    PubMed

    Sheehan, David V; Giddens, Jennifer M; Sheehan, Kathy Harnett

    2014-09-01

    Standard international classification criteria require that classification categories be comprehensive to avoid type II error. Categories should be mutually exclusive and definitions should be clear and unambiguous (to avoid type I and type II errors). In addition, the classification system should be robust enough to last over time and provide comparability between data collections. This article was designed to evaluate the extent to which the classification system contained in the United States Food and Drug Administration 2012 Draft Guidance for the prospective assessment and classification of suicidal ideation and behavior in clinical trials meets these criteria. A critical review is used to assess the extent to which the proposed categories contained in the Food and Drug Administration 2012 Draft Guidance are comprehensive, unambiguous, and robust. Assumptions that underlie the classification system are also explored. The Food and Drug Administration classification system contained in the 2012 Draft Guidance does not capture the full range of suicidal ideation and behavior (type II error). Definitions, moreover, are frequently ambiguous (susceptible to multiple interpretations), and the potential for misclassification (type I and type II errors) is compounded by frequent mismatches in category titles and definitions. These issues have the potential to compromise data comparability within clinical trial sites, across sites, and over time. These problems need to be remedied because of the potential for flawed data output and consequent threats to public health, to research on the safety of medications, and to the search for effective medication treatments for suicidality.

  14. Error Analysis in Mathematics. Technical Report #1012

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  15. Information-Gathering Patterns Associated with Higher Rates of Diagnostic Error

    ERIC Educational Resources Information Center

    Delzell, John E., Jr.; Chumley, Heidi; Webb, Russell; Chakrabarti, Swapan; Relan, Anju

    2009-01-01

    Diagnostic errors are an important source of medical errors. Problematic information-gathering is a common cause of diagnostic errors among physicians and medical students. The objectives of this study were to (1) determine if medical students' information-gathering patterns formed clusters of similar strategies, and if so (2) to calculate the…

  16. Lexical Errors and Accuracy in Foreign Language Writing. Second Language Acquisition

    ERIC Educational Resources Information Center

    del Pilar Agustin Llach, Maria

    2011-01-01

    Lexical errors are a determinant in gaining insight into vocabulary acquisition, vocabulary use and writing quality assessment. Lexical errors are very frequent in the written production of young EFL learners, but they decrease as learners gain proficiency. Misspellings are the most common category, but formal errors give way to semantic-based…

  17. More on Systematic Error in a Boyle's Law Experiment

    ERIC Educational Resources Information Center

    McCall, Richard P.

    2012-01-01

    A recent article in "The Physics Teacher" describes a method for analyzing a systematic error in a Boyle's law laboratory activity. Systematic errors are important to consider in physics labs because they tend to bias the results of measurements. There are numerous laboratory examples and resources that discuss this common source of error.

  18. Is neonatal neurological damage in the delivery room avoidable? Experience of 33 levels I and II maternity units of a French perinatal network.

    PubMed

    Dupuis, O; Dupont, C; Gaucherand, P; Rudigoz, R-C; Fernandez, M P; Peigne, E; Labaune, J M

    2007-09-01

    To determine the frequency of avoidable neonatal neurological damage. We carried out a retrospective study from January 1st to December 31st 2003, including all children transferred from a level I or II maternity unit for suspected neurological damage (SND). Only cases confirmed by a persistent abnormality on clinical examination, EEG, transfontanelle ultrasound scan, CT scan or cerebral MRI were retained. Each case was studied in detail by an expert committee and classified as "avoidable", "unavoidable" or "of indeterminate avoidability." The management of "avoidable" cases was analysed to identify potentially avoidable factors (PAFs): not taking into account a major risk factor (PAF1), diagnostic errors (PAF2), suboptimal decision to delivery interval (PAF3) and mechanical complications (PAF4). In total, 77 children were transferred for SND; two cases were excluded (inaccessible medical files). Forty of the 75 cases of SND included were confirmed: 29 were "avoidable", 8 were "unavoidable" and 3 were "of indeterminate avoidability". Analysis of the 29 avoidable cases identified 39 PAFs: 18 PAF1, 5 PAF2, 10 PAF3 and 6 PAF4. Five had no classifiable PAF (0 death), 11 children had one type of PAF (one death), 11 children had two types of PAF (3 deaths), 2 had three types of PAF (2 deaths). Three quarters of the confirmed cases of neurological damage occurring in levels I and II maternity units of the Aurore network in 2003 were avoidable. Five out of six cases resulting in early death involved several potentially avoidable factors.

  19. Random measurement error: Why worry? An example of cardiovascular risk factors.

    PubMed

    Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H

    2018-01-01

    With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  20. A new accuracy measure based on bounded relative error for time series forecasting

    PubMed Central

    Twycross, Jamie; Garibaldi, Jonathan M.

    2017-01-01

    Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred. PMID:28339480

  1. A new accuracy measure based on bounded relative error for time series forecasting.

    PubMed

    Chen, Chao; Twycross, Jamie; Garibaldi, Jonathan M

    2017-01-01

    Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred.

  2. Economic measurement of medical errors using a hospital claims database.

    PubMed

    David, Guy; Gunnarsson, Candace L; Waters, Heidi C; Horblyuk, Ruslan; Kaplan, Harold S

    2013-01-01

    The primary objective of this study was to estimate the occurrence and costs of medical errors from the hospital perspective. Methods from a recent actuarial study of medical errors were used to identify medical injuries. A visit qualified as an injury visit if at least 1 of 97 injury groupings occurred at that visit, and the percentage of injuries caused by medical error was estimated. Visits with more than four injuries were removed from the population to avoid overestimation of cost. Population estimates were extrapolated from the Premier hospital database to all US acute care hospitals. There were an estimated 161,655 medical errors in 2008 and 170,201 medical errors in 2009. Extrapolated to the entire US population, there were more than 4 million unique injury visits containing more than 1 million unique medical errors each year. This analysis estimated that the total annual cost of measurable medical errors in the United States was $985 million in 2008 and just over $1 billion in 2009. The median cost per error to hospitals was $892 for 2008 and rose to $939 in 2009. Nearly one third of all medical injuries were due to error in each year. Medical errors directly impact patient outcomes and hospitals' profitability, especially since 2008 when Medicare stopped reimbursing hospitals for care related to certain preventable medical errors. Hospitals must rigorously analyze causes of medical errors and implement comprehensive preventative programs to reduce their occurrence as the financial burden of medical errors shifts to hospitals. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Measurement of precipitation using lysimeters

    NASA Astrophysics Data System (ADS)

    Fank, Johann; Klammler, Gernot

    2013-04-01

    Austria's alpine foothill aquifers contain important drinking water resources, but are also used intensively for agricultural production. These groundwater bodies are generally recharged by infiltrating precipitation. A sustainable water resources management of these aquifers requires quantifying real evapotranspiration (ET), groundwater recharge (GR), precipitation (P) and soil water storage change (ΔS). While GR and ΔS can be directly measured by weighable lysimeters and P by separate precipitation gauges, ET is determined by solving the climatic water balance ET = P GR ± ΔS. According to WMO (2008) measurement of rainfall is strongly influenced by precipitation gauge errors. Most significant errors result from wind loss, wetting loss, evaporation loss, and due to in- and out-splashing of water. Measuring errors can be reduced by a larger area of the measuring gaugés surface and positioning the collecting vessel at ground level. Modern weighable lysimeters commonly have a surface of 1 m², are integrated into their typical surroundings of vegetation cover (to avoid oasis effects) and allow scaling the mass change of monolithic soil columns in high measuring accuracy (0.01 mm water equivalent) and high temporal resolution. Thus, also precipitation can be quantified by measuring the positive mass changes of the lysimeter. According to Meissner et al. (2007) also dew, fog and rime can be determined by means of highly precise weighable lysimeters. Furthermore, measuring precipitation using lysimeters avoid common measuring errors (WMO 2008) at point scale. Though, this method implicates external effects (background noise, influence of vegetation and wind) which affect the mass time series. While the background noise of the weighing is rather well known and can be filtered out of the mass time series, the influence of wind, which blows through the vegetation and affects measured lysimeter mass, cannot be corrected easily since there is no clear relation between wind speeds and the measured outliers of lysimeter mass. Moreover, the influence of wind seems to be varying for different lysimeters. At the agricultural test site Wagna, Austria, two precipitation gauges in high temporal resolution (weighing-recording gauge and tipping-bucket gauge; both 200 cm² surface; measuring height 1.5 m) are installed. Furthermore, mass time series of various lysimeters cultivated with different vegetation is also available for the same location. Appropriate methods to compensate the influence of wind on measuring precipitation using lysimeters are investigated and results between the different measuring devices are compared. Results show that precipitation measured with lysimeters is generally higher, especially compared to the weighing-recording gauge. In addition it is detected that also the data interval of lysimeter mass time series used for quantifying precipitation (e.g., 1 day, 1 hour, 30 minutes, 10 minutes) is a crucial factor and influences the result. Summarizing, the potential of using highly precise weighable lysimeters for measuring precipitation at the point scale is rather high. However, methods used to compensate external effects on lysimeter weighing have to be enhanced for a global application of using lysimeters as precipitation gauges. Meissner, R., J. Seeger, H. Rupp, M. Seyfarth & H. Borg, 2007: Measurement of dew, fog, and rime with a high-precision gravitation Lysimeter. J. Plant Nutr. Soil Sci. 2007, 170, p. 335-344. WMO (World Meteorological Organization), 2008. Guide to Meteorological Instruments and Methods of Observation. WMO-No. 8, 140 pp.

  4. ELLIPTICAL WEIGHTED HOLICs FOR WEAK LENSING SHEAR MEASUREMENT. III. THE EFFECT OF RANDOM COUNT NOISE ON IMAGE MOMENTS IN WEAK LENSING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Futamase, Toshifumi, E-mail: yuki.okura@nao.ac.jp, E-mail: tof@astr.tohoku.ac.jp

    This is the third paper on the improvement of systematic errors in weak lensing analysis using an elliptical weight function, referred to as E-HOLICs. In previous papers, we succeeded in avoiding errors that depend on the ellipticity of the background image. In this paper, we investigate the systematic error that depends on the signal-to-noise ratio of the background image. We find that the origin of this error is the random count noise that comes from the Poisson noise of sky counts. The random count noise makes additional moments and centroid shift error, and those first-order effects are canceled in averaging,more » but the second-order effects are not canceled. We derive the formulae that correct this systematic error due to the random count noise in measuring the moments and ellipticity of the background image. The correction formulae obtained are expressed as combinations of complex moments of the image, and thus can correct the systematic errors caused by each object. We test their validity using a simulated image and find that the systematic error becomes less than 1% in the measured ellipticity for objects with an IMCAT significance threshold of {nu} {approx} 11.7.« less

  5. Risk-Aware Planetary Rover Operation: Autonomous Terrain Classification and Path Planning

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Fuchs, Thoams J.; Steffy, Amanda; Maimone, Mark; Yen, Jeng

    2015-01-01

    Identifying and avoiding terrain hazards (e.g., soft soil and pointy embedded rocks) are crucial for the safety of planetary rovers. This paper presents a newly developed groundbased Mars rover operation tool that mitigates risks from terrain by automatically identifying hazards on the terrain, evaluating their risks, and suggesting operators safe paths options that avoids potential risks while achieving specified goals. The tool will bring benefits to rover operations by reducing operation cost, by reducing cognitive load of rover operators, by preventing human errors, and most importantly, by significantly reducing the risk of the loss of rovers.

  6. An observational study of drug administration errors in a Malaysian hospital (study of drug administration errors).

    PubMed

    Chua, S S; Tea, M H; Rahman, M H A

    2009-04-01

    Drug administration errors were the second most frequent type of medication errors, after prescribing errors but the latter were often intercepted hence, administration errors were more probably to reach the patients. Therefore, this study was conducted to determine the frequency and types of drug administration errors in a Malaysian hospital ward. This is a prospective study that involved direct, undisguised observations of drug administrations in a hospital ward. A researcher was stationed in the ward under study for 15 days to observe all drug administrations which were recorded in a data collection form and then compared with the drugs prescribed for the patient. A total of 1118 opportunities for errors were observed and 127 administrations had errors. This gave an error rate of 11.4 % [95% confidence interval (CI) 9.5-13.3]. If incorrect time errors were excluded, the error rate reduced to 8.7% (95% CI 7.1-10.4). The most common types of drug administration errors were incorrect time (25.2%), followed by incorrect technique of administration (16.3%) and unauthorized drug errors (14.1%). In terms of clinical significance, 10.4% of the administration errors were considered as potentially life-threatening. Intravenous routes were more likely to be associated with an administration error than oral routes (21.3% vs. 7.9%, P < 0.001). The study indicates that the frequency of drug administration errors in developing countries such as Malaysia is similar to that in the developed countries. Incorrect time errors were also the most common type of drug administration errors. A non-punitive system of reporting medication errors should be established to encourage more information to be documented so that risk management protocol could be developed and implemented.

  7. 33 CFR 210.2 - Notice of award.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 210.2 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROCUREMENT ACTIVITIES OF THE CORPS OF ENGINEERS § 210.2 Notice of award. The successful bidder... accompany the contract papers which are forwarded for execution. To avoid error, or confusing the notice of...

  8. 33 CFR 210.2 - Notice of award.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 210.2 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROCUREMENT ACTIVITIES OF THE CORPS OF ENGINEERS § 210.2 Notice of award. The successful bidder... accompany the contract papers which are forwarded for execution. To avoid error, or confusing the notice of...

  9. 33 CFR 210.2 - Notice of award.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 210.2 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROCUREMENT ACTIVITIES OF THE CORPS OF ENGINEERS § 210.2 Notice of award. The successful bidder... accompany the contract papers which are forwarded for execution. To avoid error, or confusing the notice of...

  10. 33 CFR 210.2 - Notice of award.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 210.2 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROCUREMENT ACTIVITIES OF THE CORPS OF ENGINEERS § 210.2 Notice of award. The successful bidder... accompany the contract papers which are forwarded for execution. To avoid error, or confusing the notice of...

  11. Conflict Monitoring in Dual Process Theories of Thinking

    ERIC Educational Resources Information Center

    De Neys, Wim; Glumicic, Tamara

    2008-01-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…

  12. 12 CFR 202.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Credit means the right granted by a creditor to an applicant to defer payment of a debt, incur debt and defer its payment, or purchase property or services and defer payment therefor. (k) Credit card means... notwithstanding the maintenance of procedures reasonably adapted to avoid such errors. (t) Judgmental system of...

  13. 12 CFR 202.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Credit means the right granted by a creditor to an applicant to defer payment of a debt, incur debt and defer its payment, or purchase property or services and defer payment therefor. (k) Credit card means... notwithstanding the maintenance of procedures reasonably adapted to avoid such errors. (t) Judgmental system of...

  14. 12 CFR 202.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Credit means the right granted by a creditor to an applicant to defer payment of a debt, incur debt and defer its payment, or purchase property or services and defer payment therefor. (k) Credit card means... notwithstanding the maintenance of procedures reasonably adapted to avoid such errors. (t) Judgmental system of...

  15. Sampling command generator corrects for noise and dropouts in recorded data

    NASA Technical Reports Server (NTRS)

    Anderson, T. O.

    1973-01-01

    Generator measures period between zero crossings of reference signal and accepts as correct timing points only those zero crossings which occur acceptably close to nominal time predicted from last accepted command. Unidirectional crossover points are used exclusively so errors from analog nonsymmetry of crossover detector are avoided.

  16. La parole, vue et prise par les etudiants (Speech as Seen and Understood by Student).

    ERIC Educational Resources Information Center

    Gajo, Laurent, Ed.; Jeanneret, Fabrice, Ed.

    1998-01-01

    Articles on speech and second language learning include: "Les sequences de correction en classe de langue seconde: evitement du 'non' explicite" ("Error Correction Sequences in Second Language Class: Avoidance of the Explicit 'No'") (Anne-Lise de Bosset); "Analyse hierarchique et fonctionnelle du discours: conversations…

  17. Single-Event Upset Characterization of Common First- and Second-Order All-Digital Phase-Locked Loops

    NASA Astrophysics Data System (ADS)

    Chen, Y. P.; Massengill, L. W.; Kauppila, J. S.; Bhuva, B. L.; Holman, W. T.; Loveless, T. D.

    2017-08-01

    The single-event upset (SEU) vulnerability of common first- and second-order all-digital-phase-locked loops (ADPLLs) is investigated through field-programmable gate array-based fault injection experiments. SEUs in the highest order pole of the loop filter and fraction-based phase detectors (PDs) may result in the worst case error response, i.e., limit cycle errors, often requiring system restart. SEUs in integer-based linear PDs may result in loss-of-lock errors, while SEUs in bang-bang PDs only result in temporary-frequency errors. ADPLLs with the same frequency tuning range but fewer bits in the control word exhibit better overall SEU performance.

  18. Prevalence and cost of hospital medical errors in the general and elderly United States populations.

    PubMed

    Mallow, Peter J; Pandya, Bhavik; Horblyuk, Ruslan; Kaplan, Harold S

    2013-12-01

    The primary objective of this study was to quantify the differences in the prevalence rate and costs of hospital medical errors between the general population and an elderly population aged ≥65 years. Methods from an actuarial study of medical errors were modified to identify medical errors in the Premier Hospital Database using data from 2009. Visits with more than four medical errors were removed from the population to avoid over-estimation of cost. Prevalence rates were calculated based on the total number of inpatient visits. There were 3,466,596 total inpatient visits in 2009. Of these, 1,230,836 (36%) occurred in people aged ≥ 65. The prevalence rate was 49 medical errors per 1000 inpatient visits in the general cohort and 79 medical errors per 1000 inpatient visits for the elderly cohort. The top 10 medical errors accounted for more than 80% of the total in the general cohort and the 65+ cohort. The most costly medical error for the general population was postoperative infection ($569,287,000). Pressure ulcers were most costly ($347,166,257) in the elderly population. This study was conducted with a hospital administrative database, and assumptions were necessary to identify medical errors in the database. Further, there was no method to identify errors of omission or misdiagnoses within the database. This study indicates that prevalence of hospital medical errors for the elderly is greater than the general population and the associated cost of medical errors in the elderly population is quite substantial. Hospitals which further focus their attention on medical errors in the elderly population may see a significant reduction in costs due to medical errors as a disproportionate percentage of medical errors occur in this age group.

  19. Refractive errors in Mercyland Specialist Hospital, Osogbo, Western Nigeria.

    PubMed

    Adeoti, C O; Egbewale, B E

    2008-06-01

    The study was conducted to determine the magnitude and pattern of refractive errors in order to provide facilities for its management. A prospective study of 3601 eyes of 1824 consective patients was conducted. Information obtained included age, sex, occupation, visual acuity, type and degree of refractive error. The data was analysed using Statistical Package for Social Sciences 11.0 version) Computer Software. Refractive error was found in 1824(53.71%) patients. There were 832(45.61%) males and 992(54.39%) females with a mean age of 35.55. Myopia was the commonest (1412(39.21% eyes). Others include hypermetropia (840(23.33% eyes), astigmatism (785(21.80%) and 820 patients (1640 eyes) had presbyopia. Anisometropia was present in 791(44.51%) of 1777 patients that had bilateral refractive errors. Two thousand two hundred and fifty two eyes has spherical errors. Out of 2252 eyes with spherical errors, 1308 eyes (58.08%) had errors -0.50 to +0.50 dioptres, 567 eyes (25.18%) had errors less than -0.50 dioptres of whom 63 eyes (2.80%) had errors less than -5.00 dioptres while 377 eyes (16.74%) had errors greater than +0.50 dioptres of whom 81 eyes (3.60%) had errors greater than +2.00 dioptres. The highest error was 20.00 dioptres for myopia and 18.00 dioptres for hypermetropia. Refractive error is common in this environment. Adequate provision should be made for its correction bearing in mind the common types and degrees.

  20. Clinical Dental Faculty Members' Perceptions of Diagnostic Errors and How to Avoid Them.

    PubMed

    Nikdel, Cathy; Nikdel, Kian; Ibarra-Noriega, Ana; Kalenderian, Elsbeth; Walji, Muhammad F

    2018-04-01

    Diagnostic errors are increasingly recognized as a source of preventable harm in medicine, yet little is known about their occurrence in dentistry. The aim of this study was to gain a deeper understanding of clinical dental faculty members' perceptions of diagnostic errors, types of errors that may occur, and possible contributing factors. The authors conducted semi-structured interviews with ten domain experts at one U.S. dental school in May-August 2016 about their perceptions of diagnostic errors and their causes. The interviews were analyzed using an inductive process to identify themes and key findings. The results showed that the participants varied in their definitions of diagnostic errors. While all identified missed diagnosis and wrong diagnosis, only four participants perceived that a delay in diagnosis was a diagnostic error. Some participants perceived that an error occurs only when the choice of treatment leads to harm. Contributing factors associated with diagnostic errors included the knowledge and skills of the dentist, not taking adequate time, lack of communication among colleagues, and cognitive biases such as premature closure based on previous experience. Strategies suggested by the participants to prevent these errors were taking adequate time when investigating a case, forming study groups, increasing communication, and putting more emphasis on differential diagnosis. These interviews revealed differing perceptions of dental diagnostic errors among clinical dental faculty members. To address the variations, the authors recommend adopting shared language developed by the medical profession to increase understanding.

  1. Passive avoidance and complex maze learning in the senescence accelerated mouse (SAM): age and strain comparisons of SAM P8 and R1.

    PubMed

    Spangler, Edward L; Patel, Namisha; Speer, Dorey; Hyman, Michael; Hengemihle, John; Markowska, Alicja; Ingram, Donald K

    2002-02-01

    Two strains of the senescence accelerated mouse, P8 and R1,were tested in footshock-motivated passive avoidance (PA; P8, 3-21 months; R1, 3-24 months) and 14-unit T-maze (P8 and R1, 9, and 15 months) tasks. For PA, entry to a dark chamber from a lighted chamber was followed by a brief shock. Latency to enter the dark chamber 24 hours later served as a measure of retention. Two days of active avoidance training in a straight runway preceded 2 days (8 trials/day) of testing in the 14-unit T-maze. For PA retention, older P8 mice entered the dark chamber more quickly than older R1 mice, whereas no differences were observed between young P8 or R1 mice. In the 14-unit T-maze, age-related learning performance deficits were reflected in higher error scores for older mice. P8 mice were actually superior learners; that is, they had lower error scores compared with those of age-matched R1 counterparts. Although PA learning results were in agreement with other reports, results obtained in the 14-unit T-maze were not consistent with previous reports of learning impairments in the P8 senescence accelerated mouse.

  2. Abnormal decision-making in generalized anxiety disorder: Aversion of risk or stimulus-reinforcement impairment?

    PubMed

    Teng, Cindy; Otero, Marcela; Geraci, Marilla; Blair, R J R; Pine, Daniel S; Grillon, Christian; Blair, Karina S

    2016-03-30

    There is preliminary data indicating that patients with generalized anxiety disorder (GAD) show impairment on decision-making tasks requiring the appropriate representation of reinforcement value. The current study aimed to extend this literature using the passive avoidance (PA) learning task, where the participant has to learn to respond to stimuli that engender reward and avoid responding to stimuli that engender punishment. Six stimuli engendering reward and six engendering punishment are presented once per block for 10 blocks of trials. Thirty-nine medication-free patients with GAD and 29 age-, IQ and gender matched healthy comparison individuals performed the task. In addition, indexes of social functioning as assessed by the Global Assessment of Functioning (GAF) scale were obtained to allow for correlational analyzes of potential relations between cognitive and social impairments. The results revealed a Group-by-Error Type-by-Block interaction; patients with GAD committed significantly more commission (passive avoidance) errors than comparison individuals in the later blocks (blocks 7,8, and 9). In addition, the extent of impairment on these blocks was associated with their functional impairment as measured by the GAF scale. These results link GAD with anomalous decision-making and indicate that a potential problem in reinforcement representation may contribute to the severity of expression of their disorder. Copyright © 2016. Published by Elsevier Ireland Ltd.

  3. Reduction of construction wastes by improving construction contract management: a multinational evaluation.

    PubMed

    Mendis, Daylath; Hewage, Kasun N; Wrzesniewski, Joanna

    2013-10-01

    The Canadian construction industry generates 30% of the total municipal solid waste deposited in landfills. Ample evidence can be found in the published literature about rework and waste generation due to ambiguity and errors in contract documents. Also, the literature quotes that disclaimer clauses in contract documents are included in the contractual agreements to prevent contractor claims, which often cause rework. Our professional practice has also noted that there are several disclaimer clauses in standard contract documents which have the potential to cause rework (and associated waste). This article illustrates a comparative study of standard contractual documents and their potential to create rework (and associated waste) in different regions of the world. The objectives of this study are (1) to analyse standard contractual documents in Canada, the USA and Australia in terms of their potential to generate rework and waste, and (2) to propose changes/amendments to the existing standard contract documents to minimise/avoid rework. In terms of construction waste management, all the reviewed standard contract documents have deficiencies. The parties that produce the contract documents include exculpatory clauses to avoid the other party's claims. This approach tends to result in rework and construction waste. The contractual agreements/contract documents should be free from errors, deficiencies, ambiguity and unfair risk transfers to minimise/avoid potential to generate rework and waste.

  4. "An integrative formal model of motivation and decision making: The MGPM*": Correction to Ballard et al. (2016).

    PubMed

    2017-02-01

    Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Cost and benefit estimates of partially-automated vehicle collision avoidance technologies.

    PubMed

    Harper, Corey D; Hendrickson, Chris T; Samaras, Constantine

    2016-10-01

    Many light-duty vehicle crashes occur due to human error and distracted driving. Partially-automated crash avoidance features offer the potential to reduce the frequency and severity of vehicle crashes that occur due to distracted driving and/or human error by assisting in maintaining control of the vehicle or issuing alerts if a potentially dangerous situation is detected. This paper evaluates the benefits and costs of fleet-wide deployment of blind spot monitoring, lane departure warning, and forward collision warning crash avoidance systems within the US light-duty vehicle fleet. The three crash avoidance technologies could collectively prevent or reduce the severity of as many as 1.3 million U.S. crashes a year including 133,000 injury crashes and 10,100 fatal crashes. For this paper we made two estimates of potential benefits in the United States: (1) the upper bound fleet-wide technology diffusion benefits by assuming all relevant crashes are avoided and (2) the lower bound fleet-wide benefits of the three technologies based on observed insurance data. The latter represents a lower bound as technology is improved over time and cost reduced with scale economies and technology improvement. All three technologies could collectively provide a lower bound annual benefit of about $18 billion if equipped on all light-duty vehicles. With 2015 pricing of safety options, the total annual costs to equip all light-duty vehicles with the three technologies would be about $13 billion, resulting in an annual net benefit of about $4 billion or a $20 per vehicle net benefit. By assuming all relevant crashes are avoided, the total upper bound annual net benefit from all three technologies combined is about $202 billion or an $861 per vehicle net benefit, at current technology costs. The technologies we are exploring in this paper represent an early form of vehicle automation and a positive net benefit suggests the fleet-wide adoption of these technologies would be beneficial from an economic and social perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Secondary School Teachers' Pedagogical Content Knowledge of Some Common Student Errors and Misconceptions in Sets

    ERIC Educational Resources Information Center

    Kolitsoe Moru, Eunice; Qhobela, Makomosela

    2013-01-01

    The study investigated teachers' pedagogical content knowledge of common students' errors and misconceptions in sets. Five mathematics teachers from one Lesotho secondary school were the sample of the study. Questionnaires and interviews were used for data collection. The results show that teachers were able to identify the following students'…

  7. Addressing Common Student Errors with Classroom Voting in Multivariable Calculus

    ERIC Educational Resources Information Center

    Cline, Kelly; Parker, Mark; Zullo, Holly; Stewart, Ann

    2012-01-01

    One technique for identifying and addressing common student errors is the method of classroom voting, in which the instructor presents a multiple-choice question to the class, and after a few minutes for consideration and small group discussion, each student votes on the correct answer, often using a hand-held electronic clicker. If a large number…

  8. The Nature of Error in Adolescent Student Writing

    ERIC Educational Resources Information Center

    Wilcox, Kristen Campbell; Yagelski, Robert; Yu, Fang

    2014-01-01

    This study examined the nature and frequency of error in high school native English speaker (L1) and English learner (L2) writing. Four main research questions were addressed: Are there significant differences in students' error rates in English language arts (ELA) and social studies? Do the most common errors made by students differ in ELA…

  9. False Positives in Multiple Regression: Unanticipated Consequences of Measurement Error in the Predictor Variables

    ERIC Educational Resources Information Center

    Shear, Benjamin R.; Zumbo, Bruno D.

    2013-01-01

    Type I error rates in multiple regression, and hence the chance for false positive research findings, can be drastically inflated when multiple regression models are used to analyze data that contain random measurement error. This article shows the potential for inflated Type I error rates in commonly encountered scenarios and provides new…

  10. Comparing Measurement Error between Two Different Methods of Measurement of Various Magnitudes

    ERIC Educational Resources Information Center

    Zavorsky, Gerald S.

    2010-01-01

    Measurement error is a common problem in several fields of research such as medicine, physiology, and exercise science. The standard deviation of repeated measurements on the same person is the measurement error. One way of presenting measurement error is called the repeatability, which is 2.77 multiplied by the within subject standard deviation.…

  11. [Errors in prescriptions and their preparation at the outpatient pharmacy of a regional hospital].

    PubMed

    Alvarado A, Carolina; Ossa G, Ximena; Bustos M, Luis

    2017-01-01

    Adverse effects of medications are an important cause of morbidity and hospital admissions. Errors in prescription or preparation of medications by pharmacy personnel are a factor that may influence these occurrence of the adverse effects Aim: To assess the frequency and type of errors in prescriptions and in their preparation at the pharmacy unit of a regional public hospital. Prescriptions received by ambulatory patients and those being discharged from the hospital, were reviewed using a 12-item checklist. The preparation of such prescriptions at the pharmacy unit was also reviewed using a seven item checklist. Seventy two percent of prescriptions had at least one error. The most common mistake was the impossibility of determining the concentration of the prescribed drug. Prescriptions for patients being discharged from the hospital had the higher number of errors. When a prescription had more than two drugs, the risk of error increased 2.4 times. Twenty four percent of prescription preparations had at least one error. The most common mistake was the labeling of drugs with incomplete medical indications. When a preparation included more than three drugs, the risk of preparation error increased 1.8 times. Prescription and preparation of medication delivered to patients had frequent errors. The most important risk factor for errors was the number of drugs prescribed.

  12. Reduction in chemotherapy order errors with computerized physician order entry.

    PubMed

    Meisenberg, Barry R; Wright, Robert R; Brady-Copertino, Catherine J

    2014-01-01

    To measure the number and type of errors associated with chemotherapy order composition associated with three sequential methods of ordering: handwritten orders, preprinted orders, and computerized physician order entry (CPOE) embedded in the electronic health record. From 2008 to 2012, a sample of completed chemotherapy orders were reviewed by a pharmacist for the number and type of errors as part of routine performance improvement monitoring. Error frequencies for each of the three distinct methods of composing chemotherapy orders were compared using statistical methods. The rate of problematic order sets-those requiring significant rework for clarification-was reduced from 30.6% with handwritten orders to 12.6% with preprinted orders (preprinted v handwritten, P < .001) to 2.2% with CPOE (preprinted v CPOE, P < .001). The incidence of errors capable of causing harm was reduced from 4.2% with handwritten orders to 1.5% with preprinted orders (preprinted v handwritten, P < .001) to 0.1% with CPOE (CPOE v preprinted, P < .001). The number of problem- and error-containing chemotherapy orders was reduced sequentially by preprinted order sets and then by CPOE. CPOE is associated with low error rates, but it did not eliminate all errors, and the technology can introduce novel types of errors not seen with traditional handwritten or preprinted orders. Vigilance even with CPOE is still required to avoid patient harm.

  13. Data quality assurance and control in cognitive research: Lessons learned from the PREDICT-HD study.

    PubMed

    Westervelt, Holly James; Bernier, Rachel A; Faust, Melanie; Gover, Mary; Bockholt, H Jeremy; Zschiegner, Roland; Long, Jeffrey D; Paulsen, Jane S

    2017-09-01

    We discuss the strategies employed in data quality control and quality assurance for the cognitive core of Neurobiological Predictors of Huntington's Disease (PREDICT-HD), a long-term observational study of over 1,000 participants with prodromal Huntington disease. In particular, we provide details regarding the training and continual evaluation of cognitive examiners, methods for error corrections, and strategies to minimize errors in the data. We present five important lessons learned to help other researchers avoid certain assumptions that could potentially lead to inaccuracies in their cognitive data. Copyright © 2017 John Wiley & Sons, Ltd.

  14. The epoch state navigation filter. [for maximum likelihood estimates of position and velocity vectors

    NASA Technical Reports Server (NTRS)

    Battin, R. H.; Croopnick, S. R.; Edwards, J. A.

    1977-01-01

    The formulation of a recursive maximum likelihood navigation system employing reference position and velocity vectors as state variables is presented. Convenient forms of the required variational equations of motion are developed together with an explicit form of the associated state transition matrix needed to refer measurement data from the measurement time to the epoch time. Computational advantages accrue from this design in that the usual forward extrapolation of the covariance matrix of estimation errors can be avoided without incurring unacceptable system errors. Simulation data for earth orbiting satellites are provided to substantiate this assertion.

  15. A Rasch Perspective

    ERIC Educational Resources Information Center

    Schumacker, Randall E.; Smith, Everett V., Jr.

    2007-01-01

    Measurement error is a common theme in classical measurement models used in testing and assessment. In classical measurement models, the definition of measurement error and the subsequent reliability coefficients differ on the basis of the test administration design. Internal consistency reliability specifies error due primarily to poor item…

  16. Fuzzy Inference Based Obstacle Avoidance Control of Electric Powered Wheelchair Considering Driving Risk

    NASA Astrophysics Data System (ADS)

    Kiso, Atsushi; Murakami, Hiroki; Seki, Hirokazu

    This paper describes a novel obstacle avoidance control scheme of electric powered wheelchairs for realizing the safe driving in various environments. The “electric powered wheelchair” which generates the driving force by electric motors is expected to be widely used as a mobility support system for elderly people and disabled people; however, the driving performance must be further improved because the number of driving accidents caused by elderly operator's narrow sight and joystick operation errors is increasing. This paper proposes a novel obstacle avoidance control scheme based on fuzzy algorithm to prevent driving accidents. The proposed control system determines the driving direction by fuzzy algorithm based on the information of the joystick operation and distance to obstacles measured by ultrasonic sensors. Fuzzy rules to determine the driving direction are designed surely to avoid passers-by and walls considering the human's intent and driving environments. Some driving experiments on the practical situations show the effectiveness of the proposed control system.

  17. Asymmetric interaction paired with a super-rational strategy might resolve the tragedy of the commons without requiring recognition or negotiation.

    PubMed

    He, Jun-Zhou; Wang, Rui-Wu; Jensen, Christopher X J; Li, Yao-Tang

    2015-01-14

    Avoiding the tragedy of the commons requires that one or more individuals in a group or partnership "volunteer", benefiting the group at a cost to themselves. Recognition and negotiation with social partners can maintain cooperation, but are often not possible. If recognition and negotiation are not always the mechanism by which cooperative partnerships avoid collective tragedies, what might explain the diverse social cooperation observed in nature? Assuming that individuals interact asymmetrically and that both "weak" and "strong" players employ a super-rational strategy, we find that tragedy of the commons can be avoided without requiring either recognition or negotiation. Whereas in the volunteer's dilemma game a rational "strong" player is less likely to volunteer to provide a common good in larger groups, we show that under a wide range of conditions a super-rational "strong" player is more likely to provide a common good. These results imply that the integration of super-rationality and asymmetric interaction might have the potential to resolve the tragedy of the commons. By illuminating the conditions under which players are likely to volunteer, we shed light on the patterns of volunteerism observed in variety of well-studied cooperative social systems, and explore how societies might avert social tragedies.

  18. Asymmetric interaction paired with a super-rational strategy might resolve the tragedy of the commons without requiring recognition or negotiation

    PubMed Central

    He, Jun-Zhou; Wang, Rui-Wu; Jensen, Christopher X. J.; Li, Yao-Tang

    2015-01-01

    Avoiding the tragedy of the commons requires that one or more individuals in a group or partnership “volunteer”, benefiting the group at a cost to themselves. Recognition and negotiation with social partners can maintain cooperation, but are often not possible. If recognition and negotiation are not always the mechanism by which cooperative partnerships avoid collective tragedies, what might explain the diverse social cooperation observed in nature? Assuming that individuals interact asymmetrically and that both “weak” and “strong” players employ a super-rational strategy, we find that tragedy of the commons can be avoided without requiring either recognition or negotiation. Whereas in the volunteer's dilemma game a rational “strong” player is less likely to volunteer to provide a common good in larger groups, we show that under a wide range of conditions a super-rational “strong” player is more likely to provide a common good. These results imply that the integration of super-rationality and asymmetric interaction might have the potential to resolve the tragedy of the commons. By illuminating the conditions under which players are likely to volunteer, we shed light on the patterns of volunteerism observed in variety of well-studied cooperative social systems, and explore how societies might avert social tragedies. PMID:25586876

  19. Asymmetric interaction paired with a super-rational strategy might resolve the tragedy of the commons without requiring recognition or negotiation

    NASA Astrophysics Data System (ADS)

    He, Jun-Zhou; Wang, Rui-Wu; Jensen, Christopher X. J.; Li, Yao-Tang

    2015-01-01

    Avoiding the tragedy of the commons requires that one or more individuals in a group or partnership ``volunteer'', benefiting the group at a cost to themselves. Recognition and negotiation with social partners can maintain cooperation, but are often not possible. If recognition and negotiation are not always the mechanism by which cooperative partnerships avoid collective tragedies, what might explain the diverse social cooperation observed in nature? Assuming that individuals interact asymmetrically and that both ``weak'' and ``strong'' players employ a super-rational strategy, we find that tragedy of the commons can be avoided without requiring either recognition or negotiation. Whereas in the volunteer's dilemma game a rational ``strong'' player is less likely to volunteer to provide a common good in larger groups, we show that under a wide range of conditions a super-rational ``strong'' player is more likely to provide a common good. These results imply that the integration of super-rationality and asymmetric interaction might have the potential to resolve the tragedy of the commons. By illuminating the conditions under which players are likely to volunteer, we shed light on the patterns of volunteerism observed in variety of well-studied cooperative social systems, and explore how societies might avert social tragedies.

  20. Evaluation of alignment error due to a speed artifact in stereotactic ultrasound image guidance.

    PubMed

    Salter, Bill J; Wang, Brian; Szegedi, Martin W; Rassiah-Szegedi, Prema; Shrieve, Dennis C; Cheng, Roger; Fuss, Martin

    2008-12-07

    Ultrasound (US) image guidance systems used in radiotherapy are typically calibrated for soft tissue applications, thus introducing errors in depth-from-transducer representation when used in media with a different speed of sound propagation (e.g. fat). This error is commonly referred to as the speed artifact. In this study we utilized a standard US phantom to demonstrate the existence of the speed artifact when using a commercial US image guidance system to image through layers of simulated body fat, and we compared the results with calculated/predicted values. A general purpose US phantom (speed of sound (SOS) = 1540 m s(-1)) was imaged on a multi-slice CT scanner at a 0.625 mm slice thickness and 0.5 mm x 0.5 mm axial pixel size. Target-simulating wires inside the phantom were contoured and later transferred to the US guidance system. Layers of various thickness (1-8 cm) of commercially manufactured fat-simulating material (SOS = 1435 m s(-1)) were placed on top of the phantom to study the depth-related alignment error. In order to demonstrate that the speed artifact is not caused by adding additional layers on top of the phantom, we repeated these measurements in an identical setup using commercially manufactured tissue-simulating material (SOS = 1540 m s(-1)) for the top layers. For the fat-simulating material used in this study, we observed the magnitude of the depth-related alignment errors resulting from the speed artifact to be 0.7 mm cm(-1) of fat imaged through. The measured alignment errors caused by the speed artifact agreed with the calculated values within one standard deviation for all of the different thicknesses of fat-simulating material studied here. We demonstrated the depth-related alignment error due to the speed artifact when using US image guidance for radiation treatment alignment and note that the presence of fat causes the target to be aliased to a depth greater than it actually is. For typical US guidance systems in use today, this will lead to delivery of the high dose region at a position slightly posterior to the intended region for a supine patient. When possible, care should be taken to avoid imaging through a thick layer of fat for larger patients in US alignments or, if unavoidable, the spatial inaccuracies introduced by the artifact should be considered by the physician during the formulation of the treatment plan.

  1. [The effect of the new nootropic dipeptide GVS-111 in different functional disorders of the escape reaction].

    PubMed

    Inozemtsev, A N; Trofimov, S S; Borlikova, G G; Firova, F A; Pragina, L L; Gudasheva, T A; Ostrovskaia, R U; Tushmalova, N A; Voronina, T A

    1998-01-01

    The authors studied the effect of a new nootropic agent with anxiolytic properties GVS-111 (ethyl ether N-phenylacetyl-L-prolylglycine) on formation of the avoidance reaction (AR) in rats and its functional disorders which were induced by two methods. In one case the stereotype of the relations between the stimulus, reaction and its consequence which developed during the experiment were urgently disturbed: the change of the animal to the other half of the chamber in response to a conditioned stimulus did not lead to its cutting off and prevention of the electropain stimulation for three successive combinations (AR error). In another case the spatial stereotype of the experiment was altered by changing the place of the opening through which the animal avoided the stimulus (spatial remodeling). Intraperitoneal injection of GVS-111 (0.1 mg/kg/day) improved the learning, but the effect differed from experiment to experiment. Along with this, the dipeptide prevented AR disturbance during the error and quickened restoration of the habit in spatial remodeling. It was shown earlier that AR disorders during an error are prevented by anxiolytics and nootropic agents but during spatial remodeling only by nootropic agents. It may be assumed that the positive effect of GSV-111 on AR in functional disorders is due to its nootropic effect.

  2. Determination of drill paths for percutaneous cochlear access accounting for target positioning error

    NASA Astrophysics Data System (ADS)

    Noble, Jack H.; Warren, Frank M.; Labadie, Robert F.; Dawant, Benoit; Fitzpatrick, J. Michael

    2007-03-01

    In cochlear implant surgery an electrode array is permanently implanted to stimulate the auditory nerve and allow deaf people to hear. Current surgical techniques require wide excavation of the mastoid region of the temporal bone and one to three hours time to avoid damage to vital structures. Recently a far less invasive approach has been proposed-percutaneous cochlear access, in which a single hole is drilled from skull surface to the cochlea. The drill path is determined by attaching a fiducial system to the patient's skull and then choosing, on a pre-operative CT, an entry point and a target point. The drill is advanced to the target, the electrodes placed through the hole, and a stimulator implanted at the surface of the skull. The major challenge is the determination of a safe and effective drill path, which with high probability avoids specific vital structures-the facial nerve, the ossicles, and the external ear canal-and arrives at the basal turn of the cochlea. These four features lie within a few millimeters of each other, the drill is one millimeter in diameter, and errors in the determination of the target position are on the order of 0.5mm root-mean square. Thus, path selection is both difficult and critical to the success of the surgery. This paper presents a method for finding optimally safe and effective paths while accounting for target positioning error.

  3. Flip-avoiding interpolating surface registration for skull reconstruction.

    PubMed

    Xie, Shudong; Leow, Wee Kheng; Lee, Hanjing; Lim, Thiam Chye

    2018-03-30

    Skull reconstruction is an important and challenging task in craniofacial surgery planning, forensic investigation and anthropological studies. Existing methods typically reconstruct approximating surfaces that regard corresponding points on the target skull as soft constraints, thus incurring non-zero error even for non-defective parts and high overall reconstruction error. This paper proposes a novel geometric reconstruction method that non-rigidly registers an interpolating reference surface that regards corresponding target points as hard constraints, thus achieving low reconstruction error. To overcome the shortcoming of interpolating a surface, a flip-avoiding method is used to detect and exclude conflicting hard constraints that would otherwise cause surface patches to flip and self-intersect. Comprehensive test results show that our method is more accurate and robust than existing skull reconstruction methods. By incorporating symmetry constraints, it can produce more symmetric and normal results than other methods in reconstructing defective skulls with a large number of defects. It is robust against severe outliers such as radiation artifacts in computed tomography due to dental implants. In addition, test results also show that our method outperforms thin-plate spline for model resampling, which enables the active shape model to yield more accurate reconstruction results. As the reconstruction accuracy of defective parts varies with the use of different reference models, we also study the implication of reference model selection for skull reconstruction. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Assumption-free estimation of the genetic contribution to refractive error across childhood.

    PubMed

    Guggenheim, Jeremy A; St Pourcain, Beate; McMahon, George; Timpson, Nicholas J; Evans, David M; Williams, Cathy

    2015-01-01

    Studies in relatives have generally yielded high heritability estimates for refractive error: twins 75-90%, families 15-70%. However, because related individuals often share a common environment, these estimates are inflated (via misallocation of unique/common environment variance). We calculated a lower-bound heritability estimate for refractive error free from such bias. Between the ages 7 and 15 years, participants in the Avon Longitudinal Study of Parents and Children (ALSPAC) underwent non-cycloplegic autorefraction at regular research clinics. At each age, an estimate of the variance in refractive error explained by single nucleotide polymorphism (SNP) genetic variants was calculated using genome-wide complex trait analysis (GCTA) using high-density genome-wide SNP genotype information (minimum N at each age=3,404). The variance in refractive error explained by the SNPs ("SNP heritability") was stable over childhood: Across age 7-15 years, SNP heritability averaged 0.28 (SE=0.08, p<0.001). The genetic correlation for refractive error between visits varied from 0.77 to 1.00 (all p<0.001) demonstrating that a common set of SNPs was responsible for the genetic contribution to refractive error across this period of childhood. Simulations suggested lack of cycloplegia during autorefraction led to a small underestimation of SNP heritability (adjusted SNP heritability=0.35; SE=0.09). To put these results in context, the variance in refractive error explained (or predicted) by the time participants spent outdoors was <0.005 and by the time spent reading was <0.01, based on a parental questionnaire completed when the child was aged 8-9 years old. Genetic variation captured by common SNPs explained approximately 35% of the variation in refractive error between unrelated subjects. This value sets an upper limit for predicting refractive error using existing SNP genotyping arrays, although higher-density genotyping in larger samples and inclusion of interaction effects is expected to raise this figure toward twin- and family-based heritability estimates. The same SNPs influenced refractive error across much of childhood. Notwithstanding the strong evidence of association between time outdoors and myopia, and time reading and myopia, less than 1% of the variance in myopia at age 15 was explained by crude measures of these two risk factors, indicating that their effects may be limited, at least when averaged over the whole population.

  5. The global burden of diagnostic errors in primary care.

    PubMed

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-06-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, 'Improving Diagnosis in Health Care ', concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a 'magic bullet' and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO's leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. A Comparison of Medication Histories Obtained by a Pharmacy Technician Versus Nurses in the Emergency Department

    PubMed Central

    Markovic, Marija; Mathis, A. Scott; Ghin, Hoytin Lee; Gardiner, Michelle; Fahim, Germin

    2017-01-01

    Purpose: To compare the medication history error rate of the emergency department (ED) pharmacy technician with that of nursing staff and to describe the workflow environment. Methods: Fifty medication histories performed by an ED nurse followed by the pharmacy technician were evaluated for discrepancies (RN-PT group). A separate 50 medication histories performed by the pharmacy technician and observed with necessary intervention by the ED pharmacist were evaluated for discrepancies (PT-RPh group). Discrepancies were totaled and categorized by type of error and therapeutic category of the medication. The workflow description was obtained by observation and staff interview. Results: A total of 474 medications in the RN-PT group and 521 in the PT-RPh group were evaluated. Nurses made at least one error in all 50 medication histories (100%), compared to 18 medication histories for the pharmacy technician (36%). In the RN-PT group, 408 medications had at least one error, corresponding to an accuracy rate of 14% for nurses. In the PT-RPh group, 30 medications had an error, corresponding to an accuracy rate of 94.4% for the pharmacy technician (P < 0.0001). The most common error made by nurses was a missing medication (n = 109), while the most common error for the pharmacy technician was a wrong medication frequency (n = 19). The most common drug class with documented errors for ED nurses was cardiovascular medications (n = 100), while the pharmacy technician made the most errors in gastrointestinal medications (n = 11). Conclusion: Medication histories obtained by the pharmacy technician were significantly more accurate than those obtained by nurses in the emergency department. PMID:28090164

  7. Stopping the Spread of Germs at Home, Work and School

    MedlinePlus

    ... influenza viruses that research indicates will be most common during the upcoming season. There are several flu vaccine options for the 2017-2018 flu season . Good Health Habits Avoid close contact. Avoid close contact ...

  8. Refractive error among the elderly in rural Southern Harbin, China.

    PubMed

    Li, Zhijian; Sun, Dianjun; Cuj, Hao; Zhang, Liqiong; Lju, Ping; Yang, Hongbin; Baj, Jie

    2009-01-01

    To estimate the prevalence and associated factors of refractive errors among the elderly in a rural area of Southern Harbin, China. Five thousand and fifty seven subjects (age > or = 50 years) were enumerated for a population-based study. All participants underwent complete ophthalmic evaluation. Refraction was performed by ophthalmic personnel trained in the study procedures. Myopia was defined as spherical equivalent worse than -0.50 diopters (D) and hyperopia as spherical equivalent worse than +0.50 D. Astigmatism was defined as a cylindrical error worse than 0.75D. Association of refractive errors with age, sex, and education were analyzed. Of the 5,057 responders (91.0%), 4,979 were eligible. The mean age was 60.5 (range 50-96) years old. The prevalence of myopia was 9.5% (95% confidence interval [CI], 8.5-10.1) and of hyperopia was 8.9% (95% CI, 7.9-9.5). Astigmatism was evident in 7.6% of the subjects. Myopia, hyperopia and astigmatism increased with increasing age (p<0.001, respectively). Myopia and astigmatism were more common in males, whereas hyperopia was more common in females. We also found that prevalence of refractive error weas associated with education. Myopia was more common in those with higher degrees of education, whereas hyperopia and astigmatism were more common in those with no formal education. This report has provided details of the refractive status in a rural population of Harbin. The prevalence of refractive errors in this population is lower than those reported in other regions of the world.

  9. Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain

    PubMed Central

    Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young

    2010-01-01

    Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071

  10. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  11. [Improving blood safety: errors management in transfusion medicine].

    PubMed

    Bujandrić, Nevenka; Grujić, Jasmina; Krga-Milanović, Mirjana

    2014-01-01

    The concept of blood safety includes the entire transfusion chain starting with the collection of blood from the blood donor, and ending with blood transfusion to the patient. The concept involves quality management system as the systematic monitoring of adverse reactions and incidents regarding the blood donor or patient. Monitoring of near-miss errors show the critical points in the working process and increase transfusion safety. The aim of the study was to present the analysis results of adverse and unexpected events in transfusion practice with a potential risk to the health of blood donors and patients. One-year retrospective study was based on the collection, analysis and interpretation of written reports on medical errors in the Blood Transfusion Institute of Vojvodina. Errors were distributed according to the type, frequency and part of the working process where they occurred. Possible causes and corrective actions were described for each error. The study showed that there were not errors with potential health consequences for the blood donor/patient. Errors with potentially damaging consequences for patients were detected throughout the entire transfusion chain. Most of the errors were identified in the preanalytical phase. The human factor was responsible for the largest number of errors. Error reporting system has an important role in the error management and the reduction of transfusion-related risk of adverse events and incidents. The ongoing analysis reveals the strengths and weaknesses of the entire process and indicates the necessary changes. Errors in transfusion medicine can be avoided in a large percentage and prevention is cost-effective, systematic and applicable.

  12. Memory and the Moses illusion: failures to detect contradictions with stored knowledge yield negative memorial consequences.

    PubMed

    Bottoms, Hayden C; Eslick, Andrea N; Marsh, Elizabeth J

    2010-08-01

    Although contradictions with stored knowledge are common in daily life, people often fail to notice them. For example, in the Moses illusion, participants fail to notice errors in questions such as "How many animals of each kind did Moses take on the Ark?" despite later showing knowledge that the Biblical reference is to Noah, not Moses. We examined whether error prevalence affected participants' ability to detect distortions in questions, and whether this in turn had memorial consequences. Many of the errors were overlooked, but participants were better able to catch them when they were more common. More generally, the failure to detect errors had negative memorial consequences, increasing the likelihood that the errors were used to answer later general knowledge questions. Methodological implications of this finding are discussed, as it suggests that typical analyses likely underestimate the size of the Moses illusion. Overall, answering distorted questions can yield errors in the knowledge base; most importantly, prior knowledge does not protect against these negative memorial consequences.

  13. Prescribing errors in adult congenital heart disease patients admitted to a pediatric cardiovascular intensive care unit.

    PubMed

    Echeta, Genevieve; Moffett, Brady S; Checchia, Paul; Benton, Mary Kay; Klouda, Leda; Rodriguez, Fred H; Franklin, Wayne

    2014-01-01

    Adults with congenital heart disease (CHD) are often cared for at pediatric hospitals. There are no data describing the incidence or type of medication prescribing errors in adult patients admitted to a pediatric cardiovascular intensive care unit (CVICU). A review of patients >18 years of age admitted to the pediatric CVICU at our institution from 2009 to 2011 occurred. A comparator group <18 years of age but >70 kg (a typical adult weight) was identified. Medication prescribing errors were determined according to a commonly used adult drug reference. An independent panel consisting of a physician specializing in the care of adult CHD patients, a nurse, and a pharmacist evaluated all errors. Medication prescribing orders were classified as appropriate, underdose, overdose, or nonstandard (dosing per weight instead of standard adult dosing), and severity of error was classified. Eighty-five adult (74 patients) and 33 pediatric admissions (32 patients) met study criteria (mean age 27.5 ± 9.4 years, 53% male vs. 14.9 ± 1.8 years, 63% male). A cardiothoracic surgical procedure occurred in 81.4% of admissions. Adult admissions weighed less than pediatric admissions (72.8 ± 22.4 kg vs. 85.6 ± 14.9 kg, P < .01) but hospital length of stay was similar. (Adult 6 days [range 1-216 days]; pediatric 5 days [Range 2-123 days], P = .52.) A total of 112 prescribing errors were identified and they occurred less often in adults (42.4% of admissions vs. 66.7% of admissions, P = .02). Adults had a lower mean number of errors (0.7 errors per adult admission vs. 1.7 errors per pediatric admission, P < .01). Prescribing errors occurred most commonly with antimicrobials (n = 27). Underdosing was the most common category of prescribing error. Most prescribing errors were determined to have not caused harm to the patient. Prescribing errors occur frequently in adult patients admitted to a pediatric CVICU but occur more often in pediatric patients of adult weight. © 2013 Wiley Periodicals, Inc.

  14. Software fault-tolerance by design diversity DEDIX: A tool for experiments

    NASA Technical Reports Server (NTRS)

    Avizienis, A.; Gunningberg, P.; Kelly, J. P. J.; Lyu, R. T.; Strigini, L.; Traverse, P. J.; Tso, K. S.; Voges, U.

    1986-01-01

    The use of multiple versions of a computer program, independently designed from a common specification, to reduce the effects of an error is discussed. If these versions are designed by independent programming teams, it is expected that a fault in one version will not have the same behavior as any fault in the other versions. Since the errors in the output of the versions are different and uncorrelated, it is possible to run the versions concurrently, cross-check their results at prespecified points, and mask errors. A DEsign DIversity eXperiments (DEDIX) testbed was implemented to study the influence of common mode errors which can result in a failure of the entire system. The layered design of DEDIX and its decision algorithm are described.

  15. Drug error in paediatric anaesthesia: current status and where to go now.

    PubMed

    Anderson, Brian J

    2018-06-01

    Medication errors in paediatric anaesthesia and the perioperative setting continue to occur despite widespread recognition of the problem and published advice for reduction of this predicament at international, national, local and individual levels. Current literature was reviewed to ascertain drug error rates and to appraise causes and proposed solutions to reduce these errors. The medication error incidence remains high. There is documentation of reduction through identification of causes with consequent education and application of safety analytics and quality improvement programs in anaesthesia departments. Children remain at higher risk than adults because of additional complexities such as drug dose calculations, increased susceptibility to some adverse effects and changes associated with growth and maturation. Major improvements are best made through institutional system changes rather than a commitment to do better on the part of each practitioner. Medication errors in paediatric anaesthesia represent an important risk to children and most are avoidable. There is now an understanding of the genesis of adverse drug events and this understanding should facilitate the implementation of known effective countermeasures. An institution-wide commitment and strategy are the basis for a worthwhile and sustained improvement in medication safety.

  16. Symmetric and Asymmetric Patterns of Attraction Errors in Producing Subject-Predicate Agreement in Hebrew: An Issue of Morphological Structure

    ERIC Educational Resources Information Center

    Deutsch, Avital; Dank, Maya

    2011-01-01

    A common characteristic of subject-predicate agreement errors (usually termed attraction errors) in complex noun phrases is an asymmetrical pattern of error distribution, depending on the inflectional state of the nouns comprising the complex noun phrase. That is, attraction is most likely to occur when the head noun is the morphologically…

  17. Mimicking Aphasic Semantic Errors in Normal Speech Production: Evidence from a Novel Experimental Paradigm

    ERIC Educational Resources Information Center

    Hodgson, Catherine; Lambon Ralph, Matthew A.

    2008-01-01

    Semantic errors are commonly found in semantic dementia (SD) and some forms of stroke aphasia and provide insights into semantic processing and speech production. Low error rates are found in standard picture naming tasks in normal controls. In order to increase error rates and thus provide an experimental model of aphasic performance, this study…

  18. Refractive errors in Aminu Kano Teaching Hospital, Kano Nigeria.

    PubMed

    Lawan, Abdu; Eme, Okpo

    2011-12-01

    The aim of the study is to retrospectively determine the pattern of refractive errors seen in the eye clinic of Aminu Kano Teaching Hospital, Kano-Nigeria from January to December, 2008. The clinic refraction register was used to retrieve the case folders of all patients refracted during the review period. Information extracted includes patient's age, sex, and types of refractive error. All patients had basic eye examination (to rule out other causes of subnormal vision) including intra ocular pressure measurement and streak retinoscopy at two third meter working distance. The final subjective refraction correction given to the patients was used to categorise the type of refractive error. Refractive errors was observed in 1584 patients and accounted for 26.9% of clinic attendance. There were more females than males (M: F=1.0: 1.2). The common types of refractive errors are presbyopia in 644 patients (40%), various types of astigmatism in 527 patients (33%), myopia in 216 patients (14%), hypermetropia in 171 patients (11%) and aphakia in 26 patients (2%). Refractive errors are common causes of presentation in the eye clinic. Identification and correction of refractive errors should be an integral part of eye care delivery.

  19. Teaching Common Errors in Applying a Procedure. IDD&E Working Paper No. 18.

    ERIC Educational Resources Information Center

    Garduno, Alberto O.; And Others

    The purpose of this study was to replicate the Bentti, Golden, and Reigeluth study (1983), which explored the use of nonexamples to teach common errors as an effective strategy in teaching a procedure. A total of 24 undergraduate students enrolled in the Syracuse University Symphonic Band were randomly assigned to an experimental group and a…

  20. The Effect of Error in Item Parameter Estimates on the Test Response Function Method of Linking.

    ERIC Educational Resources Information Center

    Kaskowitz, Gary S.; De Ayala, R. J.

    2001-01-01

    Studied the effect of item parameter estimation for computation of linking coefficients for the test response function (TRF) linking/equating method. Simulation results showed that linking was more accurate when there was less error in the parameter estimates, and that 15 or 25 common items provided better results than 5 common items under both…

  1. Children's Overtensing Errors: Phonological and Lexical Effects on Syntax

    ERIC Educational Resources Information Center

    Stemberger, Joseph Paul

    2007-01-01

    Overtensing (the use of an inflected form in place of a nonfinite form, e.g. *"didn't broke" for target "didn't break") is common in early syntax. In a ChiLDES-based study of 36 children acquiring English, I examine the effects of phonological and lexical factors. For irregulars, errors are more common with verbs of low frequency and when…

  2. ISMP Medication Error Report Analysis: Understanding Human Over-reliance on Technology It's Exelan, Not Exelon Crash Cart Drug Mix-up Risk with Entering a "Test Order".

    PubMed

    Cohen, Michael R; Smetzer, Judy L

    2017-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again-perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications.

  3. Compounded Pain Creams and Adverse Effects; Postanesthesia Care Unit ADC Selection Error; Docetaxel Product Has Unusual Concentration; Tragic Vaccine Diluent Mix-ups.

    PubMed

    Cohen, Michael R; Smetzer, Judy L

    2015-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again-perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers' names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters' wishes as to the level of detail included in publications.

  4. Decimal Commas Are a Problem; Actiq Is Not for Sore Throats; Dosing Error with Tasigna; Repackaging of Imbruvica Is Approved

    PubMed Central

    Cohen, Michael R.; Smetzer, Judy L.

    2014-01-01

    These medication errors have occurred in health care facilities at least once. They will happen again—perhaps where you work. Through education and alertness of personnel and procedural safeguards, they can be avoided. You should consider publishing accounts of errors in your newsletters and/or presenting them at your inservice training programs. Your assistance is required to continue this feature. The reports described here were received through the Institute for Safe Medication Practices (ISMP) Medication Errors Reporting Program. Any reports published by ISMP will be anonymous. Comments are also invited; the writers’ names will be published if desired. ISMP may be contacted at the address shown below. Errors, close calls, or hazardous conditions may be reported directly to ISMP through the ISMP Web site (www.ismp.org), by calling 800-FAIL-SAFE, or via e-mail at ismpinfo@ismp.org. ISMP guarantees the confidentiality and security of the information received and respects reporters’ wishes as to the level of detail included in publications. PMID:25477591

  5. Negative Expertise: Comparing Differently Tenured Elder Care Nurses' Negative Knowledge

    ERIC Educational Resources Information Center

    Gartmeier, Martin; Lehtinen, Erno; Gruber, Hans; Heid, Helmut

    2011-01-01

    Negative expertise is conceptualised as the professional's ability to avoid errors during practice due to certain cognitive agencies. In this study, negative knowledge (i.e. knowledge about what is wrong in a certain context and situation) is conceptualised as one such agency. This study compares and investigates the negative knowledge of elder…

  6. An Evaluation of the Response Modulation Hypothesis in Relation to Attention-Deficit/ Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Farmer, Richard F.; Rucklidge, Julia J.

    2006-01-01

    Several hypotheses related to Newman's (e.g., Patterson & Newman, 1993) response modulation hypothesis were examined among adolescents with attention-deficit/hyperactivity disorder (ADHD; n = 18) and normal controls (n = 23). Consistent with predictions, youth with ADHD committed more passive avoidance errors (PAEs) than controls during the latter…

  7. Teaching Culture and Identifying Language Interference Errors through Films

    ERIC Educational Resources Information Center

    Argynbayev, Arman; Kabylbekova, Dana; Yaylaci, Yusuf

    2014-01-01

    This study reflects intermediate level learners' opinion about employing films in the EFL classroom for teaching culture and avoiding negative language transfer. A total of 63 participants, aged 21-23, took part in the experiment in the Faculty of Philology at Suleyman Demirel University in Almaty, Kazakhstan. During the experiment the subjects…

  8. Pollution, Health, and Avoidance Behavior: Evidence from the Ports of Los Angeles

    ERIC Educational Resources Information Center

    Moretti, Enrico; Neidell, Matthew

    2011-01-01

    A pervasive problem in estimating the costs of pollution is that optimizing individuals may compensate for increases in pollution by reducing their exposure, resulting in estimates that understate the full welfare costs. To account for this issue, measurement error, and environmental confounding, we estimate the health effects of ozone using daily…

  9. Headaches associated with refractive errors: myth or reality?

    PubMed

    Gil-Gouveia, R; Martins, I P

    2002-04-01

    Headache and refractive errors are very common conditions in the general population, and those with headache often attribute their pain to a visual problem. The International Headache Society (IHS) criteria for the classification of headache includes an entity of headache associated with refractive errors (HARE), but indicates that its importance is widely overestimated. To compare overall headache frequency and HARE frequency in healthy subjects with uncorrected or miscorrected refractive errors and a control group. We interviewed 105 individuals with uncorrected refractive errors and a control group of 71 subjects (with properly corrected or without refractive errors) regarding their headache history. We compared the occurrence of headache and its diagnosis in both groups and assessed its relation to their habits of visual effort and type of refractive errors. Headache frequency was similar in both subjects and controls. Headache associated with refractive errors was the only headache type significantly more common in subjects with refractive errors than in controls (6.7% versus 0%). It was associated with hyperopia and was unrelated to visual effort or to the severity of visual error. With adequate correction, 72.5% of the subjects with headache and refractive error reported improvement in their headaches, and 38% had complete remission of headache. Regardless of the type of headache present, headache frequency was significantly reduced in these subjects (t = 2.34, P =.02). Headache associated with refractive errors was rarely identified in individuals with refractive errors. In those with chronic headache, proper correction of refractive errors significantly improved headache complaints and did so primarily by decreasing the frequency of headache episodes.

  10. Resolving occlusion and segmentation errors in multiple video object tracking

    NASA Astrophysics Data System (ADS)

    Cheng, Hsu-Yung; Hwang, Jenq-Neng

    2009-02-01

    In this work, we propose a method to integrate the Kalman filter and adaptive particle sampling for multiple video object tracking. The proposed framework is able to detect occlusion and segmentation error cases and perform adaptive particle sampling for accurate measurement selection. Compared with traditional particle filter based tracking methods, the proposed method generates particles only when necessary. With the concept of adaptive particle sampling, we can avoid degeneracy problem because the sampling position and range are dynamically determined by parameters that are updated by Kalman filters. There is no need to spend time on processing particles with very small weights. The adaptive appearance for the occluded object refers to the prediction results of Kalman filters to determine the region that should be updated and avoids the problem of using inadequate information to update the appearance under occlusion cases. The experimental results have shown that a small number of particles are sufficient to achieve high positioning and scaling accuracy. Also, the employment of adaptive appearance substantially improves the positioning and scaling accuracy on the tracking results.

  11. Avoiding Errors in the Management of Pediatric Polytrauma Patients.

    PubMed

    Chin, Kenneth; Abzug, Joshua; Bae, Donald S; Horn, Bernard D; Herman, Martin; Eberson, Craig P

    2016-01-01

    Management of pediatric polytrauma patients is one of the most difficult challenges for orthopaedic surgeons. Multisystem injuries frequently include complex orthopaedic surgical problems that require intervention. The physiology and anatomy of children and adolescent trauma patients differ from the physiology and anatomy of an adult trauma patient, which alters the types of injuries sustained and the ideal methods for management. Errors of pediatric polytrauma care are included in two broad categories: missed injuries and inadequate fracture treatment. Diagnoses may be missed most frequently because of a surgeon's inability to reliably assess patients who have traumatic brain injuries and painful distracting injuries. Cervical spine injuries are particularly difficult to identify in a child with polytrauma and may have devastating consequences. In children who have multiple injuries, the stabilization of long bone fractures with pediatric fixation techniques, such as elastic nails and other implants, allows for easier care and more rapid mobilization compared with cast treatments. Adolescent polytrauma patients who are approaching skeletal maturity, however, are ideally treated as adults to avoid complications, such as loss of fixation, and to speed rehabilitation.

  12. Online Estimation of Allan Variance Coefficients Based on a Neural-Extended Kalman Filter

    PubMed Central

    Miao, Zhiyong; Shen, Feng; Xu, Dingjie; He, Kunpeng; Tian, Chunmiao

    2015-01-01

    As a noise analysis method for inertial sensors, the traditional Allan variance method requires the storage of a large amount of data and manual analysis for an Allan variance graph. Although the existing online estimation methods avoid the storage of data and the painful procedure of drawing slope lines for estimation, they require complex transformations and even cause errors during the modeling of dynamic Allan variance. To solve these problems, first, a new state-space model that directly models the stochastic errors to obtain a nonlinear state-space model was established for inertial sensors. Then, a neural-extended Kalman filter algorithm was used to estimate the Allan variance coefficients. The real noises of an ADIS16405 IMU and fiber optic gyro-sensors were analyzed by the proposed method and traditional methods. The experimental results show that the proposed method is more suitable to estimate the Allan variance coefficients than the traditional methods. Moreover, the proposed method effectively avoids the storage of data and can be easily implemented using an online processor. PMID:25625903

  13. A case of the birth and death of a high reliability healthcare organisation.

    PubMed

    Roberts, K H; Madsen, P; Desai, V; Van Stralen, D

    2005-06-01

    High reliability organisations (HROs) are those in which errors rarely occur. To accomplish this they conduct relatively error free operations over long periods of time and make consistently good decisions resulting in high quality and reliability. Some organisational processes that characterise HROs are process auditing, implementing appropriate reward systems, avoiding quality degradation, appropriately perceiving that risk exists and developing strategies to deal with it, and command and control. Command and control processes include migrating decision making, redundancy in people or hardware, developing situational awareness, formal rules and procedures, and training. These processes must be tailored to the specific organisation implementing them. These processes were applied to a paediatric intensive care unit (PICU) where care was derived from problem solving methodology rather than protocol. After a leadership change, the unit returned to the hierarchical medical model of care. Important outcome variables such as infant mortality, patient return to the PICU after discharge, days on the PICU, air transports, degraded. Implications for clinical practice include providing caregivers with sufficient flexibility to meet changing situations, encouraging teamwork, and avoiding shaming, naming, and blaming.

  14. A manifesto for cardiovascular imaging: addressing the human factor†

    PubMed Central

    Fraser, Alan G

    2017-01-01

    Abstract Our use of modern cardiovascular imaging tools has not kept pace with their technological development. Diagnostic errors are common but seldom investigated systematically. Rather than more impressive pictures, our main goal should be more precise tests of function which we select because their appropriate use has therapeutic implications which in turn have a beneficial impact on morbidity or mortality. We should practise analytical thinking, use checklists to avoid diagnostic pitfalls, and apply strategies that will reduce biases and avoid overdiagnosis. We should develop normative databases, so that we can apply diagnostic algorithms that take account of variations with age and risk factors and that allow us to calculate pre-test probability and report the post-test probability of disease. We should report the imprecision of a test, or its confidence limits, so that reference change values can be considered in daily clinical practice. We should develop decision support tools to improve the quality and interpretation of diagnostic imaging, so that we choose the single best test irrespective of modality. New imaging tools should be evaluated rigorously, so that their diagnostic performance is established before they are widely disseminated; this should be a shared responsibility of manufacturers with clinicians, leading to cost-effective implementation. Trials should evaluate diagnostic strategies against independent reference criteria. We should exploit advances in machine learning to analyse digital data sets and identify those features that best predict prognosis or responses to treatment. Addressing these human factors will reap benefit for patients, while technological advances continue unpredictably. PMID:29029029

  15. Economic and health risk trade-offs of swim closures at a Lake Michigan beach

    USGS Publications Warehouse

    Rabinovici, Sharyl M.; Bernknopf, Richard L.; Wein, Anne M.; Coursey, Don L.; Whitman, Richard L.

    2004-01-01

    This paper presents a framework for analyzing the economic, health, and recreation implications of swim closures related to high fecal indicator bacteria (FIB) levels. The framework utilizes benefit transfer policy analysis to provide a practical procedure for estimating the effectiveness of recreational water quality policies. Evaluation criteria include the rates of intended and unintended management outcomes, whether the chosen protocols generate closures with positive net economic benefits to swimmers, and the number of predicted illnesses the policy is able to prevent. We demonstrate the framework through a case study of a Lake Michigan freshwater beach using existing water quality and visitor data from 1998 to 2001. We find that a typical closure causes a net economic loss among would-be swimmers totaling $1274-37 030/ day, depending on the value assumptions used. Unnecessary closures, caused by high indicator variability and a 24-h time delay between when samples are taken and the management decision can be made, occurred on 14 (12%) out of 118 monitored summer days. Days with high FIB levels when the swim area is open are also common but do relatively little economic harm in comparison. Also, even if the closure policy could be implemented daily and perfectly without error, only about 42% of predicted illnesses would be avoided. These conclusions were sensitive to the relative values and risk preferences that swimmers have for recreation access and avoiding health effects, suggesting a need for further study of the impacts of recreational water quality policies on individuals.

  16. Teamwork, Communication, Formula-One Racing and the Outcomes of Cardiac Surgery

    PubMed Central

    Merry, Alan F.; Weller, Jennifer; Mitchell, Simon J.

    2014-01-01

    Abstract: Most cardiac units achieve excellent results today, but the risk of cardiac surgery is still relatively high, and avoidable harm is common. The story of the Green Lane Cardiothoracic Unit provides an exemplar of excellence, but also illustrates the challenges associated with changes over time and with increases in the size of a unit and the complexity of practice today. The ultimate aim of cardiac surgery should be the best outcomes for (often very sick) patients rather than an undue focus on the prevention of error or adverse events. Measurement is fundamental to improving quality in health care, and the framework of structure, process, and outcome is helpful in considering how best to achieve this. A combination of outcomes (including some indicators of important morbidity) with key measures of process is advocated. There is substantial evidence that failures in teamwork and communication contribute to inefficiency and avoidable harm in cardiac surgery. Minor events are as important as major ones. Six approaches to improving teamwork (and hence outcomes) in cardiac surgery are suggested. These are: 1) subspecialize and replace tribes with teams; 2) sort out the leadership while flattening the gradients of authority; 3) introduce explicit training in effective communication; 4) use checklists, briefings, and debriefings and engage in the process; 5) promote a culture of respect alongside a commitment to excellence and a focus on patients; 6) focus on the performance of the team, not on individuals. PMID:24779113

  17. Teamwork, communication, formula-one racing and the outcomes of cardiac surgery.

    PubMed

    Merry, Alan F; Weller, Jennifer; Mitchell, Simon J

    2014-03-01

    Most cardiac units achieve excellent results today, but the risk of cardiac surgery is still relatively high, and avoidable harm is common. The story of the Green Lane Cardiothoracic Unit provides an exemplar of excellence, but also illustrates the challenges associated with changes over time and with increases in the size of a unit and the complexity of practice today. The ultimate aim of cardiac surgery should be the best outcomes for (often very sick) patients rather than an undue focus on the prevention of error or adverse events. Measurement is fundamental to improving quality in health care, and the framework of structure, process, and outcome is helpful in considering how best to achieve this. A combination of outcomes (including some indicators of important morbidity) with key measures of process is advocated. There is substantial evidence that failures in teamwork and communication contribute to inefficiency and avoidable harm in cardiac surgery. Minor events are as important as major ones. Six approaches to improving teamwork (and hence outcomes) in cardiac surgery are suggested. These are: 1) subspecialize and replace tribes with teams; 2) sort out the leadership while flattening the gradients of authority; 3) introduce explicit training in effective communication; 4) use checklists, briefings, and debriefings and engage in the process; 5) promote a culture of respect alongside a commitment to excellence and a focus on patients; 6) focus on the performance of the team, not on individuals.

  18. Exhaustive Thresholds and Resistance Checkpoints

    NASA Technical Reports Server (NTRS)

    Easton, Charles; Khuzadi, Mbuyi

    2008-01-01

    Once deployed, all intricate systems that operate for a long time (such as an airplane or chemical processing plant) experience degraded performance during operational lifetime. These can result from losses of integrity in subsystems and parts that generally do not materially impact the operation of the vehicle (e.g., the light behind the button that opens the sliding door of the minivan). Or it can result from loss of more critical parts or subsystems. Such losses need to be handled quickly in order to avoid loss of personnel, mission, or part of the system itself. In order to manage degraded systems, knowledge of its potential problem areas and the means by which these problems are detected should be developed during the initial development of the system. Once determined, a web of sensors is employed and their outputs are monitored with other system parameters while the system is in preparation or operation. Just gathering the data is only part of the story. The interpretation of the data itself and the response of the system must be carefully developed as well to avoid a mishap. Typically, systems use a test-threshold-response paradigm to process potential system faults. However, such processing sub-systems can suffer from errors and oversights of a consistent type, causing system aberrant behavior instead of expected system and recovery operations. In our study, we developed a complete checklist for determining the completeness of a fault system and its robustness to common processing and response difficulties.

  19. Finite-time sliding surface constrained control for a robot manipulator with an unknown deadzone and disturbance.

    PubMed

    Ik Han, Seong; Lee, Jangmyung

    2016-11-01

    This paper presents finite-time sliding mode control (FSMC) with predefined constraints for the tracking error and sliding surface in order to obtain robust positioning of a robot manipulator with input nonlinearity due to an unknown deadzone and external disturbance. An assumed model feedforward FSMC was designed to avoid tedious identification procedures for the manipulator parameters and to obtain a fast response time. Two constraint switching control functions based on the tracking error and finite-time sliding surface were added to the FSMC to guarantee the predefined tracking performance despite the presence of an unknown deadzone and disturbance. The tracking error due to the deadzone and disturbance can be suppressed within the predefined error boundary simply by tuning the gain value of the constraint switching function and without the addition of an extra compensator. Therefore, the designed constraint controller has a simpler structure than conventional transformed error constraint methods and the sliding surface constraint scheme can also indirectly guarantee the tracking error constraint while being more stable than the tracking error constraint control. A simulation and experiment were performed on an articulated robot manipulator to validate the proposed control schemes. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Preemption versus Entrenchment: Towards a Construction-General Solution to the Problem of the Retreat from Verb Argument Structure Overgeneralization

    PubMed Central

    Ambridge, Ben; Bidgood, Amy; Twomey, Katherine E.; Pine, Julian M.; Rowland, Caroline F.; Freudenthal, Daniel

    2015-01-01

    Participants aged 5;2-6;8, 9;2-10;6 and 18;1-22;2 (72 at each age) rated verb argument structure overgeneralization errors (e.g., *Daddy giggled the baby) using a five-point scale. The study was designed to investigate the feasibility of two proposed construction-general solutions to the question of how children retreat from, or avoid, such errors. No support was found for the prediction of the preemption hypothesis that the greater the frequency of the verb in the single most nearly synonymous construction (for this example, the periphrastic causative; e.g., Daddy made the baby giggle), the lower the acceptability of the error. Support was found, however, for the prediction of the entrenchment hypothesis that the greater the overall frequency of the verb, regardless of construction, the lower the acceptability of the error, at least for the two older groups. Thus while entrenchment appears to be a robust solution to the problem of the retreat from error, and one that generalizes across different error types, we did not find evidence that this is the case for preemption. The implication is that the solution to the retreat from error lies not with specialized mechanisms, but rather in a probabilistic process of construction competition. PMID:25919003

Top