Sample records for reducing diagnostic errors

  1. [Diagnostic Errors in Medicine].

    PubMed

    Buser, Claudia; Bankova, Andriyana

    2015-12-09

    The recognition of diagnostic errors in everyday practice can help improve patient safety. The most common diagnostic errors are the cognitive errors, followed by system-related errors and no fault errors. The cognitive errors often result from mental shortcuts, known as heuristics. The rate of cognitive errors can be reduced by a better understanding of heuristics and the use of checklists. The autopsy as a retrospective quality assessment of clinical diagnosis has a crucial role in learning from diagnostic errors. Diagnostic errors occur more often in primary care in comparison to hospital settings. On the other hand, the inpatient errors are more severe than the outpatient errors.

  2. Commentary: Reducing diagnostic errors: another role for checklists?

    PubMed

    Winters, Bradford D; Aswani, Monica S; Pronovost, Peter J

    2011-03-01

    Diagnostic errors are a widespread problem, although the true magnitude is unknown because they cannot currently be measured validly. These errors have received relatively little attention despite alarming estimates of associated harm and death. One promising intervention to reduce preventable harm is the checklist. This intervention has proven successful in aviation, in which situations are linear and deterministic (one alarm goes off and a checklist guides the flight crew to evaluate the cause). In health care, problems are multifactorial and complex. A checklist has been used to reduce central-line-associated bloodstream infections in intensive care units. Nevertheless, this checklist was incorporated in a culture-based safety program that engaged and changed behaviors and used robust measurement of infections to evaluate progress. In this issue, Ely and colleagues describe how three checklists could reduce the cognitive biases and mental shortcuts that underlie diagnostic errors, but point out that these tools still need to be tested. To be effective, they must reduce diagnostic errors (efficacy) and be routinely used in practice (effectiveness). Such tools must intuitively support how the human brain works, and under time pressures, clinicians rarely think in conditional probabilities when making decisions. To move forward, it is necessary to accurately measure diagnostic errors (which could come from mapping out the diagnostic process as the medication process has done and measuring errors at each step) and pilot test interventions such as these checklists to determine whether they work.

  3. Diagnostic Errors in Ambulatory Care: Dimensions and Preventive Strategies

    ERIC Educational Resources Information Center

    Singh, Hardeep; Weingart, Saul N.

    2009-01-01

    Despite an increasing focus on patient safety in ambulatory care, progress in understanding and reducing diagnostic errors in this setting lag behind many other safety concerns such as medication errors. To explore the extent and nature of diagnostic errors in ambulatory care, we identified five dimensions of ambulatory care from which errors may…

  4. The global burden of diagnostic errors in primary care

    PubMed Central

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-01-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, ‘Improving Diagnosis in Health Care’, concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a ‘magic bullet’ and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO’s leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. PMID:27530239

  5. System Related Interventions to Reduce Diagnostic Error: A Narrative Review

    PubMed Central

    Singh, Hardeep; Graber, Mark L.; Kissam, Stephanie M.; Sorensen, Asta V.; Lenfestey, Nancy F.; Tant, Elizabeth M.; Henriksen, Kerm; LaBresh, Kenneth A.

    2013-01-01

    Background Diagnostic errors (missed, delayed, or wrong diagnosis) have gained recent attention and are associated with significant preventable morbidity and mortality. We reviewed the recent literature to identify interventions that have been, or could be, implemented to address systems-related factors that contribute directly to diagnostic error. Methods We conducted a comprehensive search using multiple search strategies. We first identified candidate articles in English between 2000 and 2009 from a PubMed search that exclusively evaluated for articles related to diagnostic error or delay. We then sought additional papers from references in the initial dataset, searches of additional databases, and subject matter experts. Articles were included if they formally evaluated an intervention to prevent or reduce diagnostic error; however, we also included papers if interventions were suggested and not tested in order to inform the state-of-the science on the topic. We categorized interventions according to the step in the diagnostic process they targeted: patient-provider encounter, performance and interpretation of diagnostic tests, follow-up and tracking of diagnostic information, subspecialty and referral-related; and patient-specific. Results We identified 43 articles for full review, of which 6 reported tested interventions and 37 contained suggestions for possible interventions. Empirical studies, though somewhat positive, were non-experimental or quasi-experimental and included a small number of clinicians or health care sites. Outcome measures in general were underdeveloped and varied markedly between studies, depending on the setting or step in the diagnostic process involved. Conclusions Despite a number of suggested interventions in the literature, few empirical studies have tested interventions to reduce diagnostic error in the last decade. Advancing the science of diagnostic error prevention will require more robust study designs and rigorous definitions of diagnostic processes and outcomes to measure intervention effects. PMID:22129930

  6. The global burden of diagnostic errors in primary care.

    PubMed

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-06-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, 'Improving Diagnosis in Health Care ', concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a 'magic bullet' and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO's leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Reducing Diagnostic Errors through Effective Communication: Harnessing the Power of Information Technology

    PubMed Central

    Naik, Aanand Dinkar; Rao, Raghuram; Petersen, Laura Ann

    2008-01-01

    Diagnostic errors are poorly understood despite being a frequent cause of medical errors. Recent efforts have aimed to advance the "basic science" of diagnostic error prevention by tracing errors to their most basic origins. Although a refined theory of diagnostic error prevention will take years to formulate, we focus on communication breakdown, a major contributor to diagnostic errors and an increasingly recognized preventable factor in medical mishaps. We describe a comprehensive framework that integrates the potential sources of communication breakdowns within the diagnostic process and identifies vulnerable steps in the diagnostic process where various types of communication breakdowns can precipitate error. We then discuss potential information technology-based interventions that may have efficacy in preventing one or more forms of these breakdowns. These possible intervention strategies include using new technologies to enhance communication between health providers and health systems, improve patient involvement, and facilitate management of information in the medical record. PMID:18373151

  8. The next organizational challenge: finding and addressing diagnostic error.

    PubMed

    Graber, Mark L; Trowbridge, Robert; Myers, Jennifer S; Umscheid, Craig A; Strull, William; Kanter, Michael H

    2014-03-01

    Although health care organizations (HCOs) are intensely focused on improving the safety of health care, efforts to date have almost exclusively targeted treatment-related issues. The literature confirms that the approaches HCOs use to identify adverse medical events are not effective in finding diagnostic errors, so the initial challenge is to identify cases of diagnostic error. WHY HEALTH CARE ORGANIZATIONS NEED TO GET INVOLVED: HCOs are preoccupied with many quality- and safety-related operational and clinical issues, including performance measures. The case for paying attention to diagnostic errors, however, is based on the following four points: (1) diagnostic errors are common and harmful, (2) high-quality health care requires high-quality diagnosis, (3) diagnostic errors are costly, and (4) HCOs are well positioned to lead the way in reducing diagnostic error. FINDING DIAGNOSTIC ERRORS: Current approaches to identifying diagnostic errors, such as occurrence screens, incident reports, autopsy, and peer review, were not designed to detect diagnostic issues (or problems of omission in general) and/or rely on voluntary reporting. The realization that the existing tools are inadequate has spurred efforts to identify novel tools that could be used to discover diagnostic errors or breakdowns in the diagnostic process that are associated with errors. New approaches--Maine Medical Center's case-finding of diagnostic errors by facilitating direct reports from physicians and Kaiser Permanente's electronic health record--based reports that detect process breakdowns in the followup of abnormal findings--are described in case studies. By raising awareness and implementing targeted programs that address diagnostic error, HCOs may begin to play an important role in addressing the problem of diagnostic error.

  9. EPs welcome new focus on reducing diagnostic errors.

    PubMed

    2015-12-01

    Emergency medicine leaders welcome a major new report from the Institute of Medicine (IOM) calling on providers, policy makers, and government agencies to institute changes to reduce the incidence of diagnostic errors. The 369-page report, "Improving Diagnosis in Health Care," states that the rate of diagnostic errors in this country is unacceptably high and offers a long list of recommendations aimed at addressing the problem. These include large, systemic changes that involve improvements in multiple areas, including health information technology (HIT), professional education, teamwork, and payment reform. Further, of particular interest to emergency physicians are recommended changes to the liability system. The authors of the IOM report state that while most people will likely experience a significant diagnostic error in their lifetime, the importance of this problem is under-appreciated. According to conservative estimates, the report says 5% of adults who seek outpatient care each year experience a diagnostic error. The report also notes that research over many decades shows diagnostic errors contribute to roughly 10% of all.deaths. The report says more steps need to be taken to facilitate inter-professional and intra-professional teamwork throughout the diagnostic process. Experts concur with the report's finding that mechanisms need to be developed so that providers receive ongoing feedback on their diagnostic performance.

  10. Cognitive aspect of diagnostic errors.

    PubMed

    Phua, Dong Haur; Tan, Nigel C K

    2013-01-01

    Diagnostic errors can result in tangible harm to patients. Despite our advances in medicine, the mental processes required to make a diagnosis exhibits shortcomings, causing diagnostic errors. Cognitive factors are found to be an important cause of diagnostic errors. With new understanding from psychology and social sciences, clinical medicine is now beginning to appreciate that our clinical reasoning can take the form of analytical reasoning or heuristics. Different factors like cognitive biases and affective influences can also impel unwary clinicians to make diagnostic errors. Various strategies have been proposed to reduce the effect of cognitive biases and affective influences when clinicians make diagnoses; however evidence for the efficacy of these methods is still sparse. This paper aims to introduce the reader to the cognitive aspect of diagnostic errors, in the hope that clinicians can use this knowledge to improve diagnostic accuracy and patient outcomes.

  11. Reducing diagnostic errors in medicine: what's the goal?

    PubMed

    Graber, Mark; Gordon, Ruthanna; Franklin, Nancy

    2002-10-01

    This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.

  12. [The factors affecting the results of mechanical jaundice management].

    PubMed

    Malkov, I S; Shaimardanov, R Sh; Korobkov, V N; Filippov, V A; Khisamiev, I G

    To improve the results of obstructive jaundice management by rational diagnostic and treatment strategies. Outcomes of 820 patients with obstructive jaundice syndrome were analyzed. Diagnostic and tactical mistakes were made at pre-hospital stage in 143 (17.4%) patients and in 105 (12.8%) at hospital stage. Herewith, in 53 (6.5%) cases the errors were observed at all stages. Retrospective analysis of severe postoperative complications and lethal outcomes in patients with obstructive jaundice showed that in 23.8% of cases they were explained by diagnostic and tactical mistakes at various stages of examination and treatment. We developed an algorithm for obstructive jaundice management to reduce the number of diagnostic and tactical errors, a reduction in the frequency of diagnostic and tactical errors. It reduced the number of postoperative complications up to 16.5% and mortality rate to 3.0%.

  13. Advancing the research agenda for diagnostic error reduction.

    PubMed

    Zwaan, Laura; Schiff, Gordon D; Singh, Hardeep

    2013-10-01

    Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research. Research methods that have studied epidemiology of diagnostic error provide some estimate on diagnostic error rates. However, there appears to be a large variability in the reported rates due to the heterogeneity of definitions and study methods used. Thus, future methods should focus on obtaining more precise estimates in different settings of care. This would lay the foundation for measuring error rates over time to evaluate improvements. Research methods have studied contributing factors for diagnostic error in both naturalistic and experimental settings. Both approaches have revealed important and complementary information. Newer conceptual models from outside healthcare are needed to advance the depth and rigour of analysis of systems and cognitive insights of causes of error. While the literature has suggested many potentially fruitful interventions for reducing diagnostic errors, most have not been systematically evaluated and/or widely implemented in practice. Research is needed to study promising intervention areas such as enhanced patient involvement in diagnosis, improving diagnosis through the use of electronic tools and identification and reduction of specific diagnostic process 'pitfalls' (eg, failure to conduct appropriate diagnostic evaluation of a breast lump after a 'normal' mammogram). The last decade of research on diagnostic error has made promising steps and laid a foundation for more rigorous methods to advance the field.

  14. Missed opportunities for diagnosis: lessons learned from diagnostic errors in primary care.

    PubMed

    Goyder, Clare R; Jones, Caroline H D; Heneghan, Carl J; Thompson, Matthew J

    2015-12-01

    Because of the difficulties inherent in diagnosis in primary care, it is inevitable that diagnostic errors will occur. However, despite the important consequences associated with diagnostic errors and their estimated high prevalence, teaching and research on diagnostic error is a neglected area. To ascertain the key learning points from GPs' experiences of diagnostic errors and approaches to clinical decision making associated with these. Secondary analysis of 36 qualitative interviews with GPs in Oxfordshire, UK. Two datasets of semi-structured interviews were combined. Questions focused on GPs' experiences of diagnosis and diagnostic errors (or near misses) in routine primary care and out of hours. Interviews were audiorecorded, transcribed verbatim, and analysed thematically. Learning points include GPs' reliance on 'pattern recognition' and the failure of this strategy to identify atypical presentations; the importance of considering all potentially serious conditions using a 'restricted rule out' approach; and identifying and acting on a sense of unease. Strategies to help manage uncertainty in primary care were also discussed. Learning from previous examples of diagnostic errors is essential if these events are to be reduced in the future and this should be incorporated into GP training. At a practice level, learning points from experiences of diagnostic errors should be discussed more frequently; and more should be done to integrate these lessons nationally to understand and characterise diagnostic errors. © British Journal of General Practice 2015.

  15. Diagnosis is a team sport - partnering with allied health professionals to reduce diagnostic errors.

    PubMed

    Thomas, Dana B; Newman-Toker, David E

    2016-06-01

    Diagnostic errors are the most common, most costly, and most catastrophic of medical errors. Interdisciplinary teamwork has been shown to reduce harm from therapeutic errors, but sociocultural barriers may impact the engagement of allied health professionals (AHPs) in the diagnostic process. A qualitative case study of the experience at a single institution around involvement of an AHP in the diagnostic process for acute dizziness and vertigo. We detail five diagnostic error cases in which the input of a physical therapist was central to correct diagnosis. We further describe evolution of the sociocultural milieu at the institution as relates to AHP engagement in diagnosis. Five patients with acute vestibular symptoms were initially misdiagnosed by physicians and then correctly diagnosed based on input from a vestibular physical therapist. These included missed labyrinthine concussion and post-traumatic benign paroxysmal positional vertigo (BPPV); BPPV called gastroenteritis; BPPV called stroke; stroke called BPPV; and multiple sclerosis called BPPV. As a consequence of surfacing these diagnostic errors, initial resistance to physical therapy input to aid medical diagnosis has gradually declined, creating a more collaborative environment for 'team diagnosis' of patients with dizziness and vertigo at the institution. Barriers to AHP engagement in 'team diagnosis' include sociocultural norms that establish medical diagnosis as something reserved only for physicians. Drawing attention to the valuable diagnostic contributions of AHPs may help facilitate cultural change. Future studies should seek to measure diagnostic safety culture and then implement proven strategies to breakdown sociocultural barriers that inhibit effective teamwork and transdisciplinary diagnosis.

  16. Diagnosis is a team sport - partnering with allied health professionals to reduce diagnostic errors: A case study on the role of a vestibular therapist in diagnosing dizziness.

    PubMed

    Thomas, Dana B; Newman-Toker, David E

    2016-06-01

    Diagnostic errors are the most common, most costly, and most catastrophic of medical errors. Interdisciplinary teamwork has been shown to reduce harm from therapeutic errors, but sociocultural barriers may impact the engagement of allied health professionals (AHPs) in the diagnostic process. A qualitative case study of the experience at a single institution around involvement of an AHP in the diagnostic process for acute dizziness and vertigo. We detail five diagnostic error cases in which the input of a physical therapist was central to correct diagnosis. We further describe evolution of the sociocultural milieu at the institution as relates to AHP engagement in diagnosis. Five patients with acute vestibular symptoms were initially misdiagnosed by physicians and then correctly diagnosed based on input from a vestibular physical therapist. These included missed labyrinthine concussion and post-traumatic benign paroxysmal positional vertigo (BPPV); BPPV called gastroenteritis; BPPV called stroke; stroke called BPPV; and multiple sclerosis called BPPV. As a consequence of surfacing these diagnostic errors, initial resistance to physical therapy input to aid medical diagnosis has gradually declined, creating a more collaborative environment for 'team diagnosis' of patients with dizziness and vertigo at the institution. Barriers to AHP engagement in 'team diagnosis' include sociocultural norms that establish medical diagnosis as something reserved only for physicians. Drawing attention to the valuable diagnostic contributions of AHPs may help facilitate cultural change. Future studies should seek to measure diagnostic safety culture and then implement proven strategies to breakdown sociocultural barriers that inhibit effective teamwork and transdisciplinary diagnosis.

  17. Diagnostic decision-making and strategies to improve diagnosis.

    PubMed

    Thammasitboon, Satid; Cutrer, William B

    2013-10-01

    A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.

  18. Reducing Cognitive Skill Decay and Diagnostic Error: Theory-Based Practices for Continuing Education in Health Care

    ERIC Educational Resources Information Center

    Weaver, Sallie J.; Newman-Toker, David E.; Rosen, Michael A.

    2012-01-01

    Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective…

  19. Understanding diagnostic errors in medicine: a lesson from aviation

    PubMed Central

    Singh, H; Petersen, L A; Thomas, E J

    2006-01-01

    The impact of diagnostic errors on patient safety in medicine is increasingly being recognized. Despite the current progress in patient safety research, the understanding of such errors and how to prevent them is inadequate. Preliminary research suggests that diagnostic errors have both cognitive and systems origins. Situational awareness is a model that is primarily used in aviation human factors research that can encompass both the cognitive and the systems roots of such errors. This conceptual model offers a unique perspective in the study of diagnostic errors. The applicability of this model is illustrated by the analysis of a patient whose diagnosis of spinal cord compression was substantially delayed. We suggest how the application of this framework could lead to potential areas of intervention and outline some areas of future research. It is possible that the use of such a model in medicine could help reduce errors in diagnosis and lead to significant improvements in patient care. Further research is needed, including the measurement of situational awareness and correlation with health outcomes. PMID:16751463

  20. Effectiveness of Toyota process redesign in reducing thyroid gland fine-needle aspiration error.

    PubMed

    Raab, Stephen S; Grzybicki, Dana Marie; Sudilovsky, Daniel; Balassanian, Ronald; Janosky, Janine E; Vrbin, Colleen M

    2006-10-01

    Our objective was to determine whether the Toyota Production System process redesign resulted in diagnostic error reduction for patients who underwent cytologic evaluation of thyroid nodules. In this longitudinal, nonconcurrent cohort study, we compared the diagnostic error frequency of a thyroid aspiration service before and after implementation of error reduction initiatives consisting of adoption of a standardized diagnostic terminology scheme and an immediate interpretation service. A total of 2,424 patients underwent aspiration. Following terminology standardization, the false-negative rate decreased from 41.8% to 19.1% (P = .006), the specimen nondiagnostic rate increased from 5.8% to 19.8% (P < .001), and the sensitivity increased from 70.2% to 90.6% (P < .001). Cases with an immediate interpretation had a lower noninterpretable specimen rate than those without immediate interpretation (P < .001). Toyota process change led to significantly fewer diagnostic errors for patients who underwent thyroid fine-needle aspiration.

  1. Errors in imaging patients in the emergency setting

    PubMed Central

    Reginelli, Alfonso; Lo Re, Giuseppe; Midiri, Federico; Muzj, Carlo; Romano, Luigia; Brunese, Luca

    2016-01-01

    Emergency and trauma care produces a “perfect storm” for radiological errors: uncooperative patients, inadequate histories, time-critical decisions, concurrent tasks and often junior personnel working after hours in busy emergency departments. The main cause of diagnostic errors in the emergency department is the failure to correctly interpret radiographs, and the majority of diagnoses missed on radiographs are fractures. Missed diagnoses potentially have important consequences for patients, clinicians and radiologists. Radiologists play a pivotal role in the diagnostic assessment of polytrauma patients and of patients with non-traumatic craniothoracoabdominal emergencies, and key elements to reduce errors in the emergency setting are knowledge, experience and the correct application of imaging protocols. This article aims to highlight the definition and classification of errors in radiology, the causes of errors in emergency radiology and the spectrum of diagnostic errors in radiography, ultrasonography and CT in the emergency setting. PMID:26838955

  2. Errors in imaging patients in the emergency setting.

    PubMed

    Pinto, Antonio; Reginelli, Alfonso; Pinto, Fabio; Lo Re, Giuseppe; Midiri, Federico; Muzj, Carlo; Romano, Luigia; Brunese, Luca

    2016-01-01

    Emergency and trauma care produces a "perfect storm" for radiological errors: uncooperative patients, inadequate histories, time-critical decisions, concurrent tasks and often junior personnel working after hours in busy emergency departments. The main cause of diagnostic errors in the emergency department is the failure to correctly interpret radiographs, and the majority of diagnoses missed on radiographs are fractures. Missed diagnoses potentially have important consequences for patients, clinicians and radiologists. Radiologists play a pivotal role in the diagnostic assessment of polytrauma patients and of patients with non-traumatic craniothoracoabdominal emergencies, and key elements to reduce errors in the emergency setting are knowledge, experience and the correct application of imaging protocols. This article aims to highlight the definition and classification of errors in radiology, the causes of errors in emergency radiology and the spectrum of diagnostic errors in radiography, ultrasonography and CT in the emergency setting.

  3. Heuristics and Cognitive Error in Medical Imaging.

    PubMed

    Itri, Jason N; Patel, Sohil H

    2018-05-01

    The field of cognitive science has provided important insights into mental processes underlying the interpretation of imaging examinations. Despite these insights, diagnostic error remains a major obstacle in the goal to improve quality in radiology. In this article, we describe several types of cognitive bias that lead to diagnostic errors in imaging and discuss approaches to mitigate cognitive biases and diagnostic error. Radiologists rely on heuristic principles to reduce complex tasks of assessing probabilities and predicting values into simpler judgmental operations. These mental shortcuts allow rapid problem solving based on assumptions and past experiences. Heuristics used in the interpretation of imaging studies are generally helpful but can sometimes result in cognitive biases that lead to significant errors. An understanding of the causes of cognitive biases can lead to the development of educational content and systematic improvements that mitigate errors and improve the quality of care provided by radiologists.

  4. Reducing Diagnostic Error with Computer-Based Clinical Decision Support

    ERIC Educational Resources Information Center

    Greenes, Robert A.

    2009-01-01

    Information technology approaches to delivering diagnostic clinical decision support (CDS) are the subject of the papers to follow in the proceedings. These will address the history of CDS and present day approaches (Miller), evaluation of diagnostic CDS methods (Friedman), and the role of clinical documentation in supporting diagnostic decision…

  5. Diagnostic Error in Stroke-Reasons and Proposed Solutions.

    PubMed

    Bakradze, Ekaterina; Liberman, Ava L

    2018-02-13

    We discuss the frequency of stroke misdiagnosis and identify subgroups of stroke at high risk for specific diagnostic errors. In addition, we review common reasons for misdiagnosis and propose solutions to decrease error. According to a recent report by the National Academy of Medicine, most people in the USA are likely to experience a diagnostic error during their lifetimes. Nearly half of such errors result in serious disability and death. Stroke misdiagnosis is a major health care concern, with initial misdiagnosis estimated to occur in 9% of all stroke patients in the emergency setting. Under- or missed diagnosis (false negative) of stroke can result in adverse patient outcomes due to the preclusion of acute treatments and failure to initiate secondary prevention strategies. On the other hand, the overdiagnosis of stroke can result in inappropriate treatment, delayed identification of actual underlying disease, and increased health care costs. Young patients, women, minorities, and patients presenting with non-specific, transient, or posterior circulation stroke symptoms are at increased risk of misdiagnosis. Strategies to decrease diagnostic error in stroke have largely focused on early stroke detection via bedside examination strategies and a clinical decision rules. Targeted interventions to improve the diagnostic accuracy of stroke diagnosis among high-risk groups as well as symptom-specific clinical decision supports are needed. There are a number of open questions in the study of stroke misdiagnosis. To improve patient outcomes, existing strategies to improve stroke diagnostic accuracy should be more broadly adopted and novel interventions devised and tested to reduce diagnostic errors.

  6. Cardiac examination and the effect of dual-processing instruction in a cardiopulmonary simulator.

    PubMed

    Sibbald, Matt; McKinney, James; Cavalcanti, Rodrigo B; Yu, Eric; Wood, David A; Nair, Parvathy; Eva, Kevin W; Hatala, Rose

    2013-08-01

    Use of dual-processing has been widely touted as a strategy to reduce diagnostic error in clinical medicine. However, this strategy has not been tested among medical trainees with complex diagnostic problems. We sought to determine whether dual-processing instruction could reduce diagnostic error across a spectrum of experience with trainees undertaking cardiac physical exam. Three experiments were conducted using a similar design to teach cardiac physical exam using a cardiopulmonary simulator. One experiment was conducted in each of three groups: experienced, intermediate and novice trainees. In all three experiments, participants were randomized to receive undirected or dual-processing verbal instruction during teaching, practice and testing phases. When tested, dual-processing instruction did not change the probability assigned to the correct diagnosis in any of the three experiments. Among intermediates, there was an apparent interaction between the diagnosis tested and the effect of dual-processing instruction. Among relative novices, dual processing instruction may have dampened the harmful effect of a bias away from the correct diagnosis. Further work is needed to define the role of dual-processing instruction to reduce cognitive error. This study suggests that it cannot be blindly applied to complex diagnostic problems such as cardiac physical exam.

  7. The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking.

    PubMed

    Norman, Geoffrey R; Monteiro, Sandra D; Sherbino, Jonathan; Ilgen, Jonathan S; Schmidt, Henk G; Mamede, Silvia

    2017-01-01

    Contemporary theories of clinical reasoning espouse a dual processing model, which consists of a rapid, intuitive component (Type 1) and a slower, logical and analytical component (Type 2). Although the general consensus is that this dual processing model is a valid representation of clinical reasoning, the causes of diagnostic errors remain unclear. Cognitive theories about human memory propose that such errors may arise from both Type 1 and Type 2 reasoning. Errors in Type 1 reasoning may be a consequence of the associative nature of memory, which can lead to cognitive biases. However, the literature indicates that, with increasing expertise (and knowledge), the likelihood of errors decreases. Errors in Type 2 reasoning may result from the limited capacity of working memory, which constrains computational processes. In this article, the authors review the medical literature to answer two substantial questions that arise from this work: (1) To what extent do diagnostic errors originate in Type 1 (intuitive) processes versus in Type 2 (analytical) processes? (2) To what extent are errors a consequence of cognitive biases versus a consequence of knowledge deficits?The literature suggests that both Type 1 and Type 2 processes contribute to errors. Although it is possible to experimentally induce cognitive biases, particularly availability bias, the extent to which these biases actually contribute to diagnostic errors is not well established. Educational strategies directed at the recognition of biases are ineffective in reducing errors; conversely, strategies focused on the reorganization of knowledge to reduce errors have small but consistent benefits.

  8. Reexamining our bias against heuristics.

    PubMed

    McLaughlin, Kevin; Eva, Kevin W; Norman, Geoff R

    2014-08-01

    Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources of bias in the literature implicating the use of heuristics in diagnostic error and highlight the fact that there are also data suggesting that under certain circumstances using heuristics may lead to better decisions that formal analysis. They suggest that diagnostic error is frequently misattributed to the use of heuristics and propose an alternative view whereby content knowledge is the root cause of diagnostic performance and heuristics lie on the causal pathway between knowledge and diagnostic error or success.

  9. Characteristics of medical professional liability claims in patients treated by family medicine physicians.

    PubMed

    Flannery, Frank T; Parikh, Parul Divya; Oetgen, William J

    2010-01-01

    This study describes a large database of closed medical professional liability (MPL) claims involving family physicians in the United States. The purpose of this report is to provide information for practicing family physicians that will be useful in improving the quality of care, thereby reducing the incidence of patient injury and the consequent frequency of MPL claims. The Physician Insurers Association of America (PIAA) established a registry of closed MPL claims in 1985. This registry contains data describing 239,756 closed claims in the United States through 2008. The registry is maintained for educational programs that are designed to improve quality of care and reduce patient injury MPL claims. We summarized this closed claims database. Of 239,756 closed claims, 27,556 (11.5%) involved family physicians. Of these 27,556 closed claims, 8797 (31.9%) resulted in a payment, and the average payment was $164,107. In the entire registry, 29.5% of closed claims were paid, and the average payment was $209,156. The most common allegation among family medicine closed claims was diagnostic error, and the most prevalent diagnosis was acute myocardial infarction, which represented 24.1% of closed claims with diagnostic errors. Diagnostic errors related to patients with breast cancer represented the next most common condition, accounting for 21.3% of closed claims with diagnostic errors. MPL issues are common and are important to all practicing family physicians. Knowledge of the details of liability claims should assist practicing family physicians in improving quality of care, reducing patient injury, and reducing the incidence of MPL claims.

  10. Diagnostic Reasoning and Cognitive Biases of Nurse Practitioners.

    PubMed

    Lawson, Thomas N

    2018-04-01

    Diagnostic reasoning is often used colloquially to describe the process by which nurse practitioners and physicians come to the correct diagnosis, but a rich definition and description of this process has been lacking in the nursing literature. A literature review was conducted with theoretical sampling seeking conceptual insight into diagnostic reasoning. Four common themes emerged: Cognitive Biases and Debiasing Strategies, the Dual Process Theory, Diagnostic Error, and Patient Harm. Relevant cognitive biases are discussed, followed by debiasing strategies and application of the dual process theory to reduce diagnostic error and harm. The accuracy of diagnostic reasoning of nurse practitioners may be improved by incorporating these items into nurse practitioner education and practice. [J Nurs Educ. 2018;57(4):203-208.]. Copyright 2018, SLACK Incorporated.

  11. Decision Making for Borderline Cases in Pass/Fail Clinical Anatomy Courses: The Practical Value of the Standard Error of Measurement and Likelihood Ratio in a Diagnostic Test

    ERIC Educational Resources Information Center

    Severo, Milton; Silva-Pereira, Fernanda; Ferreira, Maria Amelia

    2013-01-01

    Several studies have shown that the standard error of measurement (SEM) can be used as an additional “safety net” to reduce the frequency of false-positive or false-negative student grading classifications. Practical examinations in clinical anatomy are often used as diagnostic tests to admit students to course final examinations. The aim of this…

  12. A national physician survey of diagnostic error in paediatrics.

    PubMed

    Perrem, Lucy M; Fanshawe, Thomas R; Sharif, Farhana; Plüddemann, Annette; O'Neill, Michael B

    2016-10-01

    This cross-sectional survey explored paediatric physician perspectives regarding diagnostic errors. All paediatric consultants and specialist registrars in Ireland were invited to participate in this anonymous online survey. The response rate for the study was 54 % (n = 127). Respondents had a median of 9-year clinical experience (interquartile range (IQR) 4-20 years). A diagnostic error was reported at least monthly by 19 (15.0 %) respondents. Consultants reported significantly less diagnostic errors compared to trainees (p value = 0.01). Cognitive error was the top-ranked contributing factor to diagnostic error, with incomplete history and examination considered to be the principal cognitive error. Seeking a second opinion and close follow-up of patients to ensure that the diagnosis is correct were the highest-ranked, clinician-based solutions to diagnostic error. Inadequate staffing levels and excessive workload were the most highly ranked system-related and situational factors. Increased access to and availability of consultants and experts was the most highly ranked system-based solution to diagnostic error. We found a low level of self-perceived diagnostic error in an experienced group of paediatricians, at variance with the literature and warranting further clarification. The results identify perceptions on the major cognitive, system-related and situational factors contributing to diagnostic error and also key preventative strategies. • Diagnostic errors are an important source of preventable patient harm and have an estimated incidence of 10-15 %. • They are multifactorial in origin and include cognitive, system-related and situational factors. What is New: • We identified a low rate of self-perceived diagnostic error in contrast to the existing literature. • Incomplete history and examination, inadequate staffing levels and excessive workload are cited as the principal contributing factors to diagnostic error in this study.

  13. Rhythmic chaos: irregularities of computer ECG diagnosis.

    PubMed

    Wang, Yi-Ting Laureen; Seow, Swee-Chong; Singh, Devinder; Poh, Kian-Keong; Chai, Ping

    2017-09-01

    Diagnostic errors can occur when physicians rely solely on computer electrocardiogram interpretation. Cardiologists often receive referrals for computer misdiagnoses of atrial fibrillation. Patients may have been inappropriately anticoagulated for pseudo atrial fibrillation. Anticoagulation carries significant risks, and such errors may carry a high cost. Have we become overreliant on machines and technology? In this article, we illustrate three such cases and briefly discuss how we can reduce these errors. Copyright: © Singapore Medical Association.

  14. Meaningful Peer Review in Radiology: A Review of Current Practices and Potential Future Directions.

    PubMed

    Moriarity, Andrew K; Hawkins, C Matthew; Geis, J Raymond; Dreyer, Keith J; Kamer, Aaron P; Khandheria, Paras; Morey, Jose; Whitfill, James; Wiggins, Richard H; Itri, Jason N

    2016-12-01

    The current practice of peer review within radiology is well developed and widely implemented compared with other medical specialties. However, there are many factors that limit current peer review practices from reducing diagnostic errors and improving patient care. The development of "meaningful peer review" requires a transition away from compliance toward quality improvement, whereby the information and insights gained facilitate education and drive systematic improvements that reduce the frequency and impact of diagnostic error. The next generation of peer review requires significant improvements in IT functionality and integration, enabling features such as anonymization, adjudication by multiple specialists, categorization and analysis of errors, tracking, feedback, and easy export into teaching files and other media that require strong partnerships with vendors. In this article, the authors assess various peer review practices, with focused discussion on current limitations and future needs for meaningful peer review in radiology. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. The challenges in defining and measuring diagnostic error.

    PubMed

    Zwaan, Laura; Singh, Hardeep

    2015-06-01

    Diagnostic errors have emerged as a serious patient safety problem but they are hard to detect and complex to define. At the research summit of the 2013 Diagnostic Error in Medicine 6th International Conference, we convened a multidisciplinary expert panel to discuss challenges in defining and measuring diagnostic errors in real-world settings. In this paper, we synthesize these discussions and outline key research challenges in operationalizing the definition and measurement of diagnostic error. Some of these challenges include 1) difficulties in determining error when the disease or diagnosis is evolving over time and in different care settings, 2) accounting for a balance between underdiagnosis and overaggressive diagnostic pursuits, and 3) determining disease diagnosis likelihood and severity in hindsight. We also build on these discussions to describe how some of these challenges can be addressed while conducting research on measuring diagnostic error.

  16. Diagnostic reasoning: where we've been, where we're going.

    PubMed

    Monteiro, Sandra M; Norman, Geoffrey

    2013-01-01

    Recently, clinical diagnostic reasoning has been characterized by "dual processing" models, which postulate a fast, unconscious (System 1) component and a slow, logical, analytical (System 2) component. However, there are a number of variants of this basic model, which may lead to conflicting claims. This paper critically reviews current theories and evidence about the nature of clinical diagnostic reasoning. We begin by briefly discussing the history of research in clinical reasoning. We then focus more specifically on the evidence to support dual-processing models. We conclude by identifying knowledge gaps about clinical reasoning and provide suggestions for future research. In contrast to work on analytical and nonanalytical knowledge as a basis for reasoning, these theories focus on the thinking process, not the nature of the knowledge retrieved. Ironically, this appears to be a revival of an outdated concept. Rather than defining diagnostic performance by problem-solving skills, it is now being defined by processing strategy. The version of dual processing that has received most attention in the literature in medical diagnosis might be labeled a "default/interventionist" model,(17) which suggests that a default system of cognitive processes (System 1) is responsible for cognitive biases that lead to diagnostic errors and that System 2 intervenes to correct these errors. Consequently, from this model, the best strategy for reducing errors is to make students aware of the biases and to encourage them to rely more on System 2. However, an accumulation of evidence suggests that (a) strategies directed at increasing analytical (System 2) processing, by slowing down, reducing distractions, paying conscious attention, and (b) strategies directed at making students aware of the effect of cognitive biases, have no impact on error rates. Conversely, strategies based on increasing application of relevant knowledge appear to have some success and are consistent with basic research on concept formation.

  17. Colour coding for blood collection tube closures - a call for harmonisation.

    PubMed

    Simundic, Ana-Maria; Cornes, Michael P; Grankvist, Kjell; Lippi, Giuseppe; Nybo, Mads; Ceriotti, Ferruccio; Theodorsson, Elvar; Panteghini, Mauro

    2015-02-01

    At least one in 10 patients experience adverse events while receiving hospital care. Many of the errors are related to laboratory diagnostics. Efforts to reduce laboratory errors over recent decades have primarily focused on the measurement process while pre- and post-analytical errors including errors in sampling, reporting and decision-making have received much less attention. Proper sampling and additives to the samples are essential. Tubes and additives are identified not only in writing on the tubes but also by the colour of the tube closures. Unfortunately these colours have not been standardised, running the risk of error when tubes from one manufacturer are replaced by the tubes from another manufacturer that use different colour coding. EFLM therefore supports the worldwide harmonisation of the colour coding for blood collection tube closures and labels in order to reduce the risk of pre-analytical errors and improve the patient safety.

  18. Gear noise, vibration, and diagnostic studies at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Zakrajsek, James J.; Oswald, Fred B.; Townsend, Dennis P.; Coy, John J.

    1990-01-01

    The NASA Lewis Research Center and the U.S. Army Aviation Systems Command are involved in a joint research program to advance the technology of rotorcraft transmissions. This program consists of analytical as well as experimental efforts to achieve the overall goals of reducing weight, noise, and vibration, while increasing life and reliability. Recent analytical activities are highlighted in the areas of gear noise, vibration, and diagnostics performed in-house and through NASA and U.S. Army sponsored grants and contracts. These activities include studies of gear tooth profiles to reduce transmission error and vibration as well as gear housing and rotordynamic modeling to reduce structural vibration transmission and noise radiation, and basic research into current gear failure diagnostic methodologies. Results of these activities are presented along with an overview of near term research plans in the gear noise, vibration, and diagnostics area.

  19. Using Fault Trees to Advance Understanding of Diagnostic Errors.

    PubMed

    Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep

    2017-11-01

    Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  20. Clinical Dental Faculty Members' Perceptions of Diagnostic Errors and How to Avoid Them.

    PubMed

    Nikdel, Cathy; Nikdel, Kian; Ibarra-Noriega, Ana; Kalenderian, Elsbeth; Walji, Muhammad F

    2018-04-01

    Diagnostic errors are increasingly recognized as a source of preventable harm in medicine, yet little is known about their occurrence in dentistry. The aim of this study was to gain a deeper understanding of clinical dental faculty members' perceptions of diagnostic errors, types of errors that may occur, and possible contributing factors. The authors conducted semi-structured interviews with ten domain experts at one U.S. dental school in May-August 2016 about their perceptions of diagnostic errors and their causes. The interviews were analyzed using an inductive process to identify themes and key findings. The results showed that the participants varied in their definitions of diagnostic errors. While all identified missed diagnosis and wrong diagnosis, only four participants perceived that a delay in diagnosis was a diagnostic error. Some participants perceived that an error occurs only when the choice of treatment leads to harm. Contributing factors associated with diagnostic errors included the knowledge and skills of the dentist, not taking adequate time, lack of communication among colleagues, and cognitive biases such as premature closure based on previous experience. Strategies suggested by the participants to prevent these errors were taking adequate time when investigating a case, forming study groups, increasing communication, and putting more emphasis on differential diagnosis. These interviews revealed differing perceptions of dental diagnostic errors among clinical dental faculty members. To address the variations, the authors recommend adopting shared language developed by the medical profession to increase understanding.

  1. A review of ultrasonographic methods for the assessment of the anterior cruciate ligament in patients with knee instability – diagnostics using a posterior approach

    PubMed Central

    Kielar, Maciej

    2016-01-01

    Aim The purpose of the study was to improve the ultrasonographic assessment of the anterior cruciate ligament by an inclusion of a dynamic element. The proposed functional modification aims to restore normal posterior cruciate ligament tension, which is associated with a visible change in the ligament shape. This method reduces the risk of an error resulting from subjectively assessing the shape of the posterior cruciate ligament. It should be also emphasized that the method combined with other ultrasound anterior cruciate ligament assessment techniques helps increase diagnostic accuracy. Methods Ultrasonography is used as an adjunctive technique in the diagnosis of anterior cruciate ligament injury. The paper presents a sonographic technique for the assessment of suspected anterior cruciate ligament insufficiency supplemented by the use of a dynamic examination. This technique can be recommended as an additional procedure in routine ultrasound diagnostics of anterior cruciate ligament injuries. Results Supplementing routine ultrasonography with the dynamic assessment of posterior cruciate ligament shape changes in patients with suspected anterior cruciate ligament injury reduces the risk of subjective errors and increases diagnostic accuracy. This is important especially in cases of minor anterior knee instability and bilateral anterior knee instability. Conclusions An assessment of changes in posterior cruciate ligament using a dynamic ultrasound examination effectively complements routine sonographic diagnostic techniques for anterior cruciate ligament insufficiency. PMID:27679732

  2. Reducing cognitive skill decay and diagnostic error: theory-based practices for continuing education in health care.

    PubMed

    Weaver, Sallie J; Newman-Toker, David E; Rosen, Michael A

    2012-01-01

    Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective cognitive reasoning skills in the clinical education, organizational training, and human factors literatures suggest that continuing education plays a critical role in mitigating and managing diagnostic skill decay. Recent models also underscore the role of system level factors (eg, cognitive decision support tools, just-in-time training opportunities) in supporting clinical reasoning process. The purpose of this manuscript is to offer a multidisciplinary review of cognitive models of clinical decision making skills in order to provide a list of best practices for supporting continuous improvement and maintenance of cognitive diagnostic processes through continuing education. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.

  3. The thinking doctor: clinical decision making in contemporary medicine.

    PubMed

    Trimble, Michael; Hamilton, Paul

    2016-08-01

    Diagnostic errors are responsible for a significant number of adverse events. Logical reasoning and good decision-making skills are key factors in reducing such errors, but little emphasis has traditionally been placed on how these thought processes occur, and how errors could be minimised. In this article, we explore key cognitive ideas that underpin clinical decision making and suggest that by employing some simple strategies, physicians might be better able to understand how they make decisions and how the process might be optimised. © 2016 Royal College of Physicians.

  4. Diagnostic Error in Correctional Mental Health: Prevalence, Causes, and Consequences.

    PubMed

    Martin, Michael S; Hynes, Katie; Hatcher, Simon; Colman, Ian

    2016-04-01

    While they have important implications for inmates and resourcing of correctional institutions, diagnostic errors are rarely discussed in correctional mental health research. This review seeks to estimate the prevalence of diagnostic errors in prisons and jails and explores potential causes and consequences. Diagnostic errors are defined as discrepancies in an inmate's diagnostic status depending on who is responsible for conducting the assessment and/or the methods used. It is estimated that at least 10% to 15% of all inmates may be incorrectly classified in terms of the presence or absence of a mental illness. Inmate characteristics, relationships with staff, and cognitive errors stemming from the use of heuristics when faced with time constraints are discussed as possible sources of error. A policy example of screening for mental illness at intake to prison is used to illustrate when the risk of diagnostic error might be increased and to explore strategies to mitigate this risk. © The Author(s) 2016.

  5. Information-Gathering Patterns Associated with Higher Rates of Diagnostic Error

    ERIC Educational Resources Information Center

    Delzell, John E., Jr.; Chumley, Heidi; Webb, Russell; Chakrabarti, Swapan; Relan, Anju

    2009-01-01

    Diagnostic errors are an important source of medical errors. Problematic information-gathering is a common cause of diagnostic errors among physicians and medical students. The objectives of this study were to (1) determine if medical students' information-gathering patterns formed clusters of similar strategies, and if so (2) to calculate the…

  6. Correcting AUC for Measurement Error.

    PubMed

    Rosner, Bernard; Tworoger, Shelley; Qiu, Weiliang

    2015-12-01

    Diagnostic biomarkers are used frequently in epidemiologic and clinical work. The ability of a diagnostic biomarker to discriminate between subjects who develop disease (cases) and subjects who do not (controls) is often measured by the area under the receiver operating characteristic curve (AUC). The diagnostic biomarkers are usually measured with error. Ignoring measurement error can cause biased estimation of AUC, which results in misleading interpretation of the efficacy of a diagnostic biomarker. Several methods have been proposed to correct AUC for measurement error, most of which required the normality assumption for the distributions of diagnostic biomarkers. In this article, we propose a new method to correct AUC for measurement error and derive approximate confidence limits for the corrected AUC. The proposed method does not require the normality assumption. Both real data analyses and simulation studies show good performance of the proposed measurement error correction method.

  7. BREAST: a novel method to improve the diagnostic efficacy of mammography

    NASA Astrophysics Data System (ADS)

    Brennan, P. C.; Tapia, K.; Ryan, J.; Lee, W.

    2013-03-01

    High quality breast imaging and accurate image assessment are critical to the early diagnoses, treatment and management of women with breast cancer. Breast Screen Reader Assessment Strategy (BREAST) provides a platform, accessible by researchers and clinicians world-wide, which will contain image data bases, algorithms to assess reader performance and on-line systems for image evaluation. The platform will contribute to the diagnostic efficacy of breast imaging in Australia and beyond on two fronts: reducing errors in mammography, and transforming our assessment of novel technologies and techniques. Mammography is the primary diagnostic tool for detecting breast cancer with over 800,000 women X-rayed each year in Australia, however, it fails to detect 30% of breast cancers with a number of missed cancers being visible on the image [1-6]. BREAST will monitor the mistakes, identify reasons for mammographic errors, and facilitate innovative solutions to reduce error rates. The BREAST platform has the potential to enable expert assessment of breast imaging innovations, anywhere in the world where experts or innovations are located. Currently, innovations are often being assessed by limited numbers of individuals who happen to be geographically located close to the innovation, resulting in equivocal studies with low statistical power. BREAST will transform this current paradigm by enabling large numbers of experts to assess any new method or technology using our embedded evaluation methods. We are confident that this world-first system will play an important part in the future efficacy of breast imaging.

  8. Identification of factors associated with diagnostic error in primary care.

    PubMed

    Minué, Sergio; Bermúdez-Tamayo, Clara; Fernández, Alberto; Martín-Martín, José Jesús; Benítez, Vivian; Melguizo, Miguel; Caro, Araceli; Orgaz, María José; Prados, Miguel Angel; Díaz, José Enrique; Montoro, Rafael

    2014-05-12

    Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.

  9. Identification of factors associated with diagnostic error in primary care

    PubMed Central

    2014-01-01

    Background Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason’s taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. Methods Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician’s initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians’ perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. Discussion This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process. PMID:24884984

  10. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST.

    PubMed

    Xiao, Shumei; Zang, Qing; Han, Xiaofeng; Wang, Tengfei; Yu, Jin; Zhao, Junyu

    2016-07-01

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump system can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.

  11. External Quality Assessment beyond the analytical phase: an Australian perspective.

    PubMed

    Badrick, Tony; Gay, Stephanie; McCaughey, Euan J; Georgiou, Andrew

    2017-02-15

    External Quality Assessment (EQA) is the verification, on a recurring basis, that laboratory results conform to expectations for the quality required for patient care. It is now widely recognised that both the pre- and post-laboratory phase of testing, termed the diagnostic phases, are a significant source of laboratory errors. These errors have a direct impact on both the effectiveness of the laboratory and patient safety. Despite this, Australian laboratories tend to be focussed on very narrow concepts of EQA, primarily surrounding test accuracy, with little in the way of EQA programs for the diagnostic phases. There is a wide range of possibilities for the development of EQA for the diagnostic phases in Australia, such as the utilisation of scenarios and health informatics. Such programs can also be supported through advances in health information and communications technology, including electronic test ordering and clinical decision support systems. While the development of such programs will require consultation and support from the referring doctors, and their format will need careful construction to ensure that the data collected is de-identified and provides education as well as useful and informative data, we believe that there is high value in the development of such programs. Therefore, it is our opinion that all pathology laboratories should strive to be involved in an EQA program in the diagnostic phases to both monitor the diagnostic process and to identify, learn from and reduce errors and near misses in these phases in a timely fashion.

  12. External Quality Assessment beyond the analytical phase: an Australian perspective

    PubMed Central

    Gay, Stephanie; McCaughey, Euan J.; Georgiou, Andrew

    2017-01-01

    External Quality Assessment (EQA) is the verification, on a recurring basis, that laboratory results conform to expectations for the quality required for patient care. It is now widely recognised that both the pre- and post-laboratory phase of testing, termed the diagnostic phases, are a significant source of laboratory errors. These errors have a direct impact on both the effectiveness of the laboratory and patient safety. Despite this, Australian laboratories tend to be focussed on very narrow concepts of EQA, primarily surrounding test accuracy, with little in the way of EQA programs for the diagnostic phases. There is a wide range of possibilities for the development of EQA for the diagnostic phases in Australia, such as the utilisation of scenarios and health informatics. Such programs can also be supported through advances in health information and communications technology, including electronic test ordering and clinical decision support systems. While the development of such programs will require consultation and support from the referring doctors, and their format will need careful construction to ensure that the data collected is de-identified and provides education as well as useful and informative data, we believe that there is high value in the development of such programs. Therefore, it is our opinion that all pathology laboratories should strive to be involved in an EQA program in the diagnostic phases to both monitor the diagnostic process and to identify, learn from and reduce errors and near misses in these phases in a timely fashion. PMID:28392728

  13. Dual Processing and Diagnostic Errors

    ERIC Educational Resources Information Center

    Norman, Geoff

    2009-01-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…

  14. How common are cognitive errors in cases presented at emergency medicine resident morbidity and mortality conferences?

    PubMed

    Chu, David; Xiao, Jane; Shah, Payal; Todd, Brett

    2018-06-20

    Cognitive errors are a major contributor to medical error. Traditionally, medical errors at teaching hospitals are analyzed in morbidity and mortality (M&M) conferences. We aimed to describe the frequency of cognitive errors in relation to the occurrence of diagnostic and other error types, in cases presented at an emergency medicine (EM) resident M&M conference. We conducted a retrospective study of all cases presented at a suburban US EM residency monthly M&M conference from September 2011 to August 2016. Each case was reviewed using the electronic medical record (EMR) and notes from the M&M case by two EM physicians. Each case was categorized by type of primary medical error that occurred as described by Okafor et al. When a diagnostic error occurred, the case was reviewed for contributing cognitive and non-cognitive factors. Finally, when a cognitive error occurred, the case was classified into faulty knowledge, faulty data gathering or faulty synthesis, as described by Graber et al. Disagreements in error type were mediated by a third EM physician. A total of 87 M&M cases were reviewed; the two reviewers agreed on 73 cases, and 14 cases required mediation by a third reviewer. Forty-eight cases involved diagnostic errors, 47 of which were cognitive errors. Of these 47 cases, 38 involved faulty synthesis, 22 involved faulty data gathering and only 11 involved faulty knowledge. Twenty cases contained more than one type of cognitive error. Twenty-nine cases involved both a resident and an attending physician, while 17 cases involved only an attending physician. Twenty-one percent of the resident cases involved all three cognitive errors, while none of the attending cases involved all three. Forty-one percent of the resident cases and only 6% of the attending cases involved faulty knowledge. One hundred percent of the resident cases and 94% of the attending cases involved faulty synthesis. Our review of 87 EM M&M cases revealed that cognitive errors are commonly involved in cases presented, and that these errors are less likely due to deficient knowledge and more likely due to faulty synthesis. M&M conferences may therefore provide an excellent forum to discuss cognitive errors and how to reduce their occurrence.

  15. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    PubMed

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  16. Analysis of binary responses with outcome-specific misclassification probability in genome-wide association studies.

    PubMed

    Rekaya, Romdhane; Smith, Shannon; Hay, El Hamidi; Farhat, Nourhene; Aggrey, Samuel E

    2016-01-01

    Errors in the binary status of some response traits are frequent in human, animal, and plant applications. These error rates tend to differ between cases and controls because diagnostic and screening tests have different sensitivity and specificity. This increases the inaccuracies of classifying individuals into correct groups, giving rise to both false-positive and false-negative cases. The analysis of these noisy binary responses due to misclassification will undoubtedly reduce the statistical power of genome-wide association studies (GWAS). A threshold model that accommodates varying diagnostic errors between cases and controls was investigated. A simulation study was carried out where several binary data sets (case-control) were generated with varying effects for the most influential single nucleotide polymorphisms (SNPs) and different diagnostic error rate for cases and controls. Each simulated data set consisted of 2000 individuals. Ignoring misclassification resulted in biased estimates of true influential SNP effects and inflated estimates for true noninfluential markers. A substantial reduction in bias and increase in accuracy ranging from 12% to 32% was observed when the misclassification procedure was invoked. In fact, the majority of influential SNPs that were not identified using the noisy data were captured using the proposed method. Additionally, truly misclassified binary records were identified with high probability using the proposed method. The superiority of the proposed method was maintained across different simulation parameters (misclassification rates and odds ratios) attesting to its robustness.

  17. Exploring Situational Awareness in Diagnostic Errors in Primary Care

    PubMed Central

    Singh, Hardeep; Giardina, Traber Davis; Petersen, Laura A.; Smith, Michael; Wilson, Lindsey; Dismukes, Key; Bhagwath, Gayathri; Thomas, Eric J.

    2013-01-01

    Objective Diagnostic errors in primary care are harmful but poorly studied. To facilitate understanding of diagnostic errors in real-world primary care settings using electronic health records (EHRs), this study explored the use of the Situational Awareness (SA) framework from aviation human factors research. Methods A mixed-methods study was conducted involving reviews of EHR data followed by semi-structured interviews of selected providers from two institutions in the US. The study population included 380 consecutive patients with colorectal and lung cancers diagnosed between February 2008 and January 2009. Using a pre-tested data collection instrument, trained physicians identified diagnostic errors, defined as lack of timely action on one or more established indications for diagnostic work-up for lung and colorectal cancers. Twenty-six providers involved in cases with and without errors were interviewed. Interviews probed for providers' lack of SA and how this may have influenced the diagnostic process. Results Of 254 cases meeting inclusion criteria, errors were found in 30 (32.6%) of 92 lung cancer cases and 56 (33.5%) of 167 colorectal cancer cases. Analysis of interviews related to error cases revealed evidence of lack of one of four levels of SA applicable to primary care practice: information perception, information comprehension, forecasting future events, and choosing appropriate action based on the first three levels. In cases without error, the application of the SA framework provided insight into processes involved in attention management. Conclusions A framework of SA can help analyze and understand diagnostic errors in primary care settings that use EHRs. PMID:21890757

  18. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Shumei; Zang, Qing, E-mail: zangq@ipp.ac.cn; Han, Xiaofeng

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump systemmore » can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.« less

  19. Are health care provider organizations ready to tackle diagnostic error? A survey of Leapfrog-participating hospitals.

    PubMed

    Newman-Toker, David E; Austin, J Matthew; Derk, Jordan; Danforth, Melissa; Graber, Mark L

    2017-06-27

    A 2015 National Academy of Medicine report on improving diagnosis in health care made recommendations for direct action by hospitals and health systems. Little is known about how health care provider organizations are addressing diagnostic safety/quality. This study is an anonymous online survey of safety professionals from US hospitals and health systems in July-August 2016. The survey was sent to those attending a Leapfrog Group webinar on misdiagnosis (n=188). The instrument was focused on knowledge, attitudes, and capability to address diagnostic errors at the institutional level. Overall, 61 (32%) responded, including community hospitals (42%), integrated health networks (25%), and academic centers (21%). Awareness was high, but commitment and capability were low (31% of leaders understand the problem; 28% have sufficient safety resources; and 25% have made diagnosis a top institutional safety priority). Ongoing efforts to improve diagnostic safety were sparse and mostly included root cause analysis and peer review feedback around diagnostic errors. The top three barriers to addressing diagnostic error were lack of awareness of the problem, lack of measures of diagnostic accuracy and error, and lack of feedback on diagnostic performance. The top two tools viewed as critically important for locally tackling the problem were routine feedback on diagnostic performance and culture change to emphasize diagnostic safety. Although hospitals and health systems appear to be aware of diagnostic errors as a major safety imperative, most organizations (even those that appear to be making a strong commitment to patient safety) are not yet doing much to improve diagnosis. Going forward, efforts to activate health care organizations will be essential to improving diagnostic safety.

  20. Improving Papanicolaou test quality and reducing medical errors by using Toyota production system methods.

    PubMed

    Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J

    2006-01-01

    The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.

  1. The biasing effect of clinical history on physical examination diagnostic accuracy.

    PubMed

    Sibbald, Matthew; Cavalcanti, Rodrigo B

    2011-08-01

    Literature on diagnostic test interpretation has shown that access to clinical history can both enhance diagnostic accuracy and increase diagnostic error. Knowledge of clinical history has also been shown to enhance the more complex cognitive task of physical examination diagnosis, possibly by enabling early hypothesis generation. However, it is unclear whether clinicians adhere to these early hypotheses in the face of unexpected physical findings, thus resulting in diagnostic error. A sample of 180 internal medicine residents received a short clinical history and conducted a cardiac physical examination on a high-fidelity simulator. Resident Doctors (Residents) were randomised to three groups based on the physical findings in the simulator. The concordant group received physical examination findings consistent with the diagnosis that was most probable based on the clinical history. Discordant groups received findings associated with plausible alternative diagnoses which either lacked expected findings (indistinct discordant) or contained unexpected findings (distinct discordant). Physical examination diagnostic accuracy and physical examination findings were analysed. Physical examination diagnostic accuracy varied significantly among groups (75 ± 44%, 2 ± 13% and 31 ± 47% in the concordant, indistinct discordant and distinct discordant groups, respectively (F(2,177)  = 53, p < 0.0001). Of the 115 Residents who were diagnostically unsuccessful, 33% adhered to their original incorrect hypotheses. Residents verbalised an average of 12 findings (interquartile range: 10-14); 58 ± 17% were correct and the percentage of correct findings was similar in all three groups (p = 0.44). Residents showed substantially decreased diagnostic accuracy when faced with discordant physical findings. The majority of trainees given discordant physical findings rejected their initial hypotheses, but were still diagnostically unsuccessful. These results suggest that overcoming the bias induced by a misleading clinical history may involve two independent steps: rejection of the incorrect initial hypothesis, and selection of the correct diagnosis. Educational strategies focused solely on prompting clinicians to re-examine their hypotheses may be insufficient to reduce diagnostic error. © Blackwell Publishing Ltd 2011.

  2. Diagnostics for the detection and evaluation of laser induced damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheehan, L.; Kozlowski, M.; Rainer, F.

    1995-12-31

    The Laser Damage and Conditioning Group at LLNL is evaluating diagnostics which will help make damage testing more efficient and reduce the risk of damage during laser conditioning. The work to date has focused on photoacoustic and scattered light measurements on 1064-nm wavelength HfO{sub 2}/SiO{sub 2} multilayer mirror and polarizer coatings. Both the acoustic and scatter diagnostics have resolved 10 {mu}m diameter damage points in these coatings. Using a scanning stage, the scatter diagnostic can map both intrinsic and laser-induced scatter. Damage threshold measurements obtained using scatter diagnostics compare within experimental error with those measured using 100x Nomarski microscopy. Scattermore » signals measured during laser conditioning can be used to detect damage related to nodular defects.« less

  3. Diagnostics for the detection and evaluation of laser induced damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheehan, L.; Kozlowski, M.; Rainer, F.

    1995-01-03

    The Laser Damage and Conditioning Group at LLNL is evaluating diagnostics which will help make damage testing more efficient and reduce the risk of damage during laser conditioning. The work to date has focused on photoacoustic and scattered light measurements on 1064-nm wavelength HfO{sub 2}/SiO{sub 2} multilayer mirror and polarizer coatings. Both the acoustic and scatter diagnostics have resolved 10 {mu}m diameter damage points in these coatings. Using a scanning stage, the scatter diagnostic can map both intrinsic and laser-induced scatter. Damage threshold measurements obtained using scatter diagnostics compare within experimental error with those measured using 100x Nomarski microscopy. Scattermore » signals measured during laser conditioning can be used to detect damage related to nodular defects.« less

  4. Feasibility of Self-Reflection as a Tool to Balance Clinical Reasoning Strategies

    ERIC Educational Resources Information Center

    Sibbald, Matthew; de Bruin, Anique B. H.

    2012-01-01

    Clinicians are believed to use two predominant reasoning strategies: system 1 based pattern recognition, and system 2 based analytical reasoning. Balancing these cognitive reasoning strategies is widely believed to reduce diagnostic error. However, clinicians approach different problems with different reasoning strategies. This study explores…

  5. Investigating the Link Between Radiologists Gaze, Diagnostic Decision, and Image Content

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tourassi, Georgia; Voisin, Sophie; Paquit, Vincent C

    2013-01-01

    Objective: To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods: Gaze data and diagnostic decisions were collected from six radiologists who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Texture analysis was performed in mammographic regions that attracted radiologists attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results: By poolingmore » the data from all radiologists machine learning produced highly accurate predictive models linking image content, gaze, cognition, and error. Merging radiologists gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the radiologists diagnostic errors while confirming 96.2% of their correct diagnoses. The radiologists individual errors could be adequately predicted by modeling the behavior of their peers. However, personalized tuning appears to be beneficial in many cases to capture more accurately individual behavior. Conclusions: Machine learning algorithms combining image features with radiologists gaze data and diagnostic decisions can be effectively developed to recognize cognitive and perceptual errors associated with the diagnostic interpretation of mammograms.« less

  6. Defining and Measuring Diagnostic Uncertainty in Medicine: A Systematic Review.

    PubMed

    Bhise, Viraj; Rajan, Suja S; Sittig, Dean F; Morgan, Robert O; Chaudhary, Pooja; Singh, Hardeep

    2018-01-01

    Physicians routinely encounter diagnostic uncertainty in practice. Despite its impact on health care utilization, costs and error, measurement of diagnostic uncertainty is poorly understood. We conducted a systematic review to describe how diagnostic uncertainty is defined and measured in medical practice. We searched OVID Medline and PsycINFO databases from inception until May 2017 using a combination of keywords and Medical Subject Headings (MeSH). Additional search strategies included manual review of references identified in the primary search, use of a topic-specific database (AHRQ-PSNet) and expert input. We specifically focused on articles that (1) defined diagnostic uncertainty; (2) conceptualized diagnostic uncertainty in terms of its sources, complexity of its attributes or strategies for managing it; or (3) attempted to measure diagnostic uncertainty. We identified 123 articles for full review, none of which defined diagnostic uncertainty. Three attributes of diagnostic uncertainty were relevant for measurement: (1) it is a subjective perception experienced by the clinician; (2) it has the potential to impact diagnostic evaluation-for example, when inappropriately managed, it can lead to diagnostic delays; and (3) it is dynamic in nature, changing with time. Current methods for measuring diagnostic uncertainty in medical practice include: (1) asking clinicians about their perception of uncertainty (surveys and qualitative interviews), (2) evaluating the patient-clinician encounter (such as by reviews of medical records, transcripts of patient-clinician communication and observation), and (3) experimental techniques (patient vignette studies). The term "diagnostic uncertainty" lacks a clear definition, and there is no comprehensive framework for its measurement in medical practice. Based on review findings, we propose that diagnostic uncertainty be defined as a "subjective perception of an inability to provide an accurate explanation of the patient's health problem." Methodological advancements in measuring diagnostic uncertainty can improve our understanding of diagnostic decision-making and inform interventions to reduce diagnostic errors and overuse of health care resources.

  7. Cognitive balanced model: a conceptual scheme of diagnostic decision making.

    PubMed

    Lucchiari, Claudio; Pravettoni, Gabriella

    2012-02-01

    Diagnostic reasoning is a critical aspect of clinical performance, having a high impact on quality and safety of care. Although diagnosis is fundamental in medicine, we still have a poor understanding of the factors that determine its course. According to traditional understanding, all information used in diagnostic reasoning is objective and logically driven. However, these conditions are not always met. Although we would be less likely to make an inaccurate diagnosis when following rational decision making, as described by normative models, the real diagnostic process works in a different way. Recent work has described the major cognitive biases in medicine as well as a number of strategies for reducing them, collectively called debiasing techniques. However, advances have encountered obstacles in achieving implementation into clinical practice. While traditional understanding of clinical reasoning has failed to consider contextual factors, most debiasing techniques seem to fail in raising sound and safer medical praxis. Technological solutions, being data driven, are fundamental in increasing care safety, but they need to consider human factors. Thus, balanced models, cognitive driven and technology based, are needed in day-to-day applications to actually improve the diagnostic process. The purpose of this article, then, is to provide insight into cognitive influences that have resulted in wrong, delayed or missed diagnosis. Using a cognitive approach, we describe the basis of medical error, with particular emphasis on diagnostic error. We then propose a conceptual scheme of the diagnostic process by the use of fuzzy cognitive maps. © 2011 Blackwell Publishing Ltd.

  8. Avoiding Misdiagnosis in Patients with Neurological Emergencies

    PubMed Central

    Pope, Jennifer V.; Edlow, Jonathan A.

    2012-01-01

    Approximately 5% of patients presenting to emergency departments have neurological symptoms. The most common symptoms or diagnoses include headache, dizziness, back pain, weakness, and seizure disorder. Little is known about the actual misdiagnosis of these patients, which can have disastrous consequences for both the patients and the physicians. This paper reviews the existing literature about the misdiagnosis of neurological emergencies and analyzes the reason behind the misdiagnosis by specific presenting complaint. Our goal is to help emergency physicians and other providers reduce diagnostic error, understand how these errors are made, and improve patient care. PMID:22888439

  9. Evaluation and optimization of sampling errors for the Monte Carlo Independent Column Approximation

    NASA Astrophysics Data System (ADS)

    Räisänen, Petri; Barker, W. Howard

    2004-07-01

    The Monte Carlo Independent Column Approximation (McICA) method for computing domain-average broadband radiative fluxes is unbiased with respect to the full ICA, but its flux estimates contain conditional random noise. McICA's sampling errors are evaluated here using a global climate model (GCM) dataset and a correlated-k distribution (CKD) radiation scheme. Two approaches to reduce McICA's sampling variance are discussed. The first is to simply restrict all of McICA's samples to cloudy regions. This avoids wasting precious few samples on essentially homogeneous clear skies. Clear-sky fluxes need to be computed separately for this approach, but this is usually done in GCMs for diagnostic purposes anyway. Second, accuracy can be improved by repeated sampling, and averaging those CKD terms with large cloud radiative effects. Although this naturally increases computational costs over the standard CKD model, random errors for fluxes and heating rates are reduced by typically 50% to 60%, for the present radiation code, when the total number of samples is increased by 50%. When both variance reduction techniques are applied simultaneously, globally averaged flux and heating rate random errors are reduced by a factor of #3.

  10. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine.

    PubMed

    Okafor, Nnaemeka; Payne, Velma L; Chathampally, Yashwant; Miller, Sara; Doshi, Pratik; Singh, Hardeep

    2016-04-01

    Diagnostic errors are common in the emergency department (ED), but few studies have comprehensively evaluated their types and origins. We analysed incidents reported by ED physicians to determine disease conditions, contributory factors and patient harm associated with ED-related diagnostic errors. Between 1 March 2009 and 31 December 2013, ED physicians reported 509 incidents using a department-specific voluntary incident-reporting system that we implemented at two large academic hospital-affiliated EDs. For this study, we analysed 209 incidents related to diagnosis. A quality assurance team led by an ED physician champion reviewed each incident and interviewed physicians when necessary to confirm the presence/absence of diagnostic error and to determine the contributory factors. We generated descriptive statistics quantifying disease conditions involved, contributory factors and patient harm from errors. Among the 209 incidents, we identified 214 diagnostic errors associated with 65 unique diseases/conditions, including sepsis (9.6%), acute coronary syndrome (9.1%), fractures (8.6%) and vascular injuries (8.6%). Contributory factors included cognitive (n=317), system related (n=192) and non-remedial (n=106). Cognitive factors included faulty information verification (41.3%) and faulty information processing (30.6%) whereas system factors included high workload (34.4%) and inefficient ED processes (40.1%). Non-remediable factors included atypical presentation (31.3%) and the patients' inability to provide a history (31.3%). Most errors (75%) involved multiple factors. Major harm was associated with 34/209 (16.3%) of reported incidents. Most diagnostic errors in ED appeared to relate to common disease conditions. While sustaining diagnostic error reporting programmes might be challenging, our analysis reveals the potential value of such systems in identifying targets for improving patient safety in the ED. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Dual processing and diagnostic errors.

    PubMed

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  12. Recognizing and Reducing Analytical Errors and Sources of Variation in Clinical Pathology Data in Safety Assessment Studies.

    PubMed

    Schultze, A E; Irizarry, A R

    2017-02-01

    Veterinary clinical pathologists are well positioned via education and training to assist in investigations of unexpected results or increased variation in clinical pathology data. Errors in testing and unexpected variability in clinical pathology data are sometimes referred to as "laboratory errors." These alterations may occur in the preanalytical, analytical, or postanalytical phases of studies. Most of the errors or variability in clinical pathology data occur in the preanalytical or postanalytical phases. True analytical errors occur within the laboratory and are usually the result of operator or instrument error. Analytical errors are often ≤10% of all errors in diagnostic testing, and the frequency of these types of errors has decreased in the last decade. Analytical errors and increased data variability may result from instrument malfunctions, inability to follow proper procedures, undetected failures in quality control, sample misidentification, and/or test interference. This article (1) illustrates several different types of analytical errors and situations within laboratories that may result in increased variability in data, (2) provides recommendations regarding prevention of testing errors and techniques to control variation, and (3) provides a list of references that describe and advise how to deal with increased data variability.

  13. Systematic errors in regional climate model RegCM over Europe and sensitivity to variations in PBL parameterizations

    NASA Astrophysics Data System (ADS)

    Güttler, I.

    2012-04-01

    Systematic errors in near-surface temperature (T2m), total cloud cover (CLD), shortwave albedo (ALB) and surface net longwave (SNL) and shortwave energy flux (SNS) are detected in simulations of RegCM on 50 km resolution over the European CORDEX domain when forced with ERA-Interim reanalysis. Simulated T2m is compared to CRU 3.0 and other variables to GEWEX-SRB 3.0 dataset. Most of systematic errors found in SNL and SNS are consistent with errors in T2m, CLD and ALB: they include prevailing negative errors in T2m and positive errors in CLD present during most of the year. Errors in T2m and CLD can be associated with the overestimation of SNL and SNS in most simulations. Impact of errors in albedo are primarily confined to north Africa, where e.g. underestimation of albedo in JJA is consistent with associated surface heating and positive SNS and T2m errors. Sensitivity to the choice of the PBL scheme and various parameters in PBL schemes is examined from an ensemble of 20 simulations. The recently implemented prognostic PBL scheme performs over Europe with a mixed success when compared to standard diagnostic scheme with a general increase of errors in T2m and CLD over all of the domain. Nevertheless, the improvements in T2m can be found in e.g. north-eastern Europe during DJF and western Europe during JJA where substantial warm biases existed in simulations with the diagnostic scheme. The most detectable impact, in terms of the JJA T2m errors over western Europe, comes form the variation in the formulation of mixing length. In order to reduce the above errors an update of the RegCM albedo values and further work in customizing PBL scheme is suggested.

  14. Estimation of an accuracy index of a diagnostic biomarker when the reference biomarker is continuous and measured with error.

    PubMed

    Wu, Mixia; Zhang, Dianchen; Liu, Aiyi

    2016-01-01

    New biomarkers continue to be developed for the purpose of diagnosis, and their diagnostic performances are typically compared with an existing reference biomarker used for the same purpose. Considerable amounts of research have focused on receiver operating characteristic curves analysis when the reference biomarker is dichotomous. In the situation where the reference biomarker is measured on a continuous scale and dichotomization is not practically appealing, an index was proposed in the literature to measure the accuracy of a continuous biomarker, which is essentially a linear function of the popular Kendall's tau. We consider the issue of estimating such an accuracy index when the continuous reference biomarker is measured with errors. We first investigate the impact of measurement errors on the accuracy index, and then propose methods to correct for the bias due to measurement errors. Simulation results show the effectiveness of the proposed estimator in reducing biases. The methods are exemplified with hemoglobin A1c measurements obtained from both the central lab and a local lab to evaluate the accuracy of the mean data obtained from the metered blood glucose monitoring against the centrally measured hemoglobin A1c from a behavioral intervention study for families of youth with type 1 diabetes.

  15. Ametropia, retinal anatomy, and OCT abnormality patterns in glaucoma. 1. Impacts of refractive error and interartery angle

    NASA Astrophysics Data System (ADS)

    Elze, Tobias; Baniasadi, Neda; Jin, Qingying; Wang, Hui; Wang, Mengyu

    2017-12-01

    Retinal nerve fiber layer thickness (RNFLT) measured by optical coherence tomography (OCT) is widely used in clinical practice to support glaucoma diagnosis. Clinicians frequently interpret peripapillary RNFLT areas marked as abnormal by OCT machines. However, presently, clinical OCT machines do not take individual retinal anatomy variation into account, and according diagnostic biases have been shown particularly for patients with ametropia. The angle between the two major temporal retinal arteries (interartery angle, IAA) is considered a fundamental retinal ametropia marker. Here, we analyze peripapillary spectral domain OCT RNFLT scans of 691 glaucoma patients and apply multivariate logistic regression to quantitatively compare the diagnostic bias of spherical equivalent (SE) of refractive error and IAA and to identify the precise retinal locations of false-positive/negative abnormality marks. Independent of glaucoma severity (visual field mean deviation), IAA/SE variations biased abnormality marks on OCT RNFLT printouts at 36.7%/22.9% of the peripapillary area, respectively. 17.2% of the biases due to SE are not explained by IAA variation, particularly in inferonasal areas. To conclude, the inclusion of SE and IAA in OCT RNFLT norms would help to increase diagnostic accuracy. Our detailed location maps may help clinicians to reduce diagnostic bias while interpreting retinal OCT scans.

  16. Interceptive Beam Diagnostics - Signal Creation and Materials Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plum, Michael; Spallation Neutron Source, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN

    2004-11-10

    The focus of this tutorial will be on interceptive beam diagnostics such as wire scanners, screens, and harps. We will start with an overview of the various ways beams interact with materials to create signals useful for beam diagnostics systems. We will then discuss the errors in a harp or wire scanner profile measurement caused by errors in wire position, number of samples, and signal errors. Finally we will apply our results to two design examples-the SNS wire scanner system and the SNS target harp.

  17. Educational agenda for diagnostic error reduction

    PubMed Central

    Trowbridge, Robert L; Dhaliwal, Gurpreet; Cosby, Karen S

    2013-01-01

    Diagnostic errors are a major patient safety concern. Although the majority of diagnostic errors are partially attributable to cognitive mistakes, the most effective means of improving clinician cognition in order to achieve gains in diagnostic reliability are unclear. We propose a tripartite educational agenda for improving diagnostic performance among students, residents and practising physicians. This agenda includes strengthening the metacognitive abilities of clinicians, fostering intuitive reasoning and increasing awareness of the role of systems in the diagnostic process. The evidence supporting initiatives in each of these realms is reviewed and a course of future implementation and study is proposed. The barriers to designing and implementing this agenda are substantial and include limited evidence supporting these initiatives and the challenges of changing the practice patterns of practising physicians. Implementation will need to be accompanied by rigorous evaluation. PMID:23764435

  18. The role of MRI in musculoskeletal practice: a clinical perspective

    PubMed Central

    Dean Deyle, Gail

    2011-01-01

    This clinical perspective presents an overview of current and potential uses for magnetic resonance imaging (MRI) in musculoskeletal practice. Clinical practice guidelines and current evidence for improved outcomes will help providers determine the situations when an MRI is indicated. The advanced competency standard of examination used by physical therapists will be helpful to prevent overuse of musculoskeletal imaging, reduce diagnostic errors, and provide the appropriate clinical context to pathology revealed on MRI. Physical therapists are diagnostically accurate and appropriately conservative in their use of MRI consistent with evidence-based principles of diagnosis and screening. PMID:22851878

  19. Three-dimensional assessment of facial asymmetry: A systematic review.

    PubMed

    Akhil, Gopi; Senthil Kumar, Kullampalayam Palanisamy; Raja, Subramani; Janardhanan, Kumaresan

    2015-08-01

    For patients with facial asymmetry, complete and precise diagnosis, and surgical treatments to correct the underlying cause of the asymmetry are significant. Conventional diagnostic radiographs (submento-vertex projections, posteroanterior radiography) have limitations in asymmetry diagnosis due to two-dimensional assessments of three-dimensional (3D) images. The advent of 3D images has greatly reduced the magnification and projection errors that are common in conventional radiographs making it as a precise diagnostic aid for assessment of facial asymmetry. Thus, this article attempts to review the newly introduced 3D tools in the diagnosis of more complex facial asymmetries.

  20. Dynamic diagnostics of the error fields in tokamaks

    NASA Astrophysics Data System (ADS)

    Pustovitov, V. D.

    2007-07-01

    The error field diagnostics based on magnetic measurements outside the plasma is discussed. The analysed methods rely on measuring the plasma dynamic response to the finite-amplitude external magnetic perturbations, which are the error fields and the pre-programmed probing pulses. Such pulses can be created by the coils designed for static error field correction and for stabilization of the resistive wall modes, the technique developed and applied in several tokamaks, including DIII-D and JET. Here analysis is based on the theory predictions for the resonant field amplification (RFA). To achieve the desired level of the error field correction in tokamaks, the diagnostics must be sensitive to signals of several Gauss. Therefore, part of the measurements should be performed near the plasma stability boundary, where the RFA effect is stronger. While the proximity to the marginal stability is important, the absolute values of plasma parameters are not. This means that the necessary measurements can be done in the diagnostic discharges with parameters below the nominal operating regimes, with the stability boundary intentionally lowered. The estimates for ITER are presented. The discussed diagnostics can be tested in dedicated experiments in existing tokamaks. The diagnostics can be considered as an extension of the 'active MHD spectroscopy' used recently in the DIII-D tokamak and the EXTRAP T2R reversed field pinch.

  1. Looking for trouble? Diagnostics expanding disease and producing patients.

    PubMed

    Hofmann, Bjørn

    2018-05-23

    Novel tests give great opportunities for earlier and more precise diagnostics. At the same time, new tests expand disease, produce patients, and cause unnecessary harm in overdiagnosis and overtreatment. How can we evaluate diagnostics to obtain the benefits and avoid harm? One way is to pay close attention to the diagnostic process and its core concepts. Doing so reveals 3 errors that expand disease and increase overdiagnosis. The first error is to decouple diagnostics from harm, eg, by diagnosing insignificant conditions. The second error is to bypass proper validation of the relationship between test indicator and disease, eg, by introducing biomarkers for Alzheimer's disease before the tests are properly validated. The third error is to couple the name of disease to insignificant or indecisive indicators, eg, by lending the cancer name to preconditions, such as ductal carcinoma in situ. We need to avoid these errors to promote beneficial testing, bar harmful diagnostics, and evade unwarranted expansion of disease. Accordingly, we must stop identifying and testing for conditions that are only remotely associated with harm. We need more stringent verification of tests, and we must avoid naming indicators and indicative conditions after diseases. If not, we will end like ancient tragic heroes, succumbing because of our very best abilities. © 2018 John Wiley & Sons, Ltd.

  2. Toward diagnostic and phenotype markers for genetically transmitted speech delay.

    PubMed

    Shriberg, Lawrence D; Lewis, Barbara A; Tomblin, J Bruce; McSweeny, Jane L; Karlsson, Heather B; Scheer, Alison R

    2005-08-01

    Converging evidence supports the hypothesis that the most common subtype of childhood speech sound disorder (SSD) of currently unknown origin is genetically transmitted. We report the first findings toward a set of diagnostic markers to differentiate this proposed etiological subtype (provisionally termed speech delay-genetic) from other proposed subtypes of SSD of unknown origin. Conversational speech samples from 72 preschool children with speech delay of unknown origin from 3 research centers were selected from an audio archive. Participants differed on the number of biological, nuclear family members (0 or 2+) classified as positive for current and/or prior speech-language disorder. Although participants in the 2 groups were found to have similar speech competence, as indexed by their Percentage of Consonants Correct scores, their speech error patterns differed significantly in 3 ways. Compared with children who may have reduced genetic load for speech delay (no affected nuclear family members), children with possibly higher genetic load (2+ affected members) had (a) a significantly higher proportion of relative omission errors on the Late-8 consonants; (b) a significantly lower proportion of relative distortion errors on these consonants, particularly on the sibilant fricatives /s/, /z/, and //; and (c) a significantly lower proportion of backed /s/ distortions, as assessed by both perceptual and acoustic methods. Machine learning routines identified a 3-part classification rule that included differential weightings of these variables. The classification rule had diagnostic accuracy value of 0.83 (95% confidence limits = 0.74-0.92), with positive and negative likelihood ratios of 9.6 (95% confidence limits = 3.1-29.9) and 0.40 (95% confidence limits = 0.24-0.68), respectively. The diagnostic accuracy findings are viewed as promising. The error pattern for this proposed subtype of SSD is viewed as consistent with the cognitive-linguistic processing deficits that have been reported for genetically transmitted verbal disorders.

  3. Error Consistency in Acquired Apraxia of Speech with Aphasia: Effects of the Analysis Unit

    ERIC Educational Resources Information Center

    Haley, Katarina L.; Cunningham, Kevin T.; Eaton, Catherine Torrington; Jacks, Adam

    2018-01-01

    Purpose: Diagnostic recommendations for acquired apraxia of speech (AOS) have been contradictory concerning whether speech sound errors are consistent or variable. Studies have reported divergent findings that, on face value, could argue either for or against error consistency as a diagnostic criterion. The purpose of this study was to explain…

  4. [Cognitive errors in diagnostic decision making].

    PubMed

    Gäbler, Martin

    2017-10-01

    Approximately 10-15% of our diagnostic decisions are faulty and may lead to unfavorable and dangerous outcomes, which could be avoided. These diagnostic errors are mainly caused by cognitive biases in the diagnostic reasoning process.Our medical diagnostic decision-making is based on intuitive "System 1" and analytical "System 2" diagnostic decision-making and can be deviated by unconscious cognitive biases.These deviations can be positively influenced on a systemic and an individual level. For the individual, metacognition (internal withdrawal from the decision-making process) and debiasing strategies, such as verification, falsification and rule out worst-case scenarios, can lead to improved diagnostic decisions making.

  5. A Combined Fabrication and Instrumentation Platform for Sample Preparation.

    PubMed

    Guckenberger, David J; Thomas, Peter C; Rothbauer, Jacob; LaVanway, Alex J; Anderson, Meghan; Gilson, Dan; Fawcett, Kevin; Berto, Tristan; Barrett, Kevin; Beebe, David J; Berry, Scott M

    2014-06-01

    While potentially powerful, access to molecular diagnostics is substantially limited in the developing world. Here we present an approach to reduced cost molecular diagnostic instrumentation that has the potential to empower developing world communities by reducing costs through streamlining the sample preparation process. In addition, this instrument is capable of producing its own consumable devices on demand, reducing reliance on assay suppliers. Furthermore, this instrument is designed with an "open" architecture, allowing users to visually observe the assay process and make modifications as necessary (as opposed to traditional "black box" systems). This open environment enables integration of microfluidic fabrication and viral RNA purification onto an easy-to-use modular system via the use of interchangeable trays. Here we employ this system to develop a protocol to fabricate microfluidic devices and then use these devices to isolate viral RNA from serum for the measurement of human immunodeficiency virus (HIV) viral load. Results obtained from this method show significantly reduced error compared with similar nonautomated sample preparation processes. © 2014 Society for Laboratory Automation and Screening.

  6. [Medical errors: inevitable but preventable].

    PubMed

    Giard, R W

    2001-10-27

    Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.

  7. Error-Monitoring in Response to Social Stimuli in Individuals with Higher-Functioning Autism Spectrum Disorder

    PubMed Central

    McMahon, Camilla M.; Henderson, Heather A.

    2014-01-01

    Error-monitoring, or the ability to recognize one's mistakes and implement behavioral changes to prevent further mistakes, may be impaired in individuals with Autism Spectrum Disorder (ASD). Children and adolescents (ages 9-19) with ASD (n = 42) and typical development (n = 42) completed two face processing tasks that required discrimination of either the gender or affect of standardized face stimuli. Post-error slowing and the difference in Error-Related Negativity amplitude between correct and incorrect responses (ERNdiff) were used to index error-monitoring ability. Overall, ERNdiff increased with age. On the Gender Task, individuals with ASD had a smaller ERNdiff than individuals with typical development; however, on the Affect Task, there were no significant diagnostic group differences on ERNdiff. Individuals with ASD may have ERN amplitudes similar to those observed in individuals with typical development in more social contexts compared to less social contexts due to greater consequences for errors, more effortful processing, and/or reduced processing efficiency in these contexts. Across all participants, more post-error slowing on the Affect Task was associated with better social cognitive skills. PMID:25066088

  8. Influence of perceived difficulty of cases on student osteopaths' diagnostic reasoning: a cross sectional study.

    PubMed

    Noyer, Aurelien L; Esteves, Jorge E; Thomson, Oliver P

    2017-01-01

    Diagnostic reasoning refers to the cognitive processes by which clinicians formulate diagnoses. Despite the implications for patient safety and professional identity, research on diagnostic reasoning in osteopathy remains largely theoretical. The aim of this study was to investigate the influence of perceived task difficulty on the diagnostic reasoning of students osteopaths. Using a single-blinded, cross sectional study design, sixteen final year pre-registration osteopathy students diagnosed two standardized cases under two context conditions (complex versus control). Context difficulty was manipulated via verbal manipulation and case order was randomized and counterbalanced across subjects to ensure that each case was diagnosed evenly under both conditions (i.e. half of the subjects performed either case A or B first). After diagnosis, participants were presented with items (literal, inferred and filler) designed to represent analytical and non-analytical reasoning. Response time and error rate for each item were measured. A repeated measures analysis of variance (concept type x context) was performed to identify differences across conditions and make inferences on diagnostic reasoning. Participants made significantly more errors when judging literal concepts and took significantly less time to recognize filler concepts in the complex context. No significant difference in ability to judge inferred concepts across contexts was found. Although speculative and preliminary, our findings suggest the perception of complexity led to an increased reliance on analytical reasoning at the detriment of non-analytical reasoning. To reduce the associated cognitive load, osteopathic educational institutions could consider developing the intuitive diagnostic capabilities of pre-registration students. Postgraduate mentorship opportunities could be considered to enhance the diagnostic reasoning of professional osteopaths, particularly recent graduates. Further research exploring the influence of expertise is required to enhance the validity of this study.

  9. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  10. Imperfect Gold Standards for Kidney Injury Biomarker Evaluation

    PubMed Central

    Betensky, Rebecca A.; Emerson, Sarah C.; Bonventre, Joseph V.

    2012-01-01

    Clinicians have used serum creatinine in diagnostic testing for acute kidney injury for decades, despite its imperfect sensitivity and specificity. Novel tubular injury biomarkers may revolutionize the diagnosis of acute kidney injury; however, even if a novel tubular injury biomarker is 100% sensitive and 100% specific, it may appear inaccurate when using serum creatinine as the gold standard. Acute kidney injury, as defined by serum creatinine, may not reflect tubular injury, and the absence of changes in serum creatinine does not assure the absence of tubular injury. In general, the apparent diagnostic performance of a biomarker depends not only on its ability to detect injury, but also on disease prevalence and the sensitivity and specificity of the imperfect gold standard. Assuming that, at a certain cutoff value, serum creatinine is 80% sensitive and 90% specific and disease prevalence is 10%, a new perfect biomarker with a true 100% sensitivity may seem to have only 47% sensitivity compared with serum creatinine as the gold standard. Minimizing misclassification by using more strict criteria to diagnose acute kidney injury will reduce the error when evaluating the performance of a biomarker under investigation. Apparent diagnostic errors using a new biomarker may be a reflection of errors in the imperfect gold standard itself, rather than poor performance of the biomarker. The results of this study suggest that small changes in serum creatinine alone should not be used to define acute kidney injury in biomarker or interventional studies. PMID:22021710

  11. Periodic Application of Concurrent Error Detection in Processor Array Architectures. PhD. Thesis -

    NASA Technical Reports Server (NTRS)

    Chen, Paul Peichuan

    1993-01-01

    Processor arrays can provide an attractive architecture for some applications. Featuring modularity, regular interconnection and high parallelism, such arrays are well-suited for VLSI/WSI implementations, and applications with high computational requirements, such as real-time signal processing. Preserving the integrity of results can be of paramount importance for certain applications. In these cases, fault tolerance should be used to ensure reliable delivery of a system's service. One aspect of fault tolerance is the detection of errors caused by faults. Concurrent error detection (CED) techniques offer the advantage that transient and intermittent faults may be detected with greater probability than with off-line diagnostic tests. Applying time-redundant CED techniques can reduce hardware redundancy costs. However, most time-redundant CED techniques degrade a system's performance.

  12. Enhancements to the timing of the OMEGA laser system to improve illumination uniformity

    NASA Astrophysics Data System (ADS)

    Donaldson, W. R.; Katz, J.; Kosc, T. Z.; Kelly, J. H.; Hill, E. M.; Bahr, R. E.

    2016-09-01

    Two diagnostics have been developed to improve the uniformity on the OMEGA Laser System, which is used for inertial confinement fusion (ICF) research. The first diagnostic measures the phase of an optical modulator (used for the spectral dispersion technique employed on OMEGA to enhance spatial smoothing), which adds bandwidth to the optical pulse. Setting this phase precisely is required to reduce pointing errors. The second diagnostic ensures that the arrival times of all the beams are synchronized. The arrival of each of the 60 OMEGA beams is measured by placing a 1-mm diffusing sphere at target chamber center. By comparing the arrival time of each beam with respect to a reference pulse, the measured timing spread of the OMEGA Laser System is now 3.8 ps.

  13. Modelling the influence of noise of the image sensor for blood cells recognition in computer microscopy

    NASA Astrophysics Data System (ADS)

    Nikitaev, V. G.; Nagornov, O. V.; Pronichev, A. N.; Polyakov, E. V.; Dmitrieva, V. V.

    2017-12-01

    The first stage of diagnostics of blood cancer is the analysis of blood smears. The application of decision-making support systems would reduce the subjectivity of the diagnostic process and avoid errors, resulting in often irreversible changes in the patient's condition. In this regard, the solution of this problem requires the use of modern technology. One of the tools of the program classification of blood cells are texture features, and the task of finding informative among them is promising. The paper investigates the effect of noise of the image sensor to informative texture features with application of methods of mathematical modelling.

  14. Sharing Vital Signs between mobile phone applications.

    PubMed

    Karlen, Walter; Dumont, Guy A; Scheffer, Cornie

    2014-01-01

    We propose a communication library, ShareVitalSigns, for the standardized exchange of vital sign information between health applications running on mobile platforms. The library allows an application to request one or multiple vital signs from independent measurement applications on the Android OS. Compatible measurement applications are automatically detected and can be launched from within the requesting application, simplifying the work flow for the user and reducing typing errors. Data is shared between applications using intents, a passive data structure available on Android OS. The library is accompanied by a test application which serves as a demonstrator. The secure exchange of vital sign information using a standardized library like ShareVitalSigns will facilitate the integration of measurement applications into diagnostic and other high level health monitoring applications and reduce errors due to manual entry of information.

  15. Benefit-risk Evaluation for Diagnostics: A Framework (BED-FRAME).

    PubMed

    Evans, Scott R; Pennello, Gene; Pantoja-Galicia, Norberto; Jiang, Hongyu; Hujer, Andrea M; Hujer, Kristine M; Manca, Claudia; Hill, Carol; Jacobs, Michael R; Chen, Liang; Patel, Robin; Kreiswirth, Barry N; Bonomo, Robert A

    2016-09-15

    The medical community needs systematic and pragmatic approaches for evaluating the benefit-risk trade-offs of diagnostics that assist in medical decision making. Benefit-Risk Evaluation of Diagnostics: A Framework (BED-FRAME) is a strategy for pragmatic evaluation of diagnostics designed to supplement traditional approaches. BED-FRAME evaluates diagnostic yield and addresses 2 key issues: (1) that diagnostic yield depends on prevalence, and (2) that different diagnostic errors carry different clinical consequences. As such, evaluating and comparing diagnostics depends on prevalence and the relative importance of potential errors. BED-FRAME provides a tool for communicating the expected clinical impact of diagnostic application and the expected trade-offs of diagnostic alternatives. BED-FRAME is a useful fundamental supplement to the standard analysis of diagnostic studies that will aid in clinical decision making. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  16. Influence of ECG measurement accuracy on ECG diagnostic statements.

    PubMed

    Zywietz, C; Celikag, D; Joseph, G

    1996-01-01

    Computer analysis of electrocardiograms (ECGs) provides a large amount of ECG measurement data, which may be used for diagnostic classification and storage in ECG databases. Until now, neither error limits for ECG measurements have been specified nor has their influence on diagnostic statements been systematically investigated. An analytical method is presented to estimate the influence of measurement errors on the accuracy of diagnostic ECG statements. Systematic (offset) errors will usually result in an increase of false positive or false negative statements since they cause a shift of the working point on the receiver operating characteristics curve. Measurement error dispersion broadens the distribution function of discriminative measurement parameters and, therefore, usually increases the overlap between discriminative parameters. This results in a flattening of the receiver operating characteristics curve and an increase of false positive and false negative classifications. The method developed has been applied to ECG conduction defect diagnoses by using the proposed International Electrotechnical Commission's interval measurement tolerance limits. These limits appear too large because more than 30% of false positive atrial conduction defect statements and 10-18% of false intraventricular conduction defect statements could be expected due to tolerated measurement errors. To assure long-term usability of ECG measurement databases, it is recommended that systems provide its error tolerance limits obtained on a defined test set.

  17. Multiparameter measurement utilizing poloidal polarimeter for burning plasma reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imazawa, Ryota; Kawano, Yasunori; Itami, Kiyoshi

    2014-08-21

    The authors have made the basic and applied research on the polarimeter for plasma diagnostics. Recently, the authors have proposed an application of multiparameter measurement (magnetic field, B, electron density, n{sub e}, electron temperature, T{sub e}, and total plasma current, I{sub p}) utilizing polarimeter to future fusion reactors. In this proceedings, the brief review of the polarimeter, the principle of the multiparameter measurement and the progress of the research on the multiparameter measurement are explained. The measurement method that the authors have proposed is suitable for the reactor for the following reasons; multiparameters can be obtained from a small numbermore » of diagnostics, the proposed method does not depend on time-history, and far-infrared light utilized by the polarimeter is less sensitive to degradation of of optical components. Taking into account the measuring error, performance assessment of the proposed method was carried. Assuming that the error of Δθ and Δε were 0.1° and 0.6°, respectively, the error of reconstructed j{sub φ}, n{sub e} and T{sub e} were 12 %, 8.4 % and 31 %, respectively. This study has shown that the reconstruction error can be decreased by increasing the number of the wavelength of the probing laser and by increasing the number of the viewing chords. For example, By increasing the number of viewing chords to forty-five, the error of j{sub φ}, n{sub e} and T{sub e} were reduced to 4.4 %, 4.4 %, and 17 %, respectively.« less

  18. Patient Safety in the Context of Neonatal Intensive Care: Research and Educational Opportunities

    PubMed Central

    Raju, Tonse N. K.; Suresh, Gautham; Higgins, Rosemary D.

    2012-01-01

    Case reports and observational studies continue to report adverse events from medical errors. However, despite considerable attention to patient safety in the popular media, this topic is not a regular component of medical education, and much research needs to be carried out to understand the causes, consequences, and prevention of healthcare-related adverse events during neonatal intensive care. To address the knowledge gaps and to formulate a research and educational agenda in neonatology, the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) invited a panel of experts to a workshop in August 2010. Patient safety issues discussed were: the reasons for errors, including systems design, working conditions, and worker fatigue; a need to develop a “culture” of patient safety; the role of electronic medical records, information technology, and simulators in reducing errors; error disclosure practices; medico-legal concerns; and educational needs. Specific neonatology-related topics discussed were: errors during resuscitation, mechanical ventilation, and performance of invasive procedures; medication errors including those associated with milk feedings; diagnostic errors; and misidentification of patients. This article provides an executive summary of the workshop. PMID:21386749

  19. Medical errors in primary care clinics – a cross sectional study

    PubMed Central

    2012-01-01

    Background Patient safety is vital in patient care. There is a lack of studies on medical errors in primary care settings. The aim of the study is to determine the extent of diagnostic inaccuracies and management errors in public funded primary care clinics. Methods This was a cross-sectional study conducted in twelve public funded primary care clinics in Malaysia. A total of 1753 medical records were randomly selected in 12 primary care clinics in 2007 and were reviewed by trained family physicians for diagnostic, management and documentation errors, potential errors causing serious harm and likelihood of preventability of such errors. Results The majority of patient encounters (81%) were with medical assistants. Diagnostic errors were present in 3.6% (95% CI: 2.2, 5.0) of medical records and management errors in 53.2% (95% CI: 46.3, 60.2). For management errors, medication errors were present in 41.1% (95% CI: 35.8, 46.4) of records, investigation errors in 21.7% (95% CI: 16.5, 26.8) and decision making errors in 14.5% (95% CI: 10.8, 18.2). A total of 39.9% (95% CI: 33.1, 46.7) of these errors had the potential to cause serious harm. Problems of documentation including illegible handwriting were found in 98.0% (95% CI: 97.0, 99.1) of records. Nearly all errors (93.5%) detected were considered preventable. Conclusions The occurrence of medical errors was high in primary care clinics particularly with documentation and medication errors. Nearly all were preventable. Remedial intervention addressing completeness of documentation and prescriptions are likely to yield reduction of errors. PMID:23267547

  20. Speech Characteristics and Intelligibility in Adults with Mild and Moderate Intellectual Disabilities

    PubMed Central

    Coppens-Hofman, Marjolein C.; Terband, Hayo; Snik, Ad F.M.; Maassen, Ben A.M.

    2017-01-01

    Purpose Adults with intellectual disabilities (ID) often show reduced speech intelligibility, which affects their social interaction skills. This study aims to establish the main predictors of this reduced intelligibility in order to ultimately optimise management. Method Spontaneous speech and picture naming tasks were recorded in 36 adults with mild or moderate ID. Twenty-five naïve listeners rated the intelligibility of the spontaneous speech samples. Performance on the picture-naming task was analysed by means of a phonological error analysis based on expert transcriptions. Results The transcription analyses showed that the phonemic and syllabic inventories of the speakers were complete. However, multiple errors at the phonemic and syllabic level were found. The frequencies of specific types of errors were related to intelligibility and quality ratings. Conclusions The development of the phonemic and syllabic repertoire appears to be completed in adults with mild-to-moderate ID. The charted speech difficulties can be interpreted to indicate speech motor control and planning difficulties. These findings may aid the development of diagnostic tests and speech therapies aimed at improving speech intelligibility in this specific group. PMID:28118637

  1. Draft versus finished sequence data for DNA and protein diagnostic signature development

    PubMed Central

    Gardner, Shea N.; Lam, Marisa W.; Smith, Jason R.; Torres, Clinton L.; Slezak, Tom R.

    2005-01-01

    Sequencing pathogen genomes is costly, demanding careful allocation of limited sequencing resources. We built a computational Sequencing Analysis Pipeline (SAP) to guide decisions regarding the amount of genomic sequencing necessary to develop high-quality diagnostic DNA and protein signatures. SAP uses simulations to estimate the number of target genomes and close phylogenetic relatives (near neighbors or NNs) to sequence. We use SAP to assess whether draft data are sufficient or finished sequencing is required using Marburg and variola virus sequences. Simulations indicate that intermediate to high-quality draft with error rates of 10−3–10−5 (∼8× coverage) of target organisms is suitable for DNA signature prediction. Low-quality draft with error rates of ∼1% (3× to 6× coverage) of target isolates is inadequate for DNA signature prediction, although low-quality draft of NNs is sufficient, as long as the target genomes are of high quality. For protein signature prediction, sequencing errors in target genomes substantially reduce the detection of amino acid sequence conservation, even if the draft is of high quality. In summary, high-quality draft of target and low-quality draft of NNs appears to be a cost-effective investment for DNA signature prediction, but may lead to underestimation of predicted protein signatures. PMID:16243783

  2. Minimising human error in malaria rapid diagnosis: clarity of written instructions and health worker performance.

    PubMed

    Rennie, Waverly; Phetsouvanh, Rattanaxay; Lupisan, Socorro; Vanisaveth, Viengsay; Hongvanthong, Bouasy; Phompida, Samlane; Alday, Portia; Fulache, Mila; Lumagui, Richard; Jorgensen, Pernille; Bell, David; Harvey, Steven

    2007-01-01

    The usefulness of rapid diagnostic tests (RDT) in malaria case management depends on the accuracy of the diagnoses they provide. Despite their apparent simplicity, previous studies indicate that RDT accuracy is highly user-dependent. As malaria RDTs will frequently be used in remote areas with little supervision or support, minimising mistakes is crucial. This paper describes the development of new instructions (job aids) to improve health worker performance, based on observations of common errors made by remote health workers and villagers in preparing and interpreting RDTs, in the Philippines and Laos. Initial preparation using the instructions provided by the manufacturer was poor, but improved significantly with the job aids (e.g. correct use both of the dipstick and cassette increased in the Philippines by 17%). However, mistakes in preparation remained commonplace, especially for dipstick RDTs, as did mistakes in interpretation of results. A short orientation on correct use and interpretation further improved accuracy, from 70% to 80%. The results indicate that apparently simple diagnostic tests can be poorly performed and interpreted, but provision of clear, simple instructions can reduce these errors. Preparation of appropriate instructions and training as well as monitoring of user behaviour are an essential part of rapid test implementation.

  3. Discrepancies in reporting the CAG repeat lengths for Huntington's disease

    PubMed Central

    Quarrell, Oliver W; Handley, Olivia; O'Donovan, Kirsty; Dumoulin, Christine; Ramos-Arroyo, Maria; Biunno, Ida; Bauer, Peter; Kline, Margaret; Landwehrmeyer, G Bernhard

    2012-01-01

    Huntington's disease results from a CAG repeat expansion within the Huntingtin gene; this is measured routinely in diagnostic laboratories. The European Huntington's Disease Network REGISTRY project centrally measures CAG repeat lengths on fresh samples; these were compared with the original results from 121 laboratories across 15 countries. We report on 1326 duplicate results; a discrepancy in reporting the upper allele occurred in 51% of cases, this reduced to 13.3% and 9.7% when we applied acceptable measurement errors proposed by the American College of Medical Genetics and the Draft European Best Practice Guidelines, respectively. Duplicate results were available for 1250 lower alleles; discrepancies occurred in 40% of cases. Clinically significant discrepancies occurred in 4.0% of cases with a potential unexplained misdiagnosis rate of 0.3%. There was considerable variation in the discrepancy rate among 10 of the countries participating in this study. Out of 1326 samples, 348 were re-analysed by an accredited diagnostic laboratory, based in Germany, with concordance rates of 93% and 94% for the upper and lower alleles, respectively. This became 100% if the acceptable measurement errors were applied. The central laboratory correctly reported allele sizes for six standard reference samples, blind to the known result. Our study differs from external quality assessment (EQA) schemes in that these are duplicate results obtained from a large sample of patients across the whole diagnostic range. We strongly recommend that laboratories state an error rate for their measurement on the report, participate in EQA schemes and use reference materials regularly to adjust their own internal standards. PMID:21811303

  4. Philosophy of science and the diagnostic process.

    PubMed

    Willis, Brian H; Beebee, Helen; Lasserson, Daniel S

    2013-10-01

    This is an overview of the principles that underpin philosophy of science and how they may provide a framework for the diagnostic process. Although philosophy dates back to antiquity, it is only more recently that philosophers have begun to enunciate the scientific method. Since Aristotle formulated deduction, other modes of reasoning including induction, inference to best explanation, falsificationism, theory-laden observations and Bayesian inference have emerged. Thus, rather than representing a single overriding dogma, the scientific method is a toolkit of ideas and principles of reasoning. Here we demonstrate that the diagnostic process is an example of science in action and is therefore subject to the principles encompassed by the scientific method. Although a number of the different forms of reasoning are used readily by clinicians in practice, without a clear understanding of their pitfalls and the assumptions on which they are based, it leaves doctors open to diagnostic error. We conclude by providing a case example from the medico-legal literature in which diagnostic errors were made, to illustrate how applying the scientific method may mitigate the chance for diagnostic error.

  5. Two-point motional Stark effect diagnostic for Madison Symmetric Torus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, J.; Den Hartog, D. J.; Caspary, K. J.

    2010-10-15

    A high-precision spectral motional Stark effect (MSE) diagnostic provides internal magnetic field measurements for Madison Symmetric Torus (MST) plasmas. Currently, MST uses two spatial views - on the magnetic axis and on the midminor (off-axis) radius, the latter added recently. A new analysis scheme has been developed to infer both the pitch angle and the magnitude of the magnetic field from MSE spectra. Systematic errors are reduced by using atomic data from atomic data and analysis structure in the fit. Reconstructed current density and safety factor profiles are more strongly and globally constrained with the addition of the off-axis radiusmore » measurement than with the on-axis one only.« less

  6. The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error

    PubMed Central

    Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G

    2012-01-01

    Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908

  7. Clinical decision-making: heuristics and cognitive biases for the ophthalmologist.

    PubMed

    Hussain, Ahsen; Oestreicher, James

    Diagnostic errors have a significant impact on health care outcomes and patient care. The underlying causes and development of diagnostic error are complex with flaws in health care systems, as well as human error, playing a role. Cognitive biases and a failure of decision-making shortcuts (heuristics) are human factors that can compromise the diagnostic process. We describe these mechanisms, their role with the clinician, and provide clinical scenarios to highlight the various points at which biases may emerge. We discuss strategies to modify the development and influence of these processes and the vulnerability of heuristics to provide insight and improve clinical outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Evaluating the Performance Diagnostic Checklist-Human Services to Assess Incorrect Error-Correction Procedures by Preschool Paraprofessionals

    ERIC Educational Resources Information Center

    Bowe, Melissa; Sellers, Tyra P.

    2018-01-01

    The Performance Diagnostic Checklist-Human Services (PDC-HS) has been used to assess variables contributing to undesirable staff performance. In this study, three preschool teachers completed the PDC-HS to identify the factors contributing to four paraprofessionals' inaccurate implementation of error-correction procedures during discrete trial…

  9. Analysis of Students' Error in Learning of Quadratic Equations

    ERIC Educational Resources Information Center

    Zakaria, Effandi; Ibrahim; Maat, Siti Mistima

    2010-01-01

    The purpose of the study was to determine the students' error in learning quadratic equation. The samples were 30 form three students from a secondary school in Jambi, Indonesia. Diagnostic test was used as the instrument of this study that included three components: factorization, completing the square and quadratic formula. Diagnostic interview…

  10. Accelerated Compressed Sensing Based CT Image Reconstruction.

    PubMed

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  11. Accelerated Compressed Sensing Based CT Image Reconstruction

    PubMed Central

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200

  12. Educational Diagnostic Assessment.

    ERIC Educational Resources Information Center

    Bejar, Isaac I.

    1984-01-01

    Approaches proposed for educational diagnostic assessment are reviewed and identified as deficit assessment and error analysis. The development of diagnostic instruments may require a reexamination of existing psychometric models and development of alternative ones. The psychometric and content demands of diagnostic assessment all but require test…

  13. The current and ideal state of anatomic pathology patient safety.

    PubMed

    Raab, Stephen Spencer

    2014-01-01

    An anatomic pathology diagnostic error may be secondary to a number of active and latent technical and/or cognitive components, which may occur anywhere along the total testing process in clinical and/or laboratory domains. For the pathologist interpretive steps of diagnosis, we examine Kahneman's framework of slow and fast thinking to explain different causes of error in precision (agreement) and in accuracy (truth). The pathologist cognitive diagnostic process involves image pattern recognition and a slow thinking error may be caused by the application of different rationally-constructed mental maps of image criteria/patterns by different pathologists. This type of error is partly related to a system failure in standardizing the application of these maps. A fast thinking error involves the flawed leap from image pattern to incorrect diagnosis. In the ideal state, anatomic pathology systems would target these cognitive error causes as well as the technical latent factors that lead to error.

  14. Investigating the Association of Eye Gaze Pattern and Diagnostic Error in Mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Pinto, Frank M; Xu, Songhua

    2013-01-01

    The objective of this study was to investigate the association between eye-gaze patterns and the diagnostic accuracy of radiologists for the task of assessing the likelihood of malignancy of mammographic masses. Six radiologists (2 expert breast imagers and 4 Radiology residents of variable training) assessed the likelihood of malignancy of 40 biopsy-proven mammographic masses (20 malignant and 20 benign) on a computer monitor. Eye-gaze data were collected using a commercial remote eye-tracker. Upon reviewing each mass, the radiologists were also asked to provide their assessment regarding the probability of malignancy of the depicted mass as well as a rating regardingmore » the perceived difficulty of the diagnostic task. The collected data were analyzed using established algorithms and various quantitative metrics were extracted to characterize the recorded gaze patterns. The extracted metrics were correlated with the radiologists diagnostic decisions and perceived complexity scores. Results showed that the visual gaze pattern of radiologists varies substantially, not only depending on their experience level but also among individuals. However, some eye gaze metrics appear to correlate with diagnostic error and perceived complexity more consistently. These results suggest that although gaze patterns are generally associated with diagnostic error and the human perceived difficulty of the diagnostic task, there are substantially individual differences that are not explained simply by the experience level of the individual performing the diagnostic task.« less

  15. Precise signal amplitude retrieval for a non-homogeneous diagnostic beam using complex interferometry approach

    NASA Astrophysics Data System (ADS)

    Krupka, M.; Kalal, M.; Dostal, J.; Dudzak, R.; Juha, L.

    2017-08-01

    Classical interferometry became widely used method of active optical diagnostics. Its more advanced version, allowing reconstruction of three sets of data from just one especially designed interferogram (so called complex interferogram) was developed in the past and became known as complex interferometry. Along with the phase shift, which can be also retrieved using classical interferometry, the amplitude modifications of the probing part of the diagnostic beam caused by the object under study (to be called the signal amplitude) as well as the contrast of the interference fringes can be retrieved using the complex interferometry approach. In order to partially compensate for errors in the reconstruction due to imperfections in the diagnostic beam intensity structure as well as for errors caused by a non-ideal optical setup of the interferometer itself (including the quality of its optical components), a reference interferogram can be put to a good use. This method of interferogram analysis of experimental data has been successfully implemented in practice. However, in majority of interferometer setups (especially in the case of the ones employing the wavefront division) the probe and the reference part of the diagnostic beam would feature different intensity distributions over their respective cross sections. This introduces additional error into the reconstruction of the signal amplitude and the fringe contrast, which cannot be resolved using the reference interferogram only. In order to deal with this error it was found that additional separately recorded images of the intensity distribution of the probe and the reference part of the diagnostic beam (with no signal present) are needed. For the best results a sufficient shot-to-shot stability of the whole diagnostic system is required. In this paper, efficiency of the complex interferometry approach for obtaining the highest possible accuracy of the signal amplitude reconstruction is verified using the computer generated complex and reference interferograms containing artificially introduced intensity variations in the probe and the reference part of the diagnostic beam. These sets of data are subsequently analyzed and the errors of the signal amplitude reconstruction are evaluated.

  16. Should learners reason one step at a time? A randomised trial of two diagnostic scheme designs.

    PubMed

    Blissett, Sarah; Morrison, Deric; McCarty, David; Sibbald, Matthew

    2017-04-01

    Making a diagnosis can be difficult for learners as they must integrate multiple clinical variables. Diagnostic schemes can help learners with this complex task. A diagnostic scheme is an algorithm that organises possible diagnoses by assigning signs or symptoms (e.g. systolic murmur) to groups of similar diagnoses (e.g. aortic stenosis and aortic sclerosis) and provides distinguishing features to help discriminate between similar diagnoses (e.g. carotid pulse). The current literature does not identify whether scheme layouts should guide learners to reason one step at a time in a terminally branching scheme or weigh multiple variables simultaneously in a hybrid scheme. We compared diagnostic accuracy, perceptual errors and cognitive load using two scheme layouts for cardiac auscultation. Focused on the task of identifying murmurs on Harvey, a cardiopulmonary simulator, 86 internal medicine residents used two scheme layouts. The terminally branching scheme organised the information into single variable decisions. The hybrid scheme combined single variable decisions with a chart integrating multiple distinguishing features. Using a crossover design, participants completed one set of murmurs (diastolic or systolic) with either the terminally branching or the hybrid scheme. The second set of murmurs was completed with the other scheme. A repeated measures manova was performed to compare diagnostic accuracy, perceptual errors and cognitive load between the scheme layouts. There was a main effect of the scheme layout (Wilks' λ = 0.841, F 3,80 = 5.1, p = 0.003). Use of a terminally branching scheme was associated with increased diagnostic accuracy (65 versus 53%, p = 0.02), fewer perceptual errors (0.61 versus 0.98 errors, p = 0.001) and lower cognitive load (3.1 versus 3.5/7, p = 0.023). The terminally branching scheme was associated with improved diagnostic accuracy, fewer perceptual errors and lower cognitive load, suggesting that terminally branching schemes are effective for improving diagnostic accuracy. These findings can inform the design of schemes and other clinical decision aids. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  17. Predicting diagnostic error in Radiology via eye-tracking and image analytics: Application in mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Pinto, Frank M; Morin-Ducote, Garnetta

    2013-01-01

    Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels. Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from 4 Radiology residents and 2 breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADsmore » images features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated. Results: Diagnostic error can be predicted reliably by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model (AUC=0.79). Personalized user modeling was far more accurate for the more experienced readers (average AUC of 0.837 0.029) than for the less experienced ones (average AUC of 0.667 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features. Conclusions: Diagnostic errors in mammography can be predicted reliably by leveraging the radiologists gaze behavior and image content.« less

  18. Assessing FPAR Source and Parameter Optimization Scheme in Application of a Diagnostic Carbon Flux Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, D P; Ritts, W D; Wharton, S

    2009-02-26

    The combination of satellite remote sensing and carbon cycle models provides an opportunity for regional to global scale monitoring of terrestrial gross primary production, ecosystem respiration, and net ecosystem production. FPAR (the fraction of photosynthetically active radiation absorbed by the plant canopy) is a critical input to diagnostic models, however little is known about the relative effectiveness of FPAR products from different satellite sensors nor about the sensitivity of flux estimates to different parameterization approaches. In this study, we used multiyear observations of carbon flux at four eddy covariance flux tower sites within the conifer biome to evaluate these factors.more » FPAR products from the MODIS and SeaWiFS sensors, and the effects of single site vs. cross-site parameter optimization were tested with the CFLUX model. The SeaWiFs FPAR product showed greater dynamic range across sites and resulted in slightly reduced flux estimation errors relative to the MODIS product when using cross-site optimization. With site-specific parameter optimization, the flux model was effective in capturing seasonal and interannual variation in the carbon fluxes at these sites. The cross-site prediction errors were lower when using parameters from a cross-site optimization compared to parameter sets from optimization at single sites. These results support the practice of multisite optimization within a biome for parameterization of diagnostic carbon flux models.« less

  19. Should we confirm our clinical diagnostic certainty by autopsies?

    PubMed

    Podbregar, M; Voga, G; Krivec, B; Skale, R; Pareznik, R; Gabrscek, L

    2001-11-01

    To evaluate the frequency of diagnostic errors assessed by autopsies. Retrospective review of medical and pathological records in an 11-bed closed medical intensive care unit (ICU) at a 860-bed general hospital. Patients who died in the ICU between January 1998 and December 1999. Medical diagnoses were rated into three levels of clinical diagnostic certainty: complete certainty (group L1), minor diagnostic uncertainty (group L2), and major diagnostic uncertainty (group L3). The patients were divided into three error groups: group A, the autopsy confirmed the clinical diagnosis; group B, the autopsy demonstrated a new relevant diagnosis which would probably not have influenced the therapy and outcome; group C, the autopsy demonstrated a new relevant diagnosis which would probably have changed the therapy and outcome. The overall mortality was 20.3% (270/1331 patients). Autopsies were performed in 126 patients (46.9% of deaths), more often in younger patients (66.6+/-13.9 years vs 72.7+/-12.0 years, p<0.001), in patients with shorter ICU stay (4.7+/-5.6 days vs 6.7+/-8.7 days, p=0.054), and in patients in group L3 without chronic diseases (15/126 vs 1/144, p<0.001). Fatal but potentially treatable errors [group C, 12 patients (9.5%)] were found in 8.7%, 10.0%, and 10.5% of patients in groups L1, L2, and L3, respectively (NS between groups). An ICU length of stay shorter than 24 h was not related to the frequency of group C errors. Autopsies are performed more often in younger patients without chronic disease and in patients with a low clinical diagnostic certainty. No level of clinical diagnostic certainty could predict the pathological findings.

  20. A fast Monte Carlo EM algorithm for estimation in latent class model analysis with an application to assess diagnostic accuracy for cervical neoplasia in women with AGC

    PubMed Central

    Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan

    2013-01-01

    In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493

  1. Improved assessment of gross and net primary productivity of Canada's landmass

    NASA Astrophysics Data System (ADS)

    Gonsamo, Alemu; Chen, Jing M.; Price, David T.; Kurz, Werner A.; Liu, Jane; Boisvenue, Céline; Hember, Robbie A.; Wu, Chaoyang; Chang, Kuo-Hsien

    2013-12-01

    assess Canada's gross primary productivity (GPP) and net primary productivity (NPP) using boreal ecosystem productivity simulator (BEPS) at 250 m spatial resolution with improved input parameter and driver fields and phenology and nutrient release parameterization schemes. BEPS is a process-based two-leaf enzyme kinetic terrestrial ecosystem model designed to simulate energy, water, and carbon (C) fluxes using spatial data sets of meteorology, remotely sensed land surface variables, soil properties, and photosynthesis and respiration rate parameters. Two improved key land surface variables, leaf area index (LAI) and land cover type, are derived at 250 m from Moderate Resolution Imaging Spectroradiometer sensor. For diagnostic error assessment, we use nine forest flux tower sites where all measured C flux, meteorology, and ancillary data sets are available. The errors due to input drivers and parameters are then independently corrected for Canada-wide GPP and NPP simulations. The optimized LAI use, for example, reduced the absolute bias in GPP from 20.7% to 1.1% for hourly BEPS simulations. Following the error diagnostics and corrections, daily GPP and NPP are simulated over Canada at 250 m spatial resolution, the highest resolution simulation yet for the country or any other comparable region. Total NPP (GPP) for Canada's land area was 1.27 (2.68) Pg C for 2008, with forests contributing 1.02 (2.2) Pg C. The annual comparisons between measured and simulated GPP show that the mean differences are not statistically significant (p > 0.05, paired t test). The main BEPS simulation error sources are from the driver fields.

  2. Imperfect practice makes perfect: error management training improves transfer of learning.

    PubMed

    Dyre, Liv; Tabor, Ann; Ringsted, Charlotte; Tolsgaard, Martin G

    2017-02-01

    Traditionally, trainees are instructed to practise with as few errors as possible during simulation-based training. However, transfer of learning may improve if trainees are encouraged to commit errors. The aim of this study was to assess the effects of error management instructions compared with error avoidance instructions during simulation-based ultrasound training. Medical students (n = 60) with no prior ultrasound experience were randomised to error management training (EMT) (n = 32) or error avoidance training (EAT) (n = 28). The EMT group was instructed to deliberately make errors during training. The EAT group was instructed to follow the simulator instructions and to commit as few errors as possible. Training consisted of 3 hours of simulation-based ultrasound training focusing on fetal weight estimation. Simulation-based tests were administered before and after training. Transfer tests were performed on real patients 7-10 days after the completion of training. Primary outcomes were transfer test performance scores and diagnostic accuracy. Secondary outcomes included performance scores and diagnostic accuracy during the simulation-based pre- and post-tests. A total of 56 participants completed the study. On the transfer test, EMT group participants attained higher performance scores (mean score: 67.7%, 95% confidence interval [CI]: 62.4-72.9%) than EAT group members (mean score: 51.7%, 95% CI: 45.8-57.6%) (p < 0.001; Cohen's d = 1.1, 95% CI: 0.5-1.7). There was a moderate improvement in diagnostic accuracy in the EMT group compared with the EAT group (16.7%, 95% CI: 10.2-23.3% weight deviation versus 26.6%, 95% CI: 16.5-36.7% weight deviation [p = 0.082; Cohen's d = 0.46, 95% CI: -0.06 to 1.0]). No significant interaction effects between group and performance improvements between the pre- and post-tests were found in either performance scores (p = 0.25) or diagnostic accuracy (p = 0.09). The provision of error management instructions during simulation-based training improves the transfer of learning to the clinical setting compared with error avoidance instructions. Rather than teaching to avoid errors, the use of errors for learning should be explored further in medical education theory and practice. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  3. Systematic errors in Monsoon simulation: importance of the equatorial Indian Ocean processes

    NASA Astrophysics Data System (ADS)

    Annamalai, H.; Taguchi, B.; McCreary, J. P., Jr.; Nagura, M.; Miyama, T.

    2015-12-01

    H. Annamalai1, B. Taguchi2, J.P. McCreary1, J. Hafner1, M. Nagura2, and T. Miyama2 International Pacific Research Center, University of Hawaii, USA Application Laboratory, JAMSTEC, Japan In climate models, simulating the monsoon precipitation climatology remains a grand challenge. Compared to CMIP3, the multi-model-mean (MMM) errors for Asian-Australian monsoon (AAM) precipitation climatology in CMIP5, relative to GPCP observations, have shown little improvement. One of the implications is that uncertainties in the future projections of time-mean changes to AAM rainfall may not have reduced from CMIP3 to CMIP5. Despite dedicated efforts by the modeling community, the progress in monsoon modeling is rather slow. This leads us to wonder: Has the scientific community reached a "plateau" in modeling mean monsoon precipitation? Our focus here is to better understanding of the coupled air-sea interactions, and moist processes that govern the precipitation characteristics over the tropical Indian Ocean where large-scale errors persist. A series idealized coupled model experiments are performed to test the hypothesis that errors in the coupled processes along the equatorial Indian Ocean during inter-monsoon seasons could potentially influence systematic errors during the monsoon season. Moist static energy budget diagnostics has been performed to identify the leading moist and radiative processes that account for the large-scale errors in the simulated precipitation. As a way forward, we propose three coordinated efforts, and they are: (i) idealized coupled model experiments; (ii) process-based diagnostics and (iii) direct observations to constrain model physics. We will argue that a systematic and coordinated approach in the identification of the various interactive processes that shape the precipitation basic state needs to be carried out, and high-quality observations over the data sparse monsoon region are needed to validate models and further improve model physics.

  4. Real-time sensing and gas jet mitigation of VDEs on Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Granetz, R. S.; Wolfe, S. M.; Izzo, V. A.; Reinke, M. L.; Terry, J. L.; Hughes, J. W.; Zhurovich, K.; Whyte, D. G.; Bakhtiari, M.; Wurden, G.

    2006-10-01

    Experiments have been carried out in Alcator C-Mod to test the effectiveness of gas jet disruption mitigation of VDEs with real-time detection and triggering by the C-Mod digital plasma control system (DPCS). The DPCS continuously computes the error in the plasma vertical position from the magnetics diagnostics. When this error exceeds an adjustable preset value, the DPCS triggers the gas jet valve (with a negligible latency time). The high-pressure gas (argon) only takes a few milliseconds to enter the vacuum chamber and begin affecting the plasma, but this is comparable to the VDE timescale on C-Mod. Nevertheless, gas jet injection reduced the halo current, increased the radiated power fraction, and reduced the heating of the divertor compared to unmitigated disruptions, but not quite as well as in earlier mitigation experiments with vertically stable plasmas. Presumably a faster overall response time would be beneficial, and several ways to achieve this will also be discussed.

  5. The Method of Fundamental Solutions using the Vector Magnetic Dipoles for Calculation of the Magnetic Fields in the Diagnostic Problems Based on Full-Scale Modelling Experiment

    NASA Astrophysics Data System (ADS)

    Bakhvalov, Yu A.; Grechikhin, V. V.; Yufanova, A. L.

    2016-04-01

    The article describes the calculation of the magnetic fields in the problems diagnostic of technical systems based on the full-scale modeling experiment. Use of gridless fundamental solution method and its variants in combination with grid methods (finite differences and finite elements) are allowed to considerably reduce the dimensionality task of the field calculation and hence to reduce calculation time. When implementing the method are used fictitious magnetic charges. In addition, much attention is given to the calculation accuracy. Error occurs when wrong choice of the distance between the charges. The authors are proposing to use vector magnetic dipoles to improve the accuracy of magnetic fields calculation. Examples of this approacharegiven. The article shows the results of research. They are allowed to recommend the use of this approach in the method of fundamental solutions for the full-scale modeling tests of technical systems.

  6. Adaptive algorithm of selecting optimal variant of errors detection system for digital means of automation facility of oil and gas complex

    NASA Astrophysics Data System (ADS)

    Poluyan, A. Y.; Fugarov, D. D.; Purchina, O. A.; Nesterchuk, V. V.; Smirnova, O. V.; Petrenkova, S. B.

    2018-05-01

    To date, the problems associated with the detection of errors in digital equipment (DE) systems for the automation of explosive objects of the oil and gas complex are extremely actual. Especially this problem is actual for facilities where a violation of the accuracy of the DE will inevitably lead to man-made disasters and essential material damage, at such facilities, the diagnostics of the accuracy of the DE operation is one of the main elements of the industrial safety management system. In the work, the solution of the problem of selecting the optimal variant of the errors detection system of errors detection by a validation criterion. Known methods for solving these problems have an exponential valuation of labor intensity. Thus, with a view to reduce time for solving the problem, a validation criterion is compiled as an adaptive bionic algorithm. Bionic algorithms (BA) have proven effective in solving optimization problems. The advantages of bionic search include adaptability, learning ability, parallelism, the ability to build hybrid systems based on combining. [1].

  7. Predicting diagnostic error in radiology via eye-tracking and image analytics: Preliminary investigation in mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Tourassi, Georgia D.; Pinto, Frank

    2013-10-15

    Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists’ gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels.Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from four Radiology residents and two breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADS imagesmore » features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated.Results: Machine learning can be used to predict diagnostic error by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model [area under the ROC curve (AUC) = 0.792 ± 0.030]. Personalized user modeling was far more accurate for the more experienced readers (AUC = 0.837 ± 0.029) than for the less experienced ones (AUC = 0.667 ± 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features.Conclusions: Diagnostic errors in mammography can be predicted to a good extent by leveraging the radiologists’ gaze behavior and image content.« less

  8. First measurements of error fields on W7-X using flux surface mapping

    DOE PAGES

    Lazerson, Samuel A.; Otte, Matthias; Bozhenkov, Sergey; ...

    2016-08-03

    Error fields have been detected and quantified using the flux surface mapping diagnostic system on Wendelstein 7-X (W7-X). A low-field 'more » $${\\rlap{-}\\ \\iota} =1/2$$ ' magnetic configuration ($${\\rlap{-}\\ \\iota} =\\iota /2\\pi $$ ), sensitive to error fields, was developed in order to detect their presence using the flux surface mapping diagnostic. In this configuration, a vacuum flux surface with rotational transform of n/m = 1/2 is created at the mid-radius of the vacuum flux surfaces. If no error fields are present a vanishingly small n/m = 5/10 island chain should be present. Modeling indicates that if an n = 1 perturbing field is applied by the trim coils, a large n/m = 1/2 island chain will be opened. This island chain is used to create a perturbation large enough to be imaged by the diagnostic. Phase and amplitude scans of the applied field allow the measurement of a small $$\\sim 0.04$$ m intrinsic island chain with a $${{130}^{\\circ}}$$ phase relative to the first module of the W7-X experiment. Lastly, these error fields are determined to be small and easily correctable by the trim coil system.« less

  9. A real-time diagnostic and performance monitor for UNIX. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dong, Hongchao

    1992-01-01

    There are now over one million UNIX sites and the pace at which new installations are added is steadily increasing. Along with this increase, comes a need to develop simple efficient, effective and adaptable ways of simultaneously collecting real-time diagnostic and performance data. This need exists because distributed systems can give rise to complex failure situations that are often un-identifiable with single-machine diagnostic software. The simultaneous collection of error and performance data is also important for research in failure prediction and error/performance studies. This paper introduces a portable method to concurrently collect real-time diagnostic and performance data on a distributed UNIX system. The combined diagnostic/performance data collection is implemented on a distributed multi-computer system using SUN4's as servers. The approach uses existing UNIX system facilities to gather system dependability information such as error and crash reports. In addition, performance data such as CPU utilization, disk usage, I/O transfer rate and network contention is also collected. In the future, the collected data will be used to identify dependability bottlenecks and to analyze the impact of failures on system performance.

  10. Evaluating concentration estimation errors in ELISA microarray experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daly, Don S.; White, Amanda M.; Varnum, Susan M.

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to predict a protein concentration in a sample. Deploying ELISA in a microarray format permits simultaneous prediction of the concentrations of numerous proteins in a small sample. These predictions, however, are uncertain due to processing error and biological variability. Evaluating prediction error is critical to interpreting biological significance and improving the ELISA microarray process. Evaluating prediction error must be automated to realize a reliable high-throughput ELISA microarray system. Methods: In this paper, we present a statistical method based on propagation of error to evaluate prediction errors in the ELISA microarray process. Althoughmore » propagation of error is central to this method, it is effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization and statistical diagnostics when evaluating ELISA microarray prediction errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of prediction errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error.« less

  11. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed.

  12. Automated detection of heuristics and biases among pathologists in a computer-based system.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.

  13. Quality and patient safety in the diagnosis of breast cancer.

    PubMed

    Raab, Stephen S; Swain, Justin; Smith, Natasha; Grzybicki, Dana M

    2013-09-01

    The media, medical legal, and safety science perspectives of a laboratory medical error differ and assign variable levels of responsibility on individuals and systems. We examine how the media identifies, communicates, and interprets information related to anatomic pathology breast diagnostic errors compared to groups using a safety science Lean-based quality improvement perspective. The media approach focuses on the outcome of error from the patient perspective and some errors have catastrophic consequences. The medical safety science perspective does not ignore the importance of patient outcome, but focuses on causes including the active events and latent factors that contribute to the error. Lean improvement methods deconstruct work into individual steps consisting of tasks, communications, and flow in order to understand the affect of system design on current state levels of quality. In the Lean model, system redesign to reduce errors depends on front-line staff knowledge and engagement to change the components of active work to develop best practices. In addition, Lean improvement methods require organizational and environmental alignment with the front-line change in order to improve the latent conditions affecting components such as regulation, education, and safety culture. Although we examine instances of laboratory error for a specific test in surgical pathology, the same model of change applies to all areas of the laboratory. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Overcoming the errors of in-house PCR used in the clinical laboratory for the diagnosis of extrapulmonary tuberculosis.

    PubMed

    Kunakorn, M; Raksakai, K; Pracharktam, R; Sattaudom, C

    1999-03-01

    Our experiences from 1993 to 1997 in the development and use of IS6110 base PCR for the diagnosis of extrapulmonary tuberculosis in a routine clinical setting revealed that error-correcting processes can improve existing diagnostic methodology. The reamplification method initially used had a sensitivity of 90.91% and a specificity of 93.75%. The concern was focused on the false positive results of this method caused by product-carryover contamination. This method was changed to single round PCR with carryover prevention by uracil DNA glycosylase (UDG), resulting in a 100% specificity but only 63% sensitivity. Dot blot hybridization was added after the single round PCR, increasing the sensitivity to 87.50%. However, false positivity resulted from the nonspecific dot blot hybridization signal, reducing the specificity to 89.47%. The hybridization of PCR was changed to a Southern blot with a new oligonucleotide probe giving the sensitivity of 85.71% and raising the specificity to 99.52%. We conclude that the PCR protocol for routine clinical use should include UDG for carryover prevention and hybridization with specific probes to optimize diagnostic sensitivity and specificity in extrapulmonary tuberculosis testing.

  15. Modern Diagnostic Techniques for the Assessment of Ocular Blood Flow in Myopia: Current State of Knowledge.

    PubMed

    Grudzińska, Ewa; Modrzejewska, Monika

    2018-01-01

    Myopia is the most common refractive error and the subject of interest of various studies assessing ocular blood flow. Increasing refractive error and axial elongation of the eye result in the stretching and thinning of the scleral, choroid, and retinal tissues and the decrease in retinal vessel diameter, disturbing ocular blood flow. Local and systemic factors known to change ocular blood flow include glaucoma, medications and fluctuations in intraocular pressure, and metabolic parameters. Techniques and tools assessing ocular blood flow include, among others, laser Doppler flowmetry (LDF), retinal function imager (RFI), laser speckle contrast imaging (LSCI), magnetic resonance imaging (MRI), optical coherence tomography angiography (OCTA), pulsatile ocular blood flowmeter (POBF), fundus pulsation amplitude (FPA), colour Doppler imaging (CDI), and Doppler optical coherence tomography (DOCT). Many researchers consistently reported lower blood flow parameters in myopic eyes regardless of the used diagnostic method. It is unclear whether this is a primary change that causes secondary thinning of ocular tissues or quite the opposite; that is, the mechanical stretching of the eye wall reduces its thickness and causes a secondary lower demand of tissues for oxygen. This paper presents a review of studies assessing ocular blood flow in myopes.

  16. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors

    PubMed Central

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-01-01

    Objectives Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. Methods This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. Results After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% (P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Conclusion Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors. PMID:29062553

  17. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.

    PubMed

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-08-01

    Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.

  18. Advanced error diagnostics of the CMAQ and Chimere modelling systems within the AQMEII3 model evaluation framework

    NASA Astrophysics Data System (ADS)

    Solazzo, Efisio; Hogrefe, Christian; Colette, Augustin; Garcia-Vivanco, Marta; Galmarini, Stefano

    2017-09-01

    The work here complements the overview analysis of the modelling systems participating in the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3) by focusing on the performance for hourly surface ozone by two modelling systems, Chimere for Europe and CMAQ for North America. The evaluation strategy outlined in the course of the three phases of the AQMEII activity, aimed to build up a diagnostic methodology for model evaluation, is pursued here and novel diagnostic methods are proposed. In addition to evaluating the base case simulation in which all model components are configured in their standard mode, the analysis also makes use of sensitivity simulations in which the models have been applied by altering and/or zeroing lateral boundary conditions, emissions of anthropogenic precursors, and ozone dry deposition. To help understand of the causes of model deficiencies, the error components (bias, variance, and covariance) of the base case and of the sensitivity runs are analysed in conjunction with timescale considerations and error modelling using the available error fields of temperature, wind speed, and NOx concentration. The results reveal the effectiveness and diagnostic power of the methods devised (which remains the main scope of this study), allowing the detection of the timescale and the fields that the two models are most sensitive to. The representation of planetary boundary layer (PBL) dynamics is pivotal to both models. In particular, (i) the fluctuations slower than ˜ 1.5 days account for 70-85 % of the mean square error of the full (undecomposed) ozone time series; (ii) a recursive, systematic error with daily periodicity is detected, responsible for 10-20 % of the quadratic total error; (iii) errors in representing the timing of the daily transition between stability regimes in the PBL are responsible for a covariance error as large as 9 ppb (as much as the standard deviation of the network-average ozone observations in summer in both Europe and North America); (iv) the CMAQ ozone error has a weak/negligible dependence on the errors in NO2, while the error in NO2 significantly impacts the ozone error produced by Chimere; (v) the response of the models to variations of anthropogenic emissions and boundary conditions show a pronounced spatial heterogeneity, while the seasonal variability of the response is found to be less marked. Only during the winter season does the zeroing of boundary values for North America produce a spatially uniform deterioration of the model accuracy across the majority of the continent.

  19. Automatic lung segmentation using control feedback system: morphology and texture paradigm.

    PubMed

    Noor, Norliza M; Than, Joel C M; Rijal, Omar M; Kassim, Rosminah M; Yunus, Ashari; Zeki, Amir A; Anzidei, Michele; Saba, Luca; Suri, Jasjit S

    2015-03-01

    Interstitial Lung Disease (ILD) encompasses a wide array of diseases that share some common radiologic characteristics. When diagnosing such diseases, radiologists can be affected by heavy workload and fatigue thus decreasing diagnostic accuracy. Automatic segmentation is the first step in implementing a Computer Aided Diagnosis (CAD) that will help radiologists to improve diagnostic accuracy thereby reducing manual interpretation. Automatic segmentation proposed uses an initial thresholding and morphology based segmentation coupled with feedback that detects large deviations with a corrective segmentation. This feedback is analogous to a control system which allows detection of abnormal or severe lung disease and provides a feedback to an online segmentation improving the overall performance of the system. This feedback system encompasses a texture paradigm. In this study we studied 48 males and 48 female patients consisting of 15 normal and 81 abnormal patients. A senior radiologist chose the five levels needed for ILD diagnosis. The results of segmentation were displayed by showing the comparison of the automated and ground truth boundaries (courtesy of ImgTracer™ 1.0, AtheroPoint™ LLC, Roseville, CA, USA). The left lung's performance of segmentation was 96.52% for Jaccard Index and 98.21% for Dice Similarity, 0.61 mm for Polyline Distance Metric (PDM), -1.15% for Relative Area Error and 4.09% Area Overlap Error. The right lung's performance of segmentation was 97.24% for Jaccard Index, 98.58% for Dice Similarity, 0.61 mm for PDM, -0.03% for Relative Area Error and 3.53% for Area Overlap Error. The segmentation overall has an overall similarity of 98.4%. The segmentation proposed is an accurate and fully automated system.

  20. MFP scanner motion characterization using self-printed target

    NASA Astrophysics Data System (ADS)

    Kim, Minwoong; Bauer, Peter; Wagner, Jerry K.; Allebach, Jan P.

    2015-01-01

    Multifunctional printers (MFP) are products that combine the functions of a printer, scanner, and copier. Our goal is to help customers to be able to easily diagnose scanner or print quality issues with their products by developing an automated diagnostic system embedded in the product. We specifically focus on the characterization of scanner motions, which may be defective due to irregular movements of the scan-head. The novel design of our test page and two-stage diagnostic algorithm are described in this paper. The most challenging issue is to evaluate the scanner performance properly when both printer and scanner units contribute to the motion errors. In the first stage called the uncorrected-print-error-stage, aperiodic and periodic motion behaviors are characterized in both the spatial and frequency domains. Since it is not clear how much of the error is contributed by each unit, the scanned input is statistically analyzed in the second stage called the corrected-print-error-stage. Finally, the described diagnostic algorithms output the estimated scan error and print error separately as RMS values of the displacement of the scan and print lines, respectively, from their nominal positions in the scanner or printer motion direction. We validate our test page design and approaches by ground truth obtained from a high-precision, chrome-on-glass reticle manufactured using semiconductor chip fabrication technologies.

  1. Insight into biases and sequencing errors for amplicon sequencing with the Illumina MiSeq platform.

    PubMed

    Schirmer, Melanie; Ijaz, Umer Z; D'Amore, Rosalinda; Hall, Neil; Sloan, William T; Quince, Christopher

    2015-03-31

    With read lengths of currently up to 2 × 300 bp, high throughput and low sequencing costs Illumina's MiSeq is becoming one of the most utilized sequencing platforms worldwide. The platform is manageable and affordable even for smaller labs. This enables quick turnaround on a broad range of applications such as targeted gene sequencing, metagenomics, small genome sequencing and clinical molecular diagnostics. However, Illumina error profiles are still poorly understood and programs are therefore not designed for the idiosyncrasies of Illumina data. A better knowledge of the error patterns is essential for sequence analysis and vital if we are to draw valid conclusions. Studying true genetic variation in a population sample is fundamental for understanding diseases, evolution and origin. We conducted a large study on the error patterns for the MiSeq based on 16S rRNA amplicon sequencing data. We tested state-of-the-art library preparation methods for amplicon sequencing and showed that the library preparation method and the choice of primers are the most significant sources of bias and cause distinct error patterns. Furthermore we tested the efficiency of various error correction strategies and identified quality trimming (Sickle) combined with error correction (BayesHammer) followed by read overlapping (PANDAseq) as the most successful approach, reducing substitution error rates on average by 93%. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Beyond Correctness: Development and Validation of Concept-Based Categorical Scoring Rubrics for Diagnostic Purposes

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Liu, Ying

    2016-01-01

    Diagnostic assessment approaches intend to provide fine-grained reports of what students know and can do, focusing on their areas of strengths and weaknesses. However, current application of such diagnostic approaches is limited by the scoring method for item responses; important diagnostic information, such as type of errors and strategy use is…

  3. Measures to Improve Diagnostic Safety in Clinical Practice

    PubMed Central

    Singh, Hardeep; Graber, Mark L; Hofer, Timothy P

    2016-01-01

    Timely and accurate diagnosis is foundational to good clinical practice and an essential first step to achieving optimal patient outcomes. However, a recent Institute of Medicine report concluded that most of us will experience at least one diagnostic error in our lifetime. The report argues for efforts to improve the reliability of the diagnostic process through better measurement of diagnostic performance. The diagnostic process is a dynamic team-based activity that involves uncertainty, plays out over time, and requires effective communication and collaboration among multiple clinicians, diagnostic services, and the patient. Thus, it poses special challenges for measurement. In this paper, we discuss how the need to develop measures to improve diagnostic performance could move forward at a time when the scientific foundation needed to inform measurement is still evolving. We highlight challenges and opportunities for developing potential measures of “diagnostic safety” related to clinical diagnostic errors and associated preventable diagnostic harm. In doing so, we propose a starter set of measurement concepts for initial consideration that seem reasonably related to diagnostic safety, and call for these to be studied and further refined. This would enable safe diagnosis to become an organizational priority and facilitate quality improvement. Health care systems should consider measurement and evaluation of diagnostic performance as essential to timely and accurate diagnosis and to the reduction of preventable diagnostic harm. PMID:27768655

  4. Is forceps more useful than visualization for measurement of colon polyp size?

    PubMed Central

    Kim, Jae Hyun; Park, Seun Ja; Lee, Jong Hoon; Kim, Tae Oh; Kim, Hyun Jin; Kim, Hyung Wook; Lee, Sang Heon; Baek, Dong Hoon; (BIGS), Busan Ulsan Gyeongnam Intestinal Study Group Society

    2016-01-01

    AIM: To identify whether the forceps estimation is more useful than visual estimation in the measurement of colon polyp size. METHODS: We recorded colonoscopy video clips that included scenes visualizing the polyp and scenes using open biopsy forceps in association with the polyp, which were used for an exam. A total of 40 endoscopists from the Busan Ulsan Gyeongnam Intestinal Study Group Society (BIGS) participated in this study. Participants watched 40 pairs of video clips of the scenes for visual estimation and forceps estimation, and wrote down the estimated polyp size on the exam paper. When analyzing the results of the exam, we assessed inter-observer differences, diagnostic accuracy, and error range in the measurement of the polyp size. RESULTS: The overall intra-class correlation coefficients (ICC) of inter-observer agreement for forceps estimation and visual estimation were 0.804 (95%CI: 0.731-0.873, P < 0.001) and 0.743 (95%CI: 0.656-0.828, P < 0.001), respectively. The ICCs of each group for forceps estimation were higher than those for visual estimation (Beginner group, 0.761 vs 0.693; Expert group, 0.887 vs 0.840, respectively). The overall diagnostic accuracy for visual estimation was 0.639 and for forceps estimation was 0.754 (P < 0.001). In the beginner group and the expert group, the diagnostic accuracy for the forceps estimation was significantly higher than that of the visual estimation (Beginner group, 0.734 vs 0.613, P < 0.001; Expert group, 0.784 vs 0.680, P < 0.001, respectively). The overall error range for visual estimation and forceps estimation were 1.48 ± 1.18 and 1.20 ± 1.10, respectively (P < 0.001). The error ranges of each group for forceps estimation were significantly smaller than those for visual estimation (Beginner group, 1.38 ± 1.08 vs 1.68 ± 1.30, P < 0.001; Expert group, 1.12 ± 1.11 vs 1.42 ± 1.11, P < 0.001, respectively). CONCLUSION: Application of the open biopsy forceps method when measuring colon polyp size could help reduce inter-observer differences and error rates. PMID:27003999

  5. Is forceps more useful than visualization for measurement of colon polyp size?

    PubMed

    Kim, Jae Hyun; Park, Seun Ja; Lee, Jong Hoon; Kim, Tae Oh; Kim, Hyun Jin; Kim, Hyung Wook; Lee, Sang Heon; Baek, Dong Hoon; Bigs, Busan Ulsan Gyeongnam Intestinal Study Group Society

    2016-03-21

    To identify whether the forceps estimation is more useful than visual estimation in the measurement of colon polyp size. We recorded colonoscopy video clips that included scenes visualizing the polyp and scenes using open biopsy forceps in association with the polyp, which were used for an exam. A total of 40 endoscopists from the Busan Ulsan Gyeongnam Intestinal Study Group Society (BIGS) participated in this study. Participants watched 40 pairs of video clips of the scenes for visual estimation and forceps estimation, and wrote down the estimated polyp size on the exam paper. When analyzing the results of the exam, we assessed inter-observer differences, diagnostic accuracy, and error range in the measurement of the polyp size. The overall intra-class correlation coefficients (ICC) of inter-observer agreement for forceps estimation and visual estimation were 0.804 (95%CI: 0.731-0.873, P < 0.001) and 0.743 (95%CI: 0.656-0.828, P < 0.001), respectively. The ICCs of each group for forceps estimation were higher than those for visual estimation (Beginner group, 0.761 vs 0.693; Expert group, 0.887 vs 0.840, respectively). The overall diagnostic accuracy for visual estimation was 0.639 and for forceps estimation was 0.754 (P < 0.001). In the beginner group and the expert group, the diagnostic accuracy for the forceps estimation was significantly higher than that of the visual estimation (Beginner group, 0.734 vs 0.613, P < 0.001; Expert group, 0.784 vs 0.680, P < 0.001, respectively). The overall error range for visual estimation and forceps estimation were 1.48 ± 1.18 and 1.20 ± 1.10, respectively (P < 0.001). The error ranges of each group for forceps estimation were significantly smaller than those for visual estimation (Beginner group, 1.38 ± 1.08 vs 1.68 ± 1.30, P < 0.001; Expert group, 1.12 ± 1.11 vs 1.42 ± 1.11, P < 0.001, respectively). Application of the open biopsy forceps method when measuring colon polyp size could help reduce inter-observer differences and error rates.

  6. When do latent class models overstate accuracy for diagnostic and other classifiers in the absence of a gold standard?

    PubMed

    Spencer, Bruce D

    2012-06-01

    Latent class models are increasingly used to assess the accuracy of medical diagnostic tests and other classifications when no gold standard is available and the true state is unknown. When the latent class is treated as the true class, the latent class models provide measures of components of accuracy including specificity and sensitivity and their complements, type I and type II error rates. The error rates according to the latent class model differ from the true error rates, however, and empirical comparisons with a gold standard suggest the true error rates often are larger. We investigate conditions under which the true type I and type II error rates are larger than those provided by the latent class models. Results from Uebersax (1988, Psychological Bulletin 104, 405-416) are extended to accommodate random effects and covariates affecting the responses. The results are important for interpreting the results of latent class analyses. An error decomposition is presented that incorporates an error component from invalidity of the latent class model. © 2011, The International Biometric Society.

  7. “Reducing unnecessary testing in a CPOE system through implementation of a targeted CDS intervention”

    PubMed Central

    2013-01-01

    Background We describe and evaluate the development and use of a Clinical Decision Support (CDS) intervention; an alert, in response to an identified medical error of overuse of a diagnostic laboratory test in a Computerized Physician Order Entry (CPOE) system. CPOE with embedded CDS has been shown to improve quality of care and reduce medical errors. CPOE can also improve resource utilization through more appropriate use of laboratory tests and diagnostic studies. Observational studies are necessary in order to understand how these technologies can be successfully employed by healthcare providers. Methods The error was identified by the Test Utilization Committee (TUC) in September, 2008 when they noticed critical care patients were being tested daily, and sometimes twice daily, for B-Type Natriuretic Peptide (BNP). Repeat and/or serial BNP testing is inappropriate for guiding the management of heart failure and may be clinically misleading. The CDS intervention consists of an expert rule that searches the system for a BNP lab value on the patient. If there is a value and the value is within the current hospital stay, an advisory is displayed to the ordering clinician. In order to isolate the impact of this intervention on unnecessary BNP testing we applied multiple regression analysis to the sample of 41,306 patient admissions with at least one BNP test at LVHN between January, 2008 and September, 2011. Results Our regression results suggest the CDS intervention reduced BNP orders by 21% relative to the mean. The financial impact of the rule was also significant. Multiplying by the direct supply cost of $28.04 per test, the intervention saved approximately $92,000 per year. Conclusions The use of alerts has great positive potential to improve care, but should be used judiciously and in the appropriate environment. While these savings may not be generalizable to other interventions, the experience at LVHN suggests that appropriately designed and carefully implemented CDS interventions can have a substantial impact on the efficiency of care provision. PMID:23566021

  8. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan

    PubMed Central

    2017-01-01

    Background Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Materials and Methods Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician’s request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. Results The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results. PMID:28107395

  9. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    PubMed

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results.

  10. Examining the diagnostic utility of the DSM-5 PTSD symptoms among male and female returning veterans.

    PubMed

    Green, Jonathan D; Annunziata, Anthony; Kleiman, Sarah E; Bovin, Michelle J; Harwell, Aaron M; Fox, Annie M L; Black, Shimrit K; Schnurr, Paula P; Holowka, Darren W; Rosen, Raymond C; Keane, Terence M; Marx, Brian P

    2017-08-01

    Posttraumatic stress disorder (PTSD) diagnostic criteria have been criticized for including symptoms that overlap with commonly comorbid disorders, which critics argue undermines the validity of the diagnosis and inflates psychiatric comorbidity rates. In response, the upcoming 11th edition of the International Classification of Diseases (ICD-11) will offer PTSD diagnostic criteria that are intended to promote diagnostic accuracy. However, diagnostic utility analyses have not yet assessed whether these criteria minimize diagnostic errors. The present study examined the diagnostic utility of each PTSD symptom in the fifth edition of the Diagnostic and Statistical Manual for Mental Disorders (DSM-5) for males and females. Participants were 1,347 individuals enrolled in a longitudinal national registry of returning veterans receiving care at a Department of Veterans Affairs (VA) facility. Doctoral level clinicians assessed all participants using the PTSD module of the Structured Clinical Interview for DSM. Of the 20 symptoms examined, the majority performed in the fair to poor range on test quality indices. Although a few items did perform in the good (or better) range, only half were ICD-11 symptoms. None of the 20 symptoms demonstrated good quality of efficiency. Results demonstrated few sex differences across indices. There were no differences in the proportion of comorbid psychiatric disorders or functional impairment between DSM-5 and ICD-11 criteria. ICD-11 PTSD criteria demonstrate neither greater diagnostic specificity nor reduced rates of comorbidity relative to DSM-5 criteria and, as such, do not perform as intended. Modifications to existing symptoms or new symptoms may improve differential diagnosis. © 2017 Wiley Periodicals, Inc.

  11. Diagnostic grade wireless ECG monitoring.

    PubMed

    Garudadri, Harinath; Chi, Yuejie; Baker, Steve; Majumdar, Somdeb; Baheti, Pawan K; Ballard, Dan

    2011-01-01

    In remote monitoring of Electrocardiogram (ECG), it is very important to ensure that the diagnostic integrity of signals is not compromised by sensing artifacts and channel errors. It is also important for the sensors to be extremely power efficient to enable wearable form factors and long battery life. We present an application of Compressive Sensing (CS) as an error mitigation scheme at the application layer for wearable, wireless sensors in diagnostic grade remote monitoring of ECG. In our previous work, we described an approach to mitigate errors due to packet losses by projecting ECG data to a random space and recovering a faithful representation using sparse reconstruction methods. Our contributions in this work are twofold. First, we present an efficient hardware implementation of random projection at the sensor. Second, we validate the diagnostic integrity of the reconstructed ECG after packet loss mitigation. We validate our approach on MIT and AHA databases comprising more than 250,000 normal and abnormal beats using EC57 protocols adopted by the Food and Drug Administration (FDA). We show that sensitivity and positive predictivity of a state-of-the-art ECG arrhythmia classifier is essentially invariant under CS based packet loss mitigation for both normal and abnormal beats even at high packet loss rates. In contrast, the performance degrades significantly in the absence of any error mitigation scheme, particularly for abnormal beats such as Ventricular Ectopic Beats (VEB).

  12. Second opinion oral pathology referrals in New Zealand.

    PubMed

    Seo, B; Hussaini, H M; Rich, A M

    2017-04-01

    Referral for a second opinion is an important aspect of pathology practice, which reduces the rate of diagnostic error and ensures consistency with diagnoses. The Oral Pathology Centre (OPC) is the only specialist oral diagnostic centre in New Zealand. OPC provides diagnostic services to dentists and dental specialists throughout New Zealand and acts as a referral centre for second opinions for oral pathology specimens that have been sent to anatomical pathologists. The aim of this study was to review second opinion referral cases sent to the OPC over a 15-year period and to assess the levels of concordance between the original and final diagnoses. The findings indicated that the majority of referred cases were odontogenic lesions, followed by connective tissue, epithelial and salivary lesions. The most prevalent diagnoses were ameloblastoma and keratocystic odontogenic tumour, followed by oral squamous cell carcinoma. Discordant diagnoses were recorded in 24% of cases. Diagnostic discrepancies were higher in odontogenic and salivary gland lesions, resulting in the change of diagnoses. Second opinion of oral pathology cases should be encouraged in view of the relative rarity of these lesions in general pathology laboratories and the rates of diagnostic discrepancy, particularly for odontogenic and salivary gland lesions. Copyright © 2017 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  13. Multistrip western blotting to increase quantitative data output.

    PubMed

    Kiyatkin, Anatoly; Aksamitiene, Edita

    2009-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.

  14. "First, know thyself": cognition and error in medicine.

    PubMed

    Elia, Fabrizio; Aprà, Franco; Verhovez, Andrea; Crupi, Vincenzo

    2016-04-01

    Although error is an integral part of the world of medicine, physicians have always been little inclined to take into account their own mistakes and the extraordinary technological progress observed in the last decades does not seem to have resulted in a significant reduction in the percentage of diagnostic errors. The failure in the reduction in diagnostic errors, notwithstanding the considerable investment in human and economic resources, has paved the way to new strategies which were made available by the development of cognitive psychology, the branch of psychology that aims at understanding the mechanisms of human reasoning. This new approach led us to realize that we are not fully rational agents able to take decisions on the basis of logical and probabilistically appropriate evaluations. In us, two different and mostly independent modes of reasoning coexist: a fast or non-analytical reasoning, which tends to be largely automatic and fast-reactive, and a slow or analytical reasoning, which permits to give rationally founded answers. One of the features of the fast mode of reasoning is the employment of standardized rules, termed "heuristics." Heuristics lead physicians to correct choices in a large percentage of cases. Unfortunately, cases exist wherein the heuristic triggered fails to fit the target problem, so that the fast mode of reasoning can lead us to unreflectively perform actions exposing us and others to variable degrees of risk. Cognitive errors arise as a result of these cases. Our review illustrates how cognitive errors can cause diagnostic problems in clinical practice.

  15. An Instructor's Diagnostic Aid for Feedback in Training.

    ERIC Educational Resources Information Center

    Andrews, Dee H.; Uliano, Kevin C.

    1988-01-01

    Instructor's Diagnostic Aid for Feedback in Training (IDAFT) is a computer-assisted method based on error analysis, domains of learning, and events of instruction. Its use with Navy team instructors is currently being explored. (JOW)

  16. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  17. Comparing diagnostic tests on benefit-risk.

    PubMed

    Pennello, Gene; Pantoja-Galicia, Norberto; Evans, Scott

    2016-01-01

    Comparing diagnostic tests on accuracy alone can be inconclusive. For example, a test may have better sensitivity than another test yet worse specificity. Comparing tests on benefit risk may be more conclusive because clinical consequences of diagnostic error are considered. For benefit-risk evaluation, we propose diagnostic yield, the expected distribution of subjects with true positive, false positive, true negative, and false negative test results in a hypothetical population. We construct a table of diagnostic yield that includes the number of false positive subjects experiencing adverse consequences from unnecessary work-up. We then develop a decision theory for evaluating tests. The theory provides additional interpretation to quantities in the diagnostic yield table. It also indicates that the expected utility of a test relative to a perfect test is a weighted accuracy measure, the average of sensitivity and specificity weighted for prevalence and relative importance of false positive and false negative testing errors, also interpretable as the cost-benefit ratio of treating non-diseased and diseased subjects. We propose plots of diagnostic yield, weighted accuracy, and relative net benefit of tests as functions of prevalence or cost-benefit ratio. Concepts are illustrated with hypothetical screening tests for colorectal cancer with test positive subjects being referred to colonoscopy.

  18. Clinical laboratory: bigger is not always better.

    PubMed

    Plebani, Mario

    2018-06-27

    Laboratory services around the world are undergoing substantial consolidation and changes through mechanisms ranging from mergers, acquisitions and outsourcing, primarily based on expectations to improve efficiency, increasing volumes and reducing the cost per test. However, the relationship between volume and costs is not linear and numerous variables influence the end cost per test. In particular, the relationship between volumes and costs does not span the entire platter of clinical laboratories: high costs are associated with low volumes up to a threshold of 1 million test per year. Over this threshold, there is no linear association between volumes and costs, as laboratory organization rather than test volume more significantly affects the final costs. Currently, data on laboratory errors and associated diagnostic errors and risk for patient harm emphasize the need for a paradigmatic shift: from a focus on volumes and efficiency to a patient-centered vision restoring the nature of laboratory services as an integral part of the diagnostic and therapy process. Process and outcome quality indicators are effective tools to measure and improve laboratory services, by stimulating a competition based on intra- and extra-analytical performance specifications, intermediate outcomes and customer satisfaction. Rather than competing with economic value, clinical laboratories should adopt a strategy based on a set of harmonized quality indicators and performance specifications, active laboratory stewardship, and improved patient safety.

  19. Elastic scattering spectroscopy for detection of cancer risk in Barrett's esophagus: experimental and clinical validation of error removal by orthogonal subtraction for increasing accuracy

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Fearn, Tom; MacKenzie, Gary; Clark, Ben; Dunn, Jason M.; Bigio, Irving J.; Bown, Stephen G.; Lovat, Laurence B.

    2009-07-01

    Elastic scattering spectroscopy (ESS) may be used to detect high-grade dysplasia (HGD) or cancer in Barrett's esophagus (BE). When spectra are measured in vivo by a hand-held optical probe, variability among replicated spectra from the same site can hinder the development of a diagnostic model for cancer risk. An experiment was carried out on excised tissue to investigate how two potential sources of this variability, pressure and angle, influence spectral variability, and the results were compared with the variations observed in spectra collected in vivo from patients with Barrett's esophagus. A statistical method called error removal by orthogonal subtraction (EROS) was applied to model and remove this measurement variability, which accounted for 96.6% of the variation in the spectra, from the in vivo data. Its removal allowed the construction of a diagnostic model with specificity improved from 67% to 82% (with sensitivity fixed at 90%). The improvement was maintained in predictions on an independent in vivo data set. EROS works well as an effective pretreatment for Barrett's in vivo data by identifying measurement variability and ameliorating its effect. The procedure reduces the complexity and increases the accuracy and interpretability of the model for classification and detection of cancer risk in Barrett's esophagus.

  20. Concept for tremor compensation for a handheld OCT-laryngoscope

    NASA Astrophysics Data System (ADS)

    Donner, Sabine; Deutsch, Stefanie; Bleeker, Sebastian; Ripken, Tammo; Krüger, Alexander

    2013-06-01

    Optical coherence tomography (OCT) is a non-invasive imaging technique which can create optical tissue sections, enabling diagnosis of vocal cord tissue. To take full advantage from the non-contact imaging technique, OCT was adapted to an indirect laryngoscope to work on awake patients. Using OCT in a handheld diagnostic device the challenges of rapid working distance adjustment and tracking of axial motion arise. The optical focus of the endoscopic sample arm and the reference-arm length can be adjusted in a range of 40 mm to 90 mm. Automatic working distance adjustment is based on image analysis of OCT B-scans which identifies off depth images as well as position errors. The movable focal plane and reference plane are used to adjust working distance to match the sample depth and stabilise the sample in the desired axial position of the OCT scans. The autofocus adjusts the working distance within maximum 2.7 seconds for the maximum initial displacement of 40 mm. The amplitude of hand tremor during 60 s handheld scanning was reduced to 50 % and it was shown that the image stabilisation keeps the position error below 0.5 mm. Fast automatic working distance adjustment is crucial to minimise the duration of the diagnostic procedure. The image stabilisation compensates relative axial movements during handheld scanning.

  1. Random Versus Nonrandom Peer Review: A Case for More Meaningful Peer Review.

    PubMed

    Itri, Jason N; Donithan, Adam; Patel, Sohil H

    2018-05-10

    Random peer review programs are not optimized to discover cases with diagnostic error and thus have inherent limitations with respect to educational and quality improvement value. Nonrandom peer review offers an alternative approach in which diagnostic error cases are targeted for collection during routine clinical practice. The objective of this study was to compare error cases identified through random and nonrandom peer review approaches at an academic center. During the 1-year study period, the number of discrepancy cases and score of discrepancy were determined from each approach. The nonrandom peer review process collected 190 cases, of which 60 were scored as 2 (minor discrepancy), 94 as 3 (significant discrepancy), and 36 as 4 (major discrepancy). In the random peer review process, 1,690 cases were reviewed, of which 1,646 were scored as 1 (no discrepancy), 44 were scored as 2 (minor discrepancy), and none were scored as 3 or 4. Several teaching lessons and quality improvement measures were developed as a result of analysis of error cases collected through the nonrandom peer review process. Our experience supports the implementation of nonrandom peer review as a replacement to random peer review, with nonrandom peer review serving as a more effective method for collecting diagnostic error cases with educational and quality improvement value. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Performance of Physical Examination Skills in Medical Students during Diagnostic Medicine Course in a University Hospital of Northwest China

    PubMed Central

    Li, Yan; Li, Na; Han, Qunying; He, Shuixiang; Bae, Ricard S.; Liu, Zhengwen; Lv, Yi; Shi, Bingyin

    2014-01-01

    This study was conducted to evaluate the performance of physical examination (PE) skills during our diagnostic medicine course and analyze the characteristics of the data collected to provide information for practical guidance to improve the quality of teaching. Seventy-two fourth-year medical students were enrolled in the study. All received an assessment of PE skills after receiving a 17-week formal training course and systematic teaching. Their performance was evaluated and recorded in detail using a checklist, which included 5 aspects of PE skills: examination techniques, communication and care skills, content items, appropriateness of examination sequence, and time taken. Error frequency and type were designated as the assessment parameters in the survey. The results showed that the distribution and the percentage in examination errors between male and female students and among the different body parts examined were significantly different (p<0.001). The average error frequency per student in females (0.875) was lower than in males (1.375) although the difference was not statistically significant (p = 0.167). The average error frequency per student in cardiac (1.267) and pulmonary (1.389) examinations was higher than in abdominal (0.867) and head, neck and nervous system examinations (0.917). Female students had a lower average error frequency than males in cardiac examinations (p = 0.041). Additionally, error in examination techniques was the highest type of error among the 5 aspects of PE skills irrespective of participant gender and assessment content (p<0.001). These data suggest that PE skills in cardiac and pulmonary examinations and examination techniques may be included in the main focus of improving the teaching of diagnostics in these medical students. PMID:25329685

  3. Performance of physical examination skills in medical students during diagnostic medicine course in a University Hospital of Northwest China.

    PubMed

    Li, Yan; Li, Na; Han, Qunying; He, Shuixiang; Bae, Ricard S; Liu, Zhengwen; Lv, Yi; Shi, Bingyin

    2014-01-01

    This study was conducted to evaluate the performance of physical examination (PE) skills during our diagnostic medicine course and analyze the characteristics of the data collected to provide information for practical guidance to improve the quality of teaching. Seventy-two fourth-year medical students were enrolled in the study. All received an assessment of PE skills after receiving a 17-week formal training course and systematic teaching. Their performance was evaluated and recorded in detail using a checklist, which included 5 aspects of PE skills: examination techniques, communication and care skills, content items, appropriateness of examination sequence, and time taken. Error frequency and type were designated as the assessment parameters in the survey. The results showed that the distribution and the percentage in examination errors between male and female students and among the different body parts examined were significantly different (p<0.001). The average error frequency per student in females (0.875) was lower than in males (1.375) although the difference was not statistically significant (p = 0.167). The average error frequency per student in cardiac (1.267) and pulmonary (1.389) examinations was higher than in abdominal (0.867) and head, neck and nervous system examinations (0.917). Female students had a lower average error frequency than males in cardiac examinations (p = 0.041). Additionally, error in examination techniques was the highest type of error among the 5 aspects of PE skills irrespective of participant gender and assessment content (p<0.001). These data suggest that PE skills in cardiac and pulmonary examinations and examination techniques may be included in the main focus of improving the teaching of diagnostics in these medical students.

  4. The Relationship of Error and Correction of Error in Oral Reading to Visual-Form Perception and Word Attack Skills.

    ERIC Educational Resources Information Center

    Clayman, Deborah P. Goldweber

    The ability of 100 second-grade boys and girls to self-correct oral reading errors was studied in relationship to visual-form perception, phonic skills, response speed, and reading level. Each child was tested individually with the Bender-Error Test, the Gray Oral Paragraphs, and the Roswell-Chall Diagnostic Reading Test and placed into a group of…

  5. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  6. Impact of input field characteristics on vibrational femtosecond coherent anti-Stokes Raman scattering thermometry.

    PubMed

    Yang, Chao-Bo; He, Ping; Escofet-Martin, David; Peng, Jiang-Bo; Fan, Rong-Wei; Yu, Xin; Dunn-Rankin, Derek

    2018-01-10

    In this paper, three ultrashort-pulse coherent anti-Stokes Raman scattering (CARS) thermometry approaches are summarized with a theoretical time-domain model. The difference between the approaches can be attributed to variations in the input field characteristics of the time-domain model. That is, all three approaches of ultrashort-pulse (CARS) thermometry can be simulated with the unified model by only changing the input fields features. As a specific example, the hybrid femtosecond/picosecond CARS is assessed for its use in combustion flow diagnostics; thus, the examination of the input field has an impact on thermometry focuses on vibrational hybrid femtosecond/picosecond CARS. Beginning with the general model of ultrashort-pulse CARS, the spectra with different input field parameters are simulated. To analyze the temperature measurement error brought by the input field impacts, the spectra are fitted and compared to fits, with the model neglecting the influence introduced by the input fields. The results demonstrate that, however the input pulses are depicted, temperature errors still would be introduced during an experiment. With proper field characterization, however, the significance of the error can be reduced.

  7. Numerical Error Estimation with UQ

    NASA Astrophysics Data System (ADS)

    Ackmann, Jan; Korn, Peter; Marotzke, Jochem

    2014-05-01

    Ocean models are still in need of means to quantify model errors, which are inevitably made when running numerical experiments. The total model error can formally be decomposed into two parts, the formulation error and the discretization error. The formulation error arises from the continuous formulation of the model not fully describing the studied physical process. The discretization error arises from having to solve a discretized model instead of the continuously formulated model. Our work on error estimation is concerned with the discretization error. Given a solution of a discretized model, our general problem statement is to find a way to quantify the uncertainties due to discretization in physical quantities of interest (diagnostics), which are frequently used in Geophysical Fluid Dynamics. The approach we use to tackle this problem is called the "Goal Error Ensemble method". The basic idea of the Goal Error Ensemble method is that errors in diagnostics can be translated into a weighted sum of local model errors, which makes it conceptually based on the Dual Weighted Residual method from Computational Fluid Dynamics. In contrast to the Dual Weighted Residual method these local model errors are not considered deterministically but interpreted as local model uncertainty and described stochastically by a random process. The parameters for the random process are tuned with high-resolution near-initial model information. However, the original Goal Error Ensemble method, introduced in [1], was successfully evaluated only in the case of inviscid flows without lateral boundaries in a shallow-water framework and is hence only of limited use in a numerical ocean model. Our work consists in extending the method to bounded, viscous flows in a shallow-water framework. As our numerical model, we use the ICON-Shallow-Water model. In viscous flows our high-resolution information is dependent on the viscosity parameter, making our uncertainty measures viscosity-dependent. We will show that we can choose a sensible parameter by using the Reynolds-number as a criteria. Another topic, we will discuss is the choice of the underlying distribution of the random process. This is especially of importance in the scope of lateral boundaries. We will present resulting error estimates for different height- and velocity-based diagnostics applied to the Munk gyre experiment. References [1] F. RAUSER: Error Estimation in Geophysical Fluid Dynamics through Learning; PhD Thesis, IMPRS-ESM, Hamburg, 2010 [2] F. RAUSER, J. MAROTZKE, P. KORN: Ensemble-type numerical uncertainty quantification from single model integrations; SIAM/ASA Journal on Uncertainty Quantification, submitted

  8. Multistrip Western blotting: a tool for comparative quantitative analysis of multiple proteins.

    PubMed

    Aksamitiene, Edita; Hoek, Jan B; Kiyatkin, Anatoly

    2015-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical Western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip Western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip Western blotting increases data output per single blotting cycle up to tenfold; allows concurrent measurement of up to nine different total and/or posttranslationally modified protein expression obtained from the same loading of the sample; and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data and therefore is advantageous to apply in biomedical diagnostics, systems biology, and cell signaling research.

  9. Tapping into the wisdom in the room: results from participant discussion at the 7th International Conference on Diagnostic Error in Medicine facilitated by a World Café technique.

    PubMed

    Cosby, Karen S; Zipperer, Lorri; Balik, Barbara

    2015-09-01

    The patient safety literature is full of exhortations to approach medical error from a system perspective and seek multidisciplinary solutions from groups including clinicians, patients themselves, as well as experts outside the traditional medical domain. The 7th annual International Conference on Diagnostic Error in Medicine sought to attract a multispecialty audience, and attempted to capture some of the conversations by engaging participants in a World Café, a technique used to stimulate discussion and preserve insight gained during the conference. We present the ideas generated in this session, discuss them in the context of psychological safety, and demonstrate the application of this novel technique.

  10. [Using some modern mathematical models of postmortem cooling of the human body for the time of death determination].

    PubMed

    Vavilov, A Iu; Viter, V I

    2007-01-01

    Mathematical questions of data errors of modern thermometrical models of postmortem cooling of the human body are considered. The main diagnostic areas used for thermometry are analyzed to minimize these data errors. The authors propose practical recommendations to decrease data errors of determination of prescription of death coming.

  11. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  12. Long-term surface EMG monitoring using K-means clustering and compressive sensing

    NASA Astrophysics Data System (ADS)

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2015-05-01

    In this work, we present an advanced K-means clustering algorithm based on Compressed Sensing theory (CS) in combination with the K-Singular Value Decomposition (K-SVD) method for Clustering of long-term recording of surface Electromyography (sEMG) signals. The long-term monitoring of sEMG signals aims at recording of the electrical activity produced by muscles which are very useful procedure for treatment and diagnostic purposes as well as for detection of various pathologies. The proposed algorithm is examined for three scenarios of sEMG signals including healthy person (sEMG-Healthy), a patient with myopathy (sEMG-Myopathy), and a patient with neuropathy (sEMG-Neuropathr), respectively. The proposed algorithm can easily scan large sEMG datasets of long-term sEMG recording. We test the proposed algorithm with Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) dimensionality reduction methods. Then, the output of the proposed algorithm is fed to K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers in order to calclute the clustering performance. The proposed algorithm achieves a classification accuracy of 99.22%. This ability allows reducing 17% of Average Classification Error (ACE), 9% of Training Error (TE), and 18% of Root Mean Square Error (RMSE). The proposed algorithm also reduces 14% clustering energy consumption compared to the existing K-Means clustering algorithm.

  13. Informatics applied to cytology

    PubMed Central

    Hornish, Maryanne; Goulart, Robert A.

    2008-01-01

    Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402

  14. Burnout of Radiologists: Frequency, Risk Factors, and Remedies: A Report of the ACR Commission on Human Resources.

    PubMed

    Harolds, Jay A; Parikh, Jay R; Bluth, Edward I; Dutton, Sharon C; Recht, Michael P

    2016-04-01

    Burnout is a concern for radiologists. The burnout rate is greater among diagnostic radiologists than the mean for all physicians, while radiation oncologists have a slightly lower burnout rate. Burnout can result in unprofessional behavior, thoughts of suicide, premature retirement, and errors in patient care. Strategies to reduce burnout include addressing the sources of job dissatisfaction, instilling lifestyle balance, finding reasons to work other than money, improving money management, developing a support group, and seeking help when needed. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. Artificial intelligence in medicine: humans need not apply?

    PubMed

    Diprose, William; Buist, Nicholas

    2016-05-06

    Artificial intelligence (AI) is a rapidly growing field with a wide range of applications. Driven by economic constraints and the potential to reduce human error, we believe that over the coming years AI will perform a significant amount of the diagnostic and treatment decision-making traditionally performed by the doctor. Humans would continue to be an important part of healthcare delivery, but in many situations, less expensive fit-for-purpose healthcare workers could be trained to 'fill the gaps' where AI are less capable. As a result, the role of the doctor as an expensive problem-solver would become redundant.

  16. Errors of logic and scholarship concerning dissociative identity disorder.

    PubMed

    Ross, Colin A

    2009-01-01

    The author reviewed a two-part critique of dissociative identity disorder published in the Canadian Journal of Psychiatry. The two papers contain errors of logic and scholarship. Contrary to the conclusions in the critique, dissociative identity disorder has established diagnostic reliability and concurrent validity, the trauma histories of affected individuals can be corroborated, and the existing prospective treatment outcome literature demonstrates improvement in individuals receiving psychotherapy for the disorder. The available evidence supports the inclusion of dissociative identity disorder in future editions of the Diagnostic and Statistical Manual of Mental Disorders.

  17. A Study on the Reliability of Sasang Constitutional Body Trunk Measurement

    PubMed Central

    Jang, Eunsu; Kim, Jong Yeol; Lee, Haejung; Kim, Honggie; Baek, Younghwa; Lee, Siwoo

    2012-01-01

    Objective. Body trunk measurement for human plays an important diagnostic role not only in conventional medicine but also in Sasang constitutional medicine (SCM). The Sasang constitutional body trunk measurement (SCBTM) consists of the 5-widths and the 8-circumferences which are standard locations currently employed in the SCM society. This study suggests to what extent a comprehensive training can improve the reliability of the SCBTM. Methods. We recruited 10 male subjects and 5 male observers with no experience of anthropometric measurement. We conducted measurements twice before and after a comprehensive training. Relative technical error of measurement (%TEMs) was produced to assess intra and inter observer reliabilities. Results. Post-training intra-observer %TEMs of the SCBTM were 0.27% to 1.85% reduced from 0.27% to 6.26% in pre-training, respectively. Post-training inter-observer %TEMs of those were 0.56% to 1.66% reduced from 1.00% to 9.60% in pre-training, respectively. Post-training % total TEMs which represent the whole reliability were 0.68% to 2.18% reduced from maximum value of 10.18%. Conclusion. A comprehensive training makes the SCBTM more reliable, hence giving a sufficiently confident diagnostic tool. It is strongly recommended to give a comprehensive training in advance to take the SCBTM. PMID:21822442

  18. A manifesto for cardiovascular imaging: addressing the human factor†

    PubMed Central

    Fraser, Alan G

    2017-01-01

    Abstract Our use of modern cardiovascular imaging tools has not kept pace with their technological development. Diagnostic errors are common but seldom investigated systematically. Rather than more impressive pictures, our main goal should be more precise tests of function which we select because their appropriate use has therapeutic implications which in turn have a beneficial impact on morbidity or mortality. We should practise analytical thinking, use checklists to avoid diagnostic pitfalls, and apply strategies that will reduce biases and avoid overdiagnosis. We should develop normative databases, so that we can apply diagnostic algorithms that take account of variations with age and risk factors and that allow us to calculate pre-test probability and report the post-test probability of disease. We should report the imprecision of a test, or its confidence limits, so that reference change values can be considered in daily clinical practice. We should develop decision support tools to improve the quality and interpretation of diagnostic imaging, so that we choose the single best test irrespective of modality. New imaging tools should be evaluated rigorously, so that their diagnostic performance is established before they are widely disseminated; this should be a shared responsibility of manufacturers with clinicians, leading to cost-effective implementation. Trials should evaluate diagnostic strategies against independent reference criteria. We should exploit advances in machine learning to analyse digital data sets and identify those features that best predict prognosis or responses to treatment. Addressing these human factors will reap benefit for patients, while technological advances continue unpredictably. PMID:29029029

  19. Types of diagnostic errors in neurological emergencies in the emergency department.

    PubMed

    Dubosh, Nicole M; Edlow, Jonathan A; Lefton, Micah; Pope, Jennifer V

    2015-02-01

    Neurological emergencies often pose diagnostic challenges for emergency physicians because these patients often present with atypical symptoms and standard imaging tests are imperfect. Misdiagnosis occurs due to a variety of errors. These can be classified as knowledge gaps, cognitive errors, and systems-based errors. The goal of this study was to describe these errors through review of quality assurance (QA) records. This was a retrospective pilot study of patients with neurological emergency diagnoses that were missed or delayed at one urban, tertiary academic emergency department. Cases meeting inclusion criteria were identified through review of QA records. Three emergency physicians independently reviewed each case and determined the type of error that led to the misdiagnosis. Proportions, confidence intervals, and a reliability coefficient were calculated. During the study period, 1168 cases were reviewed. Forty-two cases were found to include a neurological misdiagnosis and twenty-nine were determined to be the result of an error. The distribution of error types was as follows: knowledge gap 45.2% (95% CI 29.2, 62.2), cognitive error 29.0% (95% CI 15.9, 46.8), and systems-based error 25.8% (95% CI 13.5, 43.5). Cerebellar strokes were the most common type of stroke misdiagnosed, accounting for 27.3% of missed strokes. All three error types contributed to the misdiagnosis of neurological emergencies. Misdiagnosis of cerebellar lesions and erroneous radiology resident interpretations of neuroimaging were the most common mistakes. Understanding the types of errors may enable emergency physicians to develop possible solutions and avoid them in the future.

  20. An audit of request forms submitted in a multidisciplinary diagnostic center in Lagos.

    PubMed

    Oyedeji, Olufemi Abiola; Ogbenna, Abiola Ann; Iwuala, Sandra Omozehio

    2015-01-01

    Request forms are important means of communication between physicians and diagnostic service providers. Pre-analytical errors account for over two thirds of errors encountered in diagnostic service provision. The importance of adequate completion of request forms is usually underestimated by physicians which may result in medical errors or delay in instituting appropriate treatment. The aim of this study was to audit the level of completion of request forms presented at a multidisciplinary diagnostic center. A review of all requests forms for investigations which included radiologic, laboratory and cardiac investigations received between July and December 2011 was performed to assess their level of completeness. The data was entered into a spreadsheet and analyzed. Only 1.3% of the 7,841 request forms reviewed were fully completed. Patient's names, the referring physician's name and gender were the most completed information on the forms evaluated with 99.0%, 99.0% and 90.3% completion respectively. Patient's age was provided in 68.0%, request date in 88.2%, and clinical notes/ diagnosis in 65.9% of the requests. Patient's full address was provided in only 5.6% of requests evaluated. This study shows that investigation request forms are inadequately filled by physicians in our environment. Continuous medical education of physicians on the need for adequate completion of request forms is needed.

  1. Awareness of Diagnostic Error among Japanese Residents: a Nationwide Study.

    PubMed

    Nishizaki, Yuji; Shinozaki, Tomohiro; Kinoshita, Kensuke; Shimizu, Taro; Tokuda, Yasuharu

    2018-04-01

    Residents' understanding of diagnostic error may differ between countries. We sought to explore the relationship between diagnostic error knowledge and self-study, clinical knowledge, and experience. Our nationwide study involved postgraduate year 1 and 2 (PGY-1 and -2) Japanese residents. The Diagnostic Error Knowledge Assessment Test (D-KAT) and General Medicine In-Training Examination (GM-ITE) were administered at the end of the 2014 academic year. D-KAT scores were compared with the benchmark scores of US residents. Associations between D-KAT score and gender, PGY, emergency department (ED) rotations per month, mean number of inpatients handled at any given time, and mean daily minutes of self-study were also analyzed, both with and without adjusting for GM-ITE scores. Student's t test was used for comparisons with linear mixed models and structural equation models (SEM) to explore associations with D-KAT or GM-ITE scores. The mean D-KAT score among Japanese PGY-2 residents was significantly lower than that of their US PGY-2 counterparts (6.2 vs. 8.3, p < 0.001). GM-ITE scores correlated with ED rotations (≥6 rotations: 2.14; 0.16-4.13; p = 0.03), inpatient caseloads (5-9 patients: 1.79; 0.82-2.76; p < 0.001), and average daily minutes of self-study (≥91 min: 2.05; 0.56-3.53; p = 0.01). SEM revealed that D-KAT scores were directly associated with GM-ITE scores (ß = 0.37, 95% CI: 0.34-0.41) and indirectly associated with ED rotations (ß = 0.06, 95% CI: 0.02-0.10), inpatient caseload (ß = 0.04, 95% CI: 0.003-0.08), and average daily minutes of study (ß = 0.13, 95% CI: 0.09-0.17). Knowledge regarding diagnostic error among Japanese residents was poor compared with that among US residents. D-KAT scores correlated strongly with GM-ITE scores, and the latter scores were positively associated with a greater number of ED rotations, larger caseload (though only up to 15 patients), and more time spent studying.

  2. Estimation of genetic connectedness diagnostics based on prediction errors without the prediction error variance-covariance matrix.

    PubMed

    Holmes, John B; Dodds, Ken G; Lee, Michael A

    2017-03-02

    An important issue in genetic evaluation is the comparability of random effects (breeding values), particularly between pairs of animals in different contemporary groups. This is usually referred to as genetic connectedness. While various measures of connectedness have been proposed in the literature, there is general agreement that the most appropriate measure is some function of the prediction error variance-covariance matrix. However, obtaining the prediction error variance-covariance matrix is computationally demanding for large-scale genetic evaluations. Many alternative statistics have been proposed that avoid the computational cost of obtaining the prediction error variance-covariance matrix, such as counts of genetic links between contemporary groups, gene flow matrices, and functions of the variance-covariance matrix of estimated contemporary group fixed effects. In this paper, we show that a correction to the variance-covariance matrix of estimated contemporary group fixed effects will produce the exact prediction error variance-covariance matrix averaged by contemporary group for univariate models in the presence of single or multiple fixed effects and one random effect. We demonstrate the correction for a series of models and show that approximations to the prediction error matrix based solely on the variance-covariance matrix of estimated contemporary group fixed effects are inappropriate in certain circumstances. Our method allows for the calculation of a connectedness measure based on the prediction error variance-covariance matrix by calculating only the variance-covariance matrix of estimated fixed effects. Since the number of fixed effects in genetic evaluation is usually orders of magnitudes smaller than the number of random effect levels, the computational requirements for our method should be reduced.

  3. Observer detection of image degradation caused by irreversible data compression processes

    NASA Astrophysics Data System (ADS)

    Chen, Ji; Flynn, Michael J.; Gross, Barry; Spizarny, David

    1991-05-01

    Irreversible data compression methods have been proposed to reduce the data storage and communication requirements of digital imaging systems. In general, the error produced by compression increases as an algorithm''s compression ratio is increased. We have studied the relationship between compression ratios and the detection of induced error using radiologic observers. The nature of the errors was characterized by calculating the power spectrum of the difference image. In contrast with studies designed to test whether detected errors alter diagnostic decisions, this study was designed to test whether observers could detect the induced error. A paired-film observer study was designed to test whether induced errors were detected. The study was conducted with chest radiographs selected and ranked for subtle evidence of interstitial disease, pulmonary nodules, or pneumothoraces. Images were digitized at 86 microns (4K X 5K) and 2K X 2K regions were extracted. A full-frame discrete cosine transform method was used to compress images at ratios varying between 6:1 and 60:1. The decompressed images were reprinted next to the original images in a randomized order with a laser film printer. The use of a film digitizer and a film printer which can reproduce all of the contrast and detail in the original radiograph makes the results of this study insensitive to instrument performance and primarily dependent on radiographic image quality. The results of this study define conditions for which errors associated with irreversible compression cannot be detected by radiologic observers. The results indicate that an observer can detect the errors introduced by this compression algorithm for compression ratios of 10:1 (1.2 bits/pixel) or higher.

  4. An assessment of envelope-based demodulation in case of proximity of carrier and modulation frequencies

    NASA Astrophysics Data System (ADS)

    Shahriar, Md Rifat; Borghesani, Pietro; Randall, R. B.; Tan, Andy C. C.

    2017-11-01

    Demodulation is a necessary step in the field of diagnostics to reveal faults whose signatures appear as an amplitude and/or frequency modulation. The Hilbert transform has conventionally been used for the calculation of the analytic signal required in the demodulation process. However, the carrier and modulation frequencies must meet the conditions set by the Bedrosian identity for the Hilbert transform to be applicable for demodulation. This condition, basically requiring the carrier frequency to be sufficiently higher than the frequency of the modulation harmonics, is usually satisfied in many traditional diagnostic applications (e.g. vibration analysis of gear and bearing faults) due to the order-of-magnitude ratio between the carrier and modulation frequency. However, the diversification of the diagnostic approaches and applications shows cases (e.g. electrical signature analysis-based diagnostics) where the carrier frequency is in close proximity to the modulation frequency, thus challenging the applicability of the Bedrosian theorem. This work presents an analytic study to quantify the error introduced by the Hilbert transform-based demodulation when the Bedrosian identity is not satisfied and proposes a mitigation strategy to combat the error. An experimental study is also carried out to verify the analytical results. The outcome of the error analysis sets a confidence limit on the estimated modulation (both shape and magnitude) achieved through the Hilbert transform-based demodulation in case of violated Bedrosian theorem. However, the proposed mitigation strategy is found effective in combating the demodulation error aroused in this scenario, thus extending applicability of the Hilbert transform-based demodulation.

  5. Measures to Improve Diagnostic Safety in Clinical Practice.

    PubMed

    Singh, Hardeep; Graber, Mark L; Hofer, Timothy P

    2016-10-20

    Timely and accurate diagnosis is foundational to good clinical practice and an essential first step to achieving optimal patient outcomes. However, a recent Institute of Medicine report concluded that most of us will experience at least one diagnostic error in our lifetime. The report argues for efforts to improve the reliability of the diagnostic process through better measurement of diagnostic performance. The diagnostic process is a dynamic team-based activity that involves uncertainty, plays out over time, and requires effective communication and collaboration among multiple clinicians, diagnostic services, and the patient. Thus, it poses special challenges for measurement. In this paper, we discuss how the need to develop measures to improve diagnostic performance could move forward at a time when the scientific foundation needed to inform measurement is still evolving. We highlight challenges and opportunities for developing potential measures of "diagnostic safety" related to clinical diagnostic errors and associated preventable diagnostic harm. In doing so, we propose a starter set of measurement concepts for initial consideration that seem reasonably related to diagnostic safety and call for these to be studied and further refined. This would enable safe diagnosis to become an organizational priority and facilitate quality improvement. Health-care systems should consider measurement and evaluation of diagnostic performance as essential to timely and accurate diagnosis and to the reduction of preventable diagnostic harm.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  6. Anatomical and/or pathological predictors for the “incorrect” classification of red dot markers on wrist radiographs taken following trauma

    PubMed Central

    Kranz, R

    2015-01-01

    Objective: To establish the prevalence of red dot markers in a sample of wrist radiographs and to identify any anatomical and/or pathological characteristics that predict “incorrect” red dot classification. Methods: Accident and emergency (A&E) wrist cases from a digital imaging and communications in medicine/digital teaching library were examined for red dot prevalence and for the presence of several anatomical and pathological features. Binary logistic regression analyses were run to establish if any of these features were predictors of incorrect red dot classification. Results: 398 cases were analysed. Red dot was “incorrectly” classified in 8.5% of cases; 6.3% were “false negatives” (“FNs”)and 2.3% false positives (FPs) (one decimal place). Old fractures [odds ratio (OR), 5.070 (1.256–20.471)] and reported degenerative change [OR, 9.870 (2.300–42.359)] were found to predict FPs. Frykman V [OR, 9.500 (1.954–46.179)], Frykman VI [OR, 6.333 (1.205–33.283)] and non-Frykman positive abnormalities [OR, 4.597 (1.264–16.711)] predict “FNs”. Old fractures and Frykman VI were predictive of error at 90% confidence interval (CI); the rest at 95% CI. Conclusion: The five predictors of incorrect red dot classification may inform the image interpretation training of radiographers and other professionals to reduce diagnostic error. Verification with larger samples would reinforce these findings. Advances in knowledge: All healthcare providers strive to eradicate diagnostic error. By examining specific anatomical and pathological predictors on radiographs for such error, as well as extrinsic factors that may affect reporting accuracy, image interpretation training can focus on these “problem” areas and influence which radiographic abnormality detection schemes are appropriate to implement in A&E departments. PMID:25496373

  7. Clinical Reasoning Education at US Medical Schools: Results from a National Survey of Internal Medicine Clerkship Directors.

    PubMed

    Rencic, Joseph; Trowbridge, Robert L; Fagan, Mark; Szauter, Karen; Durning, Steven

    2017-11-01

    Recent reports, including the Institute of Medicine's Improving Diagnosis in Health Care, highlight the pervasiveness and underappreciated harm of diagnostic error, and recommend enhancing health care professional education in diagnostic reasoning. However, little is known about clinical reasoning curricula at US medical schools. To describe clinical reasoning curricula at US medical schools and to determine the attitudes of internal medicine clerkship directors toward teaching of clinical reasoning. Cross-sectional multicenter study. US institutional members of the Clerkship Directors in Internal Medicine (CDIM). Examined responses to a survey that was emailed in May 2015 to CDIM institutional representatives, who reported on their medical school's clinical reasoning curriculum. The response rate was 74% (91/123). Most respondents reported that a structured curriculum in clinical reasoning should be taught in all phases of medical education, including the preclinical years (64/85; 75%), clinical clerkships (76/87; 87%), and the fourth year (75/88; 85%), and that more curricular time should be devoted to the topic. Respondents indicated that most students enter the clerkship with only poor (25/85; 29%) to fair (47/85; 55%) knowledge of key clinical reasoning concepts. Most institutions (52/91; 57%) surveyed lacked sessions dedicated to these topics. Lack of curricular time (59/67, 88%) and faculty expertise in teaching these concepts (53/76, 69%) were identified as barriers. Internal medicine clerkship directors believe that clinical reasoning should be taught throughout the 4 years of medical school, with the greatest emphasis in the clinical years. However, only a minority reported having teaching sessions devoted to clinical reasoning, citing a lack of curricular time and faculty expertise as the largest barriers. Our findings suggest that additional institutional and national resources should be dedicated to developing clinical reasoning curricula to improve diagnostic accuracy and reduce diagnostic error.

  8. Diagnostic Lumbar Puncture

    PubMed Central

    Doherty, Carolynne M; Forbes, Raeburn B

    2014-01-01

    Diagnostic Lumbar Puncture is one of the most commonly performed invasive tests in clinical medicine. Evaluation of an acute headache and investigation of inflammatory or infectious disease of the nervous system are the most common indications. Serious complications are rare, and correct technique will minimise diagnostic error and maximise patient comfort. We review the technique of diagnostic Lumbar Puncture including anatomy, needle selection, needle insertion, measurement of opening pressure, Cerebrospinal Fluid (CSF) specimen handling and after care. We also make some quality improvement suggestions for those designing services incorporating diagnostic Lumbar Puncture. PMID:25075138

  9. Diagnostic Hypothesis Generation and Human Judgment

    ERIC Educational Resources Information Center

    Thomas, Rick P.; Dougherty, Michael R.; Sprenger, Amber M.; Harbison, J. Isaiah

    2008-01-01

    Diagnostic hypothesis-generation processes are ubiquitous in human reasoning. For example, clinicians generate disease hypotheses to explain symptoms and help guide treatment, auditors generate hypotheses for identifying sources of accounting errors, and laypeople generate hypotheses to explain patterns of information (i.e., data) in the…

  10. The influence of non-rigid anatomy and patient positioning on endoscopy-CT image registration in the head and neck.

    PubMed

    Ingram, W Scott; Yang, Jinzhong; Wendt, Richard; Beadle, Beth M; Rao, Arvind; Wang, Xin A; Court, Laurence E

    2017-08-01

    To assess the influence of non-rigid anatomy and differences in patient positioning between CT acquisition and endoscopic examination on endoscopy-CT image registration in the head and neck. Radiotherapy planning CTs and 31-35 daily treatment-room CTs were acquired for nineteen patients. Diagnostic CTs were acquired for thirteen of the patients. The surfaces of the airways were segmented on all scans and triangular meshes were created to render virtual endoscopic images with a calibrated pinhole model of an endoscope. The virtual images were used to take projective measurements throughout the meshes, with reference measurements defined as those taken on the planning CTs and test measurements defined as those taken on the daily or diagnostic CTs. The influence of non-rigid anatomy was quantified by 3D distance errors between reference and test measurements on the daily CTs, and the influence of patient positioning was quantified by 3D distance errors between reference and test measurements on the diagnostic CTs. The daily CT measurements were also used to investigate the influences of camera-to-surface distance, surface angle, and the interval of time between scans. Average errors in the daily CTs were 0.36 ± 0.61 cm in the nasal cavity, 0.58 ± 0.83 cm in the naso- and oropharynx, and 0.47 ± 0.73 cm in the hypopharynx and larynx. Average errors in the diagnostic CTs in those regions were 0.52 ± 0.69 cm, 0.65 ± 0.84 cm, and 0.69 ± 0.90 cm, respectively. All CTs had errors heavily skewed towards 0, albeit with large outliers. Large camera-to-surface distances were found to increase the errors, but the angle at which the camera viewed the surface had no effect. The errors in the Day 1 and Day 15 CTs were found to be significantly smaller than those in the Day 30 CTs (P < 0.05). Inconsistencies of patient positioning have a larger influence than non-rigid anatomy on projective measurement errors. In general, these errors are largest when the camera is in the superior pharynx, where it sees large distances and a lot of muscle motion. The errors are larger when the interval of time between CT acquisitions is longer, which suggests that the interval of time between the CT acquisition and the endoscopic examination should be kept short. The median errors found in this study are comparable to acceptable levels of uncertainty in deformable CT registration. Large errors are possible even when image alignment is very good, indicating that projective measurements must be made carefully to avoid these outliers. © 2017 American Association of Physicists in Medicine.

  11. Measurement of small lesions near metallic implants with mega-voltage cone beam CT

    NASA Astrophysics Data System (ADS)

    Grigorescu, Violeta; Prevrhal, Sven; Pouliot, Jean

    2008-03-01

    Metallic objects severely limit diagnostic CT imaging because of their high X-ray attenuation in the diagnostic energy range. In contrast, radiation therapy linear accelerators now offer CT imaging with X-ray energies in the megavolt range, where the attenuation coefficients of metals are significantly lower. We hypothesized that Mega electron-Voltage Cone-Beam CT (MVCT) implemented on a radiation therapy linear accelerator can detect and quantify small features in the vicinity of metallic implants with accuracy comparable to clinical Kilo electron-Voltage CT (KVCT) for imaging. Our test application was detection of osteolytic lesions formed near the metallic stem of a hip prosthesis, a condition of severe concern in hip replacement surgery. Both MVCT and KVCT were used to image a phantom containing simulated osteolytic bone lesions centered around a Chrome-Cobalt hip prosthesis stem with hemispherical lesions with sizes and densities ranging from 0.5 to 4 mm radius and 0 to 500 mg•cm -3, respectively. Images for both modalities were visually graded to establish lower limits of lesion visibility as a function of their size. Lesion volumes and mean density were determined and compared to reference values. Volume determination errors were reduced from 34%, on KVCT, to 20% for all lesions on MVCT, and density determination errors were reduced from 71% on KVCT to 10% on MVCT. Localization and quantification of lesions was improved with MVCT imaging. MVCT offers a viable alternative to clinical CT in cases where accurate 3D imaging of small features near metallic hardware is critical. These results need to be extended to other metallic objects of different composition and geometry.

  12. Rapidly-steered single-element ultrasound for real-time volumetric imaging and guidance

    NASA Astrophysics Data System (ADS)

    Stauber, Mark; Western, Craig; Solek, Roman; Salisbury, Kenneth; Hristov, Dmitre; Schlosser, Jeffrey

    2016-03-01

    Volumetric ultrasound (US) imaging has the potential to provide real-time anatomical imaging with high soft-tissue contrast in a variety of diagnostic and therapeutic guidance applications. However, existing volumetric US machines utilize "wobbling" linear phased array or matrix phased array transducers which are costly to manufacture and necessitate bulky external processing units. To drastically reduce cost, improve portability, and reduce footprint, we propose a rapidly-steered single-element volumetric US imaging system. In this paper we explore the feasibility of this system with a proof-of-concept single-element volumetric US imaging device. The device uses a multi-directional raster-scan technique to generate a series of two-dimensional (2D) slices that were reconstructed into three-dimensional (3D) volumes. At 15 cm depth, 90° lateral field of view (FOV), and 20° elevation FOV, the device produced 20-slice volumes at a rate of 0.8 Hz. Imaging performance was evaluated using an US phantom. Spatial resolution was 2.0 mm, 4.7 mm, and 5.0 mm in the axial, lateral, and elevational directions at 7.5 cm. Relative motion of phantom targets were automatically tracked within US volumes with a mean error of -0.3+/-0.3 mm, -0.3+/-0.3 mm, and -0.1+/-0.5 mm in the axial, lateral, and elevational directions, respectively. The device exhibited a mean spatial distortion error of 0.3+/-0.9 mm, 0.4+/-0.7 mm, and -0.3+/-1.9 in the axial, lateral, and elevational directions. With a production cost near $1000, the performance characteristics of the proposed system make it an ideal candidate for diagnostic and image-guided therapy applications where form factor and low cost are paramount.

  13. Automated working distance adjustment for a handheld OCT-Laryngoscope

    NASA Astrophysics Data System (ADS)

    Donner, Sabine; Bleeker, Sebastian; Ripken, Tammo; Krueger, Alexander

    2014-03-01

    Optical coherence tomography (OCT) is an imaging technique which enables diagnosis of vocal cord tissue structure by non-contact optical biopsies rather than invasive tissue biopsies. For diagnosis on awake patients OCT was adapted to a rigid indirect laryngoscope. The working distance must match the probe-sample distance, which varies from patient to patient. Therefore the endoscopic OCT sample arm has a variable working distance of 40 mm to 80 mm. The current axial position is identified by automated working distance adjustments based on image processing. The OCT reference plane and the focal plane of the sample arm are moved according to position errors. Repeated position adjustment during the whole diagnostic procedure keeps the tissue sample at the optimal axial position. The auto focus identifies and adjusts the working distance within the range of 50 mm within a maximum time of 2.7 s. Continuous image stabilisation reduces axial sample movement within the sampling depth for handheld OCT scanning. Rapid autofocus reduces the duration of the diagnostic procedure and axial position stabilisation eases the use of the OCT laryngoscope. Therefore this work is an important step towards the integration of OCT into indirect laryngoscopes.

  14. [Usefulness of imaging examinations in preoperative diagnosis of acute appendicitis].

    PubMed

    Nitoń, Tomasz; Górecka-Nitoń, Aleksandra

    2014-01-01

    Acute appendicitis (AA) is the cause one of most operations perform in department of general surgery on emergency ward. Frequency of acute appendicitis range from 6-8% of population. Clinical presentation is frequently unspecified and despite common occurence leads to many difficulties in diagnosis. Diagnosis of acute appendicitis includes clinical examination, laboratory tests, diagnostic scoring systems, computer programs as physisian aids and imaging examinations. About 30-45% patients suspected of acute appendicitis have untypical clinical presentation and here use of US or CT is very helpful. Longstanding use of US resulted in high AA evaluation accuracy with high sensitivity (75-90%) and specificity (84-100%). CT demonstrates above 95% ratio of correct diagnoses, reduces negative appendectomy rates and perforation rates as well as unnecessary observations. CT sensitivity and specificity CT is estimated between 83-100% among different authors. Expedited AA diagnosis, surgery and reduced hospitalization time are possible advantages of imaging tests. Additionally these tests can detect alternative deseases imitating acute appnedicitis. Use of imaging tests especially CT is beneficial in fertile women because of frequent genito-urinary disorders leading to the most diagnostic errors. However thera are contraindications in use of CT, for example it can not be performed in early pregnancy etc...

  15. The Effect of Random Error on Diagnostic Accuracy Illustrated with the Anthropometric Diagnosis of Malnutrition

    PubMed Central

    2016-01-01

    Background It is often thought that random measurement error has a minor effect upon the results of an epidemiological survey. Theoretically, errors of measurement should always increase the spread of a distribution. Defining an illness by having a measurement outside an established healthy range will lead to an inflated prevalence of that condition if there are measurement errors. Methods and results A Monte Carlo simulation was conducted of anthropometric assessment of children with malnutrition. Random errors of increasing magnitude were imposed upon the populations and showed that there was an increase in the standard deviation with each of the errors that became exponentially greater with the magnitude of the error. The potential magnitude of the resulting error of reported prevalence of malnutrition were compared with published international data and found to be of sufficient magnitude to make a number of surveys and the numerous reports and analyses that used these data unreliable. Conclusions The effect of random error in public health surveys and the data upon which diagnostic cut-off points are derived to define “health” has been underestimated. Even quite modest random errors can more than double the reported prevalence of conditions such as malnutrition. Increasing sample size does not address this problem, and may even result in less accurate estimates. More attention needs to be paid to the selection, calibration and maintenance of instruments, measurer selection, training & supervision, routine estimation of the likely magnitude of errors using standardization tests, use of statistical likelihood of error to exclude data from analysis and full reporting of these procedures in order to judge the reliability of survey reports. PMID:28030627

  16. Figure and ground in physician misdiagnosis: metacognition and diagnostic norms.

    PubMed

    Hamm, Robert M

    2014-01-01

    Meta-cognitive awareness, or self reflection informed by the "heuristics and biases" theory of how experts make cognitive errors, has been offered as a partial solution for diagnostic errors in medicine. I argue that this approach is not as easy nor as effective as one might hope. We should also promote mastery of the basic principles of diagnosis in medical school, continuing medical education, and routine reflection and review. While it may seem difficult to attend to both levels simultaneously, there is more to be gained from attending to both than from focusing only on one.

  17. Diagnostics for insufficiencies of posterior calculations in Bayesian signal inference.

    PubMed

    Dorn, Sebastian; Oppermann, Niels; Ensslin, Torsten A

    2013-11-01

    We present an error-diagnostic validation method for posterior distributions in Bayesian signal inference, an advancement of a previous work. It transfers deviations from the correct posterior into characteristic deviations from a uniform distribution of a quantity constructed for this purpose. We show that this method is able to reveal and discriminate several kinds of numerical and approximation errors, as well as their impact on the posterior distribution. For this we present four typical analytical examples of posteriors with incorrect variance, skewness, position of the maximum, or normalization. We show further how this test can be applied to multidimensional signals.

  18. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  19. Finding Useful Questions: On Bayesian Diagnosticity, Probability, Impact, and Information Gain

    ERIC Educational Resources Information Center

    Nelson, Jonathan D.

    2005-01-01

    Several norms for how people should assess a question's usefulness have been proposed, notably Bayesian diagnosticity, information gain (mutual information), Kullback-Liebler distance, probability gain (error minimization), and impact (absolute change). Several probabilistic models of previous experiments on categorization, covariation assessment,…

  20. Combining principles of Cognitive Load Theory and diagnostic error analysis for designing job aids: Effects on motivation and diagnostic performance in a process control task.

    PubMed

    Kluge, Annette; Grauel, Britta; Burkolter, Dina

    2013-03-01

    Two studies are presented in which the design of a procedural aid and the impact of an additional decision aid for process control were assessed. In Study 1, a procedural aid was developed that avoids imposing unnecessary extraneous cognitive load on novices when controlling a complex technical system. This newly designed procedural aid positively affected germane load, attention, satisfaction, motivation, knowledge acquisition and diagnostic speed for novel faults. In Study 2, the effect of a decision aid for use before the procedural aid was investigated, which was developed based on an analysis of diagnostic errors committed in Study 1. Results showed that novices were able to diagnose both novel faults and practised faults, and were even faster at diagnosing novel faults. This research contributes to the question of how to optimally support novices in dealing with technical faults in process control. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Understanding reliability and some limitations of the images and spectra reconstructed from a multi-monochromatic x-ray imager

    DOE PAGES

    Nagayama, T.; Mancini, R. C.; Mayes, D.; ...

    2015-11-18

    Temperature and density asymmetry diagnosis is critical to advance inertial confinement fusion (ICF) science. A multi-monochromatic x-ray imager (MMI) is an attractive diagnostic for this purpose. The MMI records the spectral signature from an ICF implosion core with time resolution, 2-D space resolution, and spectral resolution. While narrow-band images and 2-D space-resolved spectra from the MMI data constrain temperature and density spatial structure of the core, the accuracy of the images and spectra depends not only on the quality of the MMI data but also on the reliability of the post-processing tools. In this paper, we synthetically quantify the accuracymore » of images and spectra reconstructed from MMI data. Errors in the reconstructed images are less than a few percent when the space-resolution effect is applied to the modeled images. The errors in the reconstructed 2-D space-resolved spectra are also less than a few percent except those for the peripheral regions. Spectra reconstructed for the peripheral regions have slightly but systematically lower intensities by ~6% due to the instrumental spatial-resolution effects. However, this does not alter the relative line ratios and widths and thus does not affect the temperature and density diagnostics. We also investigate the impact of the pinhole size variation on the extracted images and spectra. A 10% pinhole size variation could introduce spatial bias to the images and spectra of ~10%. A correction algorithm is developed, and it successfully reduces the errors to a few percent. Finally, it is desirable to perform similar synthetic investigations to fully understand the reliability and limitations of each MMI application.« less

  2. Functional neurological symptom disorders in a pediatric emergency room: diagnostic accuracy, features, and outcome.

    PubMed

    de Gusmão, Claudio M; Guerriero, Réjean M; Bernson-Leung, Miya Elizabeth; Pier, Danielle; Ibeziako, Patricia I; Bujoreanu, Simona; Maski, Kiran P; Urion, David K; Waugh, Jeff L

    2014-08-01

    In children, functional neurological symptom disorders are frequently the basis for presentation for emergency care. Pediatric epidemiological and outcome data remain scarce. Assess diagnostic accuracy of trainee's first impression in our pediatric emergency room; describe manner of presentation, demographic data, socioeconomic impact, and clinical outcomes, including parental satisfaction. (1) More than 1 year, psychiatry consultations for neurology patients with a functional neurological symptom disorder were retrospectively reviewed. (2) For 3 months, all children whose emergency room presentation suggested the diagnosis were prospectively collected. (3) Three to six months after prospective collection, families completed a structured telephone interview on outcome measures. Twenty-seven patients were retrospectively assessed; 31 patients were prospectively collected. Trainees' accurately predicted the diagnosis in 93% (retrospective) and 94% (prospective) cohorts. Mixed presentations were most common (usually sensory-motor changes, e.g. weakness and/or paresthesias). Associated stressors were mundane and ubiquitous, rarely severe. Families were substantially affected, reporting mean symptom duration 7.4 (standard error of the mean ± 1.33) weeks, missing 22.4 (standard error of the mean ± 5.47) days of school, and 8.3 (standard error of the mean ± 2.88) of parental workdays (prospective cohort). At follow-up, 78% were symptom free. Parental dissatisfaction was rare, attributed to poor rapport and/or insufficient information conveyed. Trainees' clinical impression was accurate in predicting a later diagnosis of functional neurological symptom disorder. Extraordinary life stressors are not required to trigger the disorder in children. Although prognosis is favorable, families incur substantial economic burden and negative educational impact. Improving recognition and appropriately communicating the diagnosis may speed access to treatment and potentially reduce the disability and cost of this disorder. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. M-Health for Improving Screening Accuracy of Acute Malnutrition in a Community-Based Management of Acute Malnutrition Program in Mumbai Informal Settlements.

    PubMed

    Chanani, Sheila; Wacksman, Jeremy; Deshmukh, Devika; Pantvaidya, Shanti; Fernandez, Armida; Jayaraman, Anuja

    2016-12-01

    Acute malnutrition is linked to child mortality and morbidity. Community-Based Management of Acute Malnutrition (CMAM) programs can be instrumental in large-scale detection and treatment of undernutrition. The World Health Organization (WHO) 2006 weight-for-height/length tables are diagnostic tools available to screen for acute malnutrition. Frontline workers (FWs) in a CMAM program in Dharavi, Mumbai, were using CommCare, a mobile application, for monitoring and case management of children in combination with the paper-based WHO simplified tables. A strategy was undertaken to digitize the WHO tables into the CommCare application. To measure differences in diagnostic accuracy in community-based screening for acute malnutrition, by FWs, using a mobile-based solution. Twenty-seven FWs initially used the paper-based tables and then switched to an updated mobile application that included a nutritional grade calculator. Human error rates specifically associated with grade classification were calculated by comparison of the grade assigned by the FW to the grade each child should have received based on the same WHO tables. Cohen kappa coefficient, sensitivity and specificity rates were also calculated and compared for paper-based grade assignments and calculator grade assignments. Comparing FWs (N = 14) who completed at least 40 screenings without and 40 with the calculator, the error rates were 5.5% and 0.7%, respectively (p < .0001). Interrater reliability (κ) increased to an almost perfect level (>.90), from .79 to .97, after switching to the mobile calculator. Sensitivity and specificity also improved significantly. The mobile calculator significantly reduces an important component of human error in using the WHO tables to assess acute malnutrition at the community level. © The Author(s) 2016.

  4. Understanding reliability and some limitations of the images and spectra reconstructed from a multi-monochromatic x-ray imager

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagayama, T.; Mancini, R. C.; Mayes, D.

    2015-11-15

    Temperature and density asymmetry diagnosis is critical to advance inertial confinement fusion (ICF) science. A multi-monochromatic x-ray imager (MMI) is an attractive diagnostic for this purpose. The MMI records the spectral signature from an ICF implosion core with time resolution, 2-D space resolution, and spectral resolution. While narrow-band images and 2-D space-resolved spectra from the MMI data constrain temperature and density spatial structure of the core, the accuracy of the images and spectra depends not only on the quality of the MMI data but also on the reliability of the post-processing tools. Here, we synthetically quantify the accuracy of imagesmore » and spectra reconstructed from MMI data. Errors in the reconstructed images are less than a few percent when the space-resolution effect is applied to the modeled images. The errors in the reconstructed 2-D space-resolved spectra are also less than a few percent except those for the peripheral regions. Spectra reconstructed for the peripheral regions have slightly but systematically lower intensities by ∼6% due to the instrumental spatial-resolution effects. However, this does not alter the relative line ratios and widths and thus does not affect the temperature and density diagnostics. We also investigate the impact of the pinhole size variation on the extracted images and spectra. A 10% pinhole size variation could introduce spatial bias to the images and spectra of ∼10%. A correction algorithm is developed, and it successfully reduces the errors to a few percent. It is desirable to perform similar synthetic investigations to fully understand the reliability and limitations of each MMI application.« less

  5. Understanding reliability and some limitations of the images and spectra reconstructed from a multi-monochromatic x-ray imager.

    PubMed

    Nagayama, T; Mancini, R C; Mayes, D; Tommasini, R; Florido, R

    2015-11-01

    Temperature and density asymmetry diagnosis is critical to advance inertial confinement fusion (ICF) science. A multi-monochromatic x-ray imager (MMI) is an attractive diagnostic for this purpose. The MMI records the spectral signature from an ICF implosion core with time resolution, 2-D space resolution, and spectral resolution. While narrow-band images and 2-D space-resolved spectra from the MMI data constrain temperature and density spatial structure of the core, the accuracy of the images and spectra depends not only on the quality of the MMI data but also on the reliability of the post-processing tools. Here, we synthetically quantify the accuracy of images and spectra reconstructed from MMI data. Errors in the reconstructed images are less than a few percent when the space-resolution effect is applied to the modeled images. The errors in the reconstructed 2-D space-resolved spectra are also less than a few percent except those for the peripheral regions. Spectra reconstructed for the peripheral regions have slightly but systematically lower intensities by ∼6% due to the instrumental spatial-resolution effects. However, this does not alter the relative line ratios and widths and thus does not affect the temperature and density diagnostics. We also investigate the impact of the pinhole size variation on the extracted images and spectra. A 10% pinhole size variation could introduce spatial bias to the images and spectra of ∼10%. A correction algorithm is developed, and it successfully reduces the errors to a few percent. It is desirable to perform similar synthetic investigations to fully understand the reliability and limitations of each MMI application.

  6. Planetary Transmission Diagnostics

    NASA Technical Reports Server (NTRS)

    Lewicki, David G. (Technical Monitor); Samuel, Paul D.; Conroy, Joseph K.; Pines, Darryll J.

    2004-01-01

    This report presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting algorithm. The lifting scheme, developed by Wim Sweldens of Bell Labs, is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Classic lifting analyzes a given signal using wavelets derived from a single fundamental basis function. A number of researchers have proposed techniques for adding adaptivity to the lifting scheme, allowing the transform to choose from a set of fundamental bases the basis that best fits the signal. This characteristic is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation (synchronous signal-averaging) algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local wave-form changes associated with certain types of gear damage are poorly adapted, causing a significant change in the prediction error. The constrained adaptive lifting diagnostic algorithm is validated using data collected from the University of Maryland Transmission Test Rig and the results are discussed.

  7. Outcomes and genotype-phenotype correlations in 52 individuals with VLCAD deficiency diagnosed by NBS and enrolled in the IBEM-IS database

    PubMed Central

    Pena, Loren D.M.; van Calcar, Sandra C.; Hansen, Joyanna; Edick, Mathew J.; Vockley, Cate Walsh; Leslie, Nancy; Cameron, Cynthia; Mohsen, Al-Walid; Berry, Susan A; Arnold, Georgianne L; Vockley, Jerry

    2016-01-01

    Very long chain acyl-CoA dehydrogenase (VLCAD) deficiency can present at various ages from the neonatal period to adulthood, and poses the greatest risk of complications during intercurrent illness or after prolonged fasting. Early diagnosis, treatment, and surveillance can reduce mortality; hence, the disorder is included in the newborn Recommended Uniform Screening Panel (RUSP) in the United States. The Inborn Errors of Metabolism Information System (IBEM-IS) was established in 2007 to collect longitudinal information on individuals with inborn errors of metabolism included in newborn screening (NBS) programs, including VLCAD deficiency. We retrospectively analyzed early outcomes for individuals who were diagnosed with VLCAD deficiency by NBS and describe initial presentations, diagnosis, clinical outcomes and treatment in a cohort of 52 individuals ages 1–18 years. Maternal prenatal symptoms were not reported, and most newborns remained asymptomatic. Cardiomyopathy was uncommon in the cohort, diagnosed in 2/52 cases. Elevations in creatine kinase were a common finding, and usually first occurred during the toddler period (1–3 years of age). Diagnostic evaluations required several testing modalities, most commonly plasma acylcarnitine profiles and molecular testing. Functional testing, including fibroblast acylcarnitine profiling and white blood cell or fibroblast enzyme assay, is a useful diagnostic adjunct if uncharacterized mutations are identified. PMID:27209629

  8. Decomposition of the Mean Squared Error and NSE Performance Criteria: Implications for Improving Hydrological Modelling

    NASA Technical Reports Server (NTRS)

    Gupta, Hoshin V.; Kling, Harald; Yilmaz, Koray K.; Martinez-Baquero, Guillermo F.

    2009-01-01

    The mean squared error (MSE) and the related normalization, the Nash-Sutcliffe efficiency (NSE), are the two criteria most widely used for calibration and evaluation of hydrological models with observed data. Here, we present a diagnostically interesting decomposition of NSE (and hence MSE), which facilitates analysis of the relative importance of its different components in the context of hydrological modelling, and show how model calibration problems can arise due to interactions among these components. The analysis is illustrated by calibrating a simple conceptual precipitation-runoff model to daily data for a number of Austrian basins having a broad range of hydro-meteorological characteristics. Evaluation of the results clearly demonstrates the problems that can be associated with any calibration based on the NSE (or MSE) criterion. While we propose and test an alternative criterion that can help to reduce model calibration problems, the primary purpose of this study is not to present an improved measure of model performance. Instead, we seek to show that there are systematic problems inherent with any optimization based on formulations related to the MSE. The analysis and results have implications to the manner in which we calibrate and evaluate environmental models; we discuss these and suggest possible ways forward that may move us towards an improved and diagnostically meaningful approach to model performance evaluation and identification.

  9. Diagnostic Testing in Mathematics: An Extension of the PIAT?

    ERIC Educational Resources Information Center

    Algozzine, Bob; McGraw, Karen

    1980-01-01

    The article addresses the usefulness of the Peabody Individual Achievement Test (PIAT) in assessing various levels of arithmetic performance. The mathematics subtest of the PIAT is considered in terms of purpose; mathematical abilities subsections (foundations, basic facts, applications); diagnostic testing (the error analysis matrix); and poor…

  10. Validity Arguments for Diagnostic Assessment Using Automated Writing Evaluation

    ERIC Educational Resources Information Center

    Chapelle, Carol A.; Cotos, Elena; Lee, Jooyoung

    2015-01-01

    Two examples demonstrate an argument-based approach to validation of diagnostic assessment using automated writing evaluation (AWE). "Criterion"®, was developed by Educational Testing Service to analyze students' papers grammatically, providing sentence-level error feedback. An interpretive argument was developed for its use as part of…

  11. Identification of facilitators and barriers to residents' use of a clinical reasoning tool.

    PubMed

    DiNardo, Deborah; Tilstra, Sarah; McNeil, Melissa; Follansbee, William; Zimmer, Shanta; Farris, Coreen; Barnato, Amber E

    2018-03-28

    While there is some experimental evidence to support the use of cognitive forcing strategies to reduce diagnostic error in residents, the potential usability of such strategies in the clinical setting has not been explored. We sought to test the effect of a clinical reasoning tool on diagnostic accuracy and to obtain feedback on its usability and acceptability. We conducted a randomized behavioral experiment testing the effect of this tool on diagnostic accuracy on written cases among post-graduate 3 (PGY-3) residents at a single internal medical residency program in 2014. Residents completed written clinical cases in a proctored setting with and without prompts to use the tool. The tool encouraged reflection on concordant and discordant aspects of each case. We used random effects regression to assess the effect of the tool on diagnostic accuracy of the independent case sets, controlling for case complexity. We then conducted audiotaped structured focus group debriefing sessions and reviewed the tapes for facilitators and barriers to use of the tool. Of 51 eligible PGY-3 residents, 34 (67%) participated in the study. The average diagnostic accuracy increased from 52% to 60% with the tool, a difference that just met the test for statistical significance in adjusted analyses (p=0.05). Residents reported that the tool was generally acceptable and understandable but did not recognize its utility for use with simple cases, suggesting the presence of overconfidence bias. A clinical reasoning tool improved residents' diagnostic accuracy on written cases. Overconfidence bias is a potential barrier to its use in the clinical setting.

  12. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  13. Assessing Diagnostic Tests I: You Can't Be Too Sensitive.

    PubMed

    Jupiter, Daniel C

    2015-01-01

    Clinicians and patients are always interested in less invasive, cheaper, and faster diagnostic tests. When introducing such a test, physicians must ensure that it is reliable in its diagnoses and does not commit errors. In this article, I discuss several ways that new tests are compared against gold standard diagnostics. Copyright © 2015 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Radiology's Achilles' heel: error and variation in the interpretation of the Röntgen image.

    PubMed

    Robinson, P J

    1997-11-01

    The performance of the human eye and brain has failed to keep pace with the enormous technical progress in the first full century of radiology. Errors and variations in interpretation now represent the weakest aspect of clinical imaging. Those interpretations which differ from the consensus view of a panel of "experts" may be regarded as errors; where experts fail to achieve consensus, differing reports are regarded as "observer variation". Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. Observer variation is substantial and should be taken into account when different diagnostic methods are compared; in many cases the difference between observers outweighs the difference between techniques. Strategies for reducing error include attention to viewing conditions, training of the observers, availability of previous films and relevant clinical data, dual or multiple reporting, standardization of terminology and report format, and assistance from computers. Digital acquisition and display will probably not affect observer variation but the performance of radiologists, as measured by receiver operating characteristic (ROC) analysis, may be improved by computer-directed search for specific image features. Other current developments show that where image features can be comprehensively described, computer analysis can replace the perception function of the observer, whilst the function of interpretation can in some cases be performed better by artificial neural networks. However, computer-assisted diagnosis is still in its infancy and complete replacement of the human observer is as yet a remote possibility.

  15. Digital Intraoral Imaging Re-Exposure Rates of Dental Students.

    PubMed

    Senior, Anthea; Winand, Curtis; Ganatra, Seema; Lai, Hollis; Alsulfyani, Noura; Pachêco-Pereira, Camila

    2018-01-01

    A guiding principle of radiation safety is ensuring that radiation dosage is as low as possible while yielding the necessary diagnostic information. Intraoral images taken with conventional dental film have a higher re-exposure rate when taken by dental students compared to experienced staff. The aim of this study was to examine the prevalence of and reasons for re-exposure of digital intraoral images taken by third- and fourth-year dental students in a dental school clinic. At one dental school in Canada, the total number of intraoral images taken by third- and fourth-year dental students, re-exposures, and error descriptions were extracted from patient clinical records for an eight-month period (September 2015 to April 2016). The data were categorized to distinguish between digital images taken with solid-state sensors or photostimulable phosphor plates (PSP). The results showed that 9,397 intraoral images were made, and 1,064 required re-exposure. The most common error requiring re-exposure for bitewing images was an error in placement of the receptor too far mesially or distally (29% for sensors and 18% for PSP). The most common error requiring re-exposure for periapical images was inadequate capture of the periapical area (37% for sensors and 6% for PSP). A retake rate of 11% was calculated, and the common technique errors causing image deficiencies were identified. Educational intervention can now be specifically designed to reduce the retake rate and radiation dose for future patients.

  16. Hierarchical mixture of experts and diagnostic modeling approach to reduce hydrologic model structural uncertainty: STRUCTURAL UNCERTAINTY DIAGNOSTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2016-04-01

    In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integratemore » expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.« less

  17. Perspective: Whither the problem list? Organ-based documentation and deficient synthesis by medical trainees.

    PubMed

    Kaplan, Daniel M

    2010-10-01

    The author argues that the well-formulated problem list is essential for both organizing and evaluating diagnostic thinking. He considers evidence of deficiencies in problem lists in the medical record. He observes a trend among medical trainees toward organizing notes in the medical record according to lists of organ systems or medical subspecialties and hypothesizes that system-based documentation may undermine the art of problem formulation and diagnostic synthesis. Citing research linking more sophisticated problem representation with diagnostic success, he suggests that documentation style and clinical reasoning are closely connected and that organ-based documentation may predispose trainees to several varieties of cognitive diagnostic error and deficient synthesis. These include framing error, premature or absent closure, failure to integrate related findings, and failure to recognize the level of diagnostic resolution attained for a given problem. He acknowledges the pitfalls of higher-order diagnostic resolution, including the application of labels unsupported by firm evidence, while maintaining that diagnostic resolution as far as evidence permits is essential to both rational care of patients and rigorous education of learners. He proposes further research, including comparison of diagnostic efficiency between organ- and problem-oriented thinkers. He hypothesizes that the subspecialty-based structure of academic medical services helps perpetuate organ-system-based thinking, and calls on clinical educators to renew their emphasis on the formulation and documentation of complete and precise problem lists and progressively refined diagnoses by trainees.

  18. Errors of Logic and Scholarship Concerning Dissociative Identity Disorder

    ERIC Educational Resources Information Center

    Ross, Colin A.

    2009-01-01

    The author reviewed a two-part critique of dissociative identity disorder published in the "Canadian Journal of Psychiatry". The two papers contain errors of logic and scholarship. Contrary to the conclusions in the critique, dissociative identity disorder has established diagnostic reliability and concurrent validity, the trauma histories of…

  19. Proposed Interventions to Decrease the Frequency of Missed Test Results

    ERIC Educational Resources Information Center

    Wahls, Terry L.; Cram, Peter

    2009-01-01

    Numerous studies have identified that delays in diagnosis related to the mishandling of abnormal test results are an import contributor to diagnostic errors. Factors contributing to missed results included organizational factors, provider factors and patient-related factors. At the diagnosis error conference continuing medical education conference…

  20. DIAGNOSTIC STUDY ON FINE PARTICULATE MATTER PREDICTIONS OF CMAQ IN THE SOUTHEASTERN U.S.

    EPA Science Inventory

    In this study, the authors use the process analysis tool embedded in CMAQ to examine major processes that govern the fate of key pollutants, identify the most influential processes that contribute to model errors, and guide the diagnostic and sensitivity studies aimed at improvin...

  1. Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.

    PubMed

    Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab

    2017-09-01

    Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.

  2. An intelligent advisory system for pre-launch processing

    NASA Technical Reports Server (NTRS)

    Engrand, Peter A.; Mitchell, Tami

    1991-01-01

    The shuttle system of interest in this paper is the shuttle's data processing system (DPS). The DPS is composed of the following: (1) general purpose computers (GPC); (2) a multifunction CRT display system (MCDS); (3) mass memory units (MMU); and (4) a multiplexer/demultiplexer (MDM) and related software. In order to ensure the correct functioning of shuttle systems, some level of automatic error detection has been incorporated into all shuttle systems. For the DPS, error detection equipment has been incorporated into all of its subsystems. The automated diagnostic system, (MCDS) diagnostic tool, that aids in a more efficient processing of the DPS is described.

  3. Modeling and Measurement Constraints in Fault Diagnostics for HVAC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Najafi, Massieh; Auslander, David M.; Bartlett, Peter L.

    2010-05-30

    Many studies have shown that energy savings of five to fifteen percent are achievable in commercial buildings by detecting and correcting building faults, and optimizing building control systems. However, in spite of good progress in developing tools for determining HVAC diagnostics, methods to detect faults in HVAC systems are still generally undeveloped. Most approaches use numerical filtering or parameter estimation methods to compare data from energy meters and building sensors to predictions from mathematical or statistical models. They are effective when models are relatively accurate and data contain few errors. In this paper, we address the case where models aremore » imperfect and data are variable, uncertain, and can contain error. We apply a Bayesian updating approach that is systematic in managing and accounting for most forms of model and data errors. The proposed method uses both knowledge of first principle modeling and empirical results to analyze the system performance within the boundaries defined by practical constraints. We demonstrate the approach by detecting faults in commercial building air handling units. We find that the limitations that exist in air handling unit diagnostics due to practical constraints can generally be effectively addressed through the proposed approach.« less

  4. Patient safety priorities in mental healthcare in Switzerland: a modified Delphi study.

    PubMed

    Mascherek, Anna C; Schwappach, David L B

    2016-08-05

    Identifying patient safety priorities in mental healthcare is an emerging issue. A variety of aspects of patient safety in medical care apply for patient safety in mental care as well. However, specific aspects may be different as a consequence of special characteristics of patients, setting and treatment. The aim of the present study was to combine knowledge from the field and research and bundle existing initiatives and projects to define patient safety priorities in mental healthcare in Switzerland. The present study draws on national expert panels, namely, round-table discussion and modified Delphi consensus method. As preparation for the modified Delphi questionnaire, two round-table discussions and one semistructured questionnaire were conducted. Preparative work was conducted between May 2015 and October 2015. The modified Delphi was conducted to gauge experts' opinion on priorities in patient safety in mental healthcare in Switzerland. In two independent rating rounds, experts made private ratings. The modified Delphi was conducted in winter 2015. Nine topics were defined along the treatment pathway: diagnostic errors, non-drug treatment errors, medication errors, errors related to coercive measures, errors related to aggression management against self and others, errors in treatment of suicidal patients, communication errors, errors at interfaces of care and structural errors. Patient safety is considered as an important topic of quality in mental healthcare among experts, but it has been seriously neglected up until now. Activities in research and in practice are needed. Structural errors and diagnostics were given highest priority. From the topics identified, some are overlapping with important aspects of patient safety in medical care; however, some core aspects are unique. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Patient safety priorities in mental healthcare in Switzerland: a modified Delphi study

    PubMed Central

    Mascherek, Anna C

    2016-01-01

    Objective Identifying patient safety priorities in mental healthcare is an emerging issue. A variety of aspects of patient safety in medical care apply for patient safety in mental care as well. However, specific aspects may be different as a consequence of special characteristics of patients, setting and treatment. The aim of the present study was to combine knowledge from the field and research and bundle existing initiatives and projects to define patient safety priorities in mental healthcare in Switzerland. The present study draws on national expert panels, namely, round-table discussion and modified Delphi consensus method. Design As preparation for the modified Delphi questionnaire, two round-table discussions and one semistructured questionnaire were conducted. Preparative work was conducted between May 2015 and October 2015. The modified Delphi was conducted to gauge experts' opinion on priorities in patient safety in mental healthcare in Switzerland. In two independent rating rounds, experts made private ratings. The modified Delphi was conducted in winter 2015. Results Nine topics were defined along the treatment pathway: diagnostic errors, non-drug treatment errors, medication errors, errors related to coercive measures, errors related to aggression management against self and others, errors in treatment of suicidal patients, communication errors, errors at interfaces of care and structural errors. Conclusions Patient safety is considered as an important topic of quality in mental healthcare among experts, but it has been seriously neglected up until now. Activities in research and in practice are needed. Structural errors and diagnostics were given highest priority. From the topics identified, some are overlapping with important aspects of patient safety in medical care; however, some core aspects are unique. PMID:27496233

  6. Improvements on the accuracy of beam bugs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y.J.; Fessenden, T.

    1998-08-17

    At LLNL resistive wall monitors are used to measure the current and position used on ETA-II show a droop in signal due to a fast redistribution time constant of the signals. This paper presents the analysis and experimental test of the beam bugs used for beam current and position measurements in and after the fast kicker. It concludes with an outline of present and future changes that can be made to improve the accuracy of these beam bugs. of intense electron beams in electron induction linacs and beam transport lines. These, known locally as ''beam bugs'', have been used throughoutmore » linear induction accelerators as essential diagnostics of beam current and location. Recently, the development of a fast beam kicker has required improvement in the accuracy of measuring the position of beams. By picking off signals at more than the usual four positions around the monitor, beam position measurement error can be greatly reduced. A second significant source of error is the mechanical variation of the resistor around the bug.« less

  7. Improvements on the accuracy of beam bugs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y J; Fessenden, T

    1998-09-02

    At LLNL resistive wall monitors are used to measure the current and position used on ETA-II show a droop in signal due to a fast redistribution time constant of the signals. This paper presents the analysis and experimental test of the beam bugs used for beam current and position measurements in and after the fast kicker. It concludes with an outline of present and future changes that can be made to improve the accuracy of these beam bugs. of intense electron beams in electron induction linacs and beam transport lines. These, known locally as "beam bugs", have been used throughoutmore » linear induction accelerators as essential diagnostics of beam current and location. Recently, the development of a fast beam kicker has required improvement in the accuracy of measuring the position of beams. By picking off signals at more than the usual four positions around the monitor, beam position measurement error can be greatly reduced. A second significant source of error is the mechanical variation of the resistor around the bug.« less

  8. Delamination detection in smart composite beams using Lamb waves

    NASA Astrophysics Data System (ADS)

    Ip, Kim-Ho; Mai, Yiu-Wing

    2004-06-01

    This paper presents a feasibility study on using Lamb waves to detect and locate through-width delamination in fiber-reinforced plastic beams. An active diagnostic system is proposed for clamped-free specimens. It consists of a piezoelectric patch and an accelerometer both mounted near the support. Such a system can locate damage in an absolute sense, that is, a priori knowledge on the response from pristine specimens is not required. The fundamental anti-symmetric Lamb wave mode is chosen as the diagnostic wave. It is generated by applying a voltage in the form of sinusoidal bursts to the piezoelectric patch. The proposed system was applied to locate delaminations in some fabricated Kevlar/epoxy beam specimens. With an appropriate actuating frequency, distortions of waveforms due to boundary reflections can be reduced. Based on their arrival times and the known propagating speed of Lamb waves, the delaminations can be located. The errors associated with the predicted damage positions range from 4.5% to 8.5%.

  9. Artificial Intelligence in Medical Practice: The Question to the Answer?

    PubMed

    Miller, D Douglas; Brown, Eric W

    2018-02-01

    Computer science advances and ultra-fast computing speeds find artificial intelligence (AI) broadly benefitting modern society-forecasting weather, recognizing faces, detecting fraud, and deciphering genomics. AI's future role in medical practice remains an unanswered question. Machines (computers) learn to detect patterns not decipherable using biostatistics by processing massive datasets (big data) through layered mathematical models (algorithms). Correcting algorithm mistakes (training) adds to AI predictive model confidence. AI is being successfully applied for image analysis in radiology, pathology, and dermatology, with diagnostic speed exceeding, and accuracy paralleling, medical experts. While diagnostic confidence never reaches 100%, combining machines plus physicians reliably enhances system performance. Cognitive programs are impacting medical practice by applying natural language processing to read the rapidly expanding scientific literature and collate years of diverse electronic medical records. In this and other ways, AI may optimize the care trajectory of chronic disease patients, suggest precision therapies for complex illnesses, reduce medical errors, and improve subject enrollment into clinical trials. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Investigation of influence of errors of cutting machines with CNC on displacement trajectory accuracy of their actuating devices

    NASA Astrophysics Data System (ADS)

    Fedonin, O. N.; Petreshin, D. I.; Ageenko, A. V.

    2018-03-01

    In the article, the issue of increasing a CNC lathe accuracy by compensating for the static and dynamic errors of the machine is investigated. An algorithm and a diagnostic system for a CNC machine tool are considered, which allows determining the errors of the machine for their compensation. The results of experimental studies on diagnosing and improving the accuracy of a CNC lathe are presented.

  11. Error Variability and the Differentiation between Apraxia of Speech and Aphasia with Phonemic Paraphasia

    ERIC Educational Resources Information Center

    Haley, Katarina L.; Jacks, Adam; Cunningham, Kevin T.

    2013-01-01

    Purpose: This study was conducted to evaluate the clinical utility of error variability for differentiating between apraxia of speech (AOS) and aphasia with phonemic paraphasia. Method: Participants were 32 individuals with aphasia after left cerebral injury. Diagnostic groups were formed on the basis of operationalized measures of recognized…

  12. The Application of Strength of Association Statistics to the Item Analysis of an In-Training Examination in Diagnostic Radiology.

    ERIC Educational Resources Information Center

    Diamond, James J.; McCormick, Janet

    1986-01-01

    Using item responses from an in-training examination in diagnostic radiology, the application of a strength of association statistic to the general problem of item analysis is illustrated. Criteria for item selection, general issues of reliability, and error of measurement are discussed. (Author/LMO)

  13. A five-year experience with throat cultures.

    PubMed

    Shank, J C; Powell, T A

    1984-06-01

    This study addresses the usefulness of the throat culture in a family practice residency setting and explores the following questions: (1) Do faculty physicians clinically identify streptococcal pharyngitis better than residents? (2) With time, will residents and faculty physicians improve in their diagnostic accuracy? (3) Should the throat culture be used always, selectively, or never? A total of 3,982 throat cultures were obtained over a five-year study period with 16 percent positive for beta-hemolytic streptococci. The results were compared with the physician's clinical diagnosis of either "nonstreptococcal" (category A) or "streptococcal" (category B). Within category A, 363 of 3,023 patients had positive cultures (12 percent clinical diagnostic error rate). Within category B, 665 of 959 patients had negative cultures (69 percent clinical diagnostic error rate). Faculty were significantly better than residents in diagnosing streptococcal pharyngitis, but not in diagnosing nonstreptococcal sore throats. Neither faculty nor residents improved their diagnostic accuracy over time. Regarding age-specific recommendations, the findings support utilizing a throat culture in all children aged 2 to 15 years with sore throat, but in adults only when the physician suspects streptococcal pharyngitis.

  14. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  15. Uncharted territory: measuring costs of diagnostic errors outside the medical record.

    PubMed

    Schwartz, Alan; Weiner, Saul J; Weaver, Frances; Yudkowsky, Rachel; Sharma, Gunjan; Binns-Calvey, Amy; Preyss, Ben; Jordan, Neil

    2012-11-01

    In a past study using unannounced standardised patients (USPs), substantial rates of diagnostic and treatment errors were documented among internists. Because the authors know the correct disposition of these encounters and obtained the physicians' notes, they can identify necessary treatment that was not provided and unnecessary treatment. They can also discern which errors can be identified exclusively from a review of the medical records. To estimate the avoidable direct costs incurred by physicians making errors in our previous study. In the study, USPs visited 111 internal medicine attending physicians. They presented variants of four previously validated cases that jointly manipulate the presence or absence of contextual and biomedical factors that could lead to errors in management if overlooked. For example, in a patient with worsening asthma symptoms, a complicating biomedical factor was the presence of reflux disease and a complicating contextual factor was inability to afford the currently prescribed inhaler. Costs of missed or unnecessary services were computed using Medicare cost-based reimbursement data. Fourteen practice locations, including two academic clinics, two community-based primary care networks with multiple sites, a core safety net provider, and three Veteran Administration government facilities. Contribution of errors to costs of care. Overall, errors in care resulted in predicted costs of approximately $174,000 across 399 visits, of which only $8745 was discernible from a review of the medical records alone (without knowledge of the correct diagnoses). The median cost of error per visit with an incorrect care plan differed by case and by presentation variant within case. Chart reviews alone underestimate costs of care because they typically reflect appropriate treatment decisions conditional on (potentially erroneous) diagnoses. Important information about patient context is often entirely missing from medical records. Experimental methods, including the use of USPs, reveal the substantial costs of these errors.

  16. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    PubMed

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  17. Clinical epidemiology.

    PubMed

    Martin, S W; Bonnett, B

    1987-06-01

    Rational clinical practice requires deductive particularization of diagnostic findings, prognoses, and therapeutic responses from groups of animals (herds) to the individual animal (herd) under consideration This process utilizes concepts, skills, and methods of epidemiology, as they relate to the study of the distribution and determinants of health and disease in populations, and casts them in a clinical perspective.We briefly outline diagnostic strategies and introduce a measure of agreement, called kappa, between clinical diagnoses. This statistic is useful not only as a measure of diagnostic accuracy, but also as a means of quantifying and understanding disagreement between diagnosticians. It is disconcerting to many, clinicians included, that given a general deficit of data on sensitivity and specificity, the level of agreement between many clinical diagnoses is only moderate at best with kappa values of 0.3 to 0.6.Sensitivity, specificity, pretest odds, and posttest probability of disease are defined and related to the interpretation of clinical findings and ancillary diagnostic test results. An understanding of these features and how they relate to ruling-in or ruling-out a diagnosis, or minimizzing diagnostic errors will greatly enhance the diagnostic accuracy of the practitioner, and reduce the frequency of clinical disagreement. The approach of running multiple tests on every patient is not only wasteful and expensive, it is unlikely to improve the ability of the clinician to establish the correct diagnosis.We conclude with a discussion of how to decide on the best therapy, a discussion which centers on, and outlines the key features of, the well designed clinical trial. Like a diagnosis, the results from a clinical trial may not always be definitive, nonetheless it is the best available method of gleaning information about treatment efficacy.

  18. Effects of response bias and judgment framing on operator use of an automated aid in a target detection task.

    PubMed

    Rice, Stephen; McCarley, Jason S

    2011-12-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.

  19. Massively parallel digital high resolution melt for rapid and absolutely quantitative sequence profiling

    NASA Astrophysics Data System (ADS)

    Velez, Daniel Ortiz; Mack, Hannah; Jupe, Julietta; Hawker, Sinead; Kulkarni, Ninad; Hedayatnia, Behnam; Zhang, Yang; Lawrence, Shelley; Fraley, Stephanie I.

    2017-02-01

    In clinical diagnostics and pathogen detection, profiling of complex samples for low-level genotypes represents a significant challenge. Advances in speed, sensitivity, and extent of multiplexing of molecular pathogen detection assays are needed to improve patient care. We report the development of an integrated platform enabling the identification of bacterial pathogen DNA sequences in complex samples in less than four hours. The system incorporates a microfluidic chip and instrumentation to accomplish universal PCR amplification, High Resolution Melting (HRM), and machine learning within 20,000 picoliter scale reactions, simultaneously. Clinically relevant concentrations of bacterial DNA molecules are separated by digitization across 20,000 reactions and amplified with universal primers targeting the bacterial 16S gene. Amplification is followed by HRM sequence fingerprinting in all reactions, simultaneously. The resulting bacteria-specific melt curves are identified by Support Vector Machine learning, and individual pathogen loads are quantified. The platform reduces reaction volumes by 99.995% and achieves a greater than 200-fold increase in dynamic range of detection compared to traditional PCR HRM approaches. Type I and II error rates are reduced by 99% and 100% respectively, compared to intercalating dye-based digital PCR (dPCR) methods. This technology could impact a number of quantitative profiling applications, especially infectious disease diagnostics.

  20. Impact of Measurement Error on Synchrophasor Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yilu; Gracia, Jose R.; Ewing, Paul D.

    2015-07-01

    Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include themore » possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.« less

  1. Presentation of nursing diagnosis content in fundamentals of nursing textbooks.

    PubMed

    Mahon, S M; Spies, M A; Aukamp, V; Barrett, J T; Figgins, M J; Meyer, G A; Young, V K

    1997-01-01

    The technique and rationale for the use of nursing diagnosis generally are introduced early in the undergraduate curriculum. The three purposes of this descriptive study were to describe the general characteristics and presentation of content on nursing diagnosis in fundamentals of nursing textbooks; describe how the content from the theoretical chapter(s) in nursing diagnosis is carried through in the clinical chapters; and describe how content on diagnostic errors is presented. Although most of the textbooks presented content on nursing diagnosis in a similar fashion, the clinical chapters of the books did not follow the same pattern. Content on diagnostic errors was inconsistent. Educators may find this an effective methodology for reviewing textbooks.

  2. A hybrid framework for quantifying the influence of data in hydrological model calibration

    NASA Astrophysics Data System (ADS)

    Wright, David P.; Thyer, Mark; Westra, Seth; McInerney, David

    2018-06-01

    Influence diagnostics aim to identify a small number of influential data points that have a disproportionate impact on the model parameters and/or predictions. The key issues with current influence diagnostic techniques are that the regression-theory approaches do not provide hydrologically relevant influence metrics, while the case-deletion approaches are computationally expensive to calculate. The main objective of this study is to introduce a new two-stage hybrid framework that overcomes these challenges, by delivering hydrologically relevant influence metrics in a computationally efficient manner. Stage one uses computationally efficient regression-theory influence diagnostics to identify the most influential points based on Cook's distance. Stage two then uses case-deletion influence diagnostics to quantify the influence of points using hydrologically relevant metrics. To illustrate the application of the hybrid framework, we conducted three experiments on 11 hydro-climatologically diverse Australian catchments using the GR4J hydrological model. The first experiment investigated how many data points from stage one need to be retained in order to reliably identify those points that have the hightest influence on hydrologically relevant metrics. We found that a choice of 30-50 is suitable for hydrological applications similar to those explored in this study (30 points identified the most influential data 98% of the time and reduced the required recalibrations by 99% for a 10 year calibration period). The second experiment found little evidence of a change in the magnitude of influence with increasing calibration period length from 1, 2, 5 to 10 years. Even for 10 years the impact of influential points can still be high (>30% influence on maximum predicted flows). The third experiment compared the standard least squares (SLS) objective function with the weighted least squares (WLS) objective function on a 10 year calibration period. In two out of three flow metrics there was evidence that SLS, with the assumption of homoscedastic residual error, identified data points with higher influence (largest changes of 40%, 10%, and 44% for the maximum, mean, and low flows, respectively) than WLS, with the assumption of heteroscedastic residual errors (largest changes of 26%, 6%, and 6% for the maximum, mean, and low flows, respectively). The hybrid framework complements existing model diagnostic tools and can be applied to a wide range of hydrological modelling scenarios.

  3. The 3 faces of clinical reasoning: Epistemological explorations of disparate error reduction strategies.

    PubMed

    Monteiro, Sandra; Norman, Geoff; Sherbino, Jonathan

    2018-06-01

    There is general consensus that clinical reasoning involves 2 stages: a rapid stage where 1 or more diagnostic hypotheses are advanced and a slower stage where these hypotheses are tested or confirmed. The rapid hypothesis generation stage is considered inaccessible for analysis or observation. Consequently, recent research on clinical reasoning has focused specifically on improving the accuracy of the slower, hypothesis confirmation stage. Three perspectives have developed in this line of research, and each proposes different error reduction strategies for clinical reasoning. This paper considers these 3 perspectives and examines the underlying assumptions. Additionally, this paper reviews the evidence, or lack of, behind each class of error reduction strategies. The first perspective takes an epidemiological stance, appealing to the benefits of incorporating population data and evidence-based medicine in every day clinical reasoning. The second builds on the heuristic and bias research programme, appealing to a special class of dual process reasoning models that theorizes a rapid error prone cognitive process for problem solving with a slower more logical cognitive process capable of correcting those errors. Finally, the third perspective borrows from an exemplar model of categorization that explicitly relates clinical knowledge and experience to diagnostic accuracy. © 2018 John Wiley & Sons, Ltd.

  4. Making a structured psychiatric diagnostic interview faithful to the nomenclature.

    PubMed

    Robins, Lee N; Cottler, Linda B

    2004-10-15

    Psychiatric diagnostic interviews to be used in epidemiologic studies by lay interviewers have, since the 1970s, attempted to operationalize existing psychiatric nomenclatures. How to maximize the chances that they do so successfully has not previously been spelled out. In this article, the authors discuss strategies for each of the seven steps involved in writing, updating, or modifying a diagnostic interview and its supporting materials: 1) writing questions that match the nomenclature's criteria, 2) checking that respondents will be willing and able to answer the questions, 3) choosing a format acceptable to interviewers that maximizes accurate answering and recording of answers, 4) constructing a data entry and cleaning program that highlights errors to be corrected, 5) creating a diagnostic scoring program that matches the nomenclature's algorithms, 6) developing an interviewer training program that maximizes reliability, and 7) computerizing the interview. For each step, the authors discuss how to identify errors, correct them, and validate the revisions. Although operationalization will never be perfect because of ambiguities in the nomenclature, specifying methods for minimizing divergence from the nomenclature is timely as users modify existing interviews and look forward to updating interviews based on the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, and the International Classification of Diseases, Eleventh Revision.

  5. Cost-effective Diagnostic Checklists for Meningitis in Resource Limited Settings

    PubMed Central

    Durski, Kara N.; Kuntz, Karen M.; Yasukawa, Kosuke; Virnig, Beth A.; Meya, David B.; Boulware, David R.

    2013-01-01

    Background Checklists can standardize patient care, reduce errors, and improve health outcomes. For meningitis in resource-limited settings, with high patient loads and limited financial resources, CNS diagnostic algorithms may be useful to guide diagnosis and treatment. However, the cost-effectiveness of such algorithms is unknown. Methods We used decision analysis methodology to evaluate the costs, diagnostic yield, and cost-effectiveness of diagnostic strategies for adults with suspected meningitis in resource limited settings with moderate/high HIV prevalence. We considered three strategies: 1) comprehensive “shotgun” approach of utilizing all routine tests; 2) “stepwise” strategy with tests performed in a specific order with additional TB diagnostics; 3) “minimalist” strategy of sequential ordering of high-yield tests only. Each strategy resulted in one of four meningitis diagnoses: bacterial (4%), cryptococcal (59%), TB (8%), or other (aseptic) meningitis (29%). In model development, we utilized prevalence data from two Ugandan sites and published data on test performance. We validated the strategies with data from Malawi, South Africa, and Zimbabwe. Results The current comprehensive testing strategy resulted in 93.3% correct meningitis diagnoses costing $32.00/patient. A stepwise strategy had 93.8% correct diagnoses costing an average of $9.72/patient, and a minimalist strategy had 91.1% correct diagnoses costing an average of $6.17/patient. The incremental cost effectiveness ratio was $133 per additional correct diagnosis for the stepwise over minimalist strategy. Conclusions Through strategically choosing the order and type of testing coupled with disease prevalence rates, algorithms can deliver more care more efficiently. The algorithms presented herein are generalizable to East Africa and Southern Africa. PMID:23466647

  6. Development of extended WRF variational data assimilation system (WRFDA) for WRF non-hydrostatic mesoscale model

    NASA Astrophysics Data System (ADS)

    Pattanayak, Sujata; Mohanty, U. C.

    2018-06-01

    The paper intends to present the development of the extended weather research forecasting data assimilation (WRFDA) system in the framework of the non-hydrostatic mesoscale model core of weather research forecasting system (WRF-NMM), as an imperative aspect of numerical modeling studies. Though originally the WRFDA provides improved initial conditions for advanced research WRF, we have successfully developed a unified WRFDA utility that can be used by the WRF-NMM core, as well. After critical evaluation, it has been strategized to develop a code to merge WRFDA framework and WRF-NMM output. In this paper, we have provided a few selected implementations and initial results through single observation test, and background error statistics like eigenvalues, eigenvector and length scale among others, which showcase the successful development of extended WRFDA code for WRF-NMM model. Furthermore, the extended WRFDA system is applied for the forecast of three severe cyclonic storms: Nargis (27 April-3 May 2008), Aila (23-26 May 2009) and Jal (4-8 November 2010) formed over the Bay of Bengal. Model results are compared and contrasted within the analysis fields and later on with high-resolution model forecasts. The mean initial position error is reduced by 33% with WRFDA as compared to GFS analysis. The vector displacement errors in track forecast are reduced by 33, 31, 30 and 20% to 24, 48, 72 and 96 hr forecasts respectively, in data assimilation experiments as compared to control run. The model diagnostics indicates successful implementation of WRFDA within the WRF-NMM system.

  7. Measuring quality in anatomic pathology.

    PubMed

    Raab, Stephen S; Grzybicki, Dana Marie

    2008-06-01

    This article focuses mainly on diagnostic accuracy in measuring quality in anatomic pathology, noting that measuring any quality metric is complex and demanding. The authors discuss standardization and its variability within and across areas of care delivery and efforts involving defining and measuring error to achieve pathology quality and patient safety. They propose that data linking error to patient outcome are critical for developing quality improvement initiatives targeting errors that cause patient harm in addition to using methods of root cause analysis, beyond those traditionally used in cytologic-histologic correlation, to assist in the development of error reduction and quality improvement plans.

  8. A weighted generalized score statistic for comparison of predictive values of diagnostic tests.

    PubMed

    Kosinski, Andrzej S

    2013-03-15

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.

  9. A weighted generalized score statistic for comparison of predictive values of diagnostic tests

    PubMed Central

    Kosinski, Andrzej S.

    2013-01-01

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations which are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we present, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic which incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, it always reduces to the score statistic in the independent samples situation, and it preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the weighted generalized score test statistic in a general GEE setting. PMID:22912343

  10. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    ERIC Educational Resources Information Center

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  11. Uncertainties in climate data sets

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1992-01-01

    Climate diagnostics are constructed from either analyzed fields or from observational data sets. Those that have been commonly used are normally considered ground truth. However, in most of these collections, errors and uncertainties exist which are generally ignored due to the consistency of usage over time. Examples of uncertainties and errors are described in NMC and ECMWF analyses and in satellite observational sets-OLR, TOVS, and SMMR. It is suggested that these errors can be large, systematic, and not negligible in climate analysis.

  12. VLSI (Very Large Scale Integrated Circuits) Design with the MacPitts Silicon Compiler.

    DTIC Science & Technology

    1985-09-01

    the background. If the algorithm is not fully debugged, then issue instead macpitts basename herald so MacPitts diagnostics and Liszt diagnostics both...command interpreter. Upon compilation, however, the following LI!F compiler ( Liszt ) diagnostic results, Error: Non-number to minus nil where the first...language used in the MacPitts source code. The more instructive solution is to write the Franz LISP code to decide if a jumper wire is needed, and if so, to

  13. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  14. Evaluation of beam divergence of a negative hydrogen ion beam using Doppler shift spectroscopy diagnostics

    NASA Astrophysics Data System (ADS)

    Deka, A. J.; Bharathi, P.; Pandya, K.; Bandyopadhyay, M.; Bhuyan, M.; Yadav, R. K.; Tyagi, H.; Gahlaut, A.; Chakraborty, A.

    2018-01-01

    The Doppler Shift Spectroscopy (DSS) diagnostic is in the conceptual stage to estimate beam divergence, stripping losses, and beam uniformity of the 100 keV hydrogen Diagnostics Neutral Beam of International Thermonuclear Experimental Reactor. This DSS diagnostic is used to measure the above-mentioned parameters with an error of less than 10%. To aid the design calculations and to establish a methodology for estimation of the beam divergence, DSS measurements were carried out on the existing prototype ion source RF Operated Beam Source in India for Negative ion Research. Emissions of the fast-excited neutrals that are generated from the extracted negative ions were collected in the target tank, and the line broadening of these emissions were used for estimating beam divergence. The observed broadening is a convolution of broadenings due to beam divergence, collection optics, voltage ripple, beam focusing, and instrumental broadening. Hence, for estimating the beam divergence from the observed line broadening, a systematic line profile analysis was performed. To minimize the error in the divergence measurements, a study on error propagation in the beam divergence measurements was carried out and the error was estimated. The measurements of beam divergence were done at a constant RF power of 50 kW and a source pressure of 0.6 Pa by varying the extraction voltage from 4 kV to10 kV and the acceleration voltage from 10 kV to 15 kV. These measurements were then compared with the calorimetric divergence, and the results seemed to agree within 10%. A minimum beam divergence of ˜3° was obtained when the source was operated at an extraction voltage of ˜5 kV and at a ˜10 kV acceleration voltage, i.e., at a total applied voltage of 15 kV. This is in agreement with the values reported in experiments carried out on similar sources elsewhere.

  15. A Conceptual Framework for Decision-making Support in Uncertainty- and Risk-based Diagnosis of Rare Clinical Cases by Specialist Physicians.

    PubMed

    Santos, Adriano A; Moura, J Antão B; de Araújo, Joseana Macêdo Fechine Régis

    2015-01-01

    Mitigating uncertainty and risks faced by specialist physicians in analysis of rare clinical cases is something desired by anyone who needs health services. The number of clinical cases never seen by these experts, with little documentation, may introduce errors in decision-making. Such errors negatively affect well-being of patients, increase procedure costs, rework, health insurance premiums, and impair the reputation of specialists and medical systems involved. In this context, IT and Clinical Decision Support Systems (CDSS) play a fundamental role, supporting decision-making process, making it more efficient and effective, reducing a number of avoidable medical errors and enhancing quality of treatment given to patients. An investigation has been initiated to look into characteristics and solution requirements of this problem, model it, propose a general solution in terms of a conceptual risk-based, automated framework to support rare-case medical diagnostics and validate it by means of case studies. A preliminary validation study of the proposed framework has been carried out by interviews conducted with experts who are practicing professionals, academics, and researchers in health care. This paper summarizes the investigation and its positive results. These results motivate continuation of research towards development of the conceptual framework and of a software tool that implements the proposed model.

  16. Managing the patient identification crisis in healthcare and laboratory medicine.

    PubMed

    Lippi, Giuseppe; Mattiuzzi, Camilla; Bovo, Chiara; Favaloro, Emmanuel J

    2017-07-01

    Identification errors have emerged as critical issues in health care, as testified by the ample scientific literature on this argument. Despite available evidence suggesting that the frequency of misidentification in vitro laboratory diagnostic testing may be relatively low compared to that of other laboratory errors (i.e., usually comprised between 0.01 and 0.1% of all specimens received), the potential adverse consequences remain particularly worrying, wherein 10-20% of these errors not only would translate into serious harm for the patient, but may also erode considerable human and economic resources, so that the entire healthcare system should be re-engineered to act proactively and limiting the burden of this important problem. The most important paradigms for reducing the chance of misidentification in healthcare entail the widespread use of more than two unique patient identifiers, the accurate education and training of healthcare personnel, the delivery of more resources for patient safety (i.e., implementation of safer technological tools), and the use of customized solutions according to local organization and resources. Moreover, after weighing advantages and drawbacks, labeling blood collection tubes before and not after venipuncture may be considered a safer practice for safeguarding patient safety and optimizing phlebotomist's activity. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  17. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  18. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  19. On-line data collection platform for national dose surveys in diagnostic and interventional radiology.

    PubMed

    Vassileva, J; Simeonov, F; Avramova-Cholakova, S

    2015-07-01

    According to the Bulgarian regulation for radiation protection at medical exposure, the National Centre of Radiobiology and Radiation Protection (NCRRP) is responsible for performing national dose surveys in diagnostic and interventional radiology and nuclear medicine and for establishing of national diagnostic reference levels (DRLs). The next national dose survey is under preparation to be performed in the period of 2015-16, with the aim to cover conventional radiography, mammography, conventional fluoroscopy, interventional and fluoroscopy guided procedures and CT. It will be performed electronically using centralised on-line data collection platform established by the NCRRP. The aim is to increase the response rate and to improve the accuracy by reducing human errors. The concept of the on-line dose data collection platform is presented. Radiological facilities are provided with a tool to determine local typical patient doses, and the NCRRP to establish national DRLs. Future work will include automatic retrieval of dose data from hospital picture archival and communicating system. The on-line data collection platform is expected to facilitate the process of dose audit and optimisation of radiological procedures in Bulgarian hospitals. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Why patients' disruptive behaviours impair diagnostic reasoning: a randomised experiment.

    PubMed

    Mamede, Sílvia; Van Gog, Tamara; Schuit, Stephanie C E; Van den Berge, Kees; Van Daele, Paul L A; Bueving, Herman; Van der Zee, Tim; Van den Broek, Walter W; Van Saase, Jan L C M; Schmidt, H G

    2017-01-01

    Patients who display disruptive behaviours in the clinical encounter (the so-called 'difficult patients') may negatively affect doctors' diagnostic reasoning, thereby causing diagnostic errors. The present study aimed at investigating the mechanisms underlying the negative influence of difficult patients' behaviours on doctors' diagnostic performance. A randomised experiment with 74 internal medicine residents. Doctors diagnosed eight written clinical vignettes that were exactly the same except for the patients' behaviours (either difficult or neutral). Each participant diagnosed half of the vignettes in a difficult patient version and the other half in a neutral version in a counterbalanced design. After diagnosing each vignette, participants were asked to recall the patient's clinical findings and behaviours. Main measurements were: diagnostic accuracy scores; time spent on diagnosis, and amount of information recalled from patients' clinical findings and behaviours. Mean diagnostic accuracy scores (range 0-1) were significantly lower for difficult than neutral patients' vignettes (0.41 vs 0.51; p<0.01). Time spent on diagnosing was similar. Participants recalled fewer clinical findings (mean=29.82% vs mean=32.52%; p<0.001) and more behaviours (mean=25.51% vs mean=17.89%; p<0.001) from difficult than from neutral patients. Difficult patients' behaviours induce doctors to make diagnostic errors, apparently because doctors spend part of their mental resources on dealing with the difficult patients' behaviours, impeding adequate processing of clinical findings. Efforts should be made to increase doctors' awareness of the potential negative influence of difficult patients' behaviours on diagnostic decisions and their ability to counteract such influence. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Challenges in pediatric chronic inflammatory demyelinating polyneuropathy.

    PubMed

    Haliloğlu, Göknur; Yüksel, Deniz; Temoçin, Cağri Mesut; Topaloğlu, Haluk

    2016-12-01

    Chronic inflammatory demyelinating neuropathy, a treatable immune-mediated disease of the peripheral nervous system is less common in childhood compared to adults. Despite different sets of diagnostic criteria, lack of a reliable biologic marker leads to challenges in diagnosis, follow-up and treatment. Our first aim was to review clinical presentation, course, response to treatment, and prognosis in our childhood patients. We also aimed to document diagnostic and therapeutic pitfalls and challenges at the bedside. Our original cohort consisted of 23 pediatric patients who were referred to us with a clinical diagnosis of chronic inflammatory demyelinating neuropathy. Seven patients reaching to an alternative diagnosis were excluded. In the remaining patients, diagnostic, treatment and follow-up data were compared in typical patients who satisfied both clinical and electrodiagnostic criteria and atypical patients who failed to meet minimal research chronic inflammatory demyelinating neuropathy electrodiagnostic requirements. Eight of 16 patients (50%) met the minimal chronic inflammatory demyelinating neuropathy research diagnostic requirements. There was only a statistically significant difference (p = 0.010) in terms of European Neuromuscular Centre childhood chronic inflammatory diagnostic mandatory clinical criteria between the two groups. Misdiagnosis due to errors in electrophysiological interpretation (100%, n = 8), cerebrospinal fluid cytoalbuminologic dissociation (100%, n = 4 and/or subjective improvement on any immunotherapy modality (80 ± 19.27%)) was frequent. Pediatric CIDP is challenging in terms of diagnostic and therapeutic pitfalls at the bedside. Diagnostic errors due to electrophysiological interpretation, cerebrospinal fluid cytoalbuminologic dissociation, and/or subjective improvement on immunotherapy should be considered. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Investigation into diagnostic agreement using automated computer-assisted histopathology pattern recognition image analysis.

    PubMed

    Webster, Joshua D; Michalowski, Aleksandra M; Dwyer, Jennifer E; Corps, Kara N; Wei, Bih-Rong; Juopperi, Tarja; Hoover, Shelley B; Simpson, R Mark

    2012-01-01

    The extent to which histopathology pattern recognition image analysis (PRIA) agrees with microscopic assessment has not been established. Thus, a commercial PRIA platform was evaluated in two applications using whole-slide images. Substantial agreement, lacking significant constant or proportional errors, between PRIA and manual morphometric image segmentation was obtained for pulmonary metastatic cancer areas (Passing/Bablok regression). Bland-Altman analysis indicated heteroscedastic measurements and tendency toward increasing variance with increasing tumor burden, but no significant trend in mean bias. The average between-methods percent tumor content difference was -0.64. Analysis of between-methods measurement differences relative to the percent tumor magnitude revealed that method disagreement had an impact primarily in the smallest measurements (tumor burden <3%). Regression-based 95% limits of agreement indicated substantial agreement for method interchangeability. Repeated measures revealed concordance correlation of >0.988, indicating high reproducibility for both methods, yet PRIA reproducibility was superior (C.V.: PRIA = 7.4, manual = 17.1). Evaluation of PRIA on morphologically complex teratomas led to diagnostic agreement with pathologist assessments of pluripotency on subsets of teratomas. Accommodation of the diversity of teratoma histologic features frequently resulted in detrimental trade-offs, increasing PRIA error elsewhere in images. PRIA error was nonrandom and influenced by variations in histomorphology. File-size limitations encountered while training algorithms and consequences of spectral image processing dominance contributed to diagnostic inaccuracies experienced for some teratomas. PRIA appeared better suited for tissues with limited phenotypic diversity. Technical improvements may enhance diagnostic agreement, and consistent pathologist input will benefit further development and application of PRIA.

  3. Determining the Numeracy and Algebra Errors of Students in a Two-Year Vocational School

    ERIC Educational Resources Information Center

    Akyüz, Gözde

    2015-01-01

    The goal of this study was to determine the mathematics achievement level in basic numeracy and algebra concepts of students in a two-year program in a technical vocational school of higher education and determine the errors that they make in these topics. The researcher developed a diagnostic mathematics achievement test related to numeracy and…

  4. "Finding Useful Questions: On Bayesian Diagnosticity, Probability, Impact, and Information Gain": Correction to Nelson (2005)

    ERIC Educational Resources Information Center

    Nelson, Jonathan D.

    2007-01-01

    Reports an error in "Finding Useful Questions: On Bayesian Diagnosticity, Probability, Impact, and Information Gain" by Jonathan D. Nelson (Psychological Review, 2005[Oct], Vol 112[4], 979-999). In Table 13, the data should indicate that 7% of females had short hair and 93% of females had long hair. The calculations and discussion in the article…

  5. Modelling and analysis of flux surface mapping experiments on W7-X

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel; Otte, Matthias; Bozhenkov, Sergey; Sunn Pedersen, Thomas; Bräuer, Torsten; Gates, David; Neilson, Hutch; W7-X Team

    2015-11-01

    The measurement and compensation of error fields in W7-X will be key to the device achieving high beta steady state operations. Flux surface mapping utilizes the vacuum magnetic flux surfaces, a feature unique to stellarators and heliotrons, to allow direct measurement of magnetic topology, and thereby allows a highly accurate determination of remnant magnetic field errors. As will be reported separately at this meeting, the first measurements confirming the existence of nested flux surfaces in W7-X have been made. In this presentation, a synthetic diagnostic for the flux surface mapping diagnostic is presented. It utilizes Poincaré traces to construct an image of the flux surface consistent with the measured camera geometry, fluorescent rod sweep plane, and emitter beam position. Forward modeling of the high-iota configuration will be presented demonstrating an ability to measure the intrinsic error field using the U.S. supplied trim coil system on W7-X, and a first experimental assessment of error fields in W7-X will be presented. This work has been authored by Princeton University under Contract Number DE-AC02-09CH11466 with the US Department of Energy.

  6. Perceived association between diagnostic and non-diagnostic cues of women's sexual interest: General Recognition Theory predictors of risk for sexual coercion.

    PubMed

    Farris, Coreen; Viken, Richard J; Treat, Teresa A

    2010-01-01

    Young men's errors in sexual perception have been linked to sexual coercion. The current investigation sought to explicate the perceptual and decisional sources of these social perception errors, as well as their link to risk for sexual violence. General Recognition Theory (GRT; [Ashby, F. G., & Townsend, J. T. (1986). Varieties of perceptual independence. Psychological Review, 93, 154-179]) was used to estimate participants' ability to discriminate between affective cues and clothing style cues and to measure illusory correlations between men's perception of women's clothing style and sexual interest. High-risk men were less sensitive to the distinction between women's friendly and sexual interest cues relative to other men. In addition, they were more likely to perceive an illusory correlation between women's diagnostic sexual interest cues (e.g., facial affect) and non-diagnostic cues (e.g., provocative clothing), which increases the probability that high-risk men will misperceive friendly women as intending to communicate sexual interest. The results provide information about the degree of risk conferred by individual differences in perceptual processing of women's interest cues, and also illustrate how translational scientists might adapt GRT to examine research questions about individual differences in social perception.

  7. Uniform-penalty inversion of multiexponential decay data. II. Data spacing, T(2) data, systemic data errors, and diagnostics.

    PubMed

    Borgia, G C; Brown, R J; Fantazzini, P

    2000-12-01

    The basic method of UPEN (uniform penalty inversion of multiexponential decay data) is given in an earlier publication (Borgia et al., J. Magn. Reson. 132, 65-77 (1998)), which also discusses the effects of noise, constraints, and smoothing on the resolution or apparent resolution of features of a computed distribution of relaxation times. UPEN applies negative feedback to a regularization penalty, allowing stronger smoothing for a broad feature than for a sharp line. This avoids unnecessarily broadening the sharp line and/or breaking the wide peak or tail into several peaks that the relaxation data do not demand to be separate. The experimental and artificial data presented earlier were T(1) data, and all had fixed data spacings, uniform in log-time. However, for T(2) data, usually spaced uniformly in linear time, or for data spaced in any manner, we have found that the data spacing does not enter explicitly into the computation. The present work shows the extension of UPEN to T(2) data, including the averaging of data in windows and the use of the corresponding weighting factors in the computation. Measures are implemented to control portions of computed distributions extending beyond the data range. The input smoothing parameters in UPEN are normally fixed, rather than data dependent. A major problem arises, especially at high signal-to-noise ratios, when UPEN is applied to data sets with systematic errors due to instrumental nonidealities or adjustment problems. For instance, a relaxation curve for a wide line can be narrowed by an artificial downward bending of the relaxation curve. Diagnostic parameters are generated to help identify data problems, and the diagnostics are applied in several examples, with particular attention to the meaningful resolution of two closely spaced peaks in a distribution of relaxation times. Where feasible, processing with UPEN in nearly real time should help identify data problems while further instrument adjustments can still be made. The need for the nonnegative constraint is greatly reduced in UPEN, and preliminary processing without this constraint helps identify data sets for which application of the nonnegative constraint is too expensive in terms of error of fit for the data set to represent sums of decaying positive exponentials plus random noise. Copyright 2000 Academic Press.

  8. Identifying Vulnerable Plaques with Acoustic Radiation Force Impulse Imaging

    NASA Astrophysics Data System (ADS)

    Doherty, Joshua Ryan

    The rupture of arterial plaques is the most common cause of ischemic complications including stroke, the fourth leading cause of death and number one cause of long term disability in the United States. Unfortunately, because conventional diagnostic tools fail to identify plaques that confer the highest risk, often a disabling stroke and/or sudden death is the first sign of disease. A diagnostic method capable of characterizing plaque vulnerability would likely enhance the predictive ability and ultimately the treatment of stroke before the onset of clinical events. This dissertation evaluates the hypothesis that Acoustic Radiation Force Impulse (ARFI) imaging can noninvasively identify lipid regions, that have been shown to increase a plaque's propensity to rupture, within carotid artery plaques in vivo. The work detailed herein describes development efforts and results from simulations and experiments that were performed to evaluate this hypothesis. To first demonstrate feasibility and evaluate potential safety concerns, finite- element method simulations are used to model the response of carotid artery plaques to an acoustic radiation force excitation. Lipid pool visualization is shown to vary as a function of lipid pool geometry and stiffness. A comparison of the resulting Von Mises stresses indicates that stresses induced by an ARFI excitation are three orders of magnitude lower than those induced by blood pressure. This thesis also presents the development of a novel pulse inversion harmonic tracking method to reduce clutter-imposed errors in ultrasound-based tissue displacement estimates. This method is validated in phantoms and was found to reduce bias and jitter displacement errors for a marked improvement in image quality in vivo. Lastly, this dissertation presents results from a preliminary in vivo study that compares ARFI imaging derived plaque stiffness with spatially registered composition determined by a Magnetic Resonance Imaging (MRI) gold standard in human carotid artery plaques. It is shown in this capstone experiment that lipid filled regions in MRI correspond to areas of increased displacement in ARFI imaging while calcium and loose matrix components in MRI correspond to uniformly low displacements in ARFI imaging. This dissertation provides evidence to support that ARFI imaging may provide important prognostic and diagnostic information regarding stroke risk via measurements of plaque stiffness. More generally, the results have important implications for all acoustic radiation force based imaging methods used clinically.

  9. Localized Glaucomatous Change Detection within the Proper Orthogonal Decomposition Framework

    PubMed Central

    Balasubramanian, Madhusudhanan; Kriegman, David J.; Bowd, Christopher; Holst, Michael; Weinreb, Robert N.; Sample, Pamela A.; Zangwill, Linda M.

    2012-01-01

    Purpose. To detect localized glaucomatous structural changes using proper orthogonal decomposition (POD) framework with false-positive control that minimizes confirmatory follow-ups, and to compare the results to topographic change analysis (TCA). Methods. We included 167 participants (246 eyes) with ≥4 Heidelberg Retina Tomograph (HRT)-II exams from the Diagnostic Innovations in Glaucoma Study; 36 eyes progressed by stereo-photographs or visual fields. All other patient eyes (n = 210) were non-progressing. Specificities were evaluated using 21 normal eyes. Significance of change at each HRT superpixel between each follow-up and its nearest baseline (obtained using POD) was estimated using mixed-effects ANOVA. Locations with significant reduction in retinal height (red pixels) were determined using Bonferroni, Lehmann-Romano k-family-wise error rate (k-FWER), and Benjamini-Hochberg false discovery rate (FDR) type I error control procedures. Observed positive rate (OPR) in each follow-up was calculated as a ratio of number of red pixels within disk to disk size. Progression by POD was defined as one or more follow-ups with OPR greater than the anticipated false-positive rate. TCA was evaluated using the recently proposed liberal, moderate, and conservative progression criteria. Results. Sensitivity in progressors, specificity in normals, and specificity in non-progressors, respectively, were POD-Bonferroni = 100%, 0%, and 0%; POD k-FWER = 78%, 86%, and 43%; POD-FDR = 78%, 86%, and 43%; POD k-FWER with retinal height change ≥50 μm = 61%, 95%, and 60%; TCA-liberal = 86%, 62%, and 21%; TCA-moderate = 53%, 100%, and 70%; and TCA-conservative = 17%, 100%, and 84%. Conclusions. With a stronger control of type I errors, k-FWER in POD framework minimized confirmatory follow-ups while providing diagnostic accuracy comparable to TCA. Thus, POD with k-FWER shows promise to reduce the number of confirmatory follow-ups required for clinical care and studies evaluating new glaucoma treatments. (ClinicalTrials.gov number, NCT00221897.) PMID:22491406

  10. Automated and unsupervised detection of malarial parasites in microscopic images.

    PubMed

    Purwar, Yashasvi; Shah, Sirish L; Clarke, Gwen; Almugairi, Areej; Muehlenbachs, Atis

    2011-12-13

    Malaria is a serious infectious disease. According to the World Health Organization, it is responsible for nearly one million deaths each year. There are various techniques to diagnose malaria of which manual microscopy is considered to be the gold standard. However due to the number of steps required in manual assessment, this diagnostic method is time consuming (leading to late diagnosis) and prone to human error (leading to erroneous diagnosis), even in experienced hands. The focus of this study is to develop a robust, unsupervised and sensitive malaria screening technique with low material cost and one that has an advantage over other techniques in that it minimizes human reliance and is, therefore, more consistent in applying diagnostic criteria. A method based on digital image processing of Giemsa-stained thin smear image is developed to facilitate the diagnostic process. The diagnosis procedure is divided into two parts; enumeration and identification. The image-based method presented here is designed to automate the process of enumeration and identification; with the main advantage being its ability to carry out the diagnosis in an unsupervised manner and yet have high sensitivity and thus reducing cases of false negatives. The image based method is tested over more than 500 images from two independent laboratories. The aim is to distinguish between positive and negative cases of malaria using thin smear blood slide images. Due to the unsupervised nature of method it requires minimal human intervention thus speeding up the whole process of diagnosis. Overall sensitivity to capture cases of malaria is 100% and specificity ranges from 50-88% for all species of malaria parasites. Image based screening method will speed up the whole process of diagnosis and is more advantageous over laboratory procedures that are prone to errors and where pathological expertise is minimal. Further this method provides a consistent and robust way of generating the parasite clearance curves.

  11. Improvements to the ion Doppler spectrometer diagnostic on the HIT-SI experiments.

    PubMed

    Hossack, Aaron; Chandra, Rian; Everson, Chris; Jarboe, Tom

    2018-03-01

    An ion Doppler spectrometer diagnostic system measuring impurity ion temperature and velocity on the HIT-SI and HIT-SI3 spheromak devices has been improved with higher spatiotemporal resolution and lower error than previously described devices. Hardware and software improvements to the established technique have resulted in a record of 6.9 μs temporal and ≤2.8 cm spatial resolution in the midplane of each device. These allow Ciii and Oii flow, displacement, and temperature profiles to be observed simultaneously. With 72 fused-silica fiber channels in two independent bundles, and an f/8.5 Czerny-Turner spectrometer coupled to a video camera, frame rates of up to ten times the imposed magnetic perturbation frequency of 14.5 kHz were achieved in HIT-SI, viewing the upper half of the midplane. In HIT-SI3, frame rates of up to eight times the perturbation frequency were achieved viewing both halves of the midplane. Biorthogonal decomposition is used as a novel filtering tool, reducing uncertainty in ion temperature from ≲13 to ≲5 eV (with an instrument temperature of 8-16 eV) and uncertainty in velocity from ≲2 to ≲1 km/s. Doppler shift and broadening are calculated via the Levenberg-Marquardt algorithm, after which the errors in velocity and temperature are uniquely specified. Axisymmetric temperature profiles on HIT-SI3 for Ciii peaked near the inboard current separatrix at ≈40 eV are observed. Axisymmetric plasma displacement profiles have been measured on HIT-SI3, peaking at ≈6 cm at the outboard separatrix. Both profiles agree with the upper half of the midplane observable by HIT-SI. With its complete midplane view, HIT-SI3 has unambiguously extracted axisymmetric, toroidal current dependent rotation of up to 3 km/s. Analysis of the temporal phase of the displacement uncovers a coherent structure, locked to the applied perturbation. Previously described diagnostic systems could not achieve such results.

  12. Improvements to the ion Doppler spectrometer diagnostic on the HIT-SI experiments

    NASA Astrophysics Data System (ADS)

    Hossack, Aaron; Chandra, Rian; Everson, Chris; Jarboe, Tom

    2018-03-01

    An ion Doppler spectrometer diagnostic system measuring impurity ion temperature and velocity on the HIT-SI and HIT-SI3 spheromak devices has been improved with higher spatiotemporal resolution and lower error than previously described devices. Hardware and software improvements to the established technique have resulted in a record of 6.9 μs temporal and ≤2.8 cm spatial resolution in the midplane of each device. These allow Ciii and Oii flow, displacement, and temperature profiles to be observed simultaneously. With 72 fused-silica fiber channels in two independent bundles, and an f/8.5 Czerny-Turner spectrometer coupled to a video camera, frame rates of up to ten times the imposed magnetic perturbation frequency of 14.5 kHz were achieved in HIT-SI, viewing the upper half of the midplane. In HIT-SI3, frame rates of up to eight times the perturbation frequency were achieved viewing both halves of the midplane. Biorthogonal decomposition is used as a novel filtering tool, reducing uncertainty in ion temperature from ≲13 to ≲5 eV (with an instrument temperature of 8-16 eV) and uncertainty in velocity from ≲2 to ≲1 km/s. Doppler shift and broadening are calculated via the Levenberg-Marquardt algorithm, after which the errors in velocity and temperature are uniquely specified. Axisymmetric temperature profiles on HIT-SI3 for Ciii peaked near the inboard current separatrix at ≈40 eV are observed. Axisymmetric plasma displacement profiles have been measured on HIT-SI3, peaking at ≈6 cm at the outboard separatrix. Both profiles agree with the upper half of the midplane observable by HIT-SI. With its complete midplane view, HIT-SI3 has unambiguously extracted axisymmetric, toroidal current dependent rotation of up to 3 km/s. Analysis of the temporal phase of the displacement uncovers a coherent structure, locked to the applied perturbation. Previously described diagnostic systems could not achieve such results.

  13. Improvements to the Ion Doppler Spectrometer Diagnostic on the HIT-SI Experiments

    DOE PAGES

    Hossack, Aaron; Chandra, Rian; Everson, Christopher; ...

    2018-03-09

    An Ion Doppler Spectrometer diagnostic system measuring impurity ion temperature and velocity on the HIT-SI and HIT-SI3 spheromak devices has been improved with higher spatiotemporal resolution and lower error than previously described devices. Hardware and software improvements to the established technique have resulted in a record 6.9 µs temporal and <=2.8 cm spatial resolution in the midplane of each device. These allow C III and O II flow, displacement, and temperature profiles to be simultaneously observed. With 72 fused-silica fiber channels in two independent bundles, and an f/8.5 Czerny-Turner spectrometer coupled to video camera, frame-rates of up to ten timesmore » the imposed magnetic perturbation frequency of 14.5 kHz were achieved in HIT-SI, viewing the upper 1/2 of the midplane. In HIT-SI3 frame-rates of up to eight times the perturbation frequency were achieved viewing both halves of the midplane. Biorthogonal Decomposition is used as a novel filtering tool, reducing uncertainty in ion temperature from <=13 to <=5 eV (with an instrument temperature of 8-16 eV), and uncertainty in velocity from <=2 to <=1 km/s. Doppler shift and broadening is calculated via the Levenberg-Marquart algorithm, after which errors in velocity and temperature are uniquely specified. Axisymmetric temperature profiles on HIT-SI3 for C III peaked near the inboard current separatrix at approximately 40 eV are observed. Axisymmetric plasma displacement profiles have been measured on HIT-SI3, peaking at approximately 6 cm at the outboard separatrix. Both profiles agree with the upper half of the midplane observable by HIT-SI. With its complete midplane view, HIT-SI3 has unambiguously extracted axisymmetric, toroidal current dependent rotation of up to 3 km/s. Analysis of the temporal phase of the displacement uncovers a coherent structure, locked to the applied perturbation. Previously described diagnostic systems could not achieve such results.« less

  14. Improvements to the Ion Doppler Spectrometer Diagnostic on the HIT-SI Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossack, Aaron; Chandra, Rian; Everson, Christopher

    An Ion Doppler Spectrometer diagnostic system measuring impurity ion temperature and velocity on the HIT-SI and HIT-SI3 spheromak devices has been improved with higher spatiotemporal resolution and lower error than previously described devices. Hardware and software improvements to the established technique have resulted in a record 6.9 µs temporal and <=2.8 cm spatial resolution in the midplane of each device. These allow C III and O II flow, displacement, and temperature profiles to be simultaneously observed. With 72 fused-silica fiber channels in two independent bundles, and an f/8.5 Czerny-Turner spectrometer coupled to video camera, frame-rates of up to ten timesmore » the imposed magnetic perturbation frequency of 14.5 kHz were achieved in HIT-SI, viewing the upper 1/2 of the midplane. In HIT-SI3 frame-rates of up to eight times the perturbation frequency were achieved viewing both halves of the midplane. Biorthogonal Decomposition is used as a novel filtering tool, reducing uncertainty in ion temperature from <=13 to <=5 eV (with an instrument temperature of 8-16 eV), and uncertainty in velocity from <=2 to <=1 km/s. Doppler shift and broadening is calculated via the Levenberg-Marquart algorithm, after which errors in velocity and temperature are uniquely specified. Axisymmetric temperature profiles on HIT-SI3 for C III peaked near the inboard current separatrix at approximately 40 eV are observed. Axisymmetric plasma displacement profiles have been measured on HIT-SI3, peaking at approximately 6 cm at the outboard separatrix. Both profiles agree with the upper half of the midplane observable by HIT-SI. With its complete midplane view, HIT-SI3 has unambiguously extracted axisymmetric, toroidal current dependent rotation of up to 3 km/s. Analysis of the temporal phase of the displacement uncovers a coherent structure, locked to the applied perturbation. Previously described diagnostic systems could not achieve such results.« less

  15. The role of blood vessels in high-resolution volume conductor head modeling of EEG.

    PubMed

    Fiederer, L D J; Vorwerk, J; Lucka, F; Dannhauer, M; Yang, S; Dümpelmann, M; Schulze-Bonhage, A; Aertsen, A; Speck, O; Wolters, C H; Ball, T

    2016-03-01

    Reconstruction of the electrical sources of human EEG activity at high spatio-temporal accuracy is an important aim in neuroscience and neurological diagnostics. Over the last decades, numerous studies have demonstrated that realistic modeling of head anatomy improves the accuracy of source reconstruction of EEG signals. For example, including a cerebro-spinal fluid compartment and the anisotropy of white matter electrical conductivity were both shown to significantly reduce modeling errors. Here, we for the first time quantify the role of detailed reconstructions of the cerebral blood vessels in volume conductor head modeling for EEG. To study the role of the highly arborized cerebral blood vessels, we created a submillimeter head model based on ultra-high-field-strength (7T) structural MRI datasets. Blood vessels (arteries and emissary/intraosseous veins) were segmented using Frangi multi-scale vesselness filtering. The final head model consisted of a geometry-adapted cubic mesh with over 17×10(6) nodes. We solved the forward model using a finite-element-method (FEM) transfer matrix approach, which allowed reducing computation times substantially and quantified the importance of the blood vessel compartment by computing forward and inverse errors resulting from ignoring the blood vessels. Our results show that ignoring emissary veins piercing the skull leads to focal localization errors of approx. 5 to 15mm. Large errors (>2cm) were observed due to the carotid arteries and the dense arterial vasculature in areas such as in the insula or in the medial temporal lobe. Thus, in such predisposed areas, errors caused by neglecting blood vessels can reach similar magnitudes as those previously reported for neglecting white matter anisotropy, the CSF or the dura - structures which are generally considered important components of realistic EEG head models. Our findings thus imply that including a realistic blood vessel compartment in EEG head models will be helpful to improve the accuracy of EEG source analyses particularly when high accuracies in brain areas with dense vasculature are required. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. The frontal-anatomic specificity of design fluency repetitions and their diagnostic relevance for behavioral variant frontotemporal dementia.

    PubMed

    Possin, Katherine L; Chester, Serana K; Laluz, Victor; Bostrom, Alan; Rosen, Howard J; Miller, Bruce L; Kramer, Joel H

    2012-09-01

    On tests of design fluency, an examinee draws as many different designs as possible in a specified time limit while avoiding repetition. The neuroanatomical substrates and diagnostic group differences of design fluency repetition errors and total correct scores were examined in 110 individuals diagnosed with dementia, 53 with mild cognitive impairment (MCI), and 37 neurologically healthy controls. The errors correlated significantly with volumes in the right and left orbitofrontal cortex (OFC), the right and left superior frontal gyrus, the right inferior frontal gyrus, and the right striatum, but did not correlate with volumes in any parietal or temporal lobe regions. Regression analyses indicated that the lateral OFC may be particularly crucial for preventing these errors, even after excluding patients with behavioral variant frontotemporal dementia (bvFTD) from the analysis. Total correct correlated more diffusely with volumes in the right and left frontal and parietal cortex, the right temporal cortex, and the right striatum and thalamus. Patients diagnosed with bvFTD made significantly more repetition errors than patients diagnosed with MCI, Alzheimer's disease, semantic dementia, progressive supranuclear palsy, or corticobasal syndrome. In contrast, total correct design scores did not differentiate the dementia patients. These results highlight the frontal-anatomic specificity of design fluency repetitions. In addition, the results indicate that the propensity to make these errors supports the diagnosis of bvFTD. (JINS, 2012, 18, 1-11).

  17. Methodological, technical, and ethical issues of a computerized data system.

    PubMed

    Rice, C A; Godkin, M A; Catlin, R J

    1980-06-01

    This report examines some methodological, technical, and ethical issues which need to be addressed in designing and implementing a valid and reliable computerized clinical data base. The report focuses on the data collection system used by four residency based family health centers, affiliated with the University of Massachusetts Medical Center. It is suggested that data reliability and validity can be maximized by: (1) standardizing encounter forms at affiliated health centers to eliminate recording biases and ensure data comparability; (2) using forms with a diagnosis checklist to reduce coding errors and increase the number of diagnoses recorded per encounter; (3) developing uniform diagnostic criteria; (4) identifying sources of error, including discrepancies of clinical data as recorded in medical records, encounter forms, and the computer; and (5) improving provider cooperation in recording data by distributing data summaries which reinforce the data's applicability to service provision. Potential applications of the data for research purposes are restricted by personnel and computer costs, confidentiality considerations, programming related issues, and, most importantly, health center priorities, largely focused on patient care, not research.

  18. Diagnostic Techniques to Elucidate the Aerodynamic Performance of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    June, Jason; Bertolucci, Brandon; Ukeiley, Lawrence; Cattafesta, Louis N., III; Sheplak, Mark

    2017-01-01

    In support of Topic A.2.8 of NASA NRA NNH10ZEA001N, the University of Florida (UF) has investigated the use of flow field optical diagnostic and micromachined sensor-based techniques for assessing the wall shear stress on an acoustic liner. Stereoscopic particle image velocimetry (sPIV) was used to study the velocity field over a liner in the Grazing Flow Impedance Duct (GFID). The results indicate that the use of a control volume based method to determine the wall shear stress is prone to significant error. The skin friction over the liner as measured using velocity curve fitting techniques was shown to be locally reduced behind an orifice, relative to the hard wall case in a streamwise plane centered on the orifice. The capacitive wall shear stress sensor exhibited a linear response for a range of shear stresses over a hard wall. PIV over the liner is consistent with lifting of the near wall turbulent structure as it passes over an orifice, followed by a region of low wall shear stress.

  19. Study on Unified Chaotic System-Based Wind Turbine Blade Fault Diagnostic System

    NASA Astrophysics Data System (ADS)

    Kuo, Ying-Che; Hsieh, Chin-Tsung; Yau, Her-Terng; Li, Yu-Chung

    At present, vibration signals are processed and analyzed mostly in the frequency domain. The spectrum clearly shows the signal structure and the specific characteristic frequency band is analyzed, but the number of calculations required is huge, resulting in delays. Therefore, this study uses the characteristics of a nonlinear system to load the complete vibration signal to the unified chaotic system, applying the dynamic error to analyze the wind turbine vibration signal, and adopting extenics theory for artificial intelligent fault diagnosis of the analysis signal. Hence, a fault diagnostor has been developed for wind turbine rotating blades. This study simulates three wind turbine blade states, namely stress rupture, screw loosening and blade loss, and validates the methods. The experimental results prove that the unified chaotic system used in this paper has a significant effect on vibration signal analysis. Thus, the operating conditions of wind turbines can be quickly known from this fault diagnostic system, and the maintenance schedule can be arranged before the faults worsen, making the management and implementation of wind turbines smoother, so as to reduce many unnecessary costs.

  20. Lyme Disease Testing in a High-Incidence State: Clinician Knowledge and Patterns.

    PubMed

    Conant, Joanna L; Powers, Julia; Sharp, Gregory; Mead, Paul S; Nelson, Christina A

    2018-02-17

    Lyme disease (LD) incidence is increasing, but data suggest some clinicians are not fully aware of recommended procedures for ordering and interpreting diagnostic tests. The study objective was to assess clinicians' knowledge and practices regarding LD testing in a high-incidence region. We distributed surveys to 1,142 clinicians in the University of Vermont Medical Center region, of which 144 were completed (12.6% response rate). We also examined LD laboratory test results and logs of calls to laboratory customer service over a period of 2.5 years and 6 months, respectively. Most clinicians demonstrated basic knowledge of diagnostic protocols, but many misinterpreted Western blot results. For example, 42.4% incorrectly interpreted a positive immunoglobulin M result as an overall positive test in a patient with longstanding symptoms. Many also reported receiving patient requests for unvalidated tests. Additional education and modifications to LD test ordering and reporting systems would likely reduce errors and improve patient care. © American Society for Clinical Pathology, 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. Dichroic beamsplitter for high energy laser diagnostics

    DOEpatents

    LaFortune, Kai N [Livermore, CA; Hurd, Randall [Tracy, CA; Fochs, Scott N [Livermore, CA; Rotter, Mark D [San Ramon, CA; Hackel, Lloyd [Livermore, CA

    2011-08-30

    Wavefront control techniques are provided for the alignment and performance optimization of optical devices. A Shack-Hartmann wavefront sensor can be used to measure the wavefront distortion and a control system generates feedback error signal to optics inside the device to correct the wavefront. The system can be calibrated with a low-average-power probe laser. An optical element is provided to couple the optical device to a diagnostic/control package in a way that optimizes both the output power of the optical device and the coupling of the probe light into the diagnostics.

  2. Differentiating School-Aged Children with and without Language Impairment Using Tense and Grammaticality Measures from a Narrative Task

    ERIC Educational Resources Information Center

    Guo, Ling-Yu; Schneider, Phyllis

    2016-01-01

    Purpose: To determine the diagnostic accuracy of the finite verb morphology composite (FVMC), number of errors per C-unit (Errors/CU), and percent grammatical C-units (PGCUs) in differentiating school-aged children with language impairment (LI) and those with typical language development (TL). Method: Participants were 61 six-year-olds (50 TL, 11…

  3. An Application of M[subscript 2] Statistic to Evaluate the Fit of Cognitive Diagnostic Models

    ERIC Educational Resources Information Center

    Liu, Yanlou; Tian, Wei; Xin, Tao

    2016-01-01

    The fit of cognitive diagnostic models (CDMs) to response data needs to be evaluated, since CDMs might yield misleading results when they do not fit the data well. Limited-information statistic M[subscript 2] and the associated root mean square error of approximation (RMSEA[subscript 2]) in item factor analysis were extended to evaluate the fit of…

  4. Erratum: “Multi-point, high-speed passive ion velocity distribution diagnostic on the Pegasus Toroidal Experiment” [Rev. Sci. Instrum. 83, 10D516 (2012)

    DOE PAGES

    Burke, Marcus G.; Fonck, Raymond J.; Bongard, Michael W.; ...

    2016-07-18

    This article corrects an error in M.G. Burke et al., 'Multi-point, high-speed passive ion velocity distribution diagnostic on the Pegasus Toroidal Experiment,' Rev. Sci. Instrum. 83, 10D516 (2012) pertaining to ion temperature. The conclusions of this paper are not altered by the revised ion temperature measurements.

  5. Specialist integrated haematological malignancy diagnostic services: an Activity Based Cost (ABC) analysis of a networked laboratory service model.

    PubMed

    Dalley, C; Basarir, H; Wright, J G; Fernando, M; Pearson, D; Ward, S E; Thokula, P; Krishnankutty, A; Wilson, G; Dalton, A; Talley, P; Barnett, D; Hughes, D; Porter, N R; Reilly, J T; Snowden, J A

    2015-04-01

    Specialist Integrated Haematological Malignancy Diagnostic Services (SIHMDS) were introduced as a standard of care within the UK National Health Service to reduce diagnostic error and improve clinical outcomes. Two broad models of service delivery have become established: 'co-located' services operating from a single-site and 'networked' services, with geographically separated laboratories linked by common management and information systems. Detailed systematic cost analysis has never been published on any established SIHMDS model. We used Activity Based Costing (ABC) to construct a cost model for our regional 'networked' SIHMDS covering a two-million population based on activity in 2011. Overall estimated annual running costs were £1 056 260 per annum (£733 400 excluding consultant costs), with individual running costs for diagnosis, staging, disease monitoring and end of treatment assessment components of £723 138, £55 302, £184 152 and £94 134 per annum, respectively. The cost distribution by department was 28.5% for haematology, 29.5% for histopathology and 42% for genetics laboratories. Costs of the diagnostic pathways varied considerably; pathways for myelodysplastic syndromes and lymphoma were the most expensive and the pathways for essential thrombocythaemia and polycythaemia vera being the least. ABC analysis enables estimation of running costs of a SIHMDS model comprised of 'networked' laboratories. Similar cost analyses for other SIHMDS models covering varying populations are warranted to optimise quality and cost-effectiveness in delivery of modern haemato-oncology diagnostic services in the UK as well as internationally. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Do patients' disruptive behaviours influence the accuracy of a doctor's diagnosis? A randomised experiment.

    PubMed

    Schmidt, H G; Van Gog, Tamara; Schuit, Stephanie Ce; Van den Berge, Kees; Van Daele, Paul L; Bueving, Herman; Van der Zee, Tim; Van den Broek, Walter W; Van Saase, Jan L; Mamede, Sílvia

    2017-01-01

    Literature suggests that patients who display disruptive behaviours in the consulting room fuel negative emotions in doctors. These emotions, in turn, are said to cause diagnostic errors. Evidence substantiating this claim is however lacking. The purpose of the present experiment was to study the effect of such difficult patients' behaviours on doctors' diagnostic performance. We created six vignettes in which patients were depicted as difficult (displaying distressing behaviours) or neutral. Three clinical cases were deemed to be diagnostically simple and three deemed diagnostically complex. Sixty-three family practice residents were asked to evaluate the vignettes and make the patient's diagnosis quickly and then through deliberate reflection. In addition, amount of time needed to arrive at a diagnosis was measured. Finally, the participants rated the patient's likability. Mean diagnostic accuracy scores (range 0-1) were significantly lower for difficult than for neutral patients (0.54 vs 0.64; p=0.017). Overall diagnostic accuracy was higher for simple than for complex cases. Deliberate reflection upon the case improved initial diagnostic, regardless of case complexity and of patient behaviours (0.60 vs 0.68, p=0.002). Amount of time needed to diagnose the case was similar regardless of the patient's behaviour. Finally, average likability ratings were lower for difficult than for neutral-patient cases. Disruptive behaviours displayed by patients seem to induce doctors to make diagnostic errors. Interestingly, the confrontation with difficult patients does however not cause the doctor to spend less time on such case. Time can therefore not be considered an intermediary between the way the patient is perceived, his or her likability and diagnostic performance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Evaluation of malaria rapid diagnostic test (RDT) use by community health workers: a longitudinal study in western Kenya.

    PubMed

    Boyce, Matthew R; Menya, Diana; Turner, Elizabeth L; Laktabai, Jeremiah; Prudhomme-O'Meara, Wendy

    2018-05-18

    Malaria rapid diagnostic tests (RDTs) are a simple, point-of-care technology that can improve the diagnosis and subsequent treatment of malaria. They are an increasingly common diagnostic tool, but concerns remain about their use by community health workers (CHWs). These concerns regard the long-term trends relating to infection prevention measures, the interpretation of test results and adherence to treatment protocols. This study assessed whether CHWs maintained their competency at conducting RDTs over a 12-month timeframe, and if this competency varied with specific CHW characteristics. From June to September, 2015, CHWs (n = 271) were trained to conduct RDTs using a 3-day validated curriculum and a baseline assessment was completed. Between June and August, 2016, CHWs (n = 105) were randomly selected and recruited for follow-up assessments using a 20-step checklist that classified steps as relating to safety, accuracy, and treatment; 103 CHWs participated in follow-up assessments. Poisson regressions were used to test for associations between error count data at follow-up and Poisson regression models fit using generalized estimating equations were used to compare data across time-points. At both baseline and follow-up observations, at least 80% of CHWs correctly completed 17 of the 20 steps. CHWs being 50 years of age or older was associated with increased total errors and safety errors at baseline and follow-up. At follow-up, prior experience conducting RDTs was associated with fewer errors. Performance, as it related to the correct completion of all checklist steps and safety steps, did not decline over the 12 months and performance of accuracy steps improved (mean error ratio: 0.51; 95% CI 0.40-0.63). Visual interpretation of RDT results yielded a CHW sensitivity of 92.0% and a specificity of 97.3% when compared to interpretation by the research team. None of the characteristics investigated was found to be significantly associated with RDT interpretation. With training, most CHWs performing RDTs maintain diagnostic testing competency over at least 12 months. CHWs generally perform RDTs safely and accurately interpret results. Younger age and prior experiences with RDTs were associated with better testing performance. Future research should investigate the mode by which CHW characteristics impact RDT procedures.

  8. Influence of nuclei segmentation on breast cancer malignancy classification

    NASA Astrophysics Data System (ADS)

    Jelen, Lukasz; Fevens, Thomas; Krzyzak, Adam

    2009-02-01

    Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides and its influence on malignancy classification. Classification of malignancy plays a very important role during the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable information about the cancer malignancy grade which helps to choose an appropriate treatment. This process involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important. In this work we compare three powerful segmentation approaches and test their impact on the classification of breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation and textural segmentation based on co-occurrence matrix. Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes four different classifiers were trained and tested with previously extracted features. The compared classifiers are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network (PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the best results over the three compared approaches and leads to a good feature extraction with a lowest average error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron with an error rate of 3.07% using fuzzy c-means segmentation.

  9. The Effect of Multiple Sulfatase Deficiency (MSD) on Dental Development: Can We Use the Teeth as an Early Diagnostic Tool?

    PubMed

    Zilberman, Uri; Bibi, Haim

    2016-01-01

    Multiple sulfatase deficiency (MSD) is a rare autosomal recessive inborn error of metabolism due to reduced catalytic activity of the different sulfatase. Affected individuals show neurologic deterioration with mental retardation, skeletal anomalies, organomegaly, and skin changes as in X-linked ichthyosis. The only organ that was not examined in MSD patients is the dentition. To evaluate the effect of the metabolic error on dental development in a patient with the intermediate severe late-infantile form of MSD (S155P). Histological and chemical study were performed on three deciduous and five permanent teeth from MSD patient and pair-matched normal patients. Tooth germ size and enamel thickness were reduced in both deciduous and permanent MSD teeth, and the scalloping feature of the DEJ was missing in MSD teeth causing enamel to break off from the dentin. The mineral components in the enamel and dentin were different. The metabolic error insults the teeth in the stage of organogenesis in both the deciduous and permanent dentition. The end result is teeth with very sharp cusp tips, thin hypomineralized enamel, and exposed dentin due to the break off of enamel. These findings are different from all other types of MPS syndromes.Clinically the phenotype of intermediate severe late-infantile form of MSD appeared during the third year of life. In children of parents that are carriers, we can diagnose the disease as early as birth using X-ray radiograph of the anterior upper region or as early as 6-8 months when the first deciduous tooth erupt and consider very early treatment to ameliorate the symptoms.

  10. Standardizing Plasmodium falciparum infection prevalence measured via microscopy versus rapid diagnostic test.

    PubMed

    Mappin, Bonnie; Cameron, Ewan; Dalrymple, Ursula; Weiss, Daniel J; Bisanzio, Donal; Bhatt, Samir; Gething, Peter W

    2015-11-17

    Large-scale mapping of Plasmodium falciparum infection prevalence relies on opportunistic assemblies of infection prevalence data arising from thousands of P. falciparum parasite rate (PfPR) surveys conducted worldwide. Variance in these data is driven by both signal, the true underlying pattern of infection prevalence, and a range of factors contributing to 'noise', including sampling error, differing age ranges of subjects and differing parasite detection methods. Whilst the former two noise components have been addressed in previous studies, the effect of different diagnostic methods used to determine PfPR in different studies has not. In particular, the majority of PfPR data are based on positivity rates determined by either microscopy or rapid diagnostic test (RDT), yet these approaches are not equivalent; therefore a method is needed for standardizing RDT and microscopy-based prevalence estimates prior to use in mapping. Twenty-five recent Demographic and Health surveys (DHS) datasets from sub-Saharan Africa provide child diagnostic test results derived using both RDT and microscopy for each individual. These prevalence estimates were aggregated across level one administrative zones and a Bayesian probit regression model fit to the microscopy- versus RDT-derived prevalence relationship. An errors-in-variables approach was employed to account for sampling error in both the dependent and independent variables. In addition to the diagnostic outcome, RDT type, fever status and recent anti-malarial treatment were extracted from the datasets in order to analyse their effect on observed malaria prevalence. A strong non-linear relationship between the microscopy and RDT-derived prevalence was found. The results of regressions stratified by the additional diagnostic variables (RDT type, fever status and recent anti-malarial treatment) indicate that there is a distinct and consistent difference in the relationship when the data are stratified by febrile status and RDT brand. The relationships defined in this research can be applied to RDT-derived PfPR data to effectively convert them to an estimate of the parasite prevalence expected using microscopy (or vice versa), thereby standardizing the dataset and improving the signal-to-noise ratio. Additionally, the results provide insight on the importance of RDT brands, febrile status and recent anti-malarial treatment for explaining inconsistencies between observed prevalence derived from different diagnostics.

  11. Reduced error signalling in medication-naive children with ADHD: associations with behavioural variability and post-error adaptations

    PubMed Central

    Plessen, Kerstin J.; Allen, Elena A.; Eichele, Heike; van Wageningen, Heidi; Høvik, Marie Farstad; Sørensen, Lin; Worren, Marius Kalsås; Hugdahl, Kenneth; Eichele, Tom

    2016-01-01

    Background We examined the blood-oxygen level–dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). Methods We acquired functional MRI data during a Flanker task in medication-naive children with ADHD and healthy controls aged 8–12 years and analyzed the data using independent component analysis. For components corresponding to performance monitoring networks, we compared activations across groups and conditions and correlated them with reaction times (RT). Additionally, we analyzed post-error adaptations in behaviour and motor component activations. Results We included 25 children with ADHD and 29 controls in our analysis. Children with ADHD displayed reduced activation to errors in cingulo-opercular regions and higher RT variability, but no differences of interference control. Larger BOLD amplitude to error trials significantly predicted reduced RT variability across all participants. Neither group showed evidence of post-error response slowing; however, post-error adaptation in motor networks was significantly reduced in children with ADHD. This adaptation was inversely related to activation of the right-lateralized ventral attention network (VAN) on error trials and to task-driven connectivity between the cingulo-opercular system and the VAN. Limitations Our study was limited by the modest sample size and imperfect matching across groups. Conclusion Our findings show a deficit in cingulo-opercular activation in children with ADHD that could relate to reduced signalling for errors. Moreover, the reduced orienting of the VAN signal may mediate deficient post-error motor adaptions. Pinpointing general performance monitoring problems to specific brain regions and operations in error processing may help to guide the targets of future treatments for ADHD. PMID:26441332

  12. The Difference between Uncertainty and Information, and Why This Matters

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2016-12-01

    Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.

  13. When policy meets physiology: the challenge of reducing resident work hours.

    PubMed

    Lockley, Steven W; Landrigan, Christopher P; Barger, Laura K; Czeisler, Charles A

    2006-08-01

    Considerable controversy exists regarding optimal work hours for physicians and surgeons in training. In a series of studies, we assessed the effect of extended work hours on resident sleep and health as well as patient safety. In a validated nationwide survey, we found that residents who had worked 24 hours or longer were 2.3 times more likely to have a motor vehicle crash following that shift than when they worked < 24 hours, and that the monthly risk of a crash increased by 16.2% after each extended duration shift. We also found in a randomized trial that interns working a traditional on-call schedule slept 5.8 hours less per week, had twice as many attentional failures on duty overnight, and made 36% more serious medical errors and nearly six times more serious diagnostic errors than when working on a schedule that limited continuous duty to 16 hours. While numerous opinions have been published opposing reductions in extended work hours due to concerns regarding continuity of patient care, reduced educational opportunities, and traditionally-defined professionalism, there are remarkably few objective data in support of continuing to schedule medical trainees to work shifts > 24 hours. An evidence-based approach is needed to minimize the well-documented risk that current work hour practices confer on resident health and patient safety while optimizing education and continuity of care.

  14. Missed Diagnosis of Cardiovascular Disease in Outpatient General Medicine: Insights from Malpractice Claims Data.

    PubMed

    Quinn, Gene R; Ranum, Darrell; Song, Ellen; Linets, Margarita; Keohane, Carol; Riah, Heather; Greenberg, Penny

    2017-10-01

    Diagnostic errors are an underrecognized source of patient harm, and cardiovascular disease can be challenging to diagnose in the ambulatory setting. Although malpractice data can inform diagnostic error reduction efforts, no studies have examined outpatient cardiovascular malpractice cases in depth. A study was conducted to examine the characteristics of outpatient cardiovascular malpractice cases brought against general medicine practitioners. Some 3,407 closed malpractice claims were analyzed in outpatient general medicine from CRICO Strategies' Comparative Benchmarking System database-the largest detailed database of paid and unpaid malpractice in the world-and multivariate models were created to determine the factors that predicted case outcomes. Among the 153 patients in cardiovascular malpractice cases for whom patient comorbidities were coded, the majority (63%) had at least one traditional cardiac risk factor, such as diabetes, tobacco use, or previous cardiovascular disease. Cardiovascular malpractice cases were more likely to involve an allegation of error in diagnosis (75% vs. 47%, p <0.0001), have high clinical severity (86% vs. 49%, p <0.0001) and result in death (75% vs. 27%, p <0.0001), as compared to noncardiovascular cases. Initial diagnoses of nonspecific chest pain and mimics of cardiovascular pain (for example, esophageal disease) were common and independently increased the likelihood of a claim resulting in a payment (p <0.01). Cardiovascular malpractice cases against outpatient general medicine physicians mostly occur in patients with conventional risk factors for coronary artery disease and are often diagnosed with common mimics of cardiovascular pain. These findings suggest that these patients may be high-yield targets for preventing diagnostic errors in the ambulatory setting. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Effects of Correlated Errors on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, Andres; Jacobs, C. S.

    2011-01-01

    As thermal errors are reduced instrumental and troposphere correlated errors will increasingly become more important. Work in progress shows that troposphere covariance error models improve data analysis results. We expect to see stronger effects with higher data rates. Temperature modeling of delay errors may further reduce temporal correlations in the data.

  16. Feasibility, strategy, methodology, and analysis of probe measurements in plasma under high gas pressure

    NASA Astrophysics Data System (ADS)

    Demidov, V. I.; Koepke, M. E.; Kurlyandskaya, I. P.; Malkov, M. A.

    2018-02-01

    This paper reviews existing theories for interpreting probe measurements of electron distribution functions (EDF) at high gas pressure when collisions of electrons with atoms and/or molecules near the probe are pervasive. An explanation of whether or not the measurements are realizable and reliable, an enumeration of the most common sources of measurement error, and an outline of proper probe-experiment design elements that inherently limit or avoid error is presented. Additionally, we describe recent expanded plasma-condition compatibility for EDF measurement, including in applications of large wall probe plasma diagnostics. This summary of the authors’ experiences gained over decades of practicing and developing probe diagnostics is intended to inform, guide, suggest, and detail the advantages and disadvantages of probe application in plasma research.

  17. Error field measurement, correction and heat flux balancing on Wendelstein 7-X

    DOE PAGES

    Lazerson, Samuel A.; Otte, Matthias; Jakubowski, Marcin; ...

    2017-03-10

    The measurement and correction of error fields in Wendelstein 7-X (W7-X) is critical to long pulse high beta operation, as small error fields may cause overloading of divertor plates in some configurations. Accordingly, as part of a broad collaborative effort, the detection and correction of error fields on the W7-X experiment has been performed using the trim coil system in conjunction with the flux surface mapping diagnostic and high resolution infrared camera. In the early commissioning phase of the experiment, the trim coils were used to open an n/m = 1/2 island chain in a specially designed magnetic configuration. Themore » flux surfacing mapping diagnostic was then able to directly image the magnetic topology of the experiment, allowing the inference of a small similar to 4 cm intrinsic island chain. The suspected main sources of the error field, slight misalignment and deformations of the superconducting coils, are then confirmed through experimental modeling using the detailed measurements of the coil positions. Observations of the limiters temperatures in module 5 shows a clear dependence of the limiter heat flux pattern as the perturbing fields are rotated. Plasma experiments without applied correcting fields show a significant asymmetry in neutral pressure (centered in module 4) and light emission (visible, H-alpha, CII, and CIII). Such pressure asymmetry is associated with plasma-wall (limiter) interaction asymmetries between the modules. Application of trim coil fields with n = 1 waveform correct the imbalance. Confirmation of the error fields allows the assessment of magnetic fields which resonate with the n/m = 5/5 island chain.« less

  18. Errors in imaging of traumatic injuries.

    PubMed

    Scaglione, Mariano; Iaselli, Francesco; Sica, Giacomo; Feragalli, Beatrice; Nicola, Refky

    2015-10-01

    The advent of multi-detector computed tomography (MDCT) has drastically improved the outcomes of patients with multiple traumatic injuries. However, there are still diagnostic challenges to be considered. A missed or the delay of a diagnosis in trauma patients can sometimes be related to perception or other non-visual cues, while other errors are due to poor technique or poor image quality. In order to avoid any serious complications, it is important for the practicing radiologist to be cognizant of some of the most common types of errors. The objective of this article is to review the various types of errors in the evaluation of patients with multiple trauma injuries or polytrauma with MDCT.

  19. Scoping a field experiment: error diagnostics of TRMM precipitation radar estimates in complex terrain as a basis for IPHEx2014

    NASA Astrophysics Data System (ADS)

    Duan, Y.; Wilson, A. M.; Barros, A. P.

    2014-10-01

    A diagnostic analysis of the space-time structure of error in Quantitative Precipitation Estimates (QPE) from the Precipitation Radar (PR) on the Tropical Rainfall Measurement Mission (TRMM) satellite is presented here in preparation for the Integrated Precipitation and Hydrology Experiment (IPHEx) in 2014. IPHEx is the first NASA ground-validation field campaign after the launch of the Global Precipitation Measurement (GPM) satellite. In anticipation of GPM, a science-grade high-density raingauge network was deployed at mid to high elevations in the Southern Appalachian Mountains, USA since 2007. This network allows for direct comparison between ground-based measurements from raingauges and satellite-based QPE (specifically, PR 2A25 V7 using 5 years of data 2008-2013). Case studies were conducted to characterize the vertical profiles of reflectivity and rain rate retrievals associated with large discrepancies with respect to ground measurements. The spatial and temporal distribution of detection errors (false alarm, FA, and missed detection, MD) and magnitude errors (underestimation, UND, and overestimation, OVR) for stratiform and convective precipitation are examined in detail toward elucidating the physical basis of retrieval error. The diagnostic error analysis reveals that detection errors are linked to persistent stratiform light rainfall in the Southern Appalachians, which explains the high occurrence of FAs throughout the year, as well as the diurnal MD maximum at midday in the cold season (fall and winter), and especially in the inner region. Although UND dominates the magnitude error budget, underestimation of heavy rainfall conditions accounts for less than 20% of the total consistent with regional hydrometeorology. The 2A25 V7 product underestimates low level orographic enhancement of rainfall associated with fog, cap clouds and cloud to cloud feeder-seeder interactions over ridges, and overestimates light rainfall in the valleys by large amounts, though this behavior is strongly conditioned by the coarse spatial resolution (5 km) of the terrain topography mask used to remove ground clutter effects. Precipitation associated with small-scale systems (< 25 km2) and isolated deep convection tends to be underestimated, which we attribute to non-uniform beam-filling effects due to spatial averaging of reflectivity at the PR resolution. Mixed precipitation events (i.e., cold fronts and snow showers) fall into OVR or FA categories, but these are also the types of events for which observations from standard ground-based raingauge networks are more likely subject to measurement uncertainty, that is raingauge underestimation errors due to under-catch and precipitation phase. Overall, the space-time structure of the errors shows strong links among precipitation, envelope orography, landform (ridge-valley contrasts), and local hydrometeorological regime that is strongly modulated by the diurnal cycle, pointing to three major error causes that are inter-related: (1) representation of concurrent vertically and horizontally varying microphysics; (2) non uniform beam filling (NUBF) effects and ambiguity in the detection of bright band position; and (3) spatial resolution and ground clutter correction.

  20. Scoping a field experiment: error diagnostics of TRMM precipitation radar estimates in complex terrain as a basis for IPHEx2014

    NASA Astrophysics Data System (ADS)

    Duan, Y.; Wilson, A. M.; Barros, A. P.

    2015-03-01

    A diagnostic analysis of the space-time structure of error in quantitative precipitation estimates (QPEs) from the precipitation radar (PR) on the Tropical Rainfall Measurement Mission (TRMM) satellite is presented here in preparation for the Integrated Precipitation and Hydrology Experiment (IPHEx) in 2014. IPHEx is the first NASA ground-validation field campaign after the launch of the Global Precipitation Measurement (GPM) satellite. In anticipation of GPM, a science-grade high-density raingauge network was deployed at mid to high elevations in the southern Appalachian Mountains, USA, since 2007. This network allows for direct comparison between ground-based measurements from raingauges and satellite-based QPE (specifically, PR 2A25 Version 7 using 5 years of data 2008-2013). Case studies were conducted to characterize the vertical profiles of reflectivity and rain rate retrievals associated with large discrepancies with respect to ground measurements. The spatial and temporal distribution of detection errors (false alarm, FA; missed detection, MD) and magnitude errors (underestimation, UND; overestimation, OVR) for stratiform and convective precipitation are examined in detail toward elucidating the physical basis of retrieval error. The diagnostic error analysis reveals that detection errors are linked to persistent stratiform light rainfall in the southern Appalachians, which explains the high occurrence of FAs throughout the year, as well as the diurnal MD maximum at midday in the cold season (fall and winter) and especially in the inner region. Although UND dominates the error budget, underestimation of heavy rainfall conditions accounts for less than 20% of the total, consistent with regional hydrometeorology. The 2A25 V7 product underestimates low-level orographic enhancement of rainfall associated with fog, cap clouds and cloud to cloud feeder-seeder interactions over ridges, and overestimates light rainfall in the valleys by large amounts, though this behavior is strongly conditioned by the coarse spatial resolution (5 km) of the topography mask used to remove ground-clutter effects. Precipitation associated with small-scale systems (< 25 km2) and isolated deep convection tends to be underestimated, which we attribute to non-uniform beam-filling effects due to spatial averaging of reflectivity at the PR resolution. Mixed precipitation events (i.e., cold fronts and snow showers) fall into OVR or FA categories, but these are also the types of events for which observations from standard ground-based raingauge networks are more likely subject to measurement uncertainty, that is raingauge underestimation errors due to undercatch and precipitation phase. Overall, the space-time structure of the errors shows strong links among precipitation, envelope orography, landform (ridge-valley contrasts), and a local hydrometeorological regime that is strongly modulated by the diurnal cycle, pointing to three major error causes that are inter-related: (1) representation of concurrent vertically and horizontally varying microphysics; (2) non-uniform beam filling (NUBF) effects and ambiguity in the detection of bright band position; and (3) spatial resolution and ground-clutter correction.

  1. Examining the cosmic acceleration with the latest Union2 supernova data

    NASA Astrophysics Data System (ADS)

    Li, Zhengxiang; Wu, Puxun; Yu, Hongwei

    2011-01-01

    In this Letter, by reconstructing the Om diagnostic and the deceleration parameter q from the latest Union2 Type Ia Supernova sample with and without the systematic error along with the baryon acoustic oscillation (BAO) and the cosmic microwave background (CMB), we study the cosmic expanding history, using the Chevallier-Polarski-Linder (CPL) parametrization. We obtain that Union2+BAO favor an expansion with a decreasing of the acceleration at z<0.3. However, once the CMB data is added in the analysis, the cosmic acceleration is found to be still increasing, indicating a tension between low redshift data and high redshift. In order to reduce this tension significantly, two different methods are considered and thus two different subsamples of Union2 are selected. We then find that two different subsamples+BAO+CMB give completely different results on the cosmic expanding history when the systematic error is ignored, with one suggesting a decreasing cosmic acceleration, the other just the opposite, although both of them alone with BAO support that the cosmic acceleration is slowing down. However, once the systematic error is considered, two different subsamples of Union2 along with BAO and CMB all favor an increasing of the present cosmic acceleration. Therefore a clear-cut answer on whether the cosmic acceleration is slowing down calls for more consistent data and more reliable methods to analyze them.

  2. Error-free pathology: applying lean production methods to anatomic pathology.

    PubMed

    Condel, Jennifer L; Sharbaugh, David T; Raab, Stephen S

    2004-12-01

    The current state of our health care system calls for dramatic changes. In their pathology department, the authors believe these changes may be accomplished by accepting the long-term commitment of applying a lean production system. The ideal state of zero pathology errors is one that should be pursued by consistently asking, "Why can't we?" The philosophy of lean production systems began in the manufacturing industry: "All we are doing is looking at the time from the moment the customer gives us an order to the point when we collect the cash. And we are reducing that time line by removing non-value added wastes". The ultimate goals in pathology and overall health care are not so different. The authors' intention is to provide the patient (customer) with the most accurate diagnostic information in a timely and efficient manner. Their lead histotechnologist recently summarized this philosophy: she indicated that she felt she could sleep better at night knowing she truly did the best job she could. Her chances of making an error (in cutting or labeling) were dramatically decreased in the one-by-one continuous flow work process compared with previous practices. By designing a system that enables employees to be successful in meeting customer demand, and by empowering the frontline staff in the development and problem solving processes, one can meet the challenges of eliminating waste and build an improved, efficient system.

  3. Quad-phased data mining modeling for dementia diagnosis.

    PubMed

    Bang, Sunjoo; Son, Sangjoon; Roh, Hyunwoong; Lee, Jihye; Bae, Sungyun; Lee, Kyungwon; Hong, Changhyung; Shin, Hyunjung

    2017-05-18

    The number of people with dementia is increasing along with people's ageing trend worldwide. Therefore, there are various researches to improve a dementia diagnosis process in the field of computer-aided diagnosis (CAD) technology. The most significant issue is that the evaluation processes by physician which is based on medical information for patients and questionnaire from their guardians are time consuming, subjective and prone to error. This problem can be solved by an overall data mining modeling, which subsidizes an intuitive decision of clinicians. Therefore, in this paper we propose a quad-phased data mining modeling consisting of 4 modules. In Proposer Module, significant diagnostic criteria are selected that are effective for diagnostics. Then in Predictor Module, a model is constructed to predict and diagnose dementia based on a machine learning algorism. To help clinical physicians understand results of the predictive model better, in Descriptor Module, we interpret causes of diagnostics by profiling patient groups. Lastly, in Visualization Module, we provide visualization to effectively explore characteristics of patient groups. The proposed model is applied for CREDOS study which contains clinical data collected from 37 university-affiliated hospitals in republic of Korea from year 2005 to 2013. This research is an intelligent system enabling intuitive collaboration between CAD system and physicians. And also, improved evaluation process is able to effectively reduce time and cost consuming for clinicians and patients.

  4. Reactor protection system with automatic self-testing and diagnostic

    DOEpatents

    Gaubatz, Donald C.

    1996-01-01

    A reactor protection system having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically "identical" values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic.

  5. Reactor protection system with automatic self-testing and diagnostic

    DOEpatents

    Gaubatz, D.C.

    1996-12-17

    A reactor protection system is disclosed having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically ``identical`` values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic. 16 figs.

  6. Using warnings to reduce categorical false memories in younger and older adults.

    PubMed

    Carmichael, Anna M; Gutchess, Angela H

    2016-07-01

    Warnings about memory errors can reduce their incidence, although past work has largely focused on associative memory errors. The current study sought to explore whether warnings could be tailored to specifically reduce false recall of categorical information in both younger and older populations. Before encoding word pairs designed to induce categorical false memories, half of the younger and older participants were warned to avoid committing these types of memory errors. Older adults who received a warning committed fewer categorical memory errors, as well as other types of semantic memory errors, than those who did not receive a warning. In contrast, young adults' memory errors did not differ for the warning versus no-warning groups. Our findings provide evidence for the effectiveness of warnings at reducing categorical memory errors in older adults, perhaps by supporting source monitoring, reduction in reliance on gist traces, or through effective metacognitive strategies.

  7. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  8. Mid-infrared laser-absorption diagnostic for vapor-phase measurements in an evaporating n-decane aerosol

    NASA Astrophysics Data System (ADS)

    Porter, J. M.; Jeffries, J. B.; Hanson, R. K.

    2009-09-01

    A novel three-wavelength mid-infrared laser-based absorption/extinction diagnostic has been developed for simultaneous measurement of temperature and vapor-phase mole fraction in an evaporating hydrocarbon fuel aerosol (vapor and liquid droplets). The measurement technique was demonstrated for an n-decane aerosol with D 50˜3 μ m in steady and shock-heated flows with a measurement bandwidth of 125 kHz. Laser wavelengths were selected from FTIR measurements of the C-H stretching band of vapor and liquid n-decane near 3.4 μm (3000 cm -1), and from modeled light scattering from droplets. Measurements were made for vapor mole fractions below 2.3 percent with errors less than 10 percent, and simultaneous temperature measurements over the range 300 K< T<900 K were made with errors less than 3 percent. The measurement technique is designed to provide accurate values of temperature and vapor mole fraction in evaporating polydispersed aerosols with small mean diameters ( D 50<10 μ m), where near-infrared laser-based scattering corrections are prone to error.

  9. Assimilation of surface NO2 and O3 observations into the SILAM chemistry transport model

    NASA Astrophysics Data System (ADS)

    Vira, J.; Sofiev, M.

    2014-08-01

    This paper describes assimilation of trace gas observations into the chemistry transport model SILAM using the 3D-Var method. Assimilation results for year 2012 are presented for the prominent photochemical pollutants ozone (O3) and nitrogen dioxide (NO2). Both species are covered by the Airbase observation database, which provides the observational dataset used in this study. Attention is paid to the background and observation error covariance matrices, which are obtained primarily by iterative application of a posteriori diagnostics. The diagnostics are computed separately for two months representing summer and winter conditions, and further disaggregated by time of day. This allows deriving background and observation error covariance definitions which include both seasonal and diurnal variation. The consistency of the obtained covariance matrices is verified using χ2 diagnostics. The analysis scores are computed for a control set of observation stations withheld from assimilation. Compared to a free-running model simulation, the correlation coefficient for daily maximum values is improved from 0.8 to 0.9 for O3 and from 0.53 to 0.63 for NO2.

  10. Research of Fast DAQ system in KSTAR Thomson scattering diagnostic

    NASA Astrophysics Data System (ADS)

    Lee, J. H.; Kim, H. J.; Yamada, I.; Funaba, H.; Kim, Y. G.; Kim, D. Y.

    2017-12-01

    The Thomson scattering diagnostic is one of the most important diagnostic systems in fusion plasma research. It provides reliable electron temperature and density profiles in magnetically confined plasma. A Q-switched Nd:YAG Thomson system was installed several years ago in KSTAR tokamak to measure the electron temperature and density profiles. For the KSTAR Thomson scattering system, a Charge-to-Digital Conversion (QDC) type data acquisition system was used to measure a pulse type Thomson signal. Recently, however, an error was found during the Te, ne calculation, because the QDC system had integrated the pulse Thomson signal that included a signal similar to stray light. To overcome such errors, we introduce a fast data acquisition (F-DAQ) system. To test this, we use CAEN V1742 5 GS/s, a Versa Module Eurocard Bus (VMEbus) type 12-bit switched capacitor digitizer with 32 channels. In this experiment, we compare the calculated Te results of Thomson scattering data measured simultaneously using QDC and F-DAQ. In the F-DAQ system, the shape of the pulse was restored by fitting.

  11. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    PubMed

    Linder, Nina; Turkki, Riku; Walliander, Margarita; Mårtensson, Andreas; Diwan, Vinod; Rahtu, Esa; Pietikäinen, Matti; Lundin, Mikael; Lundin, Johan

    2014-01-01

    Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27) and uninfected controls (n = 20) were digitally scanned with an oil immersion objective (0.1 µm/pixel) to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors) used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls). From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for visual examination and has a potential to increase the throughput in malaria diagnostics.

  12. Estimation of diagnostic test accuracy without full verification: a review of latent class methods

    PubMed Central

    Collins, John; Huynh, Minh

    2014-01-01

    The performance of a diagnostic test is best evaluated against a reference test that is without error. For many diseases, this is not possible, and an imperfect reference test must be used. However, diagnostic accuracy estimates may be biased if inaccurately verified status is used as the truth. Statistical models have been developed to handle this situation by treating disease as a latent variable. In this paper, we conduct a systematized review of statistical methods using latent class models for estimating test accuracy and disease prevalence in the absence of complete verification. PMID:24910172

  13. Development and Evaluation of a Diagnostic Documentation Support System using Knowledge Processing

    NASA Astrophysics Data System (ADS)

    Makino, Kyoko; Hayakawa, Rumi; Terai, Koichi; Fukatsu, Hiroshi

    In this paper, we will introduce a system which supports creating diagnostic reports. Diagnostic reports are documents by doctors of radiology describing the existence and nonexistence of abnormalities from the inspection images, such as CT and MRI, and summarize a patient's state and disease. Our system indicates insufficiencies in these reports created by younger doctors, by using knowledge processing based on a medical knowledge dictionary. These indications are not only clerical errors, but the system also analyzes the purpose of the inspection and determines whether a comparison with a former inspection is required, or whether there is any shortage in description. We verified our system by using actual data of 2,233 report pairs, a pair comprised of a report written by a younger doctor and a check result of the report by an experienced doctor. The results of the verification showed that the rules of string analysis for detecting clerical errors and sentence wordiness obtained a recall of over 90% and a precision of over 75%. Moreover, the rules based on a medical knowledge dictionary for detecting the lack of required comparison with a former inspection and the shortage in description for the inspection purpose obtained a recall of over 70%. From these results, we confirmed that our system contributes to the quality improvement of diagnostic reports. We expect that our system can comprehensively support diagnostic documentations by cooperating with the interface which refers to inspection images or past reports.

  14. Impact of lossy compression on diagnostic accuracy of radiographs for periapical lesions

    NASA Technical Reports Server (NTRS)

    Eraso, Francisco E.; Analoui, Mostafa; Watson, Andrew B.; Rebeschini, Regina

    2002-01-01

    OBJECTIVES: The purpose of this study was to evaluate the lossy Joint Photographic Experts Group compression for endodontic pretreatment digital radiographs. STUDY DESIGN: Fifty clinical charge-coupled device-based, digital radiographs depicting periapical areas were selected. Each image was compressed at 2, 4, 8, 16, 32, 48, and 64 compression ratios. One root per image was marked for examination. Images were randomized and viewed by four clinical observers under standardized viewing conditions. Each observer read the image set three times, with at least two weeks between each reading. Three pre-selected sites per image (mesial, distal, apical) were scored on a five-scale score confidence scale. A panel of three examiners scored the uncompressed images, with a consensus score for each site. The consensus score was used as the baseline for assessing the impact of lossy compression on the diagnostic values of images. The mean absolute error between consensus and observer scores was computed for each observer, site, and reading session. RESULTS: Balanced one-way analysis of variance for all observers indicated that for compression ratios 48 and 64, there was significant difference between mean absolute error of uncompressed and compressed images (P <.05). After converting the five-scale score to two-level diagnostic values, the diagnostic accuracy was strongly correlated (R (2) = 0.91) with the compression ratio. CONCLUSION: The results of this study suggest that high compression ratios can have a severe impact on the diagnostic quality of the digital radiographs for detection of periapical lesions.

  15. Mistake proofing: changing designs to reduce error

    PubMed Central

    Grout, J R

    2006-01-01

    Mistake proofing uses changes in the physical design of processes to reduce human error. It can be used to change designs in ways that prevent errors from occurring, to detect errors after they occur but before harm occurs, to allow processes to fail safely, or to alter the work environment to reduce the chance of errors. Effective mistake proofing design changes should initially be effective in reducing harm, be inexpensive, and easily implemented. Over time these design changes should make life easier and speed up the process. Ideally, the design changes should increase patients' and visitors' understanding of the process. These designs should themselves be mistake proofed and follow the good design practices of other disciplines. PMID:17142609

  16. The Syndrome of Catatonia

    PubMed Central

    Wilcox, James Allen; Reid Duffy, Pam

    2015-01-01

    Catatonia is a psychomotor syndrome which has historically been associated with schizophrenia. Many clinicians have thought that the prevalence of this condition has been decreasing over the past few decades. This review reminds clinicians that catatonia is not exclusively associated with schizophrenia, and is still common in clinical practice. Many cases are related to affective disorders or are of an idiopathic nature. The illusion of reduced prevalence has been due to evolving diagnostic systems that failed to capture catatonic syndromes. This systemic error has remained unchallenged, and potentiated by the failure to perform adequate neurological evaluations and catatonia screening exams on psychiatric patients. We find that current data supports catatonic syndromes are still common, often severe and of modern clinical importance. Effective treatment is relatively easy and can greatly reduce organ failure associated with prolonged psychomotor symptoms. Prompt identification and treatment can produce a robust improvement in most cases. The ongoing prevalence of this syndrome requires that psychiatrists recognize catatonia and its presentations, the range of associated etiologies, and the import of timely treatment. PMID:26690229

  17. Using Healthcare Failure Mode and Effect Analysis to reduce medication errors in the process of drug prescription, validation and dispensing in hospitalised patients.

    PubMed

    Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa

    2013-01-01

    To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.

  18. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Pesticide Factsheets

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  19. Complex clinical reasoning in the critical care unit - difficulties, pitfalls and adaptive strategies.

    PubMed

    Shaw, M; Singh, S

    2015-04-01

    Diagnostic error has implications for both clinical outcome and resource utilisation, and may often be traced to impaired data gathering, processing or synthesis because of the influence of cognitive bias. Factors inherent to the intensive/acute care environment afford multiple additional opportunities for such errors to occur. This article illustrates many of these with reference to a case encountered on our intensive care unit. Strategies to improve completeness of data gathering, processing and synthesis in the acute care environment are critically appraised in the context of early detection and amelioration of cognitive bias. These include reflection, targeted simulation training and the integration of social media and IT based aids in complex diagnostic processes. A framework which can be quickly and easily employed in a variety of clinical environments is then presented. © 2015 John Wiley & Sons Ltd.

  20. The real malady of Marcel Proust and what it reveals about diagnostic errors in medicine.

    PubMed

    Douglas, Yellowlees

    2016-05-01

    Marcel Proust, author of À La Recherche du Temps Perdu, was considered a hypochondriac not only by the numerous specialists he consulted during his lifetime but also by every literary critic who ventured an opinion on his health, among them several clinicians. However, Proust's voluminous correspondence, as detailed in its attention to his every symptom as his novel, provides valuable clues to Proust's real, organic, and rare illness. Proust, in fact, was not only genuinely ill but far sicker than he even he believed, most likely suffering from the vascular subtype of Ehlers-Danlos Syndrome. Ironically, Proust's own doctors and his clinician-critics replicated the same kinds of diagnostic errors clinicians still routinely make today, shedding light on the plight of patients with rare illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Reliability assessment of fiber optic communication lines depending on external factors and diagnostic errors

    NASA Astrophysics Data System (ADS)

    Bogachkov, I. V.; Lutchenko, S. S.

    2018-05-01

    The article deals with the method for the assessment of the fiber optic communication lines (FOCL) reliability taking into account the effect of the optical fiber tension, the temperature influence and the built-in diagnostic equipment errors of the first kind. The reliability is assessed in terms of the availability factor using the theory of Markov chains and probabilistic mathematical modeling. To obtain a mathematical model, the following steps are performed: the FOCL state is defined and validated; the state graph and system transitions are described; the system transition of states that occur at a certain point is specified; the real and the observed time of system presence in the considered states are identified. According to the permissible value of the availability factor, it is possible to determine the limiting frequency of FOCL maintenance.

  2. Quality Aware Compression of Electrocardiogram Using Principal Component Analysis.

    PubMed

    Gupta, Rajarshi

    2016-05-01

    Electrocardiogram (ECG) compression finds wide application in various patient monitoring purposes. Quality control in ECG compression ensures reconstruction quality and its clinical acceptance for diagnostic decision making. In this paper, a quality aware compression method of single lead ECG is described using principal component analysis (PCA). After pre-processing, beat extraction and PCA decomposition, two independent quality criteria, namely, bit rate control (BRC) or error control (EC) criteria were set to select optimal principal components, eigenvectors and their quantization level to achieve desired bit rate or error measure. The selected principal components and eigenvectors were finally compressed using a modified delta and Huffman encoder. The algorithms were validated with 32 sets of MIT Arrhythmia data and 60 normal and 30 sets of diagnostic ECG data from PTB Diagnostic ECG data ptbdb, all at 1 kHz sampling. For BRC with a CR threshold of 40, an average Compression Ratio (CR), percentage root mean squared difference normalized (PRDN) and maximum absolute error (MAE) of 50.74, 16.22 and 0.243 mV respectively were obtained. For EC with an upper limit of 5 % PRDN and 0.1 mV MAE, the average CR, PRDN and MAE of 9.48, 4.13 and 0.049 mV respectively were obtained. For mitdb data 117, the reconstruction quality could be preserved up to CR of 68.96 by extending the BRC threshold. The proposed method yields better results than recently published works on quality controlled ECG compression.

  3. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    NASA Astrophysics Data System (ADS)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  4. Automated drug dispensing system reduces medication errors in an intensive care setting.

    PubMed

    Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick

    2010-12-01

    We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; p<.05); however, no significant difference was observed before automated dispensing system implementation (20.4% and 19.3%, respectively; not significant). Before-and-after comparisons in the study unit also showed a significantly reduced percentage of total opportunities for error (20.4% and 13.5%; p<.01). An analysis of detailed opportunities for error showed a significant impact of the automated dispensing system in reducing preparation errors (p<.05). Most errors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.

  5. The diagnostic performance of reduced-dose CT for suspected appendicitis in paediatric and adult patients: A systematic review and diagnostic meta-analysis.

    PubMed

    Yoon, Hee Mang; Suh, Chong Hyun; Cho, Young Ah; Kim, Jeong Rye; Lee, Jin Seong; Jung, Ah Young; Kim, Jung Heon; Lee, Jeong-Yong; Kim, So Yeon

    2018-06-01

    To evaluate the diagnostic performance of reduced-dose CT for suspected appendicitis. A systematic search of the MEDLINE and EMBASE databases was carried out through to 10 January 2017. Studies evaluating the diagnostic performance of reduced-dose CT for suspected appendicitis in paediatric and adult patients were selected. Pooled summary estimates of sensitivity and specificity were calculated using hierarchical logistic regression modelling. Meta-regression was performed. Fourteen original articles with a total of 3,262 patients were included. For all studies using reduced-dose CT, the summary sensitivity was 96 % (95 % CI 93-98) with a summary specificity of 94 % (95 % CI 92-95). For the 11 studies providing a head-to-head comparison between reduced-dose CT and standard-dose CT, reduced-dose CT demonstrated a comparable summary sensitivity of 96 % (95 % CI 91-98) and specificity of 94 % (95 % CI 93-96) without any significant differences (p=.41). In meta-regression, there were no significant factors affecting the heterogeneity. The median effective radiation dose of the reduced-dose CT was 1.8 mSv (1.46-4.16 mSv), which was a 78 % reduction in effective radiation dose compared to the standard-dose CT. Reduced-dose CT shows excellent diagnostic performance for suspected appendicitis. • Reduced-dose CT shows excellent diagnostic performance for evaluating suspected appendicitis. • Reduced-dose CT has a comparable diagnostic performance to standard-dose CT. • Median effective radiation dose of reduced-dose CT was 1.8 mSv (1.46-4.16). • Reduced-dose CT achieved a 78 % dose reduction compared to standard-dose CT.

  6. Determining relative error bounds for the CVBEM

    USGS Publications Warehouse

    Hromadka, T.V.

    1985-01-01

    The Complex Variable Boundary Element Methods provides a measure of relative error which can be utilized to subsequently reduce the error or provide information for further modeling analysis. By maximizing the relative error norm on each boundary element, a bound on the total relative error for each boundary element can be evaluated. This bound can be utilized to test CVBEM convergence, to analyze the effects of additional boundary nodal points in reducing the modeling error, and to evaluate the sensitivity of resulting modeling error within a boundary element from the error produced in another boundary element as a function of geometric distance. ?? 1985.

  7. Utilization of a CRT display light pen in the design of feedback control systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. G.; Young, K. R.

    1972-01-01

    A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.

  8. Effect of a limited-enforcement intelligent tutoring system in dermatopathology on student errors, goals and solution paths.

    PubMed

    Payne, Velma L; Medvedeva, Olga; Legowski, Elizabeth; Castine, Melissa; Tseytlin, Eugene; Jukic, Drazen; Crowley, Rebecca S

    2009-11-01

    Determine effects of a limited-enforcement intelligent tutoring system in dermatopathology on student errors, goals and solution paths. Determine if limited enforcement in a medical tutoring system inhibits students from learning the optimal and most efficient solution path. Describe the type of deviations from the optimal solution path that occur during tutoring, and how these deviations change over time. Determine if the size of the problem-space (domain scope), has an effect on learning gains when using a tutor with limited enforcement. Analyzed data mined from 44 pathology residents using SlideTutor-a Medical Intelligent Tutoring System in Dermatopathology that teaches histopathologic diagnosis and reporting skills based on commonly used diagnostic algorithms. Two subdomains were included in the study representing sub-algorithms of different sizes and complexities. Effects of the tutoring system on student errors, goal states and solution paths were determined. Students gradually increase the frequency of steps that match the tutoring system's expectation of expert performance. Frequency of errors gradually declines in all categories of error significance. Student performance frequently differs from the tutor-defined optimal path. However, as students continue to be tutored, they approach the optimal solution path. Performance in both subdomains was similar for both errors and goal differences. However, the rate at which students progress toward the optimal solution path differs between the two domains. Tutoring in superficial perivascular dermatitis, the larger and more complex domain was associated with a slower rate of approximation towards the optimal solution path. Students benefit from a limited-enforcement tutoring system that leverages diagnostic algorithms but does not prevent alternative strategies. Even with limited enforcement, students converge toward the optimal solution path.

  9. Reduction of Orifice-Induced Pressure Errors

    NASA Technical Reports Server (NTRS)

    Plentovich, Elizabeth B.; Gloss, Blair B.; Eves, John W.; Stack, John P.

    1987-01-01

    Use of porous-plug orifice reduces or eliminates errors, induced by orifice itself, in measuring static pressure on airfoil surface in wind-tunnel experiments. Piece of sintered metal press-fitted into static-pressure orifice so it matches surface contour of model. Porous material reduces orifice-induced pressure error associated with conventional orifice of same or smaller diameter. Also reduces or eliminates additional errors in pressure measurement caused by orifice imperfections. Provides more accurate measurements in regions with very thin boundary layers.

  10. Reducing errors benefits the field-based learning of a fundamental movement skill in children.

    PubMed

    Capio, C M; Poolton, J M; Sit, C H P; Holmstrom, M; Masters, R S W

    2013-03-01

    Proficient fundamental movement skills (FMS) are believed to form the basis of more complex movement patterns in sports. This study examined the development of the FMS of overhand throwing in children through either an error-reduced (ER) or error-strewn (ES) training program. Students (n = 216), aged 8-12 years (M = 9.16, SD = 0.96), practiced overhand throwing in either a program that reduced errors during practice (ER) or one that was ES. ER program reduced errors by incrementally raising the task difficulty, while the ES program had an incremental lowering of task difficulty. Process-oriented assessment of throwing movement form (Test of Gross Motor Development-2) and product-oriented assessment of throwing accuracy (absolute error) were performed. Changes in performance were examined among children in the upper and lower quartiles of the pretest throwing accuracy scores. ER training participants showed greater gains in movement form and accuracy, and performed throwing more effectively with a concurrent secondary cognitive task. Movement form improved among girls, while throwing accuracy improved among children with low ability. Reduced performance errors in FMS training resulted in greater learning than a program that did not restrict errors. Reduced cognitive processing costs (effective dual-task performance) associated with such approach suggest its potential benefits for children with developmental conditions. © 2011 John Wiley & Sons A/S.

  11. Information systems and human error in the lab.

    PubMed

    Bissell, Michael G

    2004-01-01

    Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.

  12. Errors as a Means of Reducing Impulsive Food Choice.

    PubMed

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2016-06-05

    Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities.

  13. Errors as a Means of Reducing Impulsive Food Choice

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2016-01-01

    Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities. PMID:27341281

  14. Unforced errors and error reduction in tennis

    PubMed Central

    Brody, H

    2006-01-01

    Only at the highest level of tennis is the number of winners comparable to the number of unforced errors. As the average player loses many more points due to unforced errors than due to winners by an opponent, if the rate of unforced errors can be reduced, it should lead to an increase in points won. This article shows how players can improve their game by understanding and applying the laws of physics to reduce the number of unforced errors. PMID:16632568

  15. Load Sharing Behavior of Star Gearing Reducer for Geared Turbofan Engine

    NASA Astrophysics Data System (ADS)

    Mo, Shuai; Zhang, Yidu; Wu, Qiong; Wang, Feiming; Matsumura, Shigeki; Houjoh, Haruo

    2017-07-01

    Load sharing behavior is very important for power-split gearing system, star gearing reducer as a new type and special transmission system can be used in many industry fields. However, there is few literature regarding the key multiple-split load sharing issue in main gearbox used in new type geared turbofan engine. Further mechanism analysis are made on load sharing behavior among star gears of star gearing reducer for geared turbofan engine. Comprehensive meshing error analysis are conducted on eccentricity error, gear thickness error, base pitch error, assembly error, and bearing error of star gearing reducer respectively. Floating meshing error resulting from meshing clearance variation caused by the simultaneous floating of sun gear and annular gear are taken into account. A refined mathematical model for load sharing coefficient calculation is established in consideration of different meshing stiffness and supporting stiffness for components. The regular curves of load sharing coefficient under the influence of interactions, single action and single variation of various component errors are obtained. The accurate sensitivity of load sharing coefficient toward different errors is mastered. The load sharing coefficient of star gearing reducer is 1.033 and the maximum meshing force in gear tooth is about 3010 N. This paper provides scientific theory evidences for optimal parameter design and proper tolerance distribution in advanced development and manufacturing process, so as to achieve optimal effects in economy and technology.

  16. Advances toward fully automated in vivo assessment of oral epithelial dysplasia by nuclear endomicroscopy-A pilot study.

    PubMed

    Liese, Jan; Winter, Karsten; Glass, Änne; Bertolini, Julia; Kämmerer, Peer Wolfgang; Frerich, Bernhard; Schiefke, Ingolf; Remmerbach, Torsten W

    2017-11-01

    Uncertainties in detection of oral epithelial dysplasia (OED) frequently result from sampling error especially in inflammatory oral lesions. Endomicroscopy allows non-invasive, "en face" imaging of upper oral epithelium, but parameters of OED are unknown. Mucosal nuclei were imaged in 34 toluidine blue-stained oral lesions with a commercial endomicroscopy. Histopathological diagnosis showed four biopsies in "dys-/neoplastic," 23 in "inflammatory," and seven in "others" disease groups. Strength of different assessment strategies of nuclear scoring, nuclear count, and automated nuclear analysis were measured by area under ROC curve (AUC) to identify histopathological "dys-/neoplastic" group. Nuclear objects from automated image analysis were visually corrected. Best-performing parameters of nuclear-to-image ratios were the count of large nuclei (AUC=0.986) and 6-nearest neighborhood relation (AUC=0.896), and best parameters of nuclear polymorphism were the count of atypical nuclei (AUC=0.996) and compactness of nuclei (AUC=0.922). Excluding low-grade OED, nuclear scoring and count reached 100% sensitivity and 98% specificity for detection of dys-/neoplastic lesions. In automated analysis, combination of parameters enhanced diagnostic strength. Sensitivity of 100% and specificity of 87% were seen for distances of 6-nearest neighbors and aspect ratios even in uncorrected objects. Correction improved measures of nuclear polymorphism only. The hue of background color was stronger than nuclear density (AUC=0.779 vs 0.687) to detect dys-/neoplastic group indicating that macroscopic aspect is biased. Nuclear-to-image ratios are applicable for automated optical in vivo diagnostics for oral potentially malignant disorders. Nuclear endomicroscopy may promote non-invasive, early detection of dys-/neoplastic lesions by reducing sampling error. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. A comparison of endoscopic localization error rate between operating surgeons and referring endoscopists in colorectal cancer.

    PubMed

    Azin, Arash; Saleh, Fady; Cleghorn, Michelle; Yuen, Andrew; Jackson, Timothy; Okrainec, Allan; Quereshy, Fayez A

    2017-03-01

    Colonoscopy for colorectal cancer (CRC) has a localization error rate as high as 21 %. Such errors can have substantial clinical consequences, particularly in laparoscopic surgery. The primary objective of this study was to compare accuracy of tumor localization at initial endoscopy performed by either the operating surgeon or non-operating referring endoscopist. All patients who underwent surgical resection for CRC at a large tertiary academic hospital between January 2006 and August 2014 were identified. The exposure of interest was the initial endoscopist: (1) surgeon who also performed the definitive operation (operating surgeon group); and (2) referring gastroenterologist or general surgeon (referring endoscopist group). The outcome measure was localization error, defined as a difference in at least one anatomic segment between initial endoscopy and final operative location. Multivariate logistic regression was used to explore the association between localization error rate and the initial endoscopist. A total of 557 patients were included in the study; 81 patients in the operating surgeon cohort and 476 patients in the referring endoscopist cohort. Initial diagnostic colonoscopy performed by the operating surgeon compared to referring endoscopist demonstrated statistically significant lower intraoperative localization error rate (1.2 vs. 9.0 %, P = 0.016); shorter mean time from endoscopy to surgery (52.3 vs. 76.4 days, P = 0.015); higher tattoo localization rate (32.1 vs. 21.0 %, P = 0.027); and lower preoperative repeat endoscopy rate (8.6 vs. 40.8 %, P < 0.001). Initial endoscopy performed by the operating surgeon was protective against localization error on both univariate analysis, OR 7.94 (95 % CI 1.08-58.52; P = 0.016), and multivariate analysis, OR 7.97 (95 % CI 1.07-59.38; P = 0.043). This study demonstrates that diagnostic colonoscopies performed by an operating surgeon are independently associated with a lower localization error rate. Further research exploring the factors influencing localization accuracy and why operating surgeons have lower error rates relative to non-operating endoscopists is necessary to understand differences in care.

  18. Benchmarking observational uncertainties for hydrology (Invited)

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has become more common for hydrologists to use multiple data types and sources within a single study. This may be driven by complex water management questions which integrate water quantity, quality and ecology; or by recognition of the value of auxiliary data to understand hydrological processes. We discuss briefly the impact of data uncertainty on the increasingly popular use of diagnostic signatures for hydrological process understanding and model development.

  19. Twice cutting method reduces tibial cutting error in unicompartmental knee arthroplasty.

    PubMed

    Inui, Hiroshi; Taketomi, Shuji; Yamagami, Ryota; Sanada, Takaki; Tanaka, Sakae

    2016-01-01

    Bone cutting error can be one of the causes of malalignment in unicompartmental knee arthroplasty (UKA). The amount of cutting error in total knee arthroplasty has been reported. However, none have investigated cutting error in UKA. The purpose of this study was to reveal the amount of cutting error in UKA when open cutting guide was used and clarify whether cutting the tibia horizontally twice using the same cutting guide reduced the cutting errors in UKA. We measured the alignment of the tibial cutting guides, the first-cut cutting surfaces and the second cut cutting surfaces using the navigation system in 50 UKAs. Cutting error was defined as the angular difference between the cutting guide and cutting surface. The mean absolute first-cut cutting error was 1.9° (1.1° varus) in the coronal plane and 1.1° (0.6° anterior slope) in the sagittal plane, whereas the mean absolute second-cut cutting error was 1.1° (0.6° varus) in the coronal plane and 1.1° (0.4° anterior slope) in the sagittal plane. Cutting the tibia horizontally twice reduced the cutting errors in the coronal plane significantly (P<0.05). Our study demonstrated that in UKA, cutting the tibia horizontally twice using the same cutting guide reduced cutting error in the coronal plane. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. An educational and audit tool to reduce prescribing error in intensive care.

    PubMed

    Thomas, A N; Boxall, E M; Laha, S K; Day, A J; Grundy, D

    2008-10-01

    To reduce prescribing errors in an intensive care unit by providing prescriber education in tutorials, ward-based teaching and feedback in 3-monthly cycles with each new group of trainee medical staff. Prescribing audits were conducted three times in each 3-month cycle, once pretraining, once post-training and a final audit after 6 weeks. The audit information was fed back to prescribers with their correct prescribing rates, rates for individual error types and total error rates together with anonymised information about other prescribers' error rates. The percentage of prescriptions with errors decreased over each 3-month cycle (pretraining 25%, 19%, (one missing data point), post-training 23%, 6%, 11%, final audit 7%, 3%, 5% (p<0.0005)). The total number of prescriptions and error rates varied widely between trainees (data collection one; cycle two: range of prescriptions written: 1-61, median 18; error rate: 0-100%; median: 15%). Prescriber education and feedback reduce manual prescribing errors in intensive care.

  1. Uncertainty vs. Information (Invited)

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  2. Exploring Reactions to Pilot Reliability Certification and Changing Attitudes on the Reduction of Errors

    ERIC Educational Resources Information Center

    Boedigheimer, Dan

    2010-01-01

    Approximately 70% of aviation accidents are attributable to human error. The greatest opportunity for further improving aviation safety is found in reducing human errors in the cockpit. The purpose of this quasi-experimental, mixed-method research was to evaluate whether there was a difference in pilot attitudes toward reducing human error in the…

  3. Quality of referral: What information should be included in a request for diagnostic imaging when a patient is referred to a clinical radiologist?

    PubMed

    G Pitman, Alexander

    2017-06-01

    Referral to a clinical radiologist is the prime means of communication between the referrer and the radiologist. Current Australian and New Zealand government regulations do not prescribe what clinical information should be included in a referral. This work presents a qualitative compilation of clinical radiologist opinion, relevant professional recommendations, governmental regulatory positions and prior work on diagnostic error to synthesise recommendations on what clinical information should be included in a referral. Recommended requirements on what clinical information should be included in a referral to a clinical radiologist are as follows: an unambiguous referral; identity of the patient; identity of the referrer; and sufficient clinical detail to justify performance of the diagnostic imaging examination and to confirm appropriate choice of the examination and modality. Recommended guideline on the content of clinical detail clarifies when the information provided in a referral meets these requirements. High-quality information provided in a referral allows the clinical radiologist to ensure that exposure of patients to medical radiation is justified. It also minimises the incidence of perceptual and interpretational diagnostic error. Recommended requirements and guideline on the clinical detail to be provided in a referral to a clinical radiologist have been formulated for professional debate and adoption. © 2017 The Royal Australian and New Zealand College of Radiologists.

  4. Intelligent monitoring of critical pathological events during anesthesia.

    PubMed

    Gohil, Bhupendra; Gholamhhosseini, Hamid; Harrison, Michael J; Lowe, Andrew; Al-Jumaily, Ahmed

    2007-01-01

    Expert algorithms in the field of intelligent patient monitoring have rapidly revolutionized patient care thereby improving patient safety. Patient monitoring during anesthesia requires cautious attention by anesthetists who are monitoring many modalities, diagnosing clinically critical events and performing patient management tasks simultaneously. The mishaps that occur during day-to-day anesthesia causing disastrous errors in anesthesia administration were classified and studied by Reason [1]. Human errors in anesthesia account for 82% of the preventable mishaps [2]. The aim of this paper is to develop a clinically useful diagnostic alarm system for detecting critical events during anesthesia administration. The development of an expert diagnostic alarm system called ;RT-SAAM' for detecting critical pathological events in the operating theatre is presented. This system provides decision support to the anesthetist by presenting the diagnostic results on an integrative, ergonomic display and thus enhancing patient safety. The performance of the system was validated through a series of offline and real-time testing in the operation theatre. When detecting absolute hypovolaemia (AHV), moderate level of agreement was observed between RT-SAAM and the human expert (anesthetist) during surgical procedures. RT-SAAM is a clinically useful diagnostic tool which can be easily modified for diagnosing additional critical pathological events like relative hypovolaemia, fall in cardiac output, sympathetic response and malignant hyperpyrexia during surgical procedures. RT-SAAM is currently being tested at the Auckland City Hospital with ethical approval from the local ethics committees.

  5. The most common mistakes on dermatoscopy of melanocytic lesions

    PubMed Central

    Kamińska-Winciorek, Grażyna

    2015-01-01

    Dermatoscopy is a method of in vivo evaluation of the structures within the epidermis and dermis. Currently, it may be the most precise pre-surgical method of diagnosing melanocytic lesions. Diagnostic errors may result in unnecessary removal of benign lesions or what is even worse, they can cause early and very early melanomas to be overlooked. Errors in assessment of dermatoscopy can be divided into those arising from failure to maintain proper test procedures (procedural and technical errors) and knowledge based mistakes related to the lack of sufficient familiarity and experience in dermatoscopy. The article discusses the most common mistakes made by beginner or inexperienced dermatoscopists. PMID:25821425

  6. Radiologic Errors in Patients With Lung Cancer

    PubMed Central

    Forrest, John V.; Friedman, Paul J.

    1981-01-01

    Some 20 percent to 50 percent of detectable malignant lesions are missed or misdiagnosed at the time of their first radiologic appearance. These errors can result in delayed diagnosis and treatment, which may affect a patient's survival. Use of moderately high (130 to 150) kilovolt peak films, awareness of portions of the lung where lesions are often missed (such as lung apices and paramediastinal and hilar areas), careful comparison of current roentgenograms with those taken previously and the use of an independent second observer can help to minimize the rate of radiologic diagnostic errors in patients with lung cancer. ImagesFigure 3.Figure 4. PMID:7257363

  7. Benign phyllodes tumor with tubular adenoma-like epithelial component in FNAC: A diagnostic pitfall.

    PubMed

    Panda, Kishori M

    2016-01-01

    Benign phyllodes tumor (BPT) is a biphasic neoplasm composed of bland stromal and epithelial elements. Cytologic diagnostic criteria of BPT, though documented in the literature, diagnostic pitfalls in fine-needle aspiration cytology (FNAC) may occur due to sampling error, high cellularity, ductal hyperplasia, paucity of stromal component, and occasional dissociation of epithelial cells. Here, we describe a case of BPT diagnosed by histology in a 19-year-old female, where FNAC features were inconclusive due to paucity of stromal component, predominance of tubular adenoma-like epithelial component, and due to the presence of other overlapping features with fibroadenoma.

  8. Rapid acquisition of magnetic resonance imaging of the shoulder using three-dimensional fast spin echo sequence with compressed sensing.

    PubMed

    Lee, Seung Hyun; Lee, Young Han; Song, Ho-Taek; Suh, Jin-Suck

    2017-10-01

    To evaluate the feasibility of 3D fast spin-echo (FSE) imaging with compressed sensing (CS) for the assessment of shoulder. Twenty-nine patients who underwent shoulder MRI including image sets of axial 3D-FSE sequence without CS and with CS, using an acceleration factor of 1.5, were included. Quantitative assessment was performed by calculating the root mean square error (RMSE) and structural similarity index (SSIM). Two musculoskeletal radiologists compared image quality of 3D-FSE sequences without CS and with CS, and scored the qualitative agreement between sequences, using a five-point scale. Diagnostic agreement for pathologic shoulder lesions between the two sequences was evaluated. The acquisition time of 3D-FSE MRI was reduced using CS (3min 23s vs. 2min 22s). Quantitative evaluations showed a significant correlation between the two sequences (r=0.872-0.993, p<0.05) and SSIM was in an acceptable range (0.940-0.993; mean±standard deviation, 0.968±0.018). Qualitative image quality showed good to excellent agreement between 3D-FSE images without CS and with CS. Diagnostic agreement for pathologic shoulder lesions between the two sequences was very good (κ=0.915-1). The 3D-FSE sequence with CS is feasible in evaluating the shoulder joint with reduced scan time compared to 3D-FSE without CS. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The Systematics of Strong Lens Modeling Quantified: The Effects of Constraint Selection and Redshift Information on Magnification, Mass, and Multiple Image Predictability

    NASA Astrophysics Data System (ADS)

    Johnson, Traci L.; Sharon, Keren

    2016-11-01

    Until now, systematic errors in strong gravitational lens modeling have been acknowledged but have never been fully quantified. Here, we launch an investigation into the systematics induced by constraint selection. We model the simulated cluster Ares 362 times using random selections of image systems with and without spectroscopic redshifts and quantify the systematics using several diagnostics: image predictability, accuracy of model-predicted redshifts, enclosed mass, and magnification. We find that for models with >15 image systems, the image plane rms does not decrease significantly when more systems are added; however, the rms values quoted in the literature may be misleading as to the ability of a model to predict new multiple images. The mass is well constrained near the Einstein radius in all cases, and systematic error drops to <2% for models using >10 image systems. Magnification errors are smallest along the straight portions of the critical curve, and the value of the magnification is systematically lower near curved portions. For >15 systems, the systematic error on magnification is ∼2%. We report no trend in magnification error with the fraction of spectroscopic image systems when selecting constraints at random; however, when using the same selection of constraints, increasing this fraction up to ∼0.5 will increase model accuracy. The results suggest that the selection of constraints, rather than quantity alone, determines the accuracy of the magnification. We note that spectroscopic follow-up of at least a few image systems is crucial because models without any spectroscopic redshifts are inaccurate across all of our diagnostics.

  10. Next Generation Quality: Assessing the Physician in Clinical History Completeness and Diagnostic Interpretations Using Funnel Plots and Normalized Deviations Plots in 3,854 Prostate Biopsies.

    PubMed

    Bonert, Michael; El-Shinnawy, Ihab; Carvalho, Michael; Williams, Phillip; Salama, Samih; Tang, Damu; Kapoor, Anil

    2017-01-01

    Observational data and funnel plots are routinely used outside of pathology to understand trends and improve performance. Extract diagnostic rate (DR) information from free text surgical pathology reports with synoptic elements and assess whether inter-rater variation and clinical history completeness information useful for continuous quality improvement (CQI) can be obtained. All in-house prostate biopsies in a 6-year period at two large teaching hospitals were extracted and then diagnostically categorized using string matching, fuzzy string matching, and hierarchical pruning. DRs were then stratified by the submitting physicians and pathologists. Funnel plots were created to assess for diagnostic bias. 3,854 prostate biopsies were found and all could be diagnostically classified. Two audits involving the review of 700 reports and a comparison of the synoptic elements with the free text interpretations suggest a categorization error rate of <1%. Twenty-seven pathologists each read >40 cases and together assessed 3,690 biopsies. There was considerable inter-rater variability and a trend toward more World Health Organization/International Society of Urologic Pathology Grade 1 cancers in older pathologists. Normalized deviations plots, constructed using the median DR, and standard error can elucidate associated over- and under-calls for an individual pathologist in relation to their practice group. Clinical history completeness by submitting medical doctor varied significantly (100% to 22%). Free text data analyses have some limitations; however, they could be used for data-driven CQI in anatomical pathology, and could lead to the next generation in quality of care.

  11. Diagnosis support system based on clinical guidelines: comparison between case-based fuzzy cognitive maps and Bayesian networks.

    PubMed

    Douali, Nassim; Csaba, Huszka; De Roo, Jos; Papageorgiou, Elpiniki I; Jaulent, Marie-Christine

    2014-01-01

    Several studies have described the prevalence and severity of diagnostic errors. Diagnostic errors can arise from cognitive, training, educational and other issues. Examples of cognitive issues include flawed reasoning, incomplete knowledge, faulty information gathering or interpretation, and inappropriate use of decision-making heuristics. We describe a new approach, case-based fuzzy cognitive maps, for medical diagnosis and evaluate it by comparison with Bayesian belief networks. We created a semantic web framework that supports the two reasoning methods. We used database of 174 anonymous patients from several European hospitals: 80 of the patients were female and 94 male with an average age 45±16 (average±stdev). Thirty of the 80 female patients were pregnant. For each patient, signs/symptoms/observables/age/sex were taken into account by the system. We used a statistical approach to compare the two methods. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Case-based clinical reasoning in feline medicine: 2: Managing cognitive error.

    PubMed

    Canfield, Paul J; Whitehead, Martin L; Johnson, Robert; O'Brien, Carolyn R; Malik, Richard

    2016-03-01

    This is Article 2 of a three-part series on clinical reasoning that encourages practitioners to explore and understand how they think and make case-based decisions. It is hoped that, in the process, they will learn to trust their intuition but, at the same time, put in place safeguards to diminish the impact of bias and misguided logic on their diagnostic decision-making. Article 1, published in the January 2016 issue of JFMS, discussed the relative merits and shortcomings of System 1 thinking (immediate and unconscious) and System 2 thinking (effortful and analytical). This second article examines ways of managing cognitive error, particularly the negative impact of bias, when making a diagnosis. Article 3, to appear in the May 2016 issue, explores the use of heuristics (mental short cuts) and illness scripts in diagnostic reasoning. © The Author(s) 2016.

  13. Two-Photon Laser-Induced Fluorescence O and N Atoms for the Study of Heterogeneous Catalysis in a Diffusion Reactor

    NASA Technical Reports Server (NTRS)

    Pallix, Joan B.; Copeland, Richard A.; Arnold, James O. (Technical Monitor)

    1995-01-01

    Advanced laser-based diagnostics have been developed to examine catalytic effects and atom/surface interactions on thermal protection materials. This study establishes the feasibility of using laser-induced fluorescence for detection of O and N atom loss in a diffusion tube to measure surface catalytic activity. The experimental apparatus is versatile in that it allows fluorescence detection to be used for measuring species selective recombination coefficients as well as diffusion tube and microwave discharge diagnostics. Many of the potential sources of error in measuring atom recombination coefficients by this method have been identified and taken into account. These include scattered light, detector saturation, sample surface cleanliness, reactor design, gas pressure and composition, and selectivity of the laser probe. Recombination coefficients and their associated errors are reported for N and O atoms on a quartz surface at room temperature.

  14. OSA severity assessment based on sleep breathing analysis using ambient microphone.

    PubMed

    Dafna, E; Tarasiuk, A; Zigel, Y

    2013-01-01

    In this paper, an audio-based system for severity estimation of obstructive sleep apnea (OSA) is proposed. The system estimates the apnea-hypopnea index (AHI), which is the average number of apneic events per hour of sleep. This system is based on a Gaussian mixture regression algorithm that was trained and validated on full-night audio recordings. Feature selection process using a genetic algorithm was applied to select the best features extracted from time and spectra domains. A total of 155 subjects, referred to in-laboratory polysomnography (PSG) study, were recruited. Using the PSG's AHI score as a gold-standard, the performances of the proposed system were evaluated using a Pearson correlation, AHI error, and diagnostic agreement methods. Correlation of R=0.89, AHI error of 7.35 events/hr, and diagnostic agreement of 77.3% were achieved, showing encouraging performances and a reliable non-contact alternative method for OSA severity estimation.

  15. Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.

    PubMed

    Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F

    2010-11-01

    Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.

  16. Bayesian multi-scale smoothing of photon-limited images with applications to astronomy and medicine

    NASA Astrophysics Data System (ADS)

    White, John

    Multi-scale models for smoothing Poisson signals or images have gained much attention over the past decade. A new Bayesian model is developed using the concept of the Chinese restaurant process to find structures in two-dimensional images when performing image reconstruction or smoothing. This new model performs very well when compared to other leading methodologies for the same problem. It is developed and evaluated theoretically and empirically throughout Chapter 2. The newly developed Bayesian model is extended to three-dimensional images in Chapter 3. The third dimension has numerous different applications, such as different energy spectra, another spatial index, or possibly a temporal dimension. Empirically, this method shows promise in reducing error with the use of simulation studies. A further development removes background noise in the image. This removal can further reduce the error and is done using a modeling adjustment and post-processing techniques. These details are given in Chapter 4. Applications to real world problems are given throughout. Photon-based images are common in astronomical imaging due to the collection of different types of energy such as X-Rays. Applications to real astronomical images are given, and these consist of X-ray images from the Chandra X-ray observatory satellite. Diagnostic medicine uses many types of imaging such as magnetic resonance imaging and computed tomography that can also benefit from smoothing techniques such as the one developed here. Reducing the amount of radiation a patient takes will make images more noisy, but this can be mitigated through the use of image smoothing techniques. Both types of images represent the potential real world use for these methods.

  17. Main sources of errors in diagnosis of chronic radiation sickness (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soldatova, V.A.

    1973-11-01

    With the aim of finding out the main sources of errors in the diagnosis of chronic radiation sickness, the author analyzed a total of 500 cases of this sickness in roenigenologists and radiologists sent to the clinic to be examined according to occupational indications. lt was shown that the main source of errors when interpreting the observed deviations as occupational was underestimation of etiological significance of functional and organic diseases of the nervous system, endocrinevascular dystonia and also such diseases as hypochromic anemia and chronic infection. The majority of diagnostic errors is explained by insufficient knowledge of the main regularitymore » of forming the picture of chronic radiation sickness and by the absence of the necessary differential diagnosis with general somatic diseases. (auth)« less

  18. Assimilation of surface NO2 and O3 observations into the SILAM chemistry transport model

    NASA Astrophysics Data System (ADS)

    Vira, J.; Sofiev, M.

    2015-02-01

    This paper describes the assimilation of trace gas observations into the chemistry transport model SILAM (System for Integrated modeLling of Atmospheric coMposition) using the 3D-Var method. Assimilation results for the year 2012 are presented for the prominent photochemical pollutants ozone (O3) and nitrogen dioxide (NO2). Both species are covered by the AirBase observation database, which provides the observational data set used in this study. Attention was paid to the background and observation error covariance matrices, which were obtained primarily by the iterative application of a posteriori diagnostics. The diagnostics were computed separately for 2 months representing summer and winter conditions, and further disaggregated by time of day. This enabled the derivation of background and observation error covariance definitions, which included both seasonal and diurnal variation. The consistency of the obtained covariance matrices was verified using χ2 diagnostics. The analysis scores were computed for a control set of observation stations withheld from assimilation. Compared to a free-running model simulation, the correlation coefficient for daily maximum values was improved from 0.8 to 0.9 for O3 and from 0.53 to 0.63 for NO2.

  19. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    NASA Astrophysics Data System (ADS)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  20. Near field communications technology and the potential to reduce medication errors through multidisciplinary application

    PubMed Central

    Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J.; Tabirca, Sabin; O’Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Background Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. Methods An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. Results A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). Conclusions An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment. PMID:28293602

  1. Near field communications technology and the potential to reduce medication errors through multidisciplinary application.

    PubMed

    O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.

  2. Evaluating a medical error taxonomy.

    PubMed

    Brixey, Juliana; Johnson, Todd R; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a standard language for reporting medication errors. This project maps the NCC MERP taxonomy of medication error to MedWatch medical errors involving infusion pumps. Of particular interest are human factors associated with medical device errors. The NCC MERP taxonomy of medication errors is limited in mapping information from MEDWATCH because of the focus on the medical device and the format of reporting.

  3. Comparison of the accuracy of cone beam computed tomography and medical computed tomography: implications for clinical diagnostics with guided surgery.

    PubMed

    Abboud, Marcus; Calvo-Guirado, Jose Luis; Orentlicher, Gary; Wahl, Gerhard

    2013-01-01

    This study compared the accuracy of cone beam computed tomography (CBCT) and medical-grade CT in the context of evaluating the diagnostic value and accuracy of fiducial marker localization for reference marker-based guided surgery systems. Cadaver mandibles with attached radiopaque gutta-percha markers, as well as glass balls and composite cylinders of known dimensions, were measured manually with a highly accurate digital caliper. The objects were then scanned using a medical-grade CT scanner (Philips Brilliance 64) and five different CBCT scanners (Sirona Galileos, Morita 3D Accuitomo 80, Vatech PaX-Reve3D, 3M Imtech Iluma, and Planmeca ProMax 3D). The data were then imported into commercially available software, and measurements were made of the scanned markers and objects. CT and CBCT measurements were compared to each other and to the caliper measurements. The difference between the CBCT measurements and the caliper measurements was larger than the difference between the CT measurements and the caliper measurements. Measurements of the cadaver mandible and the geometric reference markers were highly accurate with CT. The average absolute errors of the human mandible measurements were 0.03 mm for CT and 0.23 mm for CBCT. The measurement errors of the geometric objects based on CT ranged between 0.00 and 0.12 mm, compared to an error range between 0.00 and 2.17 mm with the CBCT scanners. CT provided the most accurate images in this study, closely followed by one CBCT of the five tested. Although there were differences in the distance measurements of the hard tissue of the human mandible between CT and CBCT, these differences may not be of clinical significance for most diagnostic purposes. The fiducial marker localization error caused by some CBCT scanners may be a problem for guided surgery systems.

  4. Jackknife variance of the partial area under the empirical receiver operating characteristic curve.

    PubMed

    Bandos, Andriy I; Guo, Ben; Gur, David

    2017-04-01

    Receiver operating characteristic analysis provides an important methodology for assessing traditional (e.g., imaging technologies and clinical practices) and new (e.g., genomic studies, biomarker development) diagnostic problems. The area under the clinically/practically relevant part of the receiver operating characteristic curve (partial area or partial area under the receiver operating characteristic curve) is an important performance index summarizing diagnostic accuracy at multiple operating points (decision thresholds) that are relevant to actual clinical practice. A robust estimate of the partial area under the receiver operating characteristic curve is provided by the area under the corresponding part of the empirical receiver operating characteristic curve. We derive a closed-form expression for the jackknife variance of the partial area under the empirical receiver operating characteristic curve. Using the derived analytical expression, we investigate the differences between the jackknife variance and a conventional variance estimator. The relative properties in finite samples are demonstrated in a simulation study. The developed formula enables an easy way to estimate the variance of the empirical partial area under the receiver operating characteristic curve, thereby substantially reducing the computation burden, and provides important insight into the structure of the variability. We demonstrate that when compared with the conventional approach, the jackknife variance has substantially smaller bias, and leads to a more appropriate type I error rate of the Wald-type test. The use of the jackknife variance is illustrated in the analysis of a data set from a diagnostic imaging study.

  5. A Complementary Note to 'A Lag-1 Smoother Approach to System-Error Estimation': The Intrinsic Limitations of Residual Diagnostics

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo

    2015-01-01

    Recently, this author studied an approach to the estimation of system error based on combining observation residuals derived from a sequential filter and fixed lag-1 smoother. While extending the methodology to a variational formulation, experimenting with simple models and making sure consistency was found between the sequential and variational formulations, the limitations of the residual-based approach came clearly to the surface. This note uses the sequential assimilation application to simple nonlinear dynamics to highlight the issue. Only when some of the underlying error statistics are assumed known is it possible to estimate the unknown component. In general, when considerable uncertainties exist in the underlying statistics as a whole, attempts to obtain separate estimates of the various error covariances are bound to lead to misrepresentation of errors. The conclusions are particularly relevant to present-day attempts to estimate observation-error correlations from observation residual statistics. A brief illustration of the issue is also provided by comparing estimates of error correlations derived from a quasi-operational assimilation system and a corresponding Observing System Simulation Experiments framework.

  6. Prevention of prescription errors by computerized, on-line, individual patient related surveillance of drug order entry.

    PubMed

    Oliven, A; Zalman, D; Shilankov, Y; Yeshurun, D; Odeh, M

    2002-01-01

    Computerized prescription of drugs is expected to reduce the number of many preventable drug ordering errors. In the present study we evaluated the usefullness of a computerized drug order entry (CDOE) system in reducing prescription errors. A department of internal medicine using a comprehensive CDOE, which included also patient-related drug-laboratory, drug-disease and drug-allergy on-line surveillance was compared to a similar department in which drug orders were handwritten. CDOE reduced prescription errors to 25-35%. The causes of errors remained similar, and most errors, on both departments, were associated with abnormal renal function and electrolyte balance. Residual errors remaining on the CDOE-using department were due to handwriting on the typed order, failure to feed patients' diseases, and system failures. The use of CDOE was associated with a significant reduction in mean hospital stay and in the number of changes performed in the prescription. The findings of this study both quantity the impact of comprehensive CDOE on prescription errors and delineate the causes for remaining errors.

  7. COMPLEX VARIABLE BOUNDARY ELEMENT METHOD: APPLICATIONS.

    USGS Publications Warehouse

    Hromadka, T.V.; Yen, C.C.; Guymon, G.L.

    1985-01-01

    The complex variable boundary element method (CVBEM) is used to approximate several potential problems where analytical solutions are known. A modeling result produced from the CVBEM is a measure of relative error in matching the known boundary condition values of the problem. A CVBEM error-reduction algorithm is used to reduce the relative error of the approximation by adding nodal points in boundary regions where error is large. From the test problems, overall error is reduced significantly by utilizing the adaptive integration algorithm.

  8. Reliability of HIV rapid diagnostic tests for self-testing compared with testing by health-care workers: a systematic review and meta-analysis.

    PubMed

    Figueroa, Carmen; Johnson, Cheryl; Ford, Nathan; Sands, Anita; Dalal, Shona; Meurant, Robyn; Prat, Irena; Hatzold, Karin; Urassa, Willy; Baggaley, Rachel

    2018-06-01

    The ability of individuals to use HIV self-tests correctly is debated. To inform the 2016 WHO recommendation on HIV self-testing, we assessed the reliability and performance of HIV rapid diagnostic tests when used by self-testers. In this systematic review and meta-analysis, we searched PubMed, PopLine, and Embase, conference abstracts, and additional grey literature between Jan 1, 1995, and April 30, 2016, for observational and experimental studies reporting on HIV self-testing performance. We excluded studies evaluating home specimen collection because patients did not interpret their own test results. We extracted data independently, using standardised extraction forms. Outcomes of interest were agreement between self-testers and health-care workers, sensitivity, and specificity. We calculated κ to establish the level of agreement and pooled κ estimates using a random-effects model, by approach (directly assisted or unassisted) and type of specimen (blood or oral fluid). We examined heterogeneity with the I 2 statistic. 25 studies met inclusion criteria (22 to 5662 participants). Quality assessment with QUADAS-2 showed studies had low risk of bias and incomplete reporting in accordance with the STARD checklist. Raw proportion of agreement ranged from 85·4% to 100%, and reported κ ranged from fair (κ 0·277, p<0·001) to almost perfect (κ 0·99, n=25). Pooled κ suggested almost perfect agreement for both types of approaches (directly assisted 0·98, 95% CI 0·96-0·99 and unassisted 0·97, 0·96-0·98; I 2 =34·5%, 0-97·8). Excluding two outliers, sensitivity and specificity was higher for blood-based rapid diagnostic tests (4/16) compared with oral fluid rapid diagnostic tests (13/16). The most common error that affected test performance was incorrect specimen collection (oral swab or finger prick). Study limitations included the use of different reference standards and no disaggregation of results by individuals taking antiretrovirals. Self-testers can reliably and accurately do HIV rapid diagnostic tests, as compared with trained health-care workers. Errors in performance might be reduced through the improvement of rapid diagnostic tests for self-testing, particularly to make sample collection easier and to simplify instructions for use. The Bill & Melinda Gates Foundation and Unitaid. © 2018. World Health Oranization. Licensee Elseviere. This is an Open Access article published under the CC BY 3.0 IGO license which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In any use of this article, there should be no suggestion that WHO endorses any specific organisation, products or services. The use of the WHO logo is not permitted. This notice should be preserved along with the article's original URL.

  9. Gram-stain plus MALDI-TOF MS (Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry) for a rapid diagnosis of urinary tract infection.

    PubMed

    Burillo, Almudena; Rodríguez-Sánchez, Belén; Ramiro, Ana; Cercenado, Emilia; Rodríguez-Créixems, Marta; Bouza, Emilio

    2014-01-01

    Microbiological confirmation of a urinary tract infection (UTI) takes 24-48 h. In the meantime, patients are usually given empirical antibiotics, sometimes inappropriately. We assessed the feasibility of sequentially performing a Gram stain and MALDI-TOF MS mass spectrometry (MS) on urine samples to anticipate clinically useful information. In May-June 2012, we randomly selected 1000 urine samples from patients with suspected UTI. All were Gram stained and those yielding bacteria of a single morphotype were processed for MALDI-TOF MS. Our sequential algorithm was correlated with the standard semiquantitative urine culture result as follows: Match, the information provided was anticipative of culture result; Minor error, the information provided was partially anticipative of culture result; Major error, the information provided was incorrect, potentially leading to inappropriate changes in antimicrobial therapy. A positive culture was obtained in 242/1000 samples. The Gram stain revealed a single morphotype in 207 samples, which were subjected to MALDI-TOF MS. The diagnostic performance of the Gram stain was: sensitivity (Se) 81.3%, specificity (Sp) 93.2%, positive predictive value (PPV) 81.3%, negative predictive value (NPV) 93.2%, positive likelihood ratio (+LR) 11.91, negative likelihood ratio (-LR) 0.20 and accuracy 90.0% while that of MALDI-TOF MS was: Se 79.2%, Sp 73.5, +LR 2.99, -LR 0.28 and accuracy 78.3%. The use of both techniques provided information anticipative of the culture result in 82.7% of cases, information with minor errors in 13.4% and information with major errors in 3.9%. Results were available within 1 h. Our serial algorithm provided information that was consistent or showed minor errors for 96.1% of urine samples from patients with suspected UTI. The clinical impacts of this rapid UTI diagnosis strategy need to be assessed through indicators of adequacy of treatment such as a reduced time to appropriate empirical treatment or earlier withdrawal of unnecessary antibiotics.

  10. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning.

    PubMed

    Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui

    2018-05-15

    Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.

  11. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning

    PubMed Central

    Kim, James; Li, Li; Liu, Hui

    2018-01-01

    Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796

  12. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806

  14. The associations of insomnia with costly workplace accidents and errors: results from the America Insomnia Survey.

    PubMed

    Shahly, Victoria; Berglund, Patricia A; Coulouvrat, Catherine; Fitzgerald, Timothy; Hajak, Goeran; Roth, Thomas; Shillington, Alicia C; Stephenson, Judith J; Walsh, James K; Kessler, Ronald C

    2012-10-01

    Insomnia is a common and seriously impairing condition that often goes unrecognized. To examine associations of broadly defined insomnia (ie, meeting inclusion criteria for a diagnosis from International Statistical Classification of Diseases, 10th Revision, DSM-IV, or Research Diagnostic Criteria/International Classification of Sleep Disorders, Second Edition) with costly workplace accidents and errors after excluding other chronic conditions among workers in the America Insomnia Survey (AIS). A national cross-sectional telephone survey (65.0% cooperation rate) of commercially insured health plan members selected from the more than 34 million in the HealthCore Integrated Research Database. Four thousand nine hundred ninety-one employed AIS respondents. Costly workplace accidents or errors in the 12 months before the AIS interview were assessed with one question about workplace accidents "that either caused damage or work disruption with a value of $500 or more" and another about other mistakes "that cost your company $500 or more." Current insomnia with duration of at least 12 months was assessed with the Brief Insomnia Questionnaire, a validated (area under the receiver operating characteristic curve, 0.86 compared with diagnoses based on blinded clinical reappraisal interviews), fully structured diagnostic interview. Eighteen other chronic conditions were assessed with medical/pharmacy claims records and validated self-report scales. Insomnia had a significant odds ratio with workplace accidents and/or errors controlled for other chronic conditions (1.4). The odds ratio did not vary significantly with respondent age, sex, educational level, or comorbidity. The average costs of insomnia-related accidents and errors ($32 062) were significantly higher than those of other accidents and errors ($21 914). Simulations estimated that insomnia was associated with 7.2% of all costly workplace accidents and errors and 23.7% of all the costs of these incidents. These proportions are higher than for any other chronic condition, with annualized US population projections of 274 000 costly insomnia-related workplace accidents and errors having a combined value of US $31.1 billion. Effectiveness trials are needed to determine whether expanded screening, outreach, and treatment of workers with insomnia would yield a positive return on investment for employers.

  15. Use of Mobile Apps Among Medical and Nursing Students in Iran.

    PubMed

    Sheikhtaheri, Abbas; Kermani, Farzaneh

    2018-01-01

    Mobile technologies have a positive impact on patient care and cause to improved decision making, reduced medical errors and improved communication in care team. The purpose of this study was to investigate the use of mobile technologies by medical and nursing students and their tendency in future. This study was conducted among 372 medical and nursing students of Tehran University of Medical Science. Respectively, 60.8% and 62.4% of medical and nursing students use smartphone. The most commonly used apps among medical students were medical dictionary, drug apps, medical calculators and anatomical atlases and among nursing students were medical dictionary, anatomical atlases and nursing care guides. Also, the use of decision support systems, remote monitoring, patient imagery and remote diagnosis, patient records documentation, diagnostic guidelines and laboratory tests will be increased in the future.

  16. Errors Affect Hypothetical Intertemporal Food Choice in Women

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2014-01-01

    Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534

  17. Reducing the Familiarity of Conjunction Lures with Pictures

    ERIC Educational Resources Information Center

    Lloyd, Marianne E.

    2013-01-01

    Four experiments were conducted to test whether conjunction errors were reduced after pictorial encoding and whether the semantic overlap between study and conjunction items would impact error rates. Across 4 experiments, compound words studied with a single-picture had lower conjunction error rates during a recognition test than those words…

  18. Recommendations to Improve the Accuracy of Estimates of Physical Activity Derived from Self Report

    PubMed Central

    Ainsworth, Barbara E; Caspersen, Carl J; Matthews, Charles E; Mâsse, Louise C; Baranowski, Tom; Zhu, Weimo

    2013-01-01

    Context Assessment of physical activity using self-report has the potential for measurement error that can lead to incorrect inferences about physical activity behaviors and bias study results. Objective To provide recommendations to improve the accuracy of physical activity derived from self report. Process We provide an overview of presentations and a compilation of perspectives shared by the authors of this paper and workgroup members. Findings We identified a conceptual framework for reducing errors using physical activity self-report questionnaires. The framework identifies six steps to reduce error: (1) identifying the need to measure physical activity, (2) selecting an instrument, (3) collecting data, (4) analyzing data, (5) developing a summary score, and (6) interpreting data. Underlying the first four steps are behavioral parameters of type, intensity, frequency, and duration of physical activities performed, activity domains, and the location where activities are performed. We identified ways to reduce measurement error at each step and made recommendations for practitioners, researchers, and organizational units to reduce error in questionnaire assessment of physical activity. Conclusions Self-report measures of physical activity have a prominent role in research and practice settings. Measurement error can be reduced by applying the framework discussed in this paper. PMID:22287451

  19. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the diagnoses of sleep-breathing disorders.

  20. Pathology-related cases in the Norwegian System of Patient Injury Compensation in the period 2010-2015.

    PubMed

    Alfsen, G Cecilie; Chen, Ying; Kähler, Hanne; Bukholm, Ida Rashida Khan

    2016-12-01

    The Norwegian System of Patient Injury Compensation (NPE) processes compensation claims from patients who complain about malpractice in the health services. A wrong diagnosis in pathology may cause serious injury to the patient, but the incidence of compensation claims is unknown, because pathology is not specified as a separate category in NPE’s statistics. Knowledge about errors is required to assess quality-enhancing measures. We have therefore searched through the NPE records to identify cases whose background stems from errors committed in pathology departments and laboratories. We have searched through the NPE records for cases related to pathology for the years 2010 – 2015. During this period the NPE processed a total of 26 600 cases, of which 93 were related to pathology. The compensation claim was upheld in 66 cases, resulting in total compensation payments amounting to NOK 63 million. False-negative results in the form of undetected diagnoses were the most frequent grounds for compensation claims (63 cases), with an undetected malignant melanoma (n = 23) or atypia in cell samples from the cervix uteri (n = 16) as the major groups. Sixteen cases involved non-diagnostic issues such as mix-up of samples (n = 8), contamination of samples (n = 4) or delayed responses (n = 4). The number of compensation claims caused by errors in pathology diagnostics is low in relative terms. The errors may, however, be of a serious nature, especially if malignant conditions are overlooked or samples mixed up.

  1. Teaching clinical reasoning: case-based and coached.

    PubMed

    Kassirer, Jerome P

    2010-07-01

    Optimal medical care is critically dependent on clinicians' skills to make the right diagnosis and to recommend the most appropriate therapy, and acquiring such reasoning skills is a key requirement at every level of medical education. Teaching clinical reasoning is grounded in several fundamental principles of educational theory. Adult learning theory posits that learning is best accomplished by repeated, deliberate exposure to real cases, that case examples should be selected for their reflection of multiple aspects of clinical reasoning, and that the participation of a coach augments the value of an educational experience. The theory proposes that memory of clinical medicine and clinical reasoning strategies is enhanced when errors in information, judgment, and reasoning are immediately pointed out and discussed. Rather than using cases artificially constructed from memory, real cases are greatly preferred because they often reflect the false leads, the polymorphisms of actual clinical material, and the misleading test results encountered in everyday practice. These concepts foster the teaching and learning of the diagnostic process, the complex trade-offs between the benefits and risks of diagnostic tests and treatments, and cognitive errors in clinical reasoning. The teaching of clinical reasoning need not and should not be delayed until students gain a full understanding of anatomy and pathophysiology. Concepts such as hypothesis generation, pattern recognition, context formulation, diagnostic test interpretation, differential diagnosis, and diagnostic verification provide both the language and the methods of clinical problem solving. Expertise is attainable even though the precise mechanisms of achieving it are not known.

  2. THE SYSTEMATICS OF STRONG LENS MODELING QUANTIFIED: THE EFFECTS OF CONSTRAINT SELECTION AND REDSHIFT INFORMATION ON MAGNIFICATION, MASS, AND MULTIPLE IMAGE PREDICTABILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Traci L.; Sharon, Keren, E-mail: tljohn@umich.edu

    Until now, systematic errors in strong gravitational lens modeling have been acknowledged but have never been fully quantified. Here, we launch an investigation into the systematics induced by constraint selection. We model the simulated cluster Ares 362 times using random selections of image systems with and without spectroscopic redshifts and quantify the systematics using several diagnostics: image predictability, accuracy of model-predicted redshifts, enclosed mass, and magnification. We find that for models with >15 image systems, the image plane rms does not decrease significantly when more systems are added; however, the rms values quoted in the literature may be misleading asmore » to the ability of a model to predict new multiple images. The mass is well constrained near the Einstein radius in all cases, and systematic error drops to <2% for models using >10 image systems. Magnification errors are smallest along the straight portions of the critical curve, and the value of the magnification is systematically lower near curved portions. For >15 systems, the systematic error on magnification is ∼2%. We report no trend in magnification error with the fraction of spectroscopic image systems when selecting constraints at random; however, when using the same selection of constraints, increasing this fraction up to ∼0.5 will increase model accuracy. The results suggest that the selection of constraints, rather than quantity alone, determines the accuracy of the magnification. We note that spectroscopic follow-up of at least a few image systems is crucial because models without any spectroscopic redshifts are inaccurate across all of our diagnostics.« less

  3. Reliability, Validity, and Classification Accuracy of the DSM-5 Diagnostic Criteria for Gambling Disorder and Comparison to DSM-IV.

    PubMed

    Stinchfield, Randy; McCready, John; Turner, Nigel E; Jimenez-Murcia, Susana; Petry, Nancy M; Grant, Jon; Welte, John; Chapman, Heather; Winters, Ken C

    2016-09-01

    The DSM-5 was published in 2013 and it included two substantive revisions for gambling disorder (GD). These changes are the reduction in the threshold from five to four criteria and elimination of the illegal activities criterion. The purpose of this study was to twofold. First, to assess the reliability, validity and classification accuracy of the DSM-5 diagnostic criteria for GD. Second, to compare the DSM-5-DSM-IV on reliability, validity, and classification accuracy, including an examination of the effect of the elimination of the illegal acts criterion on diagnostic accuracy. To compare DSM-5 and DSM-IV, eight datasets from three different countries (Canada, USA, and Spain; total N = 3247) were used. All datasets were based on similar research methods. Participants were recruited from outpatient gambling treatment services to represent the group with a GD and from the community to represent the group without a GD. All participants were administered a standardized measure of diagnostic criteria. The DSM-5 yielded satisfactory reliability, validity and classification accuracy. In comparing the DSM-5 to the DSM-IV, most comparisons of reliability, validity and classification accuracy showed more similarities than differences. There was evidence of modest improvements in classification accuracy for DSM-5 over DSM-IV, particularly in reduction of false negative errors. This reduction in false negative errors was largely a function of lowering the cut score from five to four and this revision is an improvement over DSM-IV. From a statistical standpoint, eliminating the illegal acts criterion did not make a significant impact on diagnostic accuracy. From a clinical standpoint, illegal acts can still be addressed in the context of the DSM-5 criterion of lying to others.

  4. How and when do expert emergency physicians generate and evaluate diagnostic hypotheses? A qualitative study using head-mounted video cued-recall interviews.

    PubMed

    Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Ammirati, Christine; Bertrand, Catherine; Dory, Valérie; Charlin, Bernard

    2014-12-01

    The ability to make a diagnosis is a crucial skill in emergency medicine. Little is known about the way emergency physicians reach a diagnosis. This study aims to identify how and when, during the initial patient examination, emergency physicians generate and evaluate diagnostic hypotheses. We carried out a qualitative research project based on semistructured interviews with emergency physicians. The interviews concerned management of an emergency situation during routine medical practice. They were associated with viewing the video recording of emergency situations filmed in an "own-point-of-view" perspective. The emergency physicians generated an average of 5 diagnostic hypotheses. Most of these hypotheses were generated before meeting the patient or within the first 5 minutes of the meeting. The hypotheses were then rank ordered within the context of a verification procedure based on identifying key information. These tasks were usually accomplished without conscious effort. No hypothesis was completely confirmed or refuted until the results of investigations were available. The generation and rank ordering of diagnostic hypotheses is based on the activation of cognitive processes, enabling expert emergency physicians to process environmental information and link it to past experiences. The physicians seemed to strive to avoid the risk of error by remaining aware of the possibility of alternative hypotheses as long as they did not have the results of investigations. Understanding the diagnostic process used by emergency physicians provides interesting ideas for training residents in a specialty in which the prevalence of reasoning errors leading to incorrect diagnoses is high. Copyright © 2014 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  5. Potential Biases Introduced by Conflating Screening and Diagnostic Testing in Colorectal Cancer Screening Surveillance

    PubMed Central

    Becker, Elizabeth A.; Griffith, Derek M.; West, Brady T.; Janz, Nancy K.; Resnicow, Ken; Morris, Arden M.

    2015-01-01

    Background Screening and post-symptomatic diagnostic testing are often conflated in cancer screening surveillance research. We examined the error in estimated colorectal cancer (CRC) screening prevalence due to the conflation of screening and diagnostic testing. Methods Using data from the 2008 National Health Interview Survey, we compared weighted prevalence estimates of the use of all testing (screening and diagnostic) and screening in at-risk adults, and calculated the overestimation of screening prevalence across socio-demographic groups. Results The population screening prevalence was overestimated by 23.3%, and the level of overestimation varied widely across socio-demographic groups (median 22.6%, mean 24.8%). The highest levels of overestimation were in non-Hispanic White females (27.4%), adults ages 50–54 (32.0%), and those with the highest socioeconomic vulnerability (low educational attainment (31.3%), low poverty ratio (32.5%), no usual source of health care (54.4%) and not insured (51.6%)) (all p-values < 0.001). Conclusions When the impetus for testing was not included, CRC screening prevalence was overestimated, and patterns of overestimation often aligned with social and economic vulnerability. These results are of concern to researchers who utilize survey data from the Behavioral Risk Factor Surveillance System (BRFSS) to assess cancer screening behaviors, as it is currently not designed to distinguish diagnostic testing from screening. Impact Surveillance research in cancer screening that does not consider the impetus for testing risks measurement error of screening prevalence, impeding progress toward improving population health. Ultimately, in order to craft relevant screening benchmarks and interventions, we must look beyond ‘what’ and ‘when’ and include ‘why.’ PMID:26491056

  6. Adenocarcinoma in situ of the cervix.

    PubMed

    Schoolland, Meike; Segal, Amanda; Allpress, Stephen; Miranda, Alina; Frost, Felicity A; Sterrett, Gregory F

    2002-12-25

    The current study examines 1) the sensitivity of detection and 2) sampling and screening/diagnostic error in the cytologic diagnosis of adenocarcinoma in situ (AIS) of the cervix. The data were taken from public and private sector screening laboratories reporting 25,000 and 80,000 smears, respectively, each year. The study group was comprised of women with a biopsy diagnosis of AIS or AIS combined with a high-grade squamous intraepithelial lesion (HSIL) who were accessioned by the Western Australian Cervical Cytology Registry (WACCR) between 1993-1998. Cervical smears reported by the Western Australia Centre for Pathology and Medical Research (PathCentre) or Western Diagnostic Pathology (WDP) in the 36 months before the index biopsy was obtained were retrieved. A true measure of the sensitivity of detection could not be determined because to the authors' knowledge the exact prevalence of disease is unknown at present. For the current study, sensitivity was defined as the percentage of smears reported as demonstrating a possible or definite high-grade epithelial abnormality (HGEA), either glandular or squamous. Sampling error was defined as the percentage of smears found to have no HGEA on review. Screening/diagnostic error was defined as the percentage of smears in which HGEA was not diagnosed initially but review demonstrated possible or definite HGEA. Sensitivity also was calculated for a randomly selected control group of biopsy proven cases of Grade 3 cervical intraepithelial neoplasia (CIN 3) accessioned at the WACCR in 1999. For biopsy findings of AIS alone, the diagnostic "sensitivity" of a single smear was 47.6% for the PathCentre and 54.3% for WDP. Nearly all the abnormalities were reported as glandular. The sampling and screening/diagnostic errors were 47.6% and 4.8%, respectively, for the PathCentre and 33.3% and 12.3%, respectively, for WDP. The results from the PathCentre were better for AIS plus HSIL than for AIS alone, but the results from WDP were similar for both groups. For the CIN 3 control cases, the "sensitivity" of a single smear was 42.5%. To the authors' knowledge epidemiologic studies published to date have not demonstrated a benefit from screening for precursors of cervical adenocarcinoma. However, in the study laboratories as in many others, reasonable expertise in diagnosing AIS has been acquired only within the last 10-15 years, which may be too short a period in which to demonstrate a significant effect. The results of the current study provide some encouraging baseline data regarding the sensitivity of the Papanicolaou smear in detecting AIS. Further improvements in sampling and cytodiagnosis may be possible. Copyright 2002 American Cancer Society.

  7. Automated Classification of Selected Data Elements from Free-text Diagnostic Reports for Clinical Research.

    PubMed

    Löpprich, Martin; Krauss, Felix; Ganzinger, Matthias; Senghas, Karsten; Riezler, Stefan; Knaup, Petra

    2016-08-05

    In the Multiple Myeloma clinical registry at Heidelberg University Hospital, most data are extracted from discharge letters. Our aim was to analyze if it is possible to make the manual documentation process more efficient by using methods of natural language processing for multiclass classification of free-text diagnostic reports to automatically document the diagnosis and state of disease of myeloma patients. The first objective was to create a corpus consisting of free-text diagnosis paragraphs of patients with multiple myeloma from German diagnostic reports, and its manual annotation of relevant data elements by documentation specialists. The second objective was to construct and evaluate a framework using different NLP methods to enable automatic multiclass classification of relevant data elements from free-text diagnostic reports. The main diagnoses paragraph was extracted from the clinical report of one third randomly selected patients of the multiple myeloma research database from Heidelberg University Hospital (in total 737 selected patients). An EDC system was setup and two data entry specialists performed independently a manual documentation of at least nine specific data elements for multiple myeloma characterization. Both data entries were compared and assessed by a third specialist and an annotated text corpus was created. A framework was constructed, consisting of a self-developed package to split multiple diagnosis sequences into several subsequences, four different preprocessing steps to normalize the input data and two classifiers: a maximum entropy classifier (MEC) and a support vector machine (SVM). In total 15 different pipelines were examined and assessed by a ten-fold cross-validation, reiterated 100 times. For quality indication the average error rate and the average F1-score were conducted. For significance testing the approximate randomization test was used. The created annotated corpus consists of 737 different diagnoses paragraphs with a total number of 865 coded diagnosis. The dataset is publicly available in the supplementary online files for training and testing of further NLP methods. Both classifiers showed low average error rates (MEC: 1.05; SVM: 0.84) and high F1-scores (MEC: 0.89; SVM: 0.92). However the results varied widely depending on the classified data element. Preprocessing methods increased this effect and had significant impact on the classification, both positive and negative. The automatic diagnosis splitter increased the average error rate significantly, even if the F1-score decreased only slightly. The low average error rates and high average F1-scores of each pipeline demonstrate the suitability of the investigated NPL methods. However, it was also shown that there is no best practice for an automatic classification of data elements from free-text diagnostic reports.

  8. Methods and apparatus for reducing peak wind turbine loads

    DOEpatents

    Moroz, Emilian Mieczyslaw

    2007-02-13

    A method for reducing peak loads of wind turbines in a changing wind environment includes measuring or estimating an instantaneous wind speed and direction at the wind turbine and determining a yaw error of the wind turbine relative to the measured instantaneous wind direction. The method further includes comparing the yaw error to a yaw error trigger that has different values at different wind speeds and shutting down the wind turbine when the yaw error exceeds the yaw error trigger corresponding to the measured or estimated instantaneous wind speed.

  9. Simulation: learning from mistakes while building communication and teamwork.

    PubMed

    Kuehster, Christina R; Hall, Carla D

    2010-01-01

    Medical errors are one of the leading causes of death annually in the United States. Many of these errors are related to poor communication and/or lack of teamwork. Using simulation as a teaching modality provides a dual role in helping to reduce these errors. Thorough integration of clinical practice with teamwork and communication in a safe environment increases the likelihood of reducing the error rates in medicine. By allowing practitioners to make potential errors in a safe environment, such as simulation, these valuable lessons improve retention and will rarely be repeated.

  10. Edge profile analysis of Joint European Torus (JET) Thomson scattering data: Quantifying the systematic error due to edge localised mode synchronisation.

    PubMed

    Leyland, M J; Beurskens, M N A; Flanagan, J C; Frassinetti, L; Gibson, K J; Kempenaars, M; Maslov, M; Scannell, R

    2016-01-01

    The Joint European Torus (JET) high resolution Thomson scattering (HRTS) system measures radial electron temperature and density profiles. One of the key capabilities of this diagnostic is measuring the steep pressure gradient, termed the pedestal, at the edge of JET plasmas. The pedestal is susceptible to limiting instabilities, such as Edge Localised Modes (ELMs), characterised by a periodic collapse of the steep gradient region. A common method to extract the pedestal width, gradient, and height, used on numerous machines, is by performing a modified hyperbolic tangent (mtanh) fit to overlaid profiles selected from the same region of the ELM cycle. This process of overlaying profiles, termed ELM synchronisation, maximises the number of data points defining the pedestal region for a given phase of the ELM cycle. When fitting to HRTS profiles, it is necessary to incorporate the diagnostic radial instrument function, particularly important when considering the pedestal width. A deconvolved fit is determined by a forward convolution method requiring knowledge of only the instrument function and profiles. The systematic error due to the deconvolution technique incorporated into the JET pedestal fitting tool has been documented by Frassinetti et al. [Rev. Sci. Instrum. 83, 013506 (2012)]. This paper seeks to understand and quantify the systematic error introduced to the pedestal width due to ELM synchronisation. Synthetic profiles, generated with error bars and point-to-point variation characteristic of real HRTS profiles, are used to evaluate the deviation from the underlying pedestal width. We find on JET that the ELM synchronisation systematic error is negligible in comparison to the statistical error when assuming ten overlaid profiles (typical for a pre-ELM fit to HRTS profiles). This confirms that fitting a mtanh to ELM synchronised profiles is a robust and practical technique for extracting the pedestal structure.

  11. Using Automated Writing Evaluation to Reduce Grammar Errors in Writing

    ERIC Educational Resources Information Center

    Liao, Hui-Chuan

    2016-01-01

    Despite the recent development of automated writing evaluation (AWE) technology and the growing interest in applying this technology to language classrooms, few studies have looked at the effects of using AWE on reducing grammatical errors in L2 writing. This study identified the primary English grammatical error types made by 66 Taiwanese…

  12. Using Six Sigma to reduce medication errors in a home-delivery pharmacy service.

    PubMed

    Castle, Lon; Franzblau-Isaac, Ellen; Paulsen, Jim

    2005-06-01

    Medco Health Solutions, Inc. conducted a project to reduce medication errors in its home-delivery service, which is composed of eight prescription-processing pharmacies, three dispensing pharmacies, and six call-center pharmacies. Medco uses the Six Sigma methodology to reduce process variation, establish procedures to monitor the effectiveness of medication safety programs, and determine when these efforts do not achieve performance goals. A team reviewed the processes in home-delivery pharmacy and suggested strategies to improve the data-collection and medication-dispensing practices. A variety of improvement activities were implemented, including a procedure for developing, reviewing, and enhancing sound-alike/look-alike (SALA) alerts and system enhancements to improve processing consistency across the pharmacies. "External nonconformances" were reduced for several categories of medication errors, including wrong-drug selection (33%), wrong directions (49%), and SALA errors (69%). Control charts demonstrated evidence of sustained process improvement and actual reduction in specific medication error elements. Establishing a continuous quality improvement process to ensure that medication errors are minimized is critical to any health care organization providing medication services.

  13. Reducing patient identification errors related to glucose point-of-care testing.

    PubMed

    Alreja, Gaurav; Setia, Namrata; Nichols, James; Pantanowitz, Liron

    2011-01-01

    Patient identification (ID) errors in point-of-care testing (POCT) can cause test results to be transferred to the wrong patient's chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number) is checked against patient registration data from admission, discharge, and transfer (ADT) feeds and only matched results are transferred to the patient's electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015%) in comparison with 61.5 errors/month (0.319%) before implementing the new meters. Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT.

  14. Reducing patient identification errors related to glucose point-of-care testing

    PubMed Central

    Alreja, Gaurav; Setia, Namrata; Nichols, James; Pantanowitz, Liron

    2011-01-01

    Background: Patient identification (ID) errors in point-of-care testing (POCT) can cause test results to be transferred to the wrong patient's chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Materials and Methods: Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number) is checked against patient registration data from admission, discharge, and transfer (ADT) feeds and only matched results are transferred to the patient's electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. Results: When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015%) in comparison with 61.5 errors/month (0.319%) before implementing the new meters. Conclusion: Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT. PMID:21633490

  15. Fine needle aspiration cytology of oral and oropharyngeal lesions with an emphasis on the diagnostic utility and pitfalls.

    PubMed

    Gupta, Nalini; Banik, Tarak; Rajwanshi, Arvind; Radotra, Bishan D; Panda, Naresh; Dey, Pranab; Srinivasan, Radhika; Nijhawan, Raje

    2012-01-01

    This study was undertaken to evaluate the diagnostic utility and pitfalls of fine needle aspiration cytology (FNAC) in oral and oropharyngeal lesions. This was a retrospective audit of oral and oropharyngeal lesions diagnosed with FNAC over a period of six years (2005-2010). Oral/oropharyngeal lesions [n=157] comprised 0.35% of the total FNAC load. The age ranged 1-80 years with the male: female ratio being 1.4:1. Aspirates were inadequate in 7% cases. Histopathology was available in 73/157 (46.5%) cases. Palate was the most common site of involvement [n=66] followed by tongue [n=35], buccal mucosa [n=18], floor of the mouth [n=17], tonsil [n=10], alveolus [n=5], retromolar trigone [n=3], and posterior pharyngeal wall [n=3]. Cytodiagnoses were categorized into infective/inflammatory lesions and benign cysts, and benign and malignant tumours. Uncommon lesions included ectopic lingual thyroid and adult rhabdomyoma of tongue, and solitary fibrous tumor (SFT), and leiomyosarcoma in buccal mucosa. A single false-positive case was dense inflammation with squamous cells misinterpreted as squamous cell carcinoma (SCC) on cytology. There were eight false-negative cases mainly due to sampling error. One false-negative case due to interpretation error was in a salivary gland tumor. The sensitivity of FNAC in diagnosing oral/oropharyngeal lesions was 71.4%; specificity was 97.8% with diagnostic accuracy of 87.7%. Salivary gland tumors and squamous cell carcinoma (SCC) are the most common lesions seen in the oral cavity. FNAC proves to be highly effective in diagnosing the spectrum of different lesions in this region. Sampling error is the main cause of false-negative cases in this region.

  16. An easy-to-use diagnostic system development shell

    NASA Technical Reports Server (NTRS)

    Tsai, L. C.; Ross, J. B.; Han, C. Y.; Wee, W. G.

    1987-01-01

    The Diagnostic System Development Shell (DSDS), an expert system development shell for diagnostic systems, is described. The major objective of building the DSDS is to create a very easy to use and friendly environment for knowledge engineers and end-users. The DSDS is written in OPS5 and CommonLisp. It runs on a VAX/VMS system. A set of domain independent, generalized rules is built in the DSDS, so the users need not be concerned about building the rules. The facts are explicitly represented in a unified format. A powerful check facility which helps the user to check the errors in the created knowledge bases is provided. A judgement facility and other useful facilities are also available. A diagnostic system based on the DSDS system is question driven and can call or be called by other knowledge based systems written in OPS5 and CommonLisp. A prototype diagnostic system for diagnosing a Philips constant potential X-ray system has been built using the DSDS.

  17. Correction of the spectral calibration of the Joint European Torus core light detecting and ranging Thomson scattering diagnostic using ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawke, J.; Scannell, R.; Maslov, M.

    2013-10-15

    This work isolated the cause of the observed discrepancy between the electron temperature (T{sub e}) measurements before and after the JET Core LIDAR Thomson Scattering (TS) diagnostic was upgraded. In the upgrade process, stray light filters positioned just before the detectors were removed from the system. Modelling showed that the shift imposed on the stray light filters transmission functions due to the variations in the incidence angles of the collected photons impacted plasma measurements. To correct for this identified source of error, correction factors were developed using ray tracing models for the calibration and operational states of the diagnostic. Themore » application of these correction factors resulted in an increase in the observed T{sub e}, resulting in the partial if not complete removal of the observed discrepancy in the measured T{sub e} between the JET core LIDAR TS diagnostic, High Resolution Thomson Scattering, and the Electron Cyclotron Emission diagnostics.« less

  18. Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry A; Hunsberger, Randolph J

    This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less

  19. Prospective memory in an air traffic control simulation: External aids that signal when to act

    PubMed Central

    Loft, Shayne; Smith, Rebekah E.; Bhaskara, Adella

    2011-01-01

    At work and in our personal life we often need to remember to perform intended actions at some point in the future, referred to as Prospective Memory. Individuals sometimes forget to perform intentions in safety-critical work contexts. Holding intentions can also interfere with ongoing tasks. We applied theories and methods from the experimental literature to test the effectiveness of external aids in reducing prospective memory error and costs to ongoing tasks in an air traffic control simulation. Participants were trained to accept and hand-off aircraft, and to detect aircraft conflicts. For the prospective memory task participants were required to substitute alternative actions for routine actions when accepting target aircraft. Across two experiments, external display aids were provided that presented the details of target aircraft and associated intended actions. We predicted that aids would only be effective if they provided information that was diagnostic of target occurrence and in this study we examined the utility of aids that directly cued participants when to allocate attention to the prospective memory task. When aids were set to flash when the prospective memory target aircraft needed to be accepted, prospective memory error and costs to ongoing tasks of aircraft acceptance and conflict detection were reduced. In contrast, aids that did not alert participants specifically when the target aircraft were present provided no advantage compared to when no aids we used. These findings have practical implications for the potential relative utility of automated external aids for occupations where individuals monitor multi-item dynamic displays. PMID:21443381

  20. Prospective memory in an air traffic control simulation: external aids that signal when to act.

    PubMed

    Loft, Shayne; Smith, Rebekah E; Bhaskara, Adella

    2011-03-01

    At work and in our personal life we often need to remember to perform intended actions at some point in the future, referred to as Prospective Memory. Individuals sometimes forget to perform intentions in safety-critical work contexts. Holding intentions can also interfere with ongoing tasks. We applied theories and methods from the experimental literature to test the effectiveness of external aids in reducing prospective memory error and costs to ongoing tasks in an air traffic control simulation. Participants were trained to accept and hand-off aircraft and to detect aircraft conflicts. For the prospective memory task, participants were required to substitute alternative actions for routine actions when accepting target aircraft. Across two experiments, external display aids were provided that presented the details of target aircraft and associated intended actions. We predicted that aids would only be effective if they provided information that was diagnostic of target occurrence, and in this study, we examined the utility of aids that directly cued participants when to allocate attention to the prospective memory task. When aids were set to flash when the prospective memory target aircraft needed to be accepted, prospective memory error and costs to ongoing tasks of aircraft acceptance and conflict detection were reduced. In contrast, aids that did not alert participants specifically when the target aircraft were present provided no advantage compared to when no aids were used. These findings have practical implications for the potential relative utility of automated external aids for occupations where individuals monitor multi-item dynamic displays.

  1. The effects of divided attention at study and reporting procedure on regulation and monitoring for episodic recall.

    PubMed

    Sauer, James; Hope, Lorraine

    2016-09-01

    Eyewitnesses regulate the level of detail (grain size) reported to balance competing demands for informativeness and accuracy. However, research to date has predominantly examined metacognitive monitoring for semantic memory tasks, and used relatively artificial phased reporting procedures. Further, although the established role of confidence in this regulation process may affect the confidence-accuracy relation for volunteered responses in predictable ways, previous investigations of the confidence-accuracy relation for eyewitness recall have largely overlooked the regulation of response granularity. Using a non-phased paradigm, Experiment 1 compared reporting and monitoring following optimal and sub-optimal (divided attention) encoding conditions. Participants showed evidence of sacrificing accuracy for informativeness, even when memory quality was relatively weak. Participants in the divided (cf. full) attention condition showed reduced accuracy for fine- but not coarse-grained responses. However, indices of discrimination and confidence diagnosticity showed no effect of divided attention. Experiment 2 compared the effects of divided attention at encoding on reporting and monitoring using both non-phased and 2-phase procedures. Divided attention effects were consistent with Experiment 1. However, compared to those in the non-phased condition, participants in the 2-phase condition displayed a more conservative control strategy, and confidence ratings were less diagnostic of accuracy. When memory quality was reduced, although attempts to balance informativeness and accuracy increased the chance of fine-grained response errors, confidence provided an index of the likely accuracy of volunteered fine-grained responses for both condition. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Diagnostic aids: the Surgical Sieve revisited.

    PubMed

    Chai, Jason; Evans, Lloyd; Hughes, Tom

    2017-08-01

    Diagnostic errors are well documented in the literature and emphasise the need to teach diagnostic skills at an early stage in medical school to create effective and safe clinicians. Hence, there may be a place for diagnostic aids (such as the Surgical Sieve) that provide a framework for generating ideas about diagnoses. With repeated use of the Surgical Sieve in teaching sessions with students, and prompted by the traditional handheld wheels used in antenatal clinics, we developed the Compass Medicine, a handheld diagnostic wheel comprising three concentric discs attached at the centre. We report a preliminary study comparing the Surgical Sieve and the Compass Medicine in generating differential diagnoses. A total of 48 third-year medical students from Cardiff University participated in a study aimed at measuring the efficacy of diagnostic aids (Surgical Sieve and Compass Medicine) in generating diagnoses. We quantified the effect each aid had on the number of diagnoses generated, and compared the size of the effect between the two diagnostic aids. There may be a place for diagnostic aids that provide a framework for generating ideas about diagnoses RESULTS: The study suggests that both diagnostic aids prompted users to generate a greater number of diagnoses, but there was no significant difference in the size of effect between the two diagnostic aids. We hope that our study with diagnostic aids will encourage the use of robust tools to teach medical students an easily visualised framework for diagnostic thinking. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  3. [A web-based Colour Vision Test as a Tool for Qualitative Evaluation of Pseudoisochromatic Pflüger Trident Colour Plates].

    PubMed

    Kuchenbecker, Joern

    2018-05-22

    Pseudoisochromatic colour plates are constructed according to specific principles. They can be very different in quality. To check the diagnostic quality, they have to be tested on a large number of subjects, but this procedure is can be tedious and expensive. Therefore, the use of a standardised web-based test is recommended. Eight Pflüger trident colour plates (including 1 demo plate) according to the Velhagen edition of 1980 were digitised and inserted into a web-based colour vision test (www.color-vision-test.info). After visual display calibration and 2 demonstrations of the demo plate (#1) to introduce the test procedure, 7 red-green colour plates (#3, 4, 10, 11, 12, 13, 16) were presented in a randomised order in 3 different randomised positions each for 10 seconds. The user had to specify the opening of the Pflüger trident by a mouse click or arrow keys. 6360 evaluations of all plates from 2120 randomised subjects were included. Without error, the detection rates of the plates were between 72.2% (plate #3) and 90.7% (plate #16; n = 6360). With an error number of 7 errors per test, the detection rates of the plates were between 21.6% (plate #3) and 67.7% (plate #16; n = 1556). If an error number of 14 errors was used, the detection rates of the plates were between 10.9% (plate #11) and 40.1% (plate #16; n = 606). Plate #16 showed the highest detection rate - at zero error number as well as at the 7 and 14 error limit. The diagnostic quality of this plate was low. The colourimetric data were improved. The detection rate was then significantly lower. The differences in quality of pseudoisochromatic Pflüger trident colour plates can be tested without great effort using a web-based test. Optimisation of a poor quality colour plate can then be carried out. Georg Thieme Verlag KG Stuttgart · New York.

  4. Decision making in trauma settings: simulation to improve diagnostic skills.

    PubMed

    Murray, David J; Freeman, Brad D; Boulet, John R; Woodhouse, Julie; Fehr, James J; Klingensmith, Mary E

    2015-06-01

    In the setting of acute injury, a wrong, missed, or delayed diagnosis can impact survival. Clinicians rely on pattern recognition and heuristics to rapidly assess injuries, but an overreliance on these approaches can result in a diagnostic error. Simulation has been advocated as a method for practitioners to learn how to recognize the limitations of heuristics and develop better diagnostic skills. The objective of this study was to determine whether simulation could be used to provide teams the experiences in managing scenarios that require the use of heuristic as well as analytic diagnostic skills to effectively recognize and treat potentially life-threatening injuries. Ten scenarios were developed to assess the ability of trauma teams to provide initial care to a severely injured patient. Seven standard scenarios simulated severe injuries that once diagnosed could be effectively treated using standard Advanced Trauma Life Support algorithms. Because diagnostic error occurs more commonly in complex clinical settings, 3 complex scenarios required teams to use more advanced diagnostic skills to uncover a coexisting condition and treat the patient. Teams composed of 3 to 5 practitioners were evaluated in the performance of 7 (of 10) randomly selected scenarios (5 standard, 2 complex). Expert rates scored teams using standardized checklists and global scores. Eighty-three surgery, emergency medicine, and anesthesia residents constituted 21 teams. Expert raters were able to reliably score the scenarios. Teams accomplished fewer checklist actions and received lower global scores on the 3 analytic scenarios (73.8% [12.3%] and 5.9 [1.6], respectively) compared with the 7 heuristic scenarios (83.2% [11.7%] and 6.6 [1.3], respectively; P < 0.05 for both). Teams led by more junior residents received higher global scores on the analytic scenarios (6.4 [1.3]) than the more senior team leaders (5.3 [1.7]). This preliminary study indicates that teams led by more senior residents received higher scores when managing heuristic scenarios but were less effective when managing the scenarios that require a more analytic approach. Simulation can be used to provide teams with decision-making experiences in trauma settings and could be used to improve diagnostic skills as well as study the decision-making process.

  5. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  6. Error reduction in EMG signal decomposition

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Decomposition of the electromyographic (EMG) signal into constituent action potentials and the identification of individual firing instances of each motor unit in the presence of ambient noise are inherently probabilistic processes, whether performed manually or with automated algorithms. Consequently, they are subject to errors. We set out to classify and reduce these errors by analyzing 1,061 motor-unit action-potential trains (MUAPTs), obtained by decomposing surface EMG (sEMG) signals recorded during human voluntary contractions. Decomposition errors were classified into two general categories: location errors representing variability in the temporal localization of each motor-unit firing instance and identification errors consisting of falsely detected or missed firing instances. To mitigate these errors, we developed an error-reduction algorithm that combines multiple decomposition estimates to determine a more probable estimate of motor-unit firing instances with fewer errors. The performance of the algorithm is governed by a trade-off between the yield of MUAPTs obtained above a given accuracy level and the time required to perform the decomposition. When applied to a set of sEMG signals synthesized from real MUAPTs, the identification error was reduced by an average of 1.78%, improving the accuracy to 97.0%, and the location error was reduced by an average of 1.66 ms. The error-reduction algorithm in this study is not limited to any specific decomposition strategy. Rather, we propose it be used for other decomposition methods, especially when analyzing precise motor-unit firing instances, as occurs when measuring synchronization. PMID:25210159

  7. Mobile phone imaging and cloud-based analysis for standardized malaria detection and reporting.

    PubMed

    Scherr, Thomas F; Gupta, Sparsh; Wright, David W; Haselton, Frederick R

    2016-06-27

    Rapid diagnostic tests (RDTs) have been widely deployed in low-resource settings. These tests are typically read by visual inspection, and accurate record keeping and data aggregation remains a substantial challenge. A successful malaria elimination campaign will require new strategies that maximize the sensitivity of RDTs, reduce user error, and integrate results reporting tools. In this report, an unmodified mobile phone was used to photograph RDTs, which were subsequently uploaded into a globally accessible database, REDCap, and then analyzed three ways: with an automated image processing program, visual inspection, and a commercial lateral flow reader. The mobile phone image processing detected 20.6 malaria parasites/microliter of blood, compared to the commercial lateral flow reader which detected 64.4 parasites/microliter. Experienced observers visually identified positive malaria cases at 12.5 parasites/microliter, but encountered reporting errors and false negatives. Visual interpretation by inexperienced users resulted in only an 80.2% true negative rate, with substantial disagreement in the lower parasitemia range. We have demonstrated that combining a globally accessible database, such as REDCap, with mobile phone based imaging of RDTs provides objective, secure, automated, data collection and result reporting. This simple combination of existing technologies would appear to be an attractive tool for malaria elimination campaigns.

  8. Correction of nonuniformity error of Gafchromic EBT2 and EBT3.

    PubMed

    Katsuda, Toshizo; Gotanda, Rumi; Gotanda, Tatsuhiro; Akagawa, Takuya; Tanki, Nobuyoshi; Kuwano, Tadao; Yabunaka, Kouichi

    2016-05-08

    This study investigates an X-ray dose measurement method for computed tomography using Gafchromic films. Nonuniformity of the active layer is a major problem in Gafchromic films. In radiotherapy, nonuniformity error is reduced by applying the double-exposure technique, but this is impractical in diagnostic radiology because of the heel effect. Therefore, we propose replacing the X-rays in the double-exposure technique with ultraviolet (UV)-A irradiation of Gafchromic EBT2 and EBT3. To improve the reproducibility of the scan position, Gafchromic EBT2 and EBT3 films were attached to a 3-mm-thick acrylic plate. The samples were then irradiated with a 10 W UV-A fluorescent lamp placed at a distance of 72cm for 30, 60, and 90 minutes. The profile curves were evaluated along the long and short axes of the film center, and the standard deviations of the pixel values were calculated over large areas of the films. Paired t-test was performed. UV-A irradiation exerted a significant effect on Gafchromic EBT2 (paired t-test; p = 0.0275) but not on EBT3 (paired t-test; p = 0.2785). Similarly, the homogeneity was improved in Gafchromic EBT2 but not in EBT3. Therefore, the double-exposure technique under UV-A irradiation is suitable only for EBT2 films.

  9. Mobile phone imaging and cloud-based analysis for standardized malaria detection and reporting

    NASA Astrophysics Data System (ADS)

    Scherr, Thomas F.; Gupta, Sparsh; Wright, David W.; Haselton, Frederick R.

    2016-06-01

    Rapid diagnostic tests (RDTs) have been widely deployed in low-resource settings. These tests are typically read by visual inspection, and accurate record keeping and data aggregation remains a substantial challenge. A successful malaria elimination campaign will require new strategies that maximize the sensitivity of RDTs, reduce user error, and integrate results reporting tools. In this report, an unmodified mobile phone was used to photograph RDTs, which were subsequently uploaded into a globally accessible database, REDCap, and then analyzed three ways: with an automated image processing program, visual inspection, and a commercial lateral flow reader. The mobile phone image processing detected 20.6 malaria parasites/microliter of blood, compared to the commercial lateral flow reader which detected 64.4 parasites/microliter. Experienced observers visually identified positive malaria cases at 12.5 parasites/microliter, but encountered reporting errors and false negatives. Visual interpretation by inexperienced users resulted in only an 80.2% true negative rate, with substantial disagreement in the lower parasitemia range. We have demonstrated that combining a globally accessible database, such as REDCap, with mobile phone based imaging of RDTs provides objective, secure, automated, data collection and result reporting. This simple combination of existing technologies would appear to be an attractive tool for malaria elimination campaigns.

  10. Applying Model-Based Reasoning to the FDIR of the Command and Data Handling Subsystem of the International Space Station

    NASA Technical Reports Server (NTRS)

    Robinson, Peter; Shirley, Mark; Fletcher, Daryl; Alena, Rick; Duncavage, Dan; Lee, Charles

    2003-01-01

    All of the International Space Station (ISS) systems which require computer control depend upon the hardware and software of the Command and Data Handling System (C&DH) system, currently a network of over 30 386-class computers called Multiplexor/Dimultiplexors (MDMs)[18]. The Caution and Warning System (C&W)[7], a set of software tasks that runs on the MDMs, is responsible for detecting, classifying, and reporting errors in all ISS subsystems including the C&DH. Fault Detection, Isolation and Recovery (FDIR) of these errors is typically handled with a combination of automatic and human effort. We are developing an Advanced Diagnostic System (ADS) to augment the C&W system with decision support tools to aid in root cause analysis as well as resolve differing human and machine C&DH state estimates. These tools which draw from sources in model-based reasoning[ 16,291, will improve the speed and accuracy of flight controllers by reducing the uncertainty in C&DH state estimation, allowing for a more complete assessment of risk. We have run tests with ISS telemetry and focus on those C&W events which relate to the C&DH system itself. This paper describes our initial results and subsequent plans.

  11. A variational assimilation method for satellite and conventional data: Development of basic model for diagnosis of cyclone systems

    NASA Technical Reports Server (NTRS)

    Achtemeier, G. L.; Ochs, H. T., III; Kidder, S. Q.; Scott, R. W.; Chen, J.; Isard, D.; Chance, B.

    1986-01-01

    A three-dimensional diagnostic model for the assimilation of satellite and conventional meteorological data is developed with the variational method of undetermined multipliers. Gridded fields of data from different type, quality, location, and measurement source are weighted according to measurement accuracy and merged using least squares criteria so that the two nonlinear horizontal momentum equations, the hydrostatic equation, and an integrated continuity equation are satisfied. The model is used to compare multivariate variational objective analyses with and without satellite data with initial analyses and the observations through criteria that were determined by the dynamical constraints, the observations, and pattern recognition. It is also shown that the diagnoses of local tendencies of the horizontal velocity components are in good comparison with the observed patterns and tendencies calculated with unadjusted data. In addition, it is found that the day-night difference in TOVS biases are statistically different (95% confidence) at most levels. Also developed is a hybrid nonlinear sigma vertical coordinate that eliminates hydrostatic truncation error in the middle and upper troposphere and reduces truncation error in the lower troposphere. Finally, it is found that the technique used to grid the initial data causes boundary effects to intrude into the interior of the analysis a distance equal to the average separation between observations.

  12. Perspective-taking abilities in the balance between autism tendencies and psychosis proneness.

    PubMed

    Abu-Akel, Ahmad M; Wood, Stephen J; Hansen, Peter C; Apperly, Ian A

    2015-06-07

    Difficulties with the ability to appreciate the perspective of others (mentalizing) is central to both autism and schizophrenia spectrum disorders. While the disorders are diagnostically independent, they can co-occur in the same individual. The effect of such co-morbidity is hypothesized to worsen mentalizing abilities. The recent influential 'diametric brain theory', however, suggests that the disorders are etiologically and phenotypically diametrical, predicting opposing effects on one's mentalizing abilities. To test these contrasting hypotheses, we evaluated the effect of psychosis and autism tendencies on the perspective-taking (PT) abilities of 201 neurotypical adults, on the assumption that autism tendencies and psychosis proneness are heritable dimensions of normal variation. We show that while both autism tendencies and psychosis proneness induce PT errors, their interaction reduced these errors. Our study is, to our knowledge, the first to observe that co-occurring autistic and psychotic traits can exert opposing influences on performance, producing a normalizing effect possibly by way of their diametrical effects on socio-cognitive abilities. This advances the notion that some individuals may, to some extent, be buffered against developing either illness or present fewer symptoms owing to a balanced expression of autistic and psychosis liability. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. Probe pressure effects on human skin diffuse reflectance and fluorescence spectroscopy measurements

    PubMed Central

    Lim, Liang; Nichols, Brandon; Rajaram, Narasimhan; Tunnell, James W.

    2011-01-01

    Diffuse reflectance and fluorescence spectroscopy are popular research techniques for noninvasive disease diagnostics. Most systems include an optical fiber probe that transmits and collects optical spectra in contact with the suspected lesion. The purpose of this study is to investigate probe pressure effects on human skin spectroscopic measurements. We conduct an in-vivo experiment on human skin tissue to study the short-term (<2 s) and long-term (>30 s) effects of probe pressure on diffuse reflectance and fluorescence measurements. Short-term light probe pressure (P0 < 9 mN∕mm2) effects are within 0 ± 10% on all physiological properties extracted from diffuse reflectance and fluorescence measurements, and less than 0 ± 5% for diagnostically significant physiological properties. Absorption decreases with site-specific variations due to blood being compressed out of the sampled volume. Reduced scattering coefficient variation is site specific. Intrinsic fluorescence shows a large standard error, although no specific pressure-related trend is observed. Differences in tissue structure and morphology contribute to site-specific probe pressure effects. Therefore, the effects of pressure can be minimized when the pressure is small and applied for a short amount of time; however, long-term and large pressures induce significant distortions in measured spectra. PMID:21280899

  14. Post-therapy lesions in patients with non-Hodgkin's lymphoma characterized by 18F-FDG PET/CT-guided biopsy using automated robotic biopsy arm.

    PubMed

    Radhakrishnan, Renjith K; Mittal, Bhagwant R; Basher, Rajender K; Prakash, Gaurav; Malhotra, Pankaj; Kalra, Naveen; Das, Ashim

    2018-01-01

    The aim of this study was to analyse the positive predictive value (PPV) of post-therapy fluorine-18-fluorodeoxyglucose (F-FDG) PET/CT performed for response or recurrence evaluation in patients with non-Hodgkin's lymphoma (NHL) and to appraise the diagnostic utility of F-FDG PET/CT-guided biopsy in this setting. A total of 17 patients with NHL showing F-FDG avid lesions in F-FDG PET/CT performed for response or recurrence assessment underwent F-FDG PET/CT-guided biopsy using automated robotic biopsy arm needle navigation technique. The objectives were analysed in reference to histopathology. In all, 15 of the 17 (88.5%) procedures yielded adequate representative tissue samples. Nine out of 15 lesions were positive for residual disease and the remaining revealed benign findings on histopathology. One patient with inconclusive biopsy underwent surgical resection and histopathology confirmed the presence of residual disease. PPV of theF-FDG PET/CT was observed to be 62.5% (10/16). F-FDG PET/CT for response evaluation in NHL possesses a low PPV and hence warrants histopathological correlation when F-FDG PET/CT findings influence management decision. Diagnostic yield of F-FDG PET/CT-guided biopsy is high and has the potential to reduce sampling errors.

  15. High energy Coulomb-scattered electrons for relativistic particle beams and diagnostics

    DOE PAGES

    Thieberger, P.; Altinbas, Z.; Carlson, C.; ...

    2016-03-29

    A new system used for monitoring energetic Coulomb-scattered electrons as the main diagnostic for accurately aligning the electron and ion beams in the new Relativistic Heavy Ion Collider (RHIC) electron lenses is described in detail. The theory of electron scattering from relativistic ions is developed and applied to the design and implementation of the system used to achieve and maintain the alignment. Commissioning with gold and 3He beams is then described as well as the successful utilization of the new system during the 2015 RHIC polarized proton run. Systematic errors of the new method are then estimated. Lastly, some possiblemore » future applications of Coulomb-scattered electrons for beam diagnostics are briefly discussed.« less

  16. Use of machine learning methods to reduce predictive error of groundwater models.

    PubMed

    Xu, Tianfang; Valocchi, Albert J; Choi, Jaesik; Amir, Eyal

    2014-01-01

    Quantitative analyses of groundwater flow and transport typically rely on a physically-based model, which is inherently subject to error. Errors in model structure, parameter and data lead to both random and systematic error even in the output of a calibrated model. We develop complementary data-driven models (DDMs) to reduce the predictive error of physically-based groundwater models. Two machine learning techniques, the instance-based weighting and support vector regression, are used to build the DDMs. This approach is illustrated using two real-world case studies of the Republican River Compact Administration model and the Spokane Valley-Rathdrum Prairie model. The two groundwater models have different hydrogeologic settings, parameterization, and calibration methods. In the first case study, cluster analysis is introduced for data preprocessing to make the DDMs more robust and computationally efficient. The DDMs reduce the root-mean-square error (RMSE) of the temporal, spatial, and spatiotemporal prediction of piezometric head of the groundwater model by 82%, 60%, and 48%, respectively. In the second case study, the DDMs reduce the RMSE of the temporal prediction of piezometric head of the groundwater model by 77%. It is further demonstrated that the effectiveness of the DDMs depends on the existence and extent of the structure in the error of the physically-based model. © 2013, National GroundWater Association.

  17. Modeling the Financial and Clinical Implications of Malaria Rapid Diagnostic Tests in the Case-management of Older Children and Adults in Kenya

    PubMed Central

    Zurovac, Dejan; Larson, Bruce A.; Skarbinski, Jacek; Slutsker, Laurence; Snow, Robert W.; Hamel, Mary J.

    2008-01-01

    Using data on clinical practices for outpatients 5 years and older, test accuracy, and malaria prevalence, we model financial and clinical implications of malaria rapid diagnostic tests (RDTs) under the new artemether-lumefantrine (AL) treatment policy in one high and one low malaria prevalence district in Kenya. In the high transmission district, RDTs as actually used would improve malaria treatment (61% less over-treatment but 8% more under-treatment) and lower costs (21% less). Nonetheless, the majority of patients with malaria would not be correctly treated with AL. In the low transmission district, especially because the treatment policy was new and AL was not widely used, RDTs as actually used would yield a minor reduction in under-treatment errors (36% less but the base is small) with 41% higher costs. In both districts, adherence to revised clinical practices with RDTs has the potential to further decrease treatment errors with acceptable costs. PMID:18541764

  18. High-resolution, low-delay, and error-resilient medical ultrasound video communication using H.264/AVC over mobile WiMAX networks.

    PubMed

    Panayides, Andreas; Antoniou, Zinonas C; Mylonas, Yiannos; Pattichis, Marios S; Pitsillides, Andreas; Pattichis, Constantinos S

    2013-05-01

    In this study, we describe an effective video communication framework for the wireless transmission of H.264/AVC medical ultrasound video over mobile WiMAX networks. Medical ultrasound video is encoded using diagnostically-driven, error resilient encoding, where quantization levels are varied as a function of the diagnostic significance of each image region. We demonstrate how our proposed system allows for the transmission of high-resolution clinical video that is encoded at the clinical acquisition resolution and can then be decoded with low-delay. To validate performance, we perform OPNET simulations of mobile WiMAX Medium Access Control (MAC) and Physical (PHY) layers characteristics that include service prioritization classes, different modulation and coding schemes, fading channels conditions, and mobility. We encode the medical ultrasound videos at the 4CIF (704 × 576) resolution that can accommodate clinical acquisition that is typically performed at lower resolutions. Video quality assessment is based on both clinical (subjective) and objective evaluations.

  19. Metabolic emergencies and the emergency physician.

    PubMed

    Fletcher, Janice Mary

    2016-02-01

    Fifty percent of inborn errors of metabolism are present in later childhood and adulthood, with crises commonly precipitated by minor viral illnesses or increased protein ingestion. Many physicians only consider IEM after more common conditions (such as sepsis) have been considered. In view of the large number of inborn errors, it might appear that their diagnosis requires precise knowledge of a large number of biochemical pathways and their interrelationship. As a matter of fact, an adequate diagnostic approach can be based on the proper use of only a few screening tests. A detailed history of antecedent events, together with these simple screening tests, can be diagnostic, leading to life-saving, targeted treatments for many disorders. Unrecognised, IEM can lead to significant mortality and morbidity. Advice is available 24/7 through the metabolic service based at the major paediatric hospital in each state and Starship Children's Health in New Zealand. © 2016 The Author. Journal of Paediatrics and Child Health © 2016 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  20. [The experience of implementation of system of quality management in the Department of Laboratory Diagnostic of the N.V. Sklifosofskiy Research Institute of Emergency Care of Moscow Health Department: a lecture].

    PubMed

    Zenina, L P; Godkov, M A

    2013-08-01

    The article presents the experience of implementation of system of quality management into the practice of multi-field laboratory of emergency medical care hospital. The analysis of laboratory errors is applied and the modes of their prevention are demonstrated. The ratings of department of laboratory diagnostic of the N. V. Sklifosofskiy research institute of emergency care in the program EQAS (USA) Monthly Clinical Chemistry from 2007 are presented. The implementation of the system of quality management of laboratory analysis into department of laboratory diagnostic made it possible to support physicians of clinical departments with reliable information. The confidence of clinicians to received results increased. The effectiveness of laboratory diagnostic increased due to lowering costs of analysis without negative impact to quality of curative process.

  1. Errors from approximation of ODE systems with reduced order models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevska, Tanya

    2016-12-30

    This is a code to calculate the error from approximation of systems of ordinary differential equations (ODEs) by using Proper Orthogonal Decomposition (POD) Reduced Order Models (ROM) methods and to compare and analyze the errors for two POD ROM variants. The first variant is the standard POD ROM, the second variant is a modification of the method using the values of the time derivatives (a.k.a. time-derivative snapshots). The code compares the errors from the two variants under different conditions.

  2. Demonstration of Fuel Hot-Spot Pressure in Excess of 50 Gbar for Direct-Drive, Layered Deuterium-Tritium Implosions on OMEGA

    NASA Astrophysics Data System (ADS)

    Regan, S. P.; Goncharov, V. N.; Igumenshchev, I. V.; Sangster, T. C.; Betti, R.; Bose, A.; Boehly, T. R.; Bonino, M. J.; Campbell, E. M.; Cao, D.; Collins, T. J. B.; Craxton, R. S.; Davis, A. K.; Delettrez, J. A.; Edgell, D. H.; Epstein, R.; Forrest, C. J.; Frenje, J. A.; Froula, D. H.; Gatu Johnson, M.; Glebov, V. Yu.; Harding, D. R.; Hohenberger, M.; Hu, S. X.; Jacobs-Perkins, D.; Janezic, R.; Karasik, M.; Keck, R. L.; Kelly, J. H.; Kessler, T. J.; Knauer, J. P.; Kosc, T. Z.; Loucks, S. J.; Marozas, J. A.; Marshall, F. J.; McCrory, R. L.; McKenty, P. W.; Meyerhofer, D. D.; Michel, D. T.; Myatt, J. F.; Obenschain, S. P.; Petrasso, R. D.; Radha, P. B.; Rice, B.; Rosenberg, M. J.; Schmitt, A. J.; Schmitt, M. J.; Seka, W.; Shmayda, W. T.; Shoup, M. J.; Shvydky, A.; Skupsky, S.; Solodov, A. A.; Stoeckl, C.; Theobald, W.; Ulreich, J.; Wittman, M. D.; Woo, K. M.; Yaakobi, B.; Zuegel, J. D.

    2016-07-01

    A record fuel hot-spot pressure Phs=56 ±7 Gbar was inferred from x-ray and nuclear diagnostics for direct-drive inertial confinement fusion cryogenic, layered deuterium-tritium implosions on the 60-beam, 30-kJ, 351-nm OMEGA Laser System. When hydrodynamically scaled to the energy of the National Ignition Facility, these implosions achieved a Lawson parameter ˜60 % of the value required for ignition [A. Bose et al., Phys. Rev. E 93, LM15119ER (2016)], similar to indirect-drive implosions [R. Betti et al., Phys. Rev. Lett. 114, 255003 (2015)], and nearly half of the direct-drive ignition-threshold pressure. Relative to symmetric, one-dimensional simulations, the inferred hot-spot pressure is approximately 40% lower. Three-dimensional simulations suggest that low-mode distortion of the hot spot seeded by laser-drive nonuniformity and target-positioning error reduces target performance.

  3. Demonstration of fuel hot-spot pressure in excess of 50 Gbar for direct-drive, layered deuterium-tritium implosions on OMEGA

    DOE PAGES

    Regan, S. P.; Goncharov, V. N.; Igumenshchev, I. V.; ...

    2016-07-07

    A record fuel hot-spot pressure P hs = 56±7 Gbar was inferred from x-ray and nuclear diagnostics for direct-drive inertial confinement fusion cryogenic, layered deuterium–tritium implosions on the 60-beam, 30-kJ, 351-nm OMEGA Laser System. When hydrodynamically scaled to the energy of the National Ignition Facility (NIF), these implosions achieved a Lawson parameter ~60% of the value required for ignition [A. Bose et al., Phys. Rev. E (in press)], similar to indirect-drive implosions [R. Betti et al., Phys. Rev. Lett. 114, 255003 (2015)], and nearly half of the direct-drive ignition-threshold pressure. Relative to symmetric, one-dimensional simulations, the inferred hot-spot pressure ismore » ~40% lower. Furthermore, three-dimensional simulations suggest that low-mode distortion of the hot spot seeded by laser-drive nonuniformity and target-positioning error reduces target performance.« less

  4. Demonstration of fuel hot-spot pressure in excess of 50 Gbar for direct-drive, layered deuterium-tritium implosions on OMEGA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regan, S. P.; Goncharov, V. N.; Igumenshchev, I. V.

    A record fuel hot-spot pressure P hs = 56±7 Gbar was inferred from x-ray and nuclear diagnostics for direct-drive inertial confinement fusion cryogenic, layered deuterium–tritium implosions on the 60-beam, 30-kJ, 351-nm OMEGA Laser System. When hydrodynamically scaled to the energy of the National Ignition Facility (NIF), these implosions achieved a Lawson parameter ~60% of the value required for ignition [A. Bose et al., Phys. Rev. E (in press)], similar to indirect-drive implosions [R. Betti et al., Phys. Rev. Lett. 114, 255003 (2015)], and nearly half of the direct-drive ignition-threshold pressure. Relative to symmetric, one-dimensional simulations, the inferred hot-spot pressure ismore » ~40% lower. Furthermore, three-dimensional simulations suggest that low-mode distortion of the hot spot seeded by laser-drive nonuniformity and target-positioning error reduces target performance.« less

  5. Three-dimensional photogrammetric measurement of magnetic field lines in the WEGA stellarator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drewelow, Peter; Braeuer, Torsten; Otte, Matthias

    2009-12-15

    The magnetic confinement of plasmas in fusion experiments can significantly degrade due to perturbations of the magnetic field. A precise analysis of the magnetic field in a stellarator-type experiment utilizes electrons as test particles following the magnetic field line. The usual fluorescent detector for this electron beam limits the provided information to two-dimensional cut views at certain toroidal positions. However, the technique described in this article allows measuring the three-dimensional structure of the magnetic field by means of close-range photogrammetry. After testing and optimizing the main diagnostic components, measurements of the magnetic field lines were accomplished with a spatial resolutionmore » of 5 mm. The results agree with numeric calculations, qualifying this technique as an additional tool to investigate magnetic field configurations in a stellarator. For a possible future application, ways are indicated on how to reduce experimental error sources.« less

  6. Patient-Centered Mobile Health Data Management Solution for the German Health Care System (The DataBox Project).

    PubMed

    Brinker, Titus Josef; Rudolph, Stefanie; Richter, Daniela; von Kalle, Christof

    2018-05-11

    This article describes the DataBox project which offers a perspective of a new health data management solution in Germany. DataBox was initially conceptualized as a repository of individual lung cancer patient data (structured and unstructured). The patient is the owner of the data and is able to share his or her data with different stakeholders. Data is transferred, displayed, and stored online, but not archived. In the long run, the project aims at replacing the conventional method of paper- and storage-device-based handling of data for all patients in Germany, leading to better organization and availability of data which reduces duplicate diagnostic procedures, treatment errors, and enables the training as well as usage of artificial intelligence algorithms on large datasets. ©Titus Josef Brinker, Stefanie Rudolph, Daniela Richter, Christof von Kalle. Originally published in JMIR Cancer (http://cancer.jmir.org), 11.05.2018.

  7. Cancer Risks Associated with External Radiation From Diagnostic Imaging Procedures

    PubMed Central

    Linet, Martha S.; Slovis, Thomas L.; Miller, Donald L.; Kleinerman, Ruth; Lee, Choonsik; Rajaraman, Preetha; de Gonzalez, Amy Berrington

    2012-01-01

    The 600% increase in medical radiation exposure to the US population since 1980 has provided immense benefit, but potential future cancer risks to patients. Most of the increase is from diagnostic radiologic procedures. The objectives of this review are to summarize epidemiologic data on cancer risks associated with diagnostic procedures, describe how exposures from recent diagnostic procedures relate to radiation levels linked with cancer occurrence, and propose a framework of strategies to reduce radiation from diagnostic imaging in patients. We briefly review radiation dose definitions, mechanisms of radiation carcinogenesis, key epidemiologic studies of medical and other radiation sources and cancer risks, and dose trends from diagnostic procedures. We describe cancer risks from experimental studies, future projected risks from current imaging procedures, and the potential for higher risks in genetically susceptible populations. To reduce future projected cancers from diagnostic procedures, we advocate widespread use of evidence-based appropriateness criteria for decisions about imaging procedures, oversight of equipment to deliver reliably the minimum radiation required to attain clinical objectives, development of electronic lifetime records of imaging procedures for patients and their physicians, and commitment by medical training programs, professional societies, and radiation protection organizations to educate all stakeholders in reducing radiation from diagnostic procedures. PMID:22307864

  8. Classification and reduction of pilot error

    NASA Technical Reports Server (NTRS)

    Rogers, W. H.; Logan, A. L.; Boley, G. D.

    1989-01-01

    Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.

  9. Quantifying the impact of material-model error on macroscale quantities-of-interest using multiscale a posteriori error-estimation techniques

    DOE PAGES

    Brown, Judith A.; Bishop, Joseph E.

    2016-07-20

    An a posteriori error-estimation framework is introduced to quantify and reduce modeling errors resulting from approximating complex mesoscale material behavior with a simpler macroscale model. Such errors may be prevalent when modeling welds and additively manufactured structures, where spatial variations and material textures may be present in the microstructure. We consider a case where a <100> fiber texture develops in the longitudinal scanning direction of a weld. Transversely isotropic elastic properties are obtained through homogenization of a microstructural model with this texture and are considered the reference weld properties within the error-estimation framework. Conversely, isotropic elastic properties are considered approximatemore » weld properties since they contain no representation of texture. Errors introduced by using isotropic material properties to represent a weld are assessed through a quantified error bound in the elastic regime. Lastly, an adaptive error reduction scheme is used to determine the optimal spatial variation of the isotropic weld properties to reduce the error bound.« less

  10. Characteristics associated with requests by pathologists for second opinions on breast biopsies

    PubMed Central

    Geller, Berta M; Nelson, Heidi D; Weaver, Donald L; Frederick, Paul D; Allison, Kimberly H; Onega, Tracy; Carney, Patricia A; Tosteson, Anna N A; Elmore, Joann G

    2018-01-01

    Aims Second opinions in pathology improve patient safety by reducing diagnostic errors, leading to more appropriate clinical treatment decisions. Little objective data are available regarding the factors triggering a request for second opinion despite second opinion consultations being part of the diagnostic system of pathology. Therefore we sought to assess breast biopsy cases and interpreting pathologists characteristics associated with second opinion requests. Methods Collected pathologist surveys and their interpretations of 60 test set cases were used to explore the relationships between case characteristics, pathologist characteristics and case perceptions, and requests for second opinions. Data were evaluated by logistic regression and generalised estimating equations. Results 115 pathologists provided 6900 assessments; pathologists requested second opinions on 70% (4827/6900) of their assessments 36% (1731/4827) of these would not have been required by policy. All associations between case characteristics and requesting second opinions were statistically significant, including diagnostic category, breast density, biopsy type, and number of diagnoses noted per case. Exclusive of institutional policies, pathologists wanted second opinions most frequently for atypia (66%) and least frequently for invasive cancer (20%). Second opinion rates were higher when the pathologist had lower assessment confidence, in cases with higher perceived difficulty, and cases with borderline diagnoses. Conclusions Pathologists request second opinions for challenging cases, particularly those with atypia, high breast density, core needle biopsies, or many co-existing diagnoses. Further studies should evaluate whether the case characteristics identified in this study could be used as clinical criteria to prompt system-level strategies for mandating second opinions. PMID:28465449

  11. Piecewise compensation for the nonlinear error of fiber-optic gyroscope scale factor

    NASA Astrophysics Data System (ADS)

    Zhang, Yonggang; Wu, Xunfeng; Yuan, Shun; Wu, Lei

    2013-08-01

    Fiber-Optic Gyroscope (FOG) scale factor nonlinear error will result in errors in Strapdown Inertial Navigation System (SINS). In order to reduce nonlinear error of FOG scale factor in SINS, a compensation method is proposed in this paper based on curve piecewise fitting of FOG output. Firstly, reasons which can result in FOG scale factor error are introduced and the definition of nonlinear degree is provided. Then we introduce the method to divide the output range of FOG into several small pieces, and curve fitting is performed in each output range of FOG to obtain scale factor parameter. Different scale factor parameters of FOG are used in different pieces to improve FOG output precision. These parameters are identified by using three-axis turntable, and nonlinear error of FOG scale factor can be reduced. Finally, three-axis swing experiment of SINS verifies that the proposed method can reduce attitude output errors of SINS by compensating the nonlinear error of FOG scale factor and improve the precision of navigation. The results of experiments also demonstrate that the compensation scheme is easy to implement. It can effectively compensate the nonlinear error of FOG scale factor with slightly increased computation complexity. This method can be used in inertial technology based on FOG to improve precision.

  12. Diagnostics of wear in aeronautical systems

    NASA Technical Reports Server (NTRS)

    Wedeven, L. D.

    1979-01-01

    The use of appropriate diagnostic tools for aircraft oil wetted components is reviewed, noting that it can reduce direct operating costs through reduced unscheduled maintenance, particularly in helicopter engine and transmission systems where bearing failures are a significant cost factor. Engine and transmission wear modes are described, and diagnostic methods for oil and wet particle analysis, the spectrometric oil analysis program, chip detectors, ferrography, in-line oil monitor and radioactive isotope tagging are discussed, noting that they are effective over a limited range of particle sizes but compliment each other if used in parallel. Fine filtration can potentially increase time between overhauls, but reduces the effectiveness of conventional oil monitoring techniques so that alternative diagnostic techniques must be used. It is concluded that the development of a diagnostic system should be parallel and integral with the development of a mechanical system.

  13. Interventions to reduce medication errors in neonatal care: a systematic review

    PubMed Central

    Nguyen, Minh-Nha Rhylie; Mosel, Cassandra

    2017-01-01

    Background: Medication errors represent a significant but often preventable cause of morbidity and mortality in neonates. The objective of this systematic review was to determine the effectiveness of interventions to reduce neonatal medication errors. Methods: A systematic review was undertaken of all comparative and noncomparative studies published in any language, identified from searches of PubMed and EMBASE and reference-list checking. Eligible studies were those investigating the impact of any medication safety interventions aimed at reducing medication errors in neonates in the hospital setting. Results: A total of 102 studies were identified that met the inclusion criteria, including 86 comparative and 16 noncomparative studies. Medication safety interventions were classified into six themes: technology (n = 38; e.g. electronic prescribing), organizational (n = 16; e.g. guidelines, policies, and procedures), personnel (n = 13; e.g. staff education), pharmacy (n = 9; e.g. clinical pharmacy service), hazard and risk analysis (n = 8; e.g. error detection tools), and multifactorial (n = 18; e.g. any combination of previous interventions). Significant variability was evident across all included studies, with differences in intervention strategies, trial methods, types of medication errors evaluated, and how medication errors were identified and evaluated. Most studies demonstrated an appreciable risk of bias. The vast majority of studies (>90%) demonstrated a reduction in medication errors. A similar median reduction of 50–70% in medication errors was evident across studies included within each of the identified themes, but findings varied considerably from a 16% increase in medication errors to a 100% reduction in medication errors. Conclusion: While neonatal medication errors can be reduced through multiple interventions aimed at improving the medication use process, no single intervention appeared clearly superior. Further research is required to evaluate the relative cost-effectiveness of the various medication safety interventions to facilitate decisions regarding uptake and implementation into clinical practice. PMID:29387337

  14. An undulator based soft x-ray source for microscopy on the Duke electron storage ring

    NASA Astrophysics Data System (ADS)

    Johnson, Lewis Elgin

    1998-09-01

    This dissertation describes the design, development, and installation of an undulator-based soft x-ray source on the Duke Free Electron Laser laboratory electron storage ring. Insertion device and soft x-ray beamline physics and technology are all discussed in detail. The Duke/NIST undulator is a 3.64-m long hybrid design constructed by the Brobeck Division of Maxwell Laboratories. Originally built for an FEL project at the National Institute of Standards and Technology, the undulator was acquired by Duke in 1992 for use as a soft x-ray source for the FEL laboratory. Initial Hall probe measurements on the magnetic field distribution of the undulator revealed field errors of more than 0.80%. Initial phase errors for the device were more than 11 degrees. Through a series of in situ and off-line measurements and modifications we have re-tuned the magnet field structure of the device to produce strong spectral characteristics through the 5th harmonic. A low operating K has served to reduce the effects of magnetic field errors on the harmonic spectral content. Although rms field errors remained at 0.75%, we succeeded in reducing phase errors to less than 5 degrees. Using trajectory simulations from magnetic field data, we have computed the spectral output given the interaction of the Duke storage ring electron beam and the NIST undulator. Driven by a series of concerns and constraints over maximum utility, personnel safety and funding, we have also constructed a unique front end beamline for the undulator. The front end has been designed for maximum throughput of the 1st harmonic around 40A in its standard mode of operation. The front end has an alternative mode of operation which transmits the 3rd and 5th harmonics. This compact system also allows for the extraction of some of the bend magnet produced synchrotron and transition radiation from the storage ring. As with any well designed front end system, it also provides excellent protection to personnel and to the storage ring. A diagnostic beamline consisting of a transmission grating spectrometer and scanning wire beam profile monitor was constructed to measure the spatial and spectral characteristics of the undulator radiation. Test of the system with a circulating electron beam has confirmed the magnetic and focusing properties of the undulator, and verified that it can be used without perturbing the orbit of the beam.

  15. Precise method of compensating radiation-induced errors in a hot-cathode-ionization gauge with correcting electrode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saeki, Hiroshi, E-mail: saeki@spring8.or.jp; Magome, Tamotsu, E-mail: saeki@spring8.or.jp

    2014-10-06

    To compensate pressure-measurement errors caused by a synchrotron radiation environment, a precise method using a hot-cathode-ionization-gauge head with correcting electrode, was developed and tested in a simulation experiment with excess electrons in the SPring-8 storage ring. This precise method to improve the measurement accuracy, can correctly reduce the pressure-measurement errors caused by electrons originating from the external environment, and originating from the primary gauge filament influenced by spatial conditions of the installed vacuum-gauge head. As the result of the simulation experiment to confirm the performance reducing the errors caused by the external environment, the pressure-measurement error using this method wasmore » approximately less than several percent in the pressure range from 10{sup −5} Pa to 10{sup −8} Pa. After the experiment, to confirm the performance reducing the error caused by spatial conditions, an additional experiment was carried out using a sleeve and showed that the improved function was available.« less

  16. Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Tong, Xin; Zhang, Zhiyong

    2012-01-01

    Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…

  17. 49 CFR 395.16 - Electronic on-board recording devices.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... transfer through wired and wireless methods to portable computers used by roadside safety assurance... the results of power-on self-tests and diagnostic error codes. (e) Date and time. (1) The date and... part. Wireless communication information interchange methods must comply with the requirements of the...

  18. Maximum likelihood phase-retrieval algorithm: applications.

    PubMed

    Nahrstedt, D A; Southwell, W H

    1984-12-01

    The maximum likelihood estimator approach is shown to be effective in determining the wave front aberration in systems involving laser and flow field diagnostics and optical testing. The robustness of the algorithm enables convergence even in cases of severe wave front error and real, nonsymmetrical, obscured amplitude distributions.

  19. 21 CFR 886.1770 - Manual refractor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1770 Manual refractor. (a) Identification. A manual refractor is a device that is a set of lenses of varous dioptric powers intended to measure the refractive error of the eye. (b) Classification. Class I (general controls). The device is exempt from the...

  20. Learning a locomotor task: with or without errors?

    PubMed

    Marchal-Crespo, Laura; Schneider, Jasmin; Jaeger, Lukas; Riener, Robert

    2014-03-04

    Robotic haptic guidance is the most commonly used robotic training strategy to reduce performance errors while training. However, research on motor learning has emphasized that errors are a fundamental neural signal that drive motor adaptation. Thus, researchers have proposed robotic therapy algorithms that amplify movement errors rather than decrease them. However, to date, no study has analyzed with precision which training strategy is the most appropriate to learn an especially simple task. In this study, the impact of robotic training strategies that amplify or reduce errors on muscle activation and motor learning of a simple locomotor task was investigated in twenty two healthy subjects. The experiment was conducted with the MAgnetic Resonance COmpatible Stepper (MARCOS) a special robotic device developed for investigations in the MR scanner. The robot moved the dominant leg passively and the subject was requested to actively synchronize the non-dominant leg to achieve an alternating stepping-like movement. Learning with four different training strategies that reduce or amplify errors was evaluated: (i) Haptic guidance: errors were eliminated by passively moving the limbs, (ii) No guidance: no robot disturbances were presented, (iii) Error amplification: existing errors were amplified with repulsive forces, (iv) Noise disturbance: errors were evoked intentionally with a randomly-varying force disturbance on top of the no guidance strategy. Additionally, the activation of four lower limb muscles was measured by the means of surface electromyography (EMG). Strategies that reduce or do not amplify errors limit muscle activation during training and result in poor learning gains. Adding random disturbing forces during training seems to increase attention, and therefore improve motor learning. Error amplification seems to be the most suitable strategy for initially less skilled subjects, perhaps because subjects could better detect their errors and correct them. Error strategies have a great potential to evoke higher muscle activation and provoke better motor learning of simple tasks. Neuroimaging evaluation of brain regions involved in learning can provide valuable information on observed behavioral outcomes related to learning processes. The impacts of these strategies on neurological patients need further investigations.

  1. Does exposure to simulated patient cases improve accuracy of clinicians' predictive value estimates of diagnostic test results? A within-subjects experiment at St Michael's Hospital, Toronto, Canada.

    PubMed

    Armstrong, Bonnie; Spaniol, Julia; Persaud, Nav

    2018-02-13

    Clinicians often overestimate the probability of a disease given a positive test result (positive predictive value; PPV) and the probability of no disease given a negative test result (negative predictive value; NPV). The purpose of this study was to investigate whether experiencing simulated patient cases (ie, an 'experience format') would promote more accurate PPV and NPV estimates compared with a numerical format. Participants were presented with information about three diagnostic tests for the same fictitious disease and were asked to estimate the PPV and NPV of each test. Tests varied with respect to sensitivity and specificity. Information about each test was presented once in the numerical format and once in the experience format. The study used a 2 (format: numerical vs experience) × 3 (diagnostic test: gold standard vs low sensitivity vs low specificity) within-subjects design. The study was completed online, via Qualtrics (Provo, Utah, USA). 50 physicians (12 clinicians and 38 residents) from the Department of Family and Community Medicine at St Michael's Hospital in Toronto, Canada, completed the study. All participants had completed at least 1 year of residency. Estimation accuracy was quantified by the mean absolute error (MAE; absolute difference between estimate and true predictive value). PPV estimation errors were larger in the numerical format (MAE=32.6%, 95% CI 26.8% to 38.4%) compared with the experience format (MAE=15.9%, 95% CI 11.8% to 20.0%, d =0.697, P<0.001). Likewise, NPV estimation errors were larger in the numerical format (MAE=24.4%, 95% CI 14.5% to 34.3%) than in the experience format (MAE=11.0%, 95% CI 6.5% to 15.5%, d =0.303, P=0.015). Exposure to simulated patient cases promotes accurate estimation of predictive values in clinicians. This finding carries implications for diagnostic training and practice. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Accuracy of vaginal symptom self-diagnosis algorithms for deployed military women.

    PubMed

    Ryan-Wenger, Nancy A; Neal, Jeremy L; Jones, Ashley S; Lowe, Nancy K

    2010-01-01

    Deployed military women have an increased risk for development of vaginitis due to extreme temperatures, primitive sanitation, hygiene and laundry facilities, and unavailable or unacceptable healthcare resources. The Women in the Military Self-Diagnosis (WMSD) and treatment kit was developed as a field-expedient solution to this problem. The primary study aims were to evaluate the accuracy of women's self-diagnosis of vaginal symptoms and eight diagnostic algorithms and to predict potential self-medication omission and commission error rates. Participants included 546 active duty, deployable Army (43.3%) and Navy (53.6%) women with vaginal symptoms who sought healthcare at troop medical clinics on base.In the clinic lavatory, women conducted a self-diagnosis using a sterile cotton swab to obtain vaginal fluid, a FemExam card to measure positive or negative pH and amines, and the investigator-developed WMSD Decision-Making Guide. Potential self-diagnoses were "bacterial infection" (bacterial vaginosis [BV] and/or trichomonas vaginitis [TV]), "yeast infection" (candida vaginitis [CV]), "no infection/normal," or "unclear." The Affirm VPIII laboratory reference standard was used to detect clinically significant amounts of vaginal fluid DNA for organisms associated with BV, TV, and CV. Women's self-diagnostic accuracy was 56% for BV/TV and 69.2% for CV. False-positives would have led to a self-medication commission error rate of 20.3% for BV/TV and 8% for CV. Potential self-medication omission error rates due to false-negatives were 23.7% for BV/TV and 24.8% for CV. The positive predictive value of diagnostic algorithms ranged from 0% to 78.1% for BV/TV and 41.7% for CV. The algorithms were based on clinical diagnostic standards. The nonspecific nature of vaginal symptoms, mixed infections, and a faulty device intended to measure vaginal pH and amines explain why none of the algorithms reached the goal of 95% accuracy. The next prototype of the WMSD kit will not include nonspecific vaginal signs and symptoms in favor of recently available point-of-care devices that identify antigens or enzymes of the causative BV, TV, and CV organisms.

  3. Influence of Tooth Spacing Error on Gears With and Without Profile Modifications

    NASA Technical Reports Server (NTRS)

    Padmasolala, Giri; Lin, Hsiang H.; Oswald, Fred B.

    2000-01-01

    A computer simulation was conducted to investigate the effectiveness of profile modification for reducing dynamic loads in gears with different tooth spacing errors. The simulation examined varying amplitudes of spacing error and differences in the span of teeth over which the error occurs. The modification considered included both linear and parabolic tip relief. The analysis considered spacing error that varies around most of the gear circumference (similar to a typical sinusoidal error pattern) as well as a shorter span of spacing errors that occurs on only a few teeth. The dynamic analysis was performed using a revised version of a NASA gear dynamics code, modified to add tooth spacing errors to the analysis. Results obtained from the investigation show that linear tip relief is more effective in reducing dynamic loads on gears with small spacing errors but parabolic tip relief becomes more effective as the amplitude of spacing error increases. In addition, the parabolic modification is more effective for the more severe error case where the error is spread over a longer span of teeth. The findings of this study can be used to design robust tooth profile modification for improving dynamic performance of gear sets with different tooth spacing errors.

  4. ALGORITHM TO REDUCE APPROXIMATION ERROR FROM THE COMPLEX-VARIABLE BOUNDARY-ELEMENT METHOD APPLIED TO SOIL FREEZING.

    USGS Publications Warehouse

    Hromadka, T.V.; Guymon, G.L.

    1985-01-01

    An algorithm is presented for the numerical solution of the Laplace equation boundary-value problem, which is assumed to apply to soil freezing or thawing. The Laplace equation is numerically approximated by the complex-variable boundary-element method. The algorithm aids in reducing integrated relative error by providing a true measure of modeling error along the solution domain boundary. This measure of error can be used to select locations for adding, removing, or relocating nodal points on the boundary or to provide bounds for the integrated relative error of unknown nodal variable values along the boundary.

  5. Digital radiography can reduce scoliosis x-ray exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kling, T.F. Jr.; Cohen, M.J.; Lindseth, R.E.

    1990-09-01

    Digital radiology is a new computerized system of acquiring x-rays in a digital (electronic) format. It possesses a greatly expanded dose response curve that allows a very broad range of x-ray dose to produce a diagnostic image. Potential advantages include significantly reduced radiation exposure without loss of image quality, acquisition of images of constant density irrespective of under or over exposure, and reduced repeat rates for unsatisfactory films. The authors prospectively studied 30 adolescents with scoliosis who had both conventional (full dose) and digital (full, one-half, or one-third dose) x-rays. They found digital made AP and lateral image with allmore » anatomic areas clearly depicted at full and one-half dose. Digital laterals were better at full dose and equal to conventional at one-half dose. Cobb angles were easily measured on all one-third dose AP and on 8 of 10 one-third dose digital laterals. Digital clearly depicted the Risser sign at one-half and one-third dose and the repeat rate was nil in this study, indicating digital compensates well for exposure errors. The study indicates that digital does allow radiation dose to be reduced by at least one-half in scoliosis patients and that it does have improved image quality with good contrast over a wide range of x-ray exposure.« less

  6. Interruption Practice Reduces Errors

    DTIC Science & Technology

    2014-01-01

    dangers of errors at the PCS. Electronic health record systems are used to reduce certain errors related to poor- handwriting and dosage...10.16, MSE =.31, p< .05, η2 = .18 A significant interaction between the number of interruptions and interrupted trials suggests that trials...the variance when calculating whether a memory has a higher signal than interference. If something in addition to activation contributes to goal

  7. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  8. Intraoperative visualisation of functional structures facilitates safe frameless stereotactic biopsy in the motor eloquent regions of the brain.

    PubMed

    Zhang, Jia-Shu; Qu, Ling; Wang, Qun; Jin, Wei; Hou, Yuan-Zheng; Sun, Guo-Chen; Li, Fang-Ye; Yu, Xin-Guang; Xu, Ban-Nan; Chen, Xiao-Lei

    2017-12-20

    For stereotactic brain biopsy involving motor eloquent regions, the surgical objective is to enhance diagnostic yield and preserve neurological function. To achieve this aim, we implemented functional neuro-navigation and intraoperative magnetic resonance imaging (iMRI) into the biopsy procedure. The impact of this integrated technique on the surgical outcome and postoperative neurological function was investigated and evaluated. Thirty nine patients with lesions involving motor eloquent structures underwent frameless stereotactic biopsy assisted by functional neuro-navigation and iMRI. Intraoperative visualisation was realised by integrating anatomical and functional information into a navigation framework to improve biopsy trajectories and preserve eloquent structures. iMRI was conducted to guarantee the biopsy accuracy and detect intraoperative complications. The perioperative change of motor function and biopsy error before and after iMRI were recorded, and the role of functional information in trajectory selection and the relationship between the distance from sampling site to nearby eloquent structures and the neurological deterioration were further analyzed. Functional neuro-navigation helped modify the original trajectories and sampling sites in 35.90% (16/39) of cases to avoid the damage of eloquent structures. Even though all the lesions were high-risk of causing neurological deficits, no significant difference was found between preoperative and postoperative muscle strength. After data analysis, 3mm was supposed to be the safe distance for avoiding transient neurological deterioration. During surgery, the use of iMRI significantly reduced the biopsy errors (p = 0.042) and potentially increased the diagnostic yield from 84.62% (33/39) to 94.87% (37/39). Moreover, iMRI detected intraoperative haemorrhage in 5.13% (2/39) of patients, all of them benefited from the intraoperative strategies based on iMRI findings. Intraoperative visualisation of functional structures could be a feasible, safe and effective technique. Combined with intraoperative high-field MRI, it contributed to enhance the biopsy accuracy and lower neurological complications in stereotactic brain biopsy involving motor eloquent areas.

  9. DIAGNOSTIC MODEL EVALUATION FOR CARBONACEOUS PM 2.5 USING ORGANIC MARKERS MEASURED IN THE SOUTHEASTERN U.S.

    EPA Science Inventory

    Summertime concentrations of fine particulate carbon in the southeastern United States are consistently underestimated by air quality models. In an effort to understand the cause of this error, the Community Multiscale Air Quality (CMAQ) model is instrumented to track primary org...

  10. [Description of clinical thinking by the dual-process theory].

    PubMed

    Peña G, Luis

    2012-06-01

    Clinical thinking is a very complex process that can be described by the dual-process theory, it has an intuitive part (that recognizes patterns) and an analytical part (that tests hypotheses). It is vulnerable to cognitive bias that professionals must be aware of, to minimize diagnostic errors.

  11. Teaching Statistics with Minitab II.

    ERIC Educational Resources Information Center

    Ryan, T. A., Jr.; And Others

    Minitab is a statistical computing system which uses simple language, produces clear output, and keeps track of bookkeeping automatically. Error checking with English diagnostics and inclusion of several default options help to facilitate use of the system by students. Minitab II is an improved and expanded version of the original Minitab which…

  12. Interlanguage Variation: A Point Missed?

    ERIC Educational Resources Information Center

    Tice, Bradley Scott

    A study investigated patterns in phonological errors occurring in the speaker's second language in both formal and informal speaking situations. Subjects were three adult learners of English as a second language, including a native Spanish-speaker and two Asians. Their speech was recorded during diagnostic testing (formal speech) and in everyday…

  13. Investigation of an Error Theory for Conjoint Measurement Methodology.

    DTIC Science & Technology

    1983-05-01

    1ybren, 1982; Srinivasan and Shocker, 1973a, 1973b; Ullrich =d Cumins , 1973; Takane, Young, and de Leeui, 190C; Yount,, 1972’. & OEM...procedures as a diagnostic tool. Specifically, they used the oompted STRESS - value and a measure of fit they called PRECAP that could be obtained

  14. The School-Based Multidisciplinary Team and Nondiscriminatory Assessment.

    ERIC Educational Resources Information Center

    Pfeiffer, Steven I.

    The potential of multidisciplinary teams to control for possible errors in diagnosis, classification, and placement and to provide a vehicle for ensuring effective outcomes of diagnostic practices is illustrated. The present functions of the school-based multidisciplinary team (also called, for example, assessment team, child study team, placement…

  15. Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment

    ERIC Educational Resources Information Center

    Spangler, David B.

    2011-01-01

    Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…

  16. 78 FR 9060 - Request for Nominations for Voting Members on Public Advisory Panels or Committees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-07

    ... diagnostic assays, e.g., hepatologists; molecular biologists. Molecular and Clinical 2 June 1, 2013. Genetics.... Individuals with training in inborn errors of metabolism, biochemical and/or molecular genetics, population genetics, epidemiology and related statistical training, and clinical molecular genetics testing (e.g...

  17. Cross-Proportions: A Conceptual Method for Developing Quantitative Problem-Solving Skills

    ERIC Educational Resources Information Center

    Cook, Elzbieta; Cook, Stephen L.

    2005-01-01

    The cross-proportion method allows both the instructor and the student to easily determine where an error is made during problem solving. The C-P method supports a strong cognitive foundation upon which students can develop other diagnostic methods as they advance in chemistry and scientific careers.

  18. Decision-Making Accuracy of CBM Progress-Monitoring Data

    ERIC Educational Resources Information Center

    Hintze, John M.; Wells, Craig S.; Marcotte, Amanda M.; Solomon, Benjamin G.

    2018-01-01

    This study examined the diagnostic accuracy associated with decision making as is typically conducted with curriculum-based measurement (CBM) approaches to progress monitoring. Using previously published estimates of the standard errors of estimate associated with CBM, 20,000 progress-monitoring data sets were simulated to model student reading…

  19. A Report on IRI Scoring and Interpretation.

    ERIC Educational Resources Information Center

    Anderson, Betty

    Noting that most classroom teachers use informal reading inventories (IRI) as diagnostic instruments, a study examined what oral reading accuracy level is most appropriate for the instructional level and whether repetitions should be counted as oral reading errors. Randomly selected students from the second through fifth grades at two elementary…

  20. Sleep Patterns and Its Relationship to Schooling and Family.

    ERIC Educational Resources Information Center

    Jones, Franklin Ross

    Diagnostic classifications of sleep and arousal disorders have been categorized in four major areas: disorders of initiating and maintaining sleep, disorders of excessive sleepiness, disorders of the sleep/wake pattern, and the parasomnias such as sleep walking, talking, and night errors. Another nomenclature classifies them into DIMS (disorders…

  1. Asymmetries in Predictive and Diagnostic Reasoning

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Darlow, Adam; Sloman, Steven A.

    2011-01-01

    In this article, we address the apparent discrepancy between causal Bayes net theories of cognition, which posit that judgments of uncertainty are generated from causal beliefs in a way that respects the norms of probability, and evidence that probability judgments based on causal beliefs are systematically in error. One purported source of bias…

  2. Reduction in chemotherapy order errors with computerized physician order entry.

    PubMed

    Meisenberg, Barry R; Wright, Robert R; Brady-Copertino, Catherine J

    2014-01-01

    To measure the number and type of errors associated with chemotherapy order composition associated with three sequential methods of ordering: handwritten orders, preprinted orders, and computerized physician order entry (CPOE) embedded in the electronic health record. From 2008 to 2012, a sample of completed chemotherapy orders were reviewed by a pharmacist for the number and type of errors as part of routine performance improvement monitoring. Error frequencies for each of the three distinct methods of composing chemotherapy orders were compared using statistical methods. The rate of problematic order sets-those requiring significant rework for clarification-was reduced from 30.6% with handwritten orders to 12.6% with preprinted orders (preprinted v handwritten, P < .001) to 2.2% with CPOE (preprinted v CPOE, P < .001). The incidence of errors capable of causing harm was reduced from 4.2% with handwritten orders to 1.5% with preprinted orders (preprinted v handwritten, P < .001) to 0.1% with CPOE (CPOE v preprinted, P < .001). The number of problem- and error-containing chemotherapy orders was reduced sequentially by preprinted order sets and then by CPOE. CPOE is associated with low error rates, but it did not eliminate all errors, and the technology can introduce novel types of errors not seen with traditional handwritten or preprinted orders. Vigilance even with CPOE is still required to avoid patient harm.

  3. A case of poor substructure diagnostics

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1992-01-01

    The NASTRAN Manuals in the substructuring area are all geared toward instant success, but the solution paths are fraught with many traps for human error. Thus, the probability of suffering a fatal abort is high. In such circumstances, the necessity for diagnostics that are user friendly is paramount. This paper is written in the spirit of improving the diagnostics as well as the documentation in one area where the author felt he was backed into a blind corner as a result of his having committed a data oversight. This topic is aired by referring to an analysis of a particular structure. The structure, under discussion, used a number of local coordinate systems that simplified the preparation of input data. The principle features of this problem are introduced by reference to a series of figures.

  4. Comment on "Hydrogen Balmer beta: The separation between line peaks for plasma electron density diagnostics and self-absorption test"

    NASA Astrophysics Data System (ADS)

    Gautam, Ghaneshwar; Surmick, David M.; Parigger, Christian G.

    2015-07-01

    In this letter, we present a brief comment regarding the recently published paper by Ivković et al., J Quant Spectrosc Radiat Transf 2015;154:1-8. Reference is made to previous experimental results to indicate that self absorption must have occurred; however, when carefully considering error propagation, both widths and peak-separation predict electron densities within the error margins. Yet the diagnosis method and the presented details on the use of the hydrogen beta peak separation are viewed as a welcomed contribution in studies of laser-induced plasma.

  5. 2D electron density profile measurement in tokamak by laser-accelerated ion-beam probe.

    PubMed

    Chen, Y H; Yang, X Y; Lin, C; Wang, L; Xu, M; Wang, X G; Xiao, C J

    2014-11-01

    A new concept of Heavy Ion Beam Probe (HIBP) diagnostic has been proposed, of which the key is to replace the electrostatic accelerator of traditional HIBP by a laser-driven ion accelerator. Due to the large energy spread of ions, the laser-accelerated HIBP can measure the two-dimensional (2D) electron density profile of tokamak plasma. In a preliminary simulation, a 2D density profile was reconstructed with a spatial resolution of about 2 cm, and with the error below 15% in the core region. Diagnostics of 2D density fluctuation is also discussed.

  6. Speech abilities in preschool children with speech sound disorder with and without co-occurring language impairment.

    PubMed

    Macrae, Toby; Tyler, Ann A

    2014-10-01

    The authors compared preschool children with co-occurring speech sound disorder (SSD) and language impairment (LI) to children with SSD only in their numbers and types of speech sound errors. In this post hoc quasi-experimental study, independent samples t tests were used to compare the groups in the standard score from different tests of articulation/phonology, percent consonants correct, and the number of omission, substitution, distortion, typical, and atypical error patterns used in the production of different wordlists that had similar levels of phonetic and structural complexity. In comparison with children with SSD only, children with SSD and LI used similar numbers but different types of errors, including more omission patterns ( p < .001, d = 1.55) and fewer distortion patterns ( p = .022, d = 1.03). There were no significant differences in substitution, typical, and atypical error pattern use. Frequent omission error pattern use may reflect a more compromised linguistic system characterized by absent phonological representations for target sounds (see Shriberg et al., 2005). Research is required to examine the diagnostic potential of early frequent omission error pattern use in predicting later diagnoses of co-occurring SSD and LI and/or reading problems.

  7. Spatial calibration of a tokamak neutral beam diagnostic using in situ neutral beam emission

    DOE PAGES

    Chrystal, Colin; Burrell, Keith H.; Grierson, Brian A.; ...

    2015-10-20

    Neutral beam injection is used in tokamaks to heat, apply torque, drive non-inductive current, and diagnose plasmas. Neutral beam diagnostics need accurate spatial calibrations to benefit from the measurement localization provided by the neutral beam. A new technique has been developed that uses in-situ measurements of neutral beam emission to determine the spatial location of the beam and the associated diagnostic views. This technique was developed to improve the charge exchange recombination diagnostic (CER) at the DIII-D tokamak and uses measurements of the Doppler shift and Stark splitting of neutral beam emission made by that diagnostic. These measurements contain informationmore » about the geometric relation between the diagnostic views and the neutral beams when they are injecting power. This information is combined with standard spatial calibration measurements to create an integrated spatial calibration that provides a more complete description of the neutral beam-CER system. The integrated spatial calibration results are very similar to the standard calibration results and derived quantities from CER measurements are unchanged within their measurement errors. Lastly, the methods developed to perform the integrated spatial calibration could be useful for tokamaks with limited physical access.« less

  8. Spatial calibration of a tokamak neutral beam diagnostic using in situ neutral beam emission

    NASA Astrophysics Data System (ADS)

    Chrystal, C.; Burrell, K. H.; Grierson, B. A.; Pace, D. C.

    2015-10-01

    Neutral beam injection is used in tokamaks to heat, apply torque, drive non-inductive current, and diagnose plasmas. Neutral beam diagnostics need accurate spatial calibrations to benefit from the measurement localization provided by the neutral beam. A new technique has been developed that uses in situ measurements of neutral beam emission to determine the spatial location of the beam and the associated diagnostic views. This technique was developed to improve the charge exchange recombination (CER) diagnostic at the DIII-D tokamak and uses measurements of the Doppler shift and Stark splitting of neutral beam emission made by that diagnostic. These measurements contain information about the geometric relation between the diagnostic views and the neutral beams when they are injecting power. This information is combined with standard spatial calibration measurements to create an integrated spatial calibration that provides a more complete description of the neutral beam-CER system. The integrated spatial calibration results are very similar to the standard calibration results and derived quantities from CER measurements are unchanged within their measurement errors. The methods developed to perform the integrated spatial calibration could be useful for tokamaks with limited physical access.

  9. A novel diagnosis method for a Hall plates-based rotary encoder with a magnetic concentrator.

    PubMed

    Meng, Bumin; Wang, Yaonan; Sun, Wei; Yuan, Xiaofang

    2014-07-31

    In the last few years, rotary encoders based on two-dimensional complementary metal oxide semiconductors (CMOS) Hall plates with a magnetic concentrator have been developed to measure contactless absolute angle. There are various error factors influencing the measuring accuracy, which are difficult to locate after the assembly of encoder. In this paper, a model-based rapid diagnosis method is presented. Based on an analysis of the error mechanism, an error model is built to compare minimum residual angle error and to quantify the error factors. Additionally, a modified particle swarm optimization (PSO) algorithm is used to reduce the calculated amount. The simulation and experimental results show that this diagnosis method is feasible to quantify the causes of the error and to reduce iteration significantly.

  10. ReQON: a Bioconductor package for recalibrating quality scores from next-generation sequencing data

    PubMed Central

    2012-01-01

    Background Next-generation sequencing technologies have become important tools for genome-wide studies. However, the quality scores that are assigned to each base have been shown to be inaccurate. If the quality scores are used in downstream analyses, these inaccuracies can have a significant impact on the results. Results Here we present ReQON, a tool that recalibrates the base quality scores from an input BAM file of aligned sequencing data using logistic regression. ReQON also generates diagnostic plots showing the effectiveness of the recalibration. We show that ReQON produces quality scores that are both more accurate, in the sense that they more closely correspond to the probability of a sequencing error, and do a better job of discriminating between sequencing errors and non-errors than the original quality scores. We also compare ReQON to other available recalibration tools and show that ReQON is less biased and performs favorably in terms of quality score accuracy. Conclusion ReQON is an open source software package, written in R and available through Bioconductor, for recalibrating base quality scores for next-generation sequencing data. ReQON produces a new BAM file with more accurate quality scores, which can improve the results of downstream analysis, and produces several diagnostic plots showing the effectiveness of the recalibration. PMID:22946927

  11. TPS, CA 19-9, VEGF-A, and CEA as diagnostic and prognostic factors in patients with mass lesions in the pancreatic head.

    PubMed

    Sandblom, Gabriel; Granroth, Sofie; Rasmussen, Ib Christian

    2008-01-01

    Although numerous tumour markers are available for periampullary tumours, including pancreatic cancer, their specificity and sensitivity have been questioned. To assess the diagnostic and prognostic values of tissue polypeptide specific antigen (TPS), carbohydrate antigen 19-9 (CA 19-9), vascular endothelial growth factor (VEGF-A), and carcinoembryonic antigen (CEA) we took serum samples in 56 patients with mass lesions in the pancreatic head. Among these patients, further investigations revealed pancreatic cancer in 20 patients, other malignant diseases in 12 and benign conditions in 24. Median CEA in all patients was 3.4 microg/L (range 0.5-585.0), median CA 19-9 was105 kU/L (range 0.6-1 300 00), median TPS 123.5 U/L (range 15.0-3350) and median VEGF-A 132.5 ng/L (range 60.0-4317). Area under the curve was 0.747, standard error (standard error [SE] =0.075) for CEA, 0.716 (SE=0.078) for CA 19-9 and 0.822 (SE=0.086) for TPS in ROC plots based on the ability of the tumours to distinguish between benign and malignant conditions. None of the markers significantly predicted survival in the subgroup of patients with pancreatic cancer. Our study shows that the markers may be used as fairly reliable diagnostic tools, but cannot be used to predict survival.

  12. Demonstration of spectral calibration for stellar interferometry

    NASA Technical Reports Server (NTRS)

    Demers, Richard T.; An, Xin; Tang, Hong; Rud, Mayer; Wayne, Leonard; Kissil, Andrew; Kwack, Eug-Yun

    2006-01-01

    A breadboard is under development to demonstrate the calibration of spectral errors in microarcsecond stellar interferometers. Analysis shows that thermally and mechanically stable hardware in addition to careful optical design can reduce the wavelength dependent error to tens of nanometers. Calibration of the hardware can further reduce the error to the level of picometers. The results of thermal, mechanical and optical analysis supporting the breadboard design will be shown.

  13. Uncertainty analysis technique for OMEGA Dante measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M. J.; Widmann, K.; Sorce, C.

    2010-10-15

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determinedmore » flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.« less

  14. Uncertainty Analysis Technique for OMEGA Dante Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M J; Widmann, K; Sorce, C

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determinedmore » flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.« less

  15. The pursuit of better diagnostic performance: a human factors perspective.

    PubMed

    Henriksen, Kerm; Brady, Jeff

    2013-10-01

    Despite the relatively slow start in treating diagnostic error as an amenable research topic at the beginning of the patient safety movement, interest has steadily increased over the past few years in the form of solicitations for research, regularly scheduled conferences, an expanding literature and even a new professional society. Yet improving diagnostic performance increasingly is recognised as a multifaceted challenge. With the aid of a human factors perspective, this paper addresses a few of these challenges, including questions that focus on who owns the problem, treating cognitive and system shortcomings as separate issues, why knowledge in the head is not enough, and what we are learning from health information technology (IT) and the use of checklists. To encourage empirical testing of interventions that aim to improve diagnostic performance, a systems engineering approach making use of rapid-cycle prototyping and simulation is proposed. To gain a fuller understanding of the complexity of the sociotechnical space where diagnostic work is performed, a final note calls for the formation of substantive partnerships with those in disciplines beyond the clinical domain.

  16. An ROC-type measure of diagnostic accuracy when the gold standard is continuous-scale.

    PubMed

    Obuchowski, Nancy A

    2006-02-15

    ROC curves and summary measures of accuracy derived from them, such as the area under the ROC curve, have become the standard for describing and comparing the accuracy of diagnostic tests. Methods for estimating ROC curves rely on the existence of a gold standard which dichotomizes patients into disease present or absent. There are, however, many examples of diagnostic tests whose gold standards are not binary-scale, but rather continuous-scale. Unnatural dichotomization of these gold standards leads to bias and inconsistency in estimates of diagnostic accuracy. In this paper, we propose a non-parametric estimator of diagnostic test accuracy which does not require dichotomization of the gold standard. This estimator has an interpretation analogous to the area under the ROC curve. We propose a confidence interval for test accuracy and a statistical test for comparing accuracies of tests from paired designs. We compare the performance (i.e. CI coverage, type I error rate, power) of the proposed methods with several alternatives. An example is presented where the accuracies of two quick blood tests for measuring serum iron concentrations are estimated and compared.

  17. Localization of Diagnostically Relevant Regions of Interest in Whole Slide Images: a Comparative Study.

    PubMed

    Mercan, Ezgi; Aksoy, Selim; Shapiro, Linda G; Weaver, Donald L; Brunyé, Tad T; Elmore, Joann G

    2016-08-01

    Whole slide digital imaging technology enables researchers to study pathologists' interpretive behavior as they view digital slides and gain new understanding of the diagnostic medical decision-making process. In this study, we propose a simple yet important analysis to extract diagnostically relevant regions of interest (ROIs) from tracking records using only pathologists' actions as they viewed biopsy specimens in the whole slide digital imaging format (zooming, panning, and fixating). We use these extracted regions in a visual bag-of-words model based on color and texture features to predict diagnostically relevant ROIs on whole slide images. Using a logistic regression classifier in a cross-validation setting on 240 digital breast biopsy slides and viewport tracking logs of three expert pathologists, we produce probability maps that show 74 % overlap with the actual regions at which pathologists looked. We compare different bag-of-words models by changing dictionary size, visual word definition (patches vs. superpixels), and training data (automatically extracted ROIs vs. manually marked ROIs). This study is a first step in understanding the scanning behaviors of pathologists and the underlying reasons for diagnostic errors.

  18. Blood venous sample collection: Recommendations overview and a checklist to improve quality.

    PubMed

    Giavarina, Davide; Lippi, Giuseppe

    2017-07-01

    The extra-analytical phases of the total testing process have substantial impact on managed care, as well as an inherent high risk of vulnerability to errors which is often greater than that of the analytical phase. The collection of biological samples is a crucial preanalytical activity. Problems or errors occurring shortly before, or soon after, this preanalytical step may impair sample quality and characteristics, or else modify the final results of testing. The standardization of fasting requirements, rest, patient position and psychological state of the patient are therefore crucial for mitigating the impact of preanalytical variability. Moreover, the quality of materials used for collecting specimens, along with their compatibility, can guarantee sample quality and persistence of chemical and physical characteristics of the analytes over time, so safeguarding the reliability of testing. Appropriate techniques and sampling procedures are effective to prevent problems such as hemolysis, undue clotting in the blood tube, draw of insufficient sample volume and modification of analyte concentration. An accurate identification of both patient and blood samples is a key priority as for other healthcare activities. Good laboratory practice and appropriate training of operators, by specifically targeting collection of biological samples, blood in particular, may greatly improve this issue, thus lowering the risk of errors and their adverse clinical consequences. The implementation of a simple and rapid check-list, including verification of blood collection devices, patient preparation and sampling techniques, was found to be effective for enhancing sample quality and reducing some preanalytical errors associated with these procedures. The use of this tool, along with implementation of objective and standardized systems for detecting non-conformities related to unsuitable samples, can be helpful for standardizing preanalytical activities and improving the quality of laboratory diagnostics, ultimately helping to reaffirm a "preanalytical" culture founded on knowledge and real risk perception. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. Reducing medication errors in critical care: a multimodal approach

    PubMed Central

    Kruer, Rachel M; Jarrell, Andrew S; Latif, Asad

    2014-01-01

    The Institute of Medicine has reported that medication errors are the single most common type of error in health care, representing 19% of all adverse events, while accounting for over 7,000 deaths annually. The frequency of medication errors in adult intensive care units can be as high as 947 per 1,000 patient-days, with a median of 105.9 per 1,000 patient-days. The formulation of drugs is a potential contributor to medication errors. Challenges related to drug formulation are specific to the various routes of medication administration, though errors associated with medication appearance and labeling occur among all drug formulations and routes of administration. Addressing these multifaceted challenges requires a multimodal approach. Changes in technology, training, systems, and safety culture are all strategies to potentially reduce medication errors related to drug formulation in the intensive care unit. PMID:25210478

  20. The effect of covariate mean differences on the standard error and confidence interval for the comparison of treatment means.

    PubMed

    Liu, Xiaofeng Steven

    2011-05-01

    The use of covariates is commonly believed to reduce the unexplained error variance and the standard error for the comparison of treatment means, but the reduction in the standard error is neither guaranteed nor uniform over different sample sizes. The covariate mean differences between the treatment conditions can inflate the standard error of the covariate-adjusted mean difference and can actually produce a larger standard error for the adjusted mean difference than that for the unadjusted mean difference. When the covariate observations are conceived of as randomly varying from one study to another, the covariate mean differences can be related to a Hotelling's T(2) . Using this Hotelling's T(2) statistic, one can always find a minimum sample size to achieve a high probability of reducing the standard error and confidence interval width for the adjusted mean difference. ©2010 The British Psychological Society.

  1. Evidence in clinical reasoning: a computational linguistics analysis of 789,712 medical case summaries 1983-2012.

    PubMed

    Seidel, Bastian M; Campbell, Steven; Bell, Erica

    2015-03-21

    Better understanding of clinical reasoning could reduce diagnostic error linked to 8% of adverse medical events and 30% of malpractice cases. To a greater extent than the evidence-based movement, the clinical reasoning literature asserts the importance of practitioner intuition—unconscious elements of diagnostic reasoning. The study aimed to analyse the content of case report summaries in ways that explored the importance of an evidence concept, not only in relation to research literature but also intuition. The study sample comprised all 789,712 abstracts in English for case reports contained in the database PUBMED for the period 1 January 1983 to 31 December 2012. It was hypothesised that, if evidence and intuition concepts were viewed by these clinical authors as essential to understanding their case reports, they would be more likely to be found in the abstracts. Computational linguistics software was used in 1) concept mapping of 21,631,481 instances of 201 concepts, and 2) specific concept analyses examining 200 paired co-occurrences for 'evidence' and research 'literature' concepts. 'Evidence' is a fundamentally patient-centred, intuitive concept linked to less common concepts about underlying processes, suspected disease mechanisms and diagnostic hunches. In contrast, the use of research literature in clinical reasoning is linked to more common reasoning concepts about specific knowledge and descriptions or presenting features of cases. 'Literature' is by far the most dominant concept, increasing in relevance since 2003, with an overall relevance of 13% versus 5% for 'evidence' which has remained static. The fact that the least present types of reasoning concepts relate to diagnostic hunches to do with underlying processes, such as what is suspected, raises questions about whether intuitive practitioner evidence-making, found in a constellation of dynamic, process concepts, has become less important. The study adds support to the existing corpus of research on clinical reasoning, by suggesting that intuition involves a complex constellation of concepts important to how the construct of evidence is understood. The list of concepts the study generated offers a basis for reflection on the nature of evidence in diagnostic reasoning and the importance of intuition to that reasoning.

  2. A String of Mistakes: The Importance of Cascade Analysis in Describing, Counting, and Preventing Medical Errors

    PubMed Central

    Woolf, Steven H.; Kuzel, Anton J.; Dovey, Susan M.; Phillips, Robert L.

    2004-01-01

    BACKGROUND Notions about the most common errors in medicine currently rest on conjecture and weak epidemiologic evidence. We sought to determine whether cascade analysis is of value in clarifying the epidemiology and causes of errors and whether physician reports are sensitive to the impact of errors on patients. METHODS Eighteen US family physicians participating in a 6-country international study filed 75 anonymous error reports. The narratives were examined to identify the chain of events and the predominant proximal errors. We tabulated the consequences to patients, both reported by physicians and inferred by investigators. RESULTS A chain of errors was documented in 77% of incidents. Although 83% of the errors that ultimately occurred were mistakes in treatment or diagnosis, 2 of 3 were set in motion by errors in communication. Fully 80% of the errors that initiated cascades involved informational or personal miscommunication. Examples of informational miscommunication included communication breakdowns among colleagues and with patients (44%), misinformation in the medical record (21%), mishandling of patients’ requests and messages (18%), inaccessible medical records (12%), and inadequate reminder systems (5%). When asked whether the patient was harmed, physicians answered affirmatively in 43% of cases in which their narratives described harms. Psychological and emotional effects accounted for 17% of physician-reported consequences but 69% of investigator-inferred consequences. CONCLUSIONS Cascade analysis of physicians’ error reports is helpful in understanding the precipitant chain of events, but physicians provide incomplete information about how patients are affected. Miscommunication appears to play an important role in propagating diagnostic and treatment mistakes. PMID:15335130

  3. TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Phillips, J

    2016-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less

  4. Missed diagnostic opportunities within South Africa's early infant diagnosis program, 2010-2015.

    PubMed

    Haeri Mazanderani, Ahmad; Moyo, Faith; Sherman, Gayle G

    2017-01-01

    Samples submitted for HIV PCR testing that fail to yield a positive or negative result represent missed diagnostic opportunities. We describe HIV PCR test rejections and indeterminate results, and the associated delay in diagnosis, within South Africa's early infant diagnosis (EID) program from 2010 to 2015. HIV PCR test data from January 2010 to December 2015 were extracted from the National Health Laboratory Service Corporate Data Warehouse, a central data repository of all registered test-sets within the public health sector in South Africa, by laboratory number, result, date, facility, and testing laboratory. Samples that failed to yield either a positive or negative result were categorized according to the rejection code on the laboratory information system, and descriptive analysis performed using Microsoft Excel. Delay in diagnosis was calculated for patients who had a missed diagnostic opportunity registered between January 2013 and December 2015 by means of a patient linking-algorithm employing demographic details. Between 2010 and 2015, 2 178 582 samples were registered for HIV PCR testing of which 6.2% (n = 134 339) failed to yield either a positive or negative result, decreasing proportionally from 7.0% (n = 20 556) in 2010 to 4.4% (n = 21 388) in 2015 (p<0.001). Amongst 76 972 coded missed diagnostic opportunities, 49 585 (64.4%) were a result of pre-analytical error and 27 387 (35.6%) analytical error. Amongst 49 694 patients searched for follow-up results, 16 895 (34.0%) had at least one subsequent HIV PCR test registered after a median of 29 days (IQR: 13-57), of which 8.4% tested positive compared with 3.6% of all samples submitted for the same period. Routine laboratory data provides the opportunity for near real-time surveillance and quality improvement within the EID program. Delay in diagnosis and wastage of resources associated with missed diagnostic opportunities must be addressed and infants actively followed-up as South Africa works towards elimination of mother-to-child transmission.

  5. Parents' versus physicians' values for clinical outcomes in young febrile children.

    PubMed

    Kramer, M S; Etezadi-Amoli, J; Ciampi, A; Tange, S M; Drummond, K N; Mills, E L; Bernstein, M L; Leduc, D G

    1994-05-01

    To compare how parents and physicians value potential clinical outcomes in young children who have a fever but no focus of bacterial infection. Cross-sectional study of 100 parents of well children aged 3 to 24 months, 61 parents of febrile children aged 3 to 24 months, and 56 attending staff physicians working in a children's hospital emergency department. A pretested visual analog scale was used to assess values on a 0-to-1 scale (where 0 is the value of the worst possible outcome, and 1 is the value for the best) for 22 scenarios, grouped in three categories according to severity. Based on the three or four common attributes comprising the scenarios in a given group, each respondent's value function was estimated statistically based on multiattribute utility theory. For outcomes in group 1 (rapidly resolving viral infection with one or more diagnostic tests), no significant group differences were observed. For outcomes in groups 2 (acute infections without long-term sequelae) and 3 (long-term sequelae of urinary tract infection or bacterial meningitis), parents of well children and parents of febrile children had values that were similar to each other but significantly lower than physicians' values for pneumonia with delayed diagnosis, false-positive diagnosis of urinary tract infection, viral meningitis, and unilateral hearing loss. For bacterial meningitis with or without delay, however, the reverse pattern was observed; physicians' values were lower than parents'. In arriving at their judgment for group 2 and 3 scenarios, parents gave significantly greater weight to attributes involving the pain and discomfort of diagnostic tests and to diagnostic error, whereas physicians gave significantly greater weight to attributes involving both short- and long-term morbidity and long-term worry and inconvenience. Parents were significantly more likely to be risk-seeking in the way they weighted the attributes comprising group 2 and 3 scenarios than physicians, ie, they were more willing to risk rare but severe morbidity to avoid the short-term adverse effects of testing. Parents and physicians show fundamental value differences concerning diagnostic testing, diagnostic error, and short- and long-term morbidity; these differences have important implications for diagnostic decision making in the young febrile child.

  6. Cervicocephalic kinesthetic sensibility and postural balance in patients with nontraumatic chronic neck pain--a pilot study.

    PubMed

    Palmgren, Per J; Andreasson, Daniel; Eriksson, Magnus; Hägglund, Andreas

    2009-06-30

    Although cervical pain is widespread, most victims are only mildly and occasionally affected. A minority, however, suffer chronic pain and/or functional impairments. Although there is abundant literature regarding nontraumatic neck pain, little focuses on diagnostic criteria. During the last decade, research on neck pain has been designed to evaluate underlying pathophysiological mechanisms, without noteworthy success. Independent researchers have investigated postural balance and cervicocephalic kinesthetic sensibility among patients with chronic neck pain, and have (in most cases) concluded the source of the problem is a reduced ability in the neck's proprioceptive system. Here, we investigated cervicocephalic kinesthetic sensibility and postural balance among patients with nontraumatic chronic neck pain. Ours was a two-group, observational pilot study of patients with complaints of continuous neck pain during the 3 months prior to recruitment. Thirteen patients with chronic neck pain of nontraumatic origin were recruited from an institutional outpatient clinic. Sixteen healthy persons were recruited as a control group. Cervicocephalic kinesthetic sensibility was assessed by exploring head repositioning accuracy and postural balance was measured with computerized static posturography. Parameters of cervicocephalic kinesthetic sensibility were not reduced. However, in one of six test movements (flexion), global repositioning errors were significantly larger in the experimental group than in the control group (p < .05). Measurements did not demonstrate any general impaired postural balance, and varied substantially among participants in both groups. In patients with nontraumatic chronic neck pain, we found statistically significant global repositioning errors in only one of six test movements. In this cohort, we found no evidence of impaired postural balance.Head repositioning accuracy and computerized static posturography are imperfect measures of functional proprioceptive impairments. Validity of (and procedures for using) these instruments demand further investigation. Current Controlled Trials ISRCTN96873990.

  7. [Paradigmatic shifts in clinical practice in the the last generation].

    PubMed

    Benbassat, J

    1996-05-01

    Physicians have always used theoretical models (paradigms) to interpret clinical reality, and have changed the prevailing model only when it could no longer satisfy clinical needs. The purpose of this essay is to review some of the paradigmatic changes in clinical reasoning that have occurred since my undergraduate medial education. My training in the 50's was along the bio-medical model that reduced all diseases to structural or biochemical dysfunctions. Within this framework, causes were perceived as leading inevitably rather than probabilistically to their consequences, and chance and ambiguity had a very small role in explication of pathophysiologic mechanisms and in diagnostic reasoning. The doctor-patient relationship was paternalistic and the orientation to extending survival rejected notions of quality of life and involved parsimonious utilization of health care resources. Today however, clinical reasoning has shifted from deductive and deterministic to inductive (evidence-based) and probabilistic. Disease is believed to result from multiple factors rather than from single causes, and there is increasing acceptance of psycho-social factors of disease. Awareness of the confounding effects of false-positive and false-negative tests has changed the attitude to diagnostic evaluation. Terms, such as risk indicators of disease, predictive value of tests and risk-benefit ratio are increasingly used in discussing clinical decisions. We respect the patient's autonomy more than we did in the past, and consider his/her preferences and quality of life in clinical decision-making. Fair distribution of medical resources is considered as an ethical principle. Finally, clinical guidelines are no longer viewed as counter-intuitive, but rather as effective means to reduce the disturbingly high rates of medical error.

  8. [Ambient air interference in oxygen intake measurements in liquid incubating media with the use of open polarographic cells].

    PubMed

    Miniaev, M V; Voronchikhina, L I

    2007-01-01

    A model of oxygen intake by aerobic bio-objects in liquid incubating media was applied to investigate the influence air-media interface area on accuracy of measuring the oxygen intake and error value. It was shown that intrusion of air oxygen increases the relative error to 24% in open polarographic cells and to 13% in cells with a reduced interface area. Results of modeling passive media oxygenation laid a basis for proposing a method to reduce relative error by 66% for open cells and by 15% for cells with a reduced interface area.

  9. Attentional effects on orientation judgements are dependent on memory consolidation processes.

    PubMed

    Haskell, Christie; Anderson, Britt

    2016-11-01

    Are the effects of memory and attention on perception synergistic, antagonistic, or independent? Tested separately, memory and attention have been shown to affect the accuracy of orientation judgements. When multiple stimuli are presented sequentially versus simultaneously, error variance is reduced. When a target is validly cued, precision is increased. What if they are manipulated together? We combined memory and attention manipulations in an orientation judgement task to answer this question. Two circular gratings were presented sequentially or simultaneously. On some trials a brief luminance cue preceded the stimuli. Participants were cued to report the orientation of one of the two gratings by rotating a response grating. We replicated the finding that error variance is reduced on sequential trials. Critically, we found interacting effects of memory and attention. Valid cueing reduced the median, absolute error only when two stimuli appeared together and improved it to the level of performance on uncued sequential trials, whereas invalid cueing always increased error. This effect was not mediated by cue predictiveness; however, predictive cues reduced the standard deviation of the error distribution, whereas nonpredictive cues reduced "guessing". Our results suggest that, when the demand on memory is greater than a single stimulus, attention is a bottom-up process that prioritizes stimuli for consolidation. Thus attention and memory are synergistic.

  10. Improving patient safety using the sterile cockpit principle during medication administration: a collaborative, unit-based project.

    PubMed

    Fore, Amanda M; Sculli, Gary L; Albee, Doreen; Neily, Julia

    2013-01-01

      To implement the sterile cockpit principle to decrease interruptions and distractions during high volume medication administration and reduce the number of medication errors.   While some studies have described the importance of reducing interruptions as a tactic to reduce medication errors, work is needed to assess the impact on patient outcomes.   Data regarding the type and frequency of distractions were collected during the first 11 weeks of implementation. Medication error rates were tracked 1 year before and after 1 year implementation.   Simple regression analysis showed a decrease in the mean number of distractions, (β = -0.193, P = 0.02) over time. The medication error rate decreased by 42.78% (P = 0.04) after implementation of the sterile cockpit principle.   The use of crew resource management techniques, including the sterile cockpit principle, applied to medication administration has a significant impact on patient safety.   Applying the sterile cockpit principle to inpatient medical units is a feasible approach to reduce the number of distractions during the administration of medication, thus, reducing the likelihood of medication error. 'Do Not Disturb' signs and vests are inexpensive, simple interventions that can be used as reminders to decrease distractions. © 2012 Blackwell Publishing Ltd.

  11. Current pulse: can a production system reduce medical errors in health care?

    PubMed

    Printezis, Antonios; Gopalakrishnan, Mohan

    2007-01-01

    One of the reasons for rising health care costs is medical errors, a majority of which result from faulty systems and processes. Health care in the past has used process-based initiatives such as Total Quality Management, Continuous Quality Improvement, and Six Sigma to reduce errors. These initiatives to redesign health care, reduce errors, and improve overall efficiency and customer satisfaction have had moderate success. Current trend is to apply the successful Toyota Production System (TPS) to health care since its organizing principles have led to tremendous improvement in productivity and quality for Toyota and other businesses that have adapted them. This article presents insights on the effectiveness of TPS principles in health care and the challenges that lie ahead in successfully integrating this approach with other quality initiatives.

  12. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    PubMed

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  13. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    PubMed

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  14. GOCI Yonsei aerosol retrieval version 2 aerosol products: improved algorithm description and error analysis with uncertainty estimation from 5-year validation over East Asia

    NASA Astrophysics Data System (ADS)

    Choi, M.; Kim, J.; Lee, J.; KIM, M.; Park, Y. J.; Holben, B. N.; Eck, T. F.; Li, Z.; Song, C. H.

    2017-12-01

    The Geostationary Ocean Color Imager (GOCI) Yonsei aerosol retrieval (YAER) version 1 algorithm was developed for retrieving hourly aerosol optical depth at 550 nm (AOD) and other subsidiary aerosol optical properties over East Asia. The GOCI YAER AOD showed comparable accuracy compared to ground-based and other satellite-based observations, but still had errors due to uncertainties in surface reflectance and simple cloud masking. Also, it was not capable of near-real-time (NRT) processing because it required a monthly database of each year encompassing the day of retrieval for the determination of surface reflectance. This study describes the improvement of GOCI YAER algorithm to the version 2 (V2) for NRT processing with improved accuracy from the modification of cloud masking, surface reflectance determination using multi-year Rayleigh corrected reflectance and wind speed database, and inversion channels per surface conditions. Therefore, the improved GOCI AOD ( ) is similar with those of Moderate Resolution Imaging Spectroradiometer (MODIS) and Visible Infrared Imaging Radiometer Suite (VIIRS) AOD compared to V1 of the YAER algorithm. The shows reduced median bias and increased ratio within range (i.e. absolute expected error range of MODIS AOD) compared to V1 in the validation results using Aerosol Robotic Network (AERONET) AOD ( ) from 2011 to 2016. The validation using the Sun-Sky Radiometer Observation Network (SONET) over China also shows similar results. The bias of error ( is within -0.1 and 0.1 range as a function of AERONET AOD and AE, scattering angle, NDVI, cloud fraction and homogeneity of retrieved AOD, observation time, month, and year. Also, the diagnostic and prognostic expected error (DEE and PEE, respectively) of are estimated. The estimated multiple PEE of GOCI V2 AOD is well matched with actual error over East Asia, and the GOCI V2 AOD over Korea shows higher ratio within PEE compared to over China and Japan. Hourly AOD products based on the improved GOCI YAER AOD could contribute to better understandings of aerosols in terms of long-term climate changes and short-term air quality monitoring and forecasting perspectives over East Asia, especially rapid diurnal variation and transboundary transport.

  15. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  16. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  17. Urine cytology of nonurothelial malignancies-a 10-year experience in a large multihospital healthcare system.

    PubMed

    Savant, Deepika; Bajaj, Jaya; Gimenez, Cecilia; Rafael, Oana C; Mirzamani, Neda; Chau, Karen; Klein, Melissa; Das, Kasturi

    2017-01-01

    Urine cytology is the most frequently utilized test to detect urothelial cancer. Secondary bladder neoplasms need to be recognized as this impacts patient management. We report our experience on nonurothelial malignancies (NUM) detected in urine cytology over a 10-year period. A 10-year retrospective search for patients with biopsy-proven NUM to the urothelial tract yielded 25 urine samples from 14 patients. Two cytopathologists blinded to the original cytology diagnosis reviewed the cytology and histology slides. The incidence, cytomorphologic features, diagnostic accuracy, factors influencing the diagnostic accuracy, and clinical impact of the cytology result were studied. The incidence of NUM was <1%. The male:female ratio was 1.3. An abnormality was detected in 60% of the cases; however, in only 4% of the cases, a primary site was identified accurately. Of the false negatives, 96% was deemed as sampling errors and 4% was interpretational. Patient management was not impacted in any of the false-negative cases due to concurrent or past tissue diagnosis. Colon cancer was the most frequent secondary tumor. Sampling error attributed to the false-negative results. Necrosis and dirty background was often associated with metastatic lesions from colon. Obtaining history of a primary tumor elsewhere was a key factor in diagnosis of a metastatic lesion. Hematopoietic malignancies remain to be a diagnostic challenge. Cytospin preparations were superior for evaluating nuclear detail and background material as opposed to monolayer (Thinprep) technology. Diagnostic accuracy was improved by obtaining immunohistochemistry. Diagn. Cytopathol. 2016. © 2016 Wiley Periodicals, Inc. Diagn. Cytopathol. 2017;45:22-28. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Quantitative analysis of in situ optical diagnostics for inferring particle/aggregate parameters in flames: Implications for soot surface growth and total emissivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koeylue, U.O.

    1997-05-01

    An in situ particulate diagnostic/analysis technique is outlined based on the Rayleigh-Debye-Gans polydisperse fractal aggregate (RDG/PFA) scattering interpretation of absolute angular light scattering and extinction measurements. Using proper particle refractive index, the proposed data analysis method can quantitatively yield all aggregate parameters (particle volume fraction, f{sub v}, fractal dimension, D{sub f}, primary particle diameter, d{sub p}, particle number density, n{sub p}, and aggregate size distribution, pdf(N)) without any prior knowledge about the particle-laden environment. The present optical diagnostic/interpretation technique was applied to two different soot-containing laminar and turbulent ethylene/air nonpremixed flames in order to assess its reliability. The aggregate interpretationmore » of optical measurements yielded D{sub f}, d{sub p}, and pdf(N) that are in excellent agreement with ex situ thermophoretic sampling/transmission electron microscope (TS/TEM) observations within experimental uncertainties. However, volume-equivalent single particle models (Rayleigh/Mie) overestimated d{sub p} by about a factor of 3, causing an order of magnitude underestimation in n{sub p}. Consequently, soot surface areas and growth rates were in error by a factor of 3, emphasizing that aggregation effects need to be taken into account when using optical diagnostics for a reliable understanding of soot formation/evolution mechanism in flames. The results also indicated that total soot emissivities were generally underestimated using Rayleigh analysis (up to 50%), mainly due to the uncertainties in soot refractive indices at infrared wavelengths. This suggests that aggregate considerations may not be essential for reasonable radiation heat transfer predictions from luminous flames because of fortuitous error cancellation, resulting in typically a 10 to 30% net effect.« less

  19. Quality of data regarding diagnoses of spinal disorders in administrative databases. A multicenter study.

    PubMed

    Faciszewski, T; Broste, S K; Fardon, D

    1997-10-01

    The purpose of the present study was to evaluate the accuracy of data regarding diagnoses of spinal disorders in administrative databases at eight different institutions. The records of 189 patients who had been managed for a disorder of the lumbar spine were independently reviewed by a physician who assigned the appropriate diagnostic codes according to the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM). The age range of the 189 patients was seventeen to eighty-four years. The six major diagnostic categories studied were herniation of a lumbar disc, a previous operation on the lumbar spine, spinal stenosis, cauda equina syndrome, acquired spondylolisthesis, and congenital spondylolisthesis. The diagnostic codes assigned by the physician were compared with the codes that had been assigned during the ordinary course of events by personnel in the medical records department of each of the eight hospitals. The accuracy of coding was also compared among the eight hospitals, and it was found to vary depending on the diagnosis. Although there were both false-negative and false-positive codes at each institution, most errors were related to the low sensitivity of coding for previous spinal operations: only seventeen (28 per cent) of sixty-one such diagnoses were coded correctly. Other errors in coding were less frequent, but their implications for conclusions drawn from the information in administrative databases depend on the frequency of a diagnosis and its importance in an analysis. This study demonstrated that the accuracy of a diagnosis of a spinal disorder recorded in an administrative database varies according to the specific condition being evaluated. It is necessary to document the relative accuracy of specific ICD-9-CM diagnostic codes in order to improve the ability to validate the conclusions derived from investigations based on administrative databases.

  20. Failure mode analysis in adrenal vein sampling: a single-center experience.

    PubMed

    Trerotola, Scott O; Asmar, Melissa; Yan, Yan; Fraker, Douglas L; Cohen, Debbie L

    2014-10-01

    To analyze failure modes in a high-volume adrenal vein sampling (AVS) practice in an effort to identify preventable causes of nondiagnostic sampling. A retrospective database was constructed containing 343 AVS procedures performed over a 10-year period. Each nondiagnostic AVS procedure was reviewed for failure mode and correlated with results of any repeat AVS. Data collected included selectivity index, lateralization index, adrenalectomy outcomes if performed, and details of AVS procedure. All AVS procedures were performed after cosyntropin stimulation, using sequential technique. AVS was nondiagnostic in 12 of 343 (3.5%) primary procedures and 2 secondary procedures. Failure was right-sided in 8 (57%) procedures, left-sided in 4 (29%) procedures, bilateral in 1 procedure, and neither in 1 procedure (laboratory error). Failure modes included diluted sample from correctly identified vein (n = 7 [50%]; 3 right and 4 left), vessel misidentified as adrenal vein (n = 3 [21%]; all right), failure to locate an adrenal vein (n = 2 [14%]; both right), cosyntropin stimulation failure (n = 1 [7%]; diagnostic by nonstimulated criteria), and laboratory error (n = 1 [7%]; specimen loss). A second AVS procedure was diagnostic in three of five cases (60%), and a third AVS procedure was diagnostic in one of one case (100%). Among the eight patients in whom AVS ultimately was not diagnostic, four underwent adrenalectomy based on diluted AVS samples, and one underwent adrenalectomy based on imaging; all five experienced improvement in aldosteronism. A substantial percentage of AVS failures occur on the left, all related to dilution. Even when technically nondiagnostic per strict criteria, some "failed" AVS procedures may be sufficient to guide therapy. Repeat AVS has a good yield. Copyright © 2014 SIR. Published by Elsevier Inc. All rights reserved.

Top