Krauss, J K; Halve, B
2004-04-01
There is no agreement on the best diagnostic criteria for selecting patients with normal pressure hydrocephalus (NPH) for CSF shunting. The primary objective of the present study was to provide a contemporary survey on diagnostic algorithms and therapeutic decision-making in clinical practice. The secondary objective was to estimate the incidence of NPH. Standardized questionnaires with sections on the incidence of NPH and the frequency of shunting, evaluation of clinical symptoms, and signs, diagnostic studies, therapeutic decision-making and operative techniques, postoperative outcome and complications, and the profiles of different centers, were sent to 82 neurosurgical centers in Germany known to participate in the care of patients with NPH. Data were analyzed from 49 of 53 centers which responded to the survey (65%). The estimated annual incidence of NPH was 1.8 cases/100.000 inhabitants. Gait disturbance was defined as the most important sign of NPH (61%). There was a wide variety in the choice of diagnostic tests. Cisternography was performed routinely only in single centers. Diagnostic CSF removal was used with varying frequency by all centers except one, but the amount of CSF removed by lumbar puncture differed markedly between centers. There was poor agreement on criteria for evaluation of continuous intracranial pressure recordings regarding both the amplitude and the relative frequency of B-waves. Both periventricular and deep white matter lesions were present in about 50% of patients being shunted, indicating that vascular comorbidity in NPH patients has gained more acceptance. Programmable shunts were used by more than half of the centers, and newer valve types such as gravitational valves have become more popular. According to the present survey, new diagnostic and therapeutic concepts on NPH have penetrated daily routine to a certain extent. Wide variability, however, still exists among different neurosurgical centers.
[CONTEMPORARY MOLECULAR-GENETIC METHODS USED FOR ETIOLOGIC DIAGNOSTICS OF SEPSIS].
Gavrilov, S N; Skachkova, T S; Shipulina, O Yu; Savochkina, Yu A; Shipulin, G A; Maleev, V V
2016-01-01
Etiologic diagnostics of sepsis is one of the most difficult problems of contemporary medicine due to a wide variety of sepsis causative agents, many of which are components of normal human microflora. Disadvantages of contemporary "golden standard" of microbiologic diagnostics of sepsis etiology by seeding of blood for sterility are duration of cultivation, limitation in detection of non-cultivable forms of microorganisms, significant effect of preliminary empiric antibiotics therapy on results of the analysis. Methods of molecular diagnostics that are being actively developed and integrated during the last decade are deprived of these disadvantages. Main contemporary methods of molecular-biological diagnostics are examined in the review, actualdata on their diagnostic characteristic are provided. Special attention is given to methods of PCR-diagnostics, including novel Russian developments. Methods of nucleic acid hybridization and proteomic analysis are examined in comparative aspect. Evaluation of application and perspectives of development of methods of molecular diagnostics of sepsis is given.
Toward detecting deception in intelligent systems
NASA Astrophysics Data System (ADS)
Santos, Eugene, Jr.; Johnson, Gregory, Jr.
2004-08-01
Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.
Fennema-Notestine, Christine; Ozyurt, I. Burak; Clark, Camellia P.; Morris, Shaunna; Bischoff-Grethe, Amanda; Bondi, Mark W.; Jernigan, Terry L.; Fischl, Bruce; Segonne, Florent; Shattuck, David W.; Leahy, Richard M.; Rex, David E.; Toga, Arthur W.; Zou, Kelly H.; BIRN, Morphometry; Brown, Gregory G.
2008-01-01
Performance of automated methods to isolate brain from nonbrain tissues in magnetic resonance (MR) structural images may be influenced by MR signal inhomogeneities, type of MR image set, regional anatomy, and age and diagnosis of subjects studied. The present study compared the performance of four methods: Brain Extraction Tool (BET; Smith [2002]: Hum Brain Mapp 17:143–155); 3dIntracranial (Ward [1999] Milwaukee: Biophysics Research Institute, Medical College of Wisconsin; in AFNI); a Hybrid Watershed algorithm (HWA, Segonne et al. [2004] Neuroimage 22:1060–1075; in FreeSurfer); and Brain Surface Extractor (BSE, Sandor and Leahy [1997] IEEE Trans Med Imag 16:41–54; Shattuck et al. [2001] Neuroimage 13:856 – 876) to manually stripped images. The methods were applied to uncorrected and bias-corrected datasets; Legacy and Contemporary T1-weighted image sets; and four diagnostic groups (depressed, Alzheimer’s, young and elderly control). To provide a criterion for outcome assessment, two experts manually stripped six sagittal sections for each dataset in locations where brain and nonbrain tissue are difficult to distinguish. Methods were compared on Jaccard similarity coefficients, Hausdorff distances, and an Expectation-Maximization algorithm. Methods tended to perform better on contemporary datasets; bias correction did not significantly improve method performance. Mesial sections were most difficult for all methods. Although AD image sets were most difficult to strip, HWA and BSE were more robust across diagnostic groups compared with 3dIntracranial and BET. With respect to specificity, BSE tended to perform best across all groups, whereas HWA was more sensitive than other methods. The results of this study may direct users towards a method appropriate to their T1-weighted datasets and improve the efficiency of processing for large, multisite neuroimaging studies. PMID:15986433
[SCAN system--semi-structured interview based on diagnostic criteria].
Adamowski, Tomasz; Kiejna, Andrzej; Hadryś, Tomasz
2006-01-01
This paper presents the main features of contemporary diagnostic systems which are implemented into the SCAN--modern and semi-structured diagnostic interview. The concepts of further development of the classifications, rationale for operationalized diagnostic criteria and for the divisional approach to mental diagnoses will be in focus. The structure and components of SCAN ver. 2.1 (WHO), i.e. Present State Examination--10th edition, Item Group Checklist, Clinical History Schedule, Glossary of Definitions and computer software with the diagnostic algorithm: I-Shell, as well as rules for a reliable use of diagnostic rating scales, will be discussed within the scope of this paper. The materials and training sets necessary for the learning of proper use of the SCAN, especially training sets for SCAN Training Centers and the Reference Manual--a form of guidebook for SCAN shall be introduced. Finally the paper will present evidence that SCAN is an instrument feasible in different cultural settings. Reliability and validity data of SCAN will also be dealt with indicating that SCAN could be widely used in research studies as well as in everyday clinical practice facilitating more detailed diagnostic approach to a patient.
[Criteria of the molecular pathology testing of lung cancer].
Tímár, József
2014-06-01
From the aspect of the contemporary pathologic diagnostics of lung cancer the tissue obtained is a key issue since small biopsies and cytology still play a major role. In the non-small cell lung cancer era cytology is considered equal to biopsy however, in recent years it is unable to provide quality diagnosis and must be replaced by biopsy. Various molecular techniques can handle various different tissue samples which must be considered during molecular pathology diagnosis. Moreover, tumor cell-normal cell ratio in the obtained tissue, as well as the absolute tumor cell number have great significance, which information must be provided in the primary lung cancer diagnosis. Last but not least, for continuous sustainable molecular diagnostics of lung cancer rational algorithms, affordable technology and appropriate reimbursement are equally necessary.
Robotic-assisted sacrocolpopexy for pelvic organ prolapse.
White, Wesley M; Pickens, Ryan B; Elder, Robert F; Firoozi, Farzeen
2014-11-01
The demand for surgical correction of pelvic organ prolapse is expected to grow as the aging population remains active and focused on quality of life. Definitive correction of pelvic organ prolapse can be accomplished through both vaginal and abdominal approaches. This article provides a contemporary reference source that specifically addresses the historical framework, diagnostic algorithm, and therapeutic options for the treatment of female pelvic organ prolapse. Particular emphasis is placed on the role and technique of abdominal-based reconstruction using robotic technology and the evolving controversy regarding the use of synthetic vaginal mesh. Copyright © 2014 Elsevier Inc. All rights reserved.
Preserving the Person in Contemporary Psychiatry.
Gabbard, Glen O
2018-06-01
Psychodynamic psychiatry is a way of thinking that places the person at the heart of diagnostic understanding and treatment. This emphasis on unique characteristics of an individual is at odds with much of contemporary psychiatric thought, which is geared to identifying a set of criteria designed to identify discrete diagnostic categories with biological underpinnings. This article addresses component parts of the person that are linked to psychodynamic constructs and lie at the heart of diagnostic understanding and treatment in psychodynamic psychiatry. Copyright © 2018 Elsevier Inc. All rights reserved.
Dunbar, R; Naidoo, P; Beyers, N; Langley, I
2017-04-01
Cape Town, South Africa. To compare the diagnostic yield for smear/culture and Xpert® MTB/RIF algorithms and to investigate the mechanisms influencing tuberculosis (TB) yield. We developed and validated an operational model of the TB diagnostic process, first with the smear/culture algorithm and then with the Xpert algorithm. We modelled scenarios by varying TB prevalence, adherence to diagnostic algorithms and human immunodeficiency virus (HIV) status. This enabled direct comparisons of diagnostic yield in the two algorithms to be made. Routine data showed that diagnostic yield had decreased over the period of the Xpert algorithm roll-out compared to the yield when the smear/culture algorithm was in place. However, modelling yield under identical conditions indicated a 13.3% increase in diagnostic yield from the Xpert algorithm compared to smear/culture. The model demonstrated that the extensive use of culture in the smear/culture algorithm and the decline in TB prevalence are the main factors contributing to not finding an increase in diagnostic yield in the routine data. We demonstrate the benefits of an operational model to determine the effect of scale-up of a new diagnostic algorithm, and recommend that policy makers use operational modelling to make appropriate decisions before new diagnostic algorithms are scaled up.
Two Meanings of Algorithmic Mathematics.
ERIC Educational Resources Information Center
Maurer, Stephen B.
1984-01-01
Two mathematical topics are interpreted from the viewpoints of traditional (performing algorithms) and contemporary (creating algorithms and thinking in terms of them for solving problems and developing theory) algorithmic mathematics. The two topics are Horner's method for evaluating polynomials and Gauss's method for solving systems of linear…
Jahnova, Helena; Dvorakova, Lenka; Vlaskova, Hana; Hulkova, Helena; Poupetova, Helena; Hrebicek, Martin; Jesina, Pavel
2014-09-19
Niemann-Pick disease type C (NPC) is a rare, fatal neurovisceral disorder with autosomal recessive inheritance, and featuring striking clinical variability dependent on the age at onset of neurological symptoms. We report data from a large cohort of 56 Czech patients with NPC diagnosed over a period of 37 years. An observational, retrospective analysis of historic and current clinical and laboratory information was performed among all NPC patients originating from the area of the contemporary Czech Republic and diagnosed between 1975 and 2012. All patients with ≥1 positive diagnostic test and relevant clinical information were included. Data on diagnostic methods (histopathological and/or ultrastructural; biochemical; genetic), clinical status and general information on treatment were collated. Data were examined in accordance with international guidelines for the management of NPC. Between 1975 and 1985 diagnoses were based exclusively on specific histopathological findings, often at autopsy. Bone marrow smear (BMS) analyses have proved to be a very specific indicator for NPC and have become an important part of our diagnostic algorithm. Filipin staining and cholesterol esterification assays became the definitive diagnostic tests after 1985 and were applied in 24 of our patients. Since 2005, more and more patients have been assessed using NPC1/NPC2 gene sequencing. Twelve patients were diagnosed with neonatal/early-infantile onset NPC, 13 with the late-infantile onset form, 20 with the juvenile onset form, and nine with the adolescent/adult onset form. Two diagnosed patients remained neurologically asymptomatic at study completion. Nineteen patients were siblings. Causal NPC1 mutations were determined in 38 patients; two identical NPC2 mutations were identified in one patient. In total, 30 different mutations were identified, 14 of which have been confirmed as novel. The frequency of individual mutated NPC1 alleles in our cohort differs compared with previous published data: the most frequent mutant NPC1 allele was p.R1186H (n = 13), followed by p.P1007A (n = 8), p.S954L (n = 8) and p.I1061T (n = 4). These data demonstrate the evolution of the diagnostic process in NPC over the last four decades. We estimate the contemporary birth prevalence of NPC in the Czech Republic at 0.93 per 100,000.
The oblique perspective: philosophical diagnostics of contemporary life sciences research.
Zwart, Hub
2017-12-01
This paper indicates how continental philosophy may contribute to a diagnostics of contemporary life sciences research, as part of a "diagnostics of the present" (envisioned by continental thinkers, from Hegel up to Foucault). First, I describe (as a "practicing" philosopher) various options for an oblique (or symptomatic) reading of emerging scientific discourse, bent on uncovering the basic "philosophemes" of science (i.e. the guiding ideas, the basic conceptions of nature, life and technology at work in contemporary life sciences research practices). Subsequently, I outline a number of radical transformations occurring both at the object-pole and at the subject-pole of the current knowledge relationship, namely the technification of the object and the anonymisation or collectivisation of the subject, under the sway of automation, ICT and big machines. Finally, I further elaborate the specificity of the oblique perspective with the help of Lacan's theorem of the four discourses. Philosophical reflections on contemporary life sciences concur neither with a Master's discourse (which aims to strengthen the legitimacy and credibility of canonical sources), nor with university discourse (which aims to establish professional expertise), nor with what Lacan refers to as hysterical discourse (which aims to challenge representatives of the power establishment), but rather with the discourse of the analyst, listening with evenly-poised attention to the scientific files in order to bring to the fore the cupido sciendi (i.e. the will to know, but also to optimise and to control) which both inspires and disrupts contemporary life sciences discourse.
Time-critical multirate scheduling using contemporary real-time operating system services
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.
1983-01-01
Although real-time operating systems provide many of the task control services necessary to process time-critical applications (i.e., applications with fixed, invariant deadlines), it may still be necessary to provide a scheduling algorithm at a level above the operating system in order to coordinate a set of synchronized, time-critical tasks executing at different cyclic rates. The scheduling requirements for such applications and develops scheduling algorithms using services provided by contemporary real-time operating systems.
Exploring Genome-Wide Expression Profiles Using Machine Learning Techniques.
Kebschull, Moritz; Papapanou, Panos N
2017-01-01
Although contemporary high-throughput -omics methods produce high-dimensional data, the resulting wealth of information is difficult to assess using traditional statistical procedures. Machine learning methods facilitate the detection of additional patterns, beyond the mere identification of lists of features that differ between groups.Here, we demonstrate the utility of (1) supervised classification algorithms in class validation, and (2) unsupervised clustering in class discovery. We use data from our previous work that described the transcriptional profiles of gingival tissue samples obtained from subjects suffering from chronic or aggressive periodontitis (1) to test whether the two diagnostic entities were also characterized by differences on the molecular level, and (2) to search for a novel, alternative classification of periodontitis based on the tissue transcriptomes.Using machine learning technology, we provide evidence for diagnostic imprecision in the currently accepted classification of periodontitis, and demonstrate that a novel, alternative classification based on differences in gingival tissue transcriptomes is feasible. The outlined procedures allow for the unbiased interrogation of high-dimensional datasets for characteristic underlying classes, and are applicable to a broad range of -omics data.
Echocardiography in Infective Endocarditis: State of the Art.
Afonso, Luis; Kottam, Anupama; Reddy, Vivek; Penumetcha, Anirudh
2017-10-25
In this review, we examine the central role of echocardiography in the diagnosis, prognosis, and management of infective endocarditis (IE). 2D transthoracic echocardiography (TTE) and transesophageal echocardiography TEE have complementary roles and are unequivocally the mainstay of diagnostic imaging in IE. The advent of 3D and multiplanar imaging have greatly enhanced the ability of the imager to evaluate cardiac structure and function. Technologic advances in 3D imaging allow for the reconstruction of realistic anatomic images that in turn have positively impacted IE-related surgical planning and intervention. CT and metabolic imaging appear to be emerging as promising ancillary diagnostic tools that could be deployed in select scenarios to circumvent some of the limitations of echocardiography. Our review summarizes the indispensable and central role of various echocardiographic modalities in the management of infective endocarditis. The complementary role of 2D TTE and TEE are discussed and areas where 3D TEE offers incremental value highlighted. An algorithm summarizing a contemporary approach to the workup of endocarditis is provided and major societal guidelines for timing of surgery are reviewed.
Casella, Francesco; Rana, Bushra; Casazza, Giovanni; Bhan, Amit; Kapetanakis, Stam; Omigie, Joe; Reiken, Joseph; Monaghan, Mark J
2009-09-01
Between 1987 and 1994, several studies demostrated transthoracic echocardiography (TTE) to be less sensitive than transesophageal echocardiography (TEE) in detecting native valve endocarditis. Recent technologic advances, especially the introduction of harmonic imaging and digital processing and storage, have improved TTE image quality. The aim of this study was to determine the diagnostic accuracy of contemporary TTE. Between 2003 and 2007, 75 patients underwent both TTE and TEE for clinically suspected infective endocarditis. The diagnostic accuracy of TTE was assessed using transesophageal echocardiography as the gold standard for diagnosis of endocarditis. Of the 75 patients in this study, 33 were found to be positive by TEE. The sensitivity for detection of infective endocarditis by TTE was 81.8%. It provided good image quality in 81.5% of cases; in these patients sensitivity was even greater (89.3%). Contemporary TTE has improved the diagnostic accuracy of infective endocarditis by ameliorating image quality; it provides an accurate assessment of endocarditis and may reduce the need for TEE.
[Contemporary methods for preterm labor diagnostics].
Kolev, N; Kovachev, E; Ivanov, S; Kornovski, Y; Tsvetkov, K; Angelova, M; Tsonev, A; Ismail, E
2013-01-01
Authors track current trends in preterm labor diagnostics. The emphasis is laid on biochemical tests for examination of fibronectin (fFN) and insulin-like growth factor-binding protein (IGFBP-1) in cervical and vaginal secretions, as well as ultrasound assessment of cervical length.
Integration of On-Line and Off-Line Diagnostic Algorithms for Aircraft Engine Health Management
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2007-01-01
This paper investigates the integration of on-line and off-line diagnostic algorithms for aircraft gas turbine engines. The on-line diagnostic algorithm is designed for in-flight fault detection. It continuously monitors engine outputs for anomalous signatures induced by faults. The off-line diagnostic algorithm is designed to track engine health degradation over the lifetime of an engine. It estimates engine health degradation periodically over the course of the engine s life. The estimate generated by the off-line algorithm is used to update the on-line algorithm. Through this integration, the on-line algorithm becomes aware of engine health degradation, and its effectiveness to detect faults can be maintained while the engine continues to degrade. The benefit of this integration is investigated in a simulation environment using a nonlinear engine model.
Diagnostic Algorithm Benchmarking
NASA Technical Reports Server (NTRS)
Poll, Scott
2011-01-01
A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.
Diagnostic work-up and loss of tuberculosis suspects in Jogjakarta, Indonesia.
Ahmad, Riris Andono; Matthys, Francine; Dwihardiani, Bintari; Rintiswati, Ning; de Vlas, Sake J; Mahendradhata, Yodi; van der Stuyft, Patrick
2012-02-15
Early and accurate diagnosis of pulmonary tuberculosis (TB) is critical for successful TB control. To assist in the diagnosis of smear-negative pulmonary TB, the World Health Organisation (WHO) recommends the use of a diagnostic algorithm. Our study evaluated the implementation of the national tuberculosis programme's diagnostic algorithm in routine health care settings in Jogjakarta, Indonesia. The diagnostic algorithm is based on the WHO TB diagnostic algorithm, which had already been implemented in the health facilities. We prospectively documented the diagnostic work-up of all new tuberculosis suspects until a diagnosis was reached. We used clinical audit forms to record each step chronologically. Data on the patient's gender, age, symptoms, examinations (types, dates, and results), and final diagnosis were collected. Information was recorded for 754 TB suspects; 43.5% of whom were lost during the diagnostic work-up in health centres, 0% in lung clinics. Among the TB suspects who completed diagnostic work-ups, 51.1% and 100.0% were diagnosed without following the national TB diagnostic algorithm in health centres and lung clinics, respectively. However, the work-up in the health centres and lung clinics generally conformed to international standards for tuberculosis care (ISTC). Diagnostic delays were significantly longer in health centres compared to lung clinics. The high rate of patients lost in health centres needs to be addressed through the implementation of TB suspect tracing and better programme supervision. The national TB algorithm needs to be revised and differentiated according to the level of care.
Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie
2009-01-01
Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.
ERIC Educational Resources Information Center
Bishop, Dorothy V. M.; Whitehouse, Andrew J. O.; Watt, Helen J.; Line, Elizabeth A.
2008-01-01
Rates of diagnosis of autism have risen since 1980, raising the question of whether some children who previously had other diagnoses are now being diagnosed with autism. We applied contemporary diagnostic criteria for autism to adults with a history of developmental language disorder, to discover whether diagnostic substitution has taken place. A…
ERIC Educational Resources Information Center
Hus, Vanessa; Lord, Catherine
2014-01-01
The recently published Autism Diagnostic Observation Schedule, 2nd edition (ADOS-2) includes revised diagnostic algorithms and standardized severity scores for modules used to assess younger children. A revised algorithm and severity scores are not yet available for Module 4, used with verbally fluent adults. The current study revises the Module 4…
ERIC Educational Resources Information Center
Oosterling, Iris; Roos, Sascha; de Bildt, Annelies; Rommelse, Nanda; de Jonge, Maretha; Visser, Janne; Lappenschaar, Martijn; Swinkels, Sophie; van der Gaag, Rutger Jan; Buitelaar, Jan
2010-01-01
Recently, Gotham et al. ("2007") proposed revised algorithms for the Autism Diagnostic Observation Schedule (ADOS) with improved diagnostic validity. The aim of the current study was to replicate predictive validity, factor structure, and correlations with age and verbal and nonverbal IQ of the ADOS revised algorithms for Modules 1 and 2…
Diagnostic work-up and loss of tuberculosis suspects in Jogjakarta, Indonesia
2012-01-01
Background Early and accurate diagnosis of pulmonary tuberculosis (TB) is critical for successful TB control. To assist in the diagnosis of smear-negative pulmonary TB, the World Health Organisation (WHO) recommends the use of a diagnostic algorithm. Our study evaluated the implementation of the national tuberculosis programme's diagnostic algorithm in routine health care settings in Jogjakarta, Indonesia. The diagnostic algorithm is based on the WHO TB diagnostic algorithm, which had already been implemented in the health facilities. Methods We prospectively documented the diagnostic work-up of all new tuberculosis suspects until a diagnosis was reached. We used clinical audit forms to record each step chronologically. Data on the patient's gender, age, symptoms, examinations (types, dates, and results), and final diagnosis were collected. Results Information was recorded for 754 TB suspects; 43.5% of whom were lost during the diagnostic work-up in health centres, 0% in lung clinics. Among the TB suspects who completed diagnostic work-ups, 51.1% and 100.0% were diagnosed without following the national TB diagnostic algorithm in health centres and lung clinics, respectively. However, the work-up in the health centres and lung clinics generally conformed to international standards for tuberculosis care (ISTC). Diagnostic delays were significantly longer in health centres compared to lung clinics. Conclusions The high rate of patients lost in health centres needs to be addressed through the implementation of TB suspect tracing and better programme supervision. The national TB algorithm needs to be revised and differentiated according to the level of care. PMID:22333111
Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Jensen, David; Poll, Scott
2009-01-01
Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.
A Hybrid Neural Network-Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2001-01-01
In this paper, a model-based diagnostic method, which utilizes Neural Networks and Genetic Algorithms, is investigated. Neural networks are applied to estimate the engine internal health, and Genetic Algorithms are applied for sensor bias detection and estimation. This hybrid approach takes advantage of the nonlinear estimation capability provided by neural networks while improving the robustness to measurement uncertainty through the application of Genetic Algorithms. The hybrid diagnostic technique also has the ability to rank multiple potential solutions for a given set of anomalous sensor measurements in order to reduce false alarms and missed detections. The performance of the hybrid diagnostic technique is evaluated through some case studies derived from a turbofan engine simulation. The results show this approach is promising for reliable diagnostics of aircraft engines.
Adeshina, A M; Hashim, R
2017-03-01
Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also successfully encrypts and decrypts textual data in Microsoft Word document, Microsoft Excel and Portable Document Formats which are the conventional format of documenting medical records. Interestingly, the entire encryption and decryption procedures were achieved at a lower computational cost using regular hardware and software resources without compromising neither the quality of the decrypted data nor the security level of the algorithms.
SWIM: A Semi-Analytical Ocean Color Inversion Algorithm for Optically Shallow Waters
NASA Technical Reports Server (NTRS)
McKinna, Lachlan I. W.; Werdell, P. Jeremy; Fearns, Peter R. C. S.; Weeks, Scarla J.; Reichstetter, Martina; Franz, Bryan A.; Shea, Donald M.; Feldman, Gene C.
2014-01-01
Ocean color remote sensing provides synoptic-scale, near-daily observations of marine inherent optical properties (IOPs). Whilst contemporary ocean color algorithms are known to perform well in deep oceanic waters, they have difficulty operating in optically clear, shallow marine environments where light reflected from the seafloor contributes to the water-leaving radiance. The effect of benthic reflectance in optically shallow waters is known to adversely affect algorithms developed for optically deep waters [1, 2]. Whilst adapted versions of optically deep ocean color algorithms have been applied to optically shallow regions with reasonable success [3], there is presently no approach that directly corrects for bottom reflectance using existing knowledge of bathymetry and benthic albedo.To address the issue of optically shallow waters, we have developed a semi-analytical ocean color inversion algorithm: the Shallow Water Inversion Model (SWIM). SWIM uses existing bathymetry and a derived benthic albedo map to correct for bottom reflectance using the semi-analytical model of Lee et al [4]. The algorithm was incorporated into the NASA Ocean Biology Processing Groups L2GEN program and tested in optically shallow waters of the Great Barrier Reef, Australia. In-lieu of readily available in situ matchup data, we present a comparison between SWIM and two contemporary ocean color algorithms, the Generalized Inherent Optical Property Algorithm (GIOP) and the Quasi-Analytical Algorithm (QAA).
ERIC Educational Resources Information Center
Zander, Eric; Sturm, Harald; Bölte, Sven
2015-01-01
The diagnostic validity of the new research algorithms of the Autism Diagnostic Interview-Revised and the revised algorithms of the Autism Diagnostic Observation Schedule was examined in a clinical sample of children aged 18-47 months. Validity was determined for each instrument separately and their combination against a clinical consensus…
ERIC Educational Resources Information Center
Kim, So Hyun; Lord, Catherine
2012-01-01
Autism Diagnostic Interview-Revised (Rutter et al. in "Autism diagnostic interview-revised." Western Psychological Services, Los Angeles, 2003) diagnostic algorithms specific to toddlers and young preschoolers were created using 829 assessments of children aged from 12 to 47 months with ASD, nonspectrum disorders, and typical development. The…
SUMIE, HIROAKI; SUMIE, SHUJI; NAKAHARA, KEITA; WATANABE, YASUTOMO; MATSUO, KEN; MUKASA, MICHITA; SAKAI, TAKESHI; YOSHIDA, HIKARU; TSURUTA, OSAMU; SATA, MICHIO
2014-01-01
The usefulness of magnifying endoscopy with narrow-band imaging (ME-NBI) for the diagnosis of early gastric cancer is well known, however, there are no evaluation criteria. The aim of this study was to devise and evaluate a novel diagnostic algorithm for ME-NBI in depressed early gastric cancer. Between August, 2007 and May, 2011, 90 patients with a total of 110 depressed gastric lesions were enrolled in the study. A diagnostic algorithm was devised based on ME-NBI microvascular findings: microvascular irregularity and abnormal microvascular patterns (fine network, corkscrew and unclassified patterns). The diagnostic efficiency of the algorithm for gastric cancer and histological grade was assessed by measuring its mean sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy. Furthermore, inter- and intra-observer variation were measured. In the differential diagnosis of gastric cancer from non-cancerous lesions, the mean sensitivity, specificity, PPV, NPV, and accuracy of the diagnostic algorithm were 86.7, 48.0, 94.4, 26.7, and 83.2%, respectively. Furthermore, in the differential diagnosis of undifferentiated adenocarcinoma from differentiated adenocarcinoma, the mean sensitivity, specificity, PPV, NPV, and accuracy of the diagnostic algorithm were 61.6, 86.3, 69.0, 84.8, and 79.1%, respectively. For the ME-NBI final diagnosis using this algorithm, the mean κ values for inter- and intra-observer agreement were 0.50 and 0.77, respectively. In conclusion, the diagnostic algorithm based on ME-NBI microvascular findings was convenient and had high diagnostic accuracy, reliability and reproducibility in the differential diagnosis of depressed gastric lesions. PMID:24649321
Computer-aided US diagnosis of breast lesions by using cell-based contour grouping.
Cheng, Jie-Zhi; Chou, Yi-Hong; Huang, Chiun-Sheng; Chang, Yeun-Chung; Tiu, Chui-Mei; Chen, Kuei-Wu; Chen, Chung-Ming
2010-06-01
To develop a computer-aided diagnostic algorithm with automatic boundary delineation for differential diagnosis of benign and malignant breast lesions at ultrasonography (US) and investigate the effect of boundary quality on the performance of a computer-aided diagnostic algorithm. This was an institutional review board-approved retrospective study with waiver of informed consent. A cell-based contour grouping (CBCG) segmentation algorithm was used to delineate the lesion boundaries automatically. Seven morphologic features were extracted. The classifier was a logistic regression function. Five hundred twenty breast US scans were obtained from 520 subjects (age range, 15-89 years), including 275 benign (mean size, 15 mm; range, 5-35 mm) and 245 malignant (mean size, 18 mm; range, 8-29 mm) lesions. The newly developed computer-aided diagnostic algorithm was evaluated on the basis of boundary quality and differentiation performance. The segmentation algorithms and features in two conventional computer-aided diagnostic algorithms were used for comparative study. The CBCG-generated boundaries were shown to be comparable with the manually delineated boundaries. The area under the receiver operating characteristic curve (AUC) and differentiation accuracy were 0.968 +/- 0.010 and 93.1% +/- 0.7, respectively, for all 520 breast lesions. At the 5% significance level, the newly developed algorithm was shown to be superior to the use of the boundaries and features of the two conventional computer-aided diagnostic algorithms in terms of AUC (0.974 +/- 0.007 versus 0.890 +/- 0.008 and 0.788 +/- 0.024, respectively). The newly developed computer-aided diagnostic algorithm that used a CBCG segmentation method to measure boundaries achieved a high differentiation performance. Copyright RSNA, 2010
Andreini, Daniele; Lin, Fay Y; Rizvi, Asim; Cho, Iksung; Heo, Ran; Pontone, Gianluca; Bartorelli, Antonio L; Mushtaq, Saima; Villines, Todd C; Carrascosa, Patricia; Choi, Byoung Wook; Bloom, Stephen; Wei, Han; Xing, Yan; Gebow, Dan; Gransar, Heidi; Chang, Hyuk-Jae; Leipsic, Jonathon; Min, James K
2018-06-01
Motion artifact can reduce the diagnostic accuracy of coronary CT angiography (CCTA) for coronary artery disease (CAD). The purpose of this study was to compare the diagnostic performance of an algorithm dedicated to correcting coronary motion artifact with the performance of standard reconstruction methods in a prospective international multicenter study. Patients referred for clinically indicated invasive coronary angiography (ICA) for suspected CAD prospectively underwent an investigational CCTA examination free from heart rate-lowering medications before they underwent ICA. Blinded core laboratory interpretations of motion-corrected and standard reconstructions for obstructive CAD (≥ 50% stenosis) were compared with ICA findings. Segments unevaluable owing to artifact were considered obstructive. The primary endpoint was per-subject diagnostic accuracy of the intracycle motion correction algorithm for obstructive CAD found at ICA. Among 230 patients who underwent CCTA with the motion correction algorithm and standard reconstruction, 92 (40.0%) had obstructive CAD on the basis of ICA findings. At a mean heart rate of 68.0 ± 11.7 beats/min, the motion correction algorithm reduced the number of nondiagnostic scans compared with standard reconstruction (20.4% vs 34.8%; p < 0.001). Diagnostic accuracy for obstructive CAD with the motion correction algorithm (62%; 95% CI, 56-68%) was not significantly different from that of standard reconstruction on a per-subject basis (59%; 95% CI, 53-66%; p = 0.28) but was superior on a per-vessel basis: 77% (95% CI, 74-80%) versus 72% (95% CI, 69-75%) (p = 0.02). The motion correction algorithm was superior in subgroups of patients with severely obstructive (≥ 70%) stenosis, heart rate ≥ 70 beats/min, and vessels in the atrioventricular groove. The motion correction algorithm studied reduces artifacts and improves diagnostic performance for obstructive CAD on a per-vessel basis and in selected subgroups on a per-subject basis.
Using qualitative research to inform development of a diagnostic algorithm for UTI in children.
de Salis, Isabel; Whiting, Penny; Sterne, Jonathan A C; Hay, Alastair D
2013-06-01
Diagnostic and prognostic algorithms can help reduce clinical uncertainty. The selection of candidate symptoms and signs to be measured in case report forms (CRFs) for potential inclusion in diagnostic algorithms needs to be comprehensive, clearly formulated and relevant for end users. To investigate whether qualitative methods could assist in designing CRFs in research developing diagnostic algorithms. Specifically, the study sought to establish whether qualitative methods could have assisted in designing the CRF for the Health Technology Association funded Diagnosis of Urinary Tract infection in Young children (DUTY) study, which will develop a diagnostic algorithm to improve recognition of urinary tract infection (UTI) in children aged <5 years presenting acutely unwell to primary care. Qualitative methods were applied using semi-structured interviews of 30 UK doctors and nurses working with young children in primary care and a Children's Emergency Department. We elicited features that clinicians believed useful in diagnosing UTI and compared these for presence or absence and terminology with the DUTY CRF. Despite much agreement between clinicians' accounts and the DUTY CRFs, we identified a small number of potentially important symptoms and signs not included in the CRF and some included items that could have been reworded to improve understanding and final data analysis. This study uniquely demonstrates the role of qualitative methods in the design and content of CRFs used for developing diagnostic (and prognostic) algorithms. Research groups developing such algorithms should consider using qualitative methods to inform the selection and wording of candidate symptoms and signs.
ERIC Educational Resources Information Center
Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L.; Yerys, Benjamin E.; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth
2015-01-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised…
SWIM: A Semi-Analytical Ocean Color Inversion Algorithm for Optically Shallow Waters
NASA Technical Reports Server (NTRS)
McKinna, Lachlan I. W.; Werdell, P. Jeremy; Fearns, Peter R. C. S.; Weeks, Scarla J.; Reichstetter, Martina; Franz, Bryan A.; Bailey, Sean W.; Shea, Donald M.; Feldman, Gene C.
2014-01-01
In clear shallow waters, light that is transmitted downward through the water column can reflect off the sea floor and thereby influence the water-leaving radiance signal. This effect can confound contemporary ocean color algorithms designed for deep waters where the seafloor has little or no effect on the water-leaving radiance. Thus, inappropriate use of deep water ocean color algorithms in optically shallow regions can lead to inaccurate retrievals of inherent optical properties (IOPs) and therefore have a detrimental impact on IOP-based estimates of marine parameters, including chlorophyll-a and the diffuse attenuation coefficient. In order to improve IOP retrievals in optically shallow regions, a semi-analytical inversion algorithm, the Shallow Water Inversion Model (SWIM), has been developed. Unlike established ocean color algorithms, SWIM considers both the water column depth and the benthic albedo. A radiative transfer study was conducted that demonstrated how SWIM and two contemporary ocean color algorithms, the Generalized Inherent Optical Properties algorithm (GIOP) and Quasi-Analytical Algorithm (QAA), performed in optically deep and shallow scenarios. The results showed that SWIM performed well, whilst both GIOP and QAA showed distinct positive bias in IOP retrievals in optically shallow waters. The SWIM algorithm was also applied to a test region: the Great Barrier Reef, Australia. Using a single test scene and time series data collected by NASA's MODIS-Aqua sensor (2002-2013), a comparison of IOPs retrieved by SWIM, GIOP and QAA was conducted.
Ferreira, António Miguel; Marques, Hugo; Tralhão, António; Santos, Miguel Borges; Santos, Ana Rita; Cardoso, Gonçalo; Dores, Hélder; Carvalho, Maria Salomé; Madeira, Sérgio; Machado, Francisco Pereira; Cardim, Nuno; de Araújo Gonçalves, Pedro
2016-11-01
Current guidelines recommend the use of the Modified Diamond-Forrester (MDF) method to assess the pre-test likelihood of obstructive coronary artery disease (CAD). We aimed to compare the performance of the MDF method with two contemporary algorithms derived from multicenter trials that additionally incorporate cardiovascular risk factors: the calculator-based 'CAD Consortium 2' method, and the integer-based CONFIRM score. We assessed 1069 consecutive patients without known CAD undergoing coronary CT angiography (CCTA) for stable chest pain. Obstructive CAD was defined as the presence of coronary stenosis ≥50% on 64-slice dual-source CT. The three methods were assessed for calibration, discrimination, net reclassification, and changes in proposed downstream testing based upon calculated pre-test likelihoods. The observed prevalence of obstructive CAD was 13.8% (n=147). Overestimations of the likelihood of obstructive CAD were 140.1%, 9.8%, and 18.8%, respectively, for the MDF, CAD Consortium 2 and CONFIRM methods. The CAD Consortium 2 showed greater discriminative power than the MDF method, with a C-statistic of 0.73 vs. 0.70 (p<0.001), while the CONFIRM score did not (C-statistic 0.71, p=0.492). Reclassification of pre-test likelihood using the 'CAD Consortium 2' or CONFIRM scores resulted in a net reclassification improvement of 0.19 and 0.18, respectively, which would change the diagnostic strategy in approximately half of the patients. Newer risk factor-encompassing models allow for a more precise estimation of pre-test probabilities of obstructive CAD than the guideline-recommended MDF method. Adoption of these scores may improve disease prediction and change the diagnostic pathway in a significant proportion of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Diagnostic Utility of the ADI-R and DSM-5 in the Assessment of Latino Children and Adolescents.
Magaña, Sandy; Vanegas, Sandra B
2017-05-01
Latino children in the US are systematically underdiagnosed with Autism Spectrum Disorder (ASD); therefore, it is important that recent changes to the diagnostic process do not exacerbate this pattern of under-identification. Previous research has found that the Autism Diagnostic Interview-Revised (ADI-R) algorithm, based on the Diagnostic and Statistical Manual of Mental Disorder, Fourth Edition, Text Revision (DSM-IV-TR), has limitations with Latino children of Spanish speaking parents. We evaluated whether an ADI-R algorithm based on the new DSM-5 classification for ASD would be more sensitive in identifying Latino children of Spanish speaking parents who have a clinical diagnosis of ASD. Findings suggest that the DSM-5 algorithm shows better sensitivity than the DSM-IV-TR algorithm for Latino children.
Unlu, Ezgi; Akay, Bengu N; Erdem, Cengizhan
2014-07-01
Dermatoscopic analysis of melanocytic lesions using the CASH algorithm has rarely been described in the literature. The purpose of this study was to compare the sensitivity, specificity, and diagnostic accuracy rates of the ABCD rule of dermatoscopy, the seven-point checklist, the three-point checklist, and the CASH algorithm in the diagnosis and dermatoscopic evaluation of melanocytic lesions on the hairy skin. One hundred and fifteen melanocytic lesions of 115 patients were examined retrospectively using dermatoscopic images and compared with the histopathologic diagnosis. Four dermatoscopic algorithms were carried out for all lesions. The ABCD rule of dermatoscopy showed sensitivity of 91.6%, specificity of 60.4%, and diagnostic accuracy of 66.9%. The seven-point checklist showed sensitivity, specificity, and diagnostic accuracy of 87.5, 65.9, and 70.4%, respectively; the three-point checklist 79.1, 62.6, 66%; and the CASH algorithm 91.6, 64.8, and 70.4%, respectively. To our knowledge, this is the first study that compares the sensitivity, specificity and diagnostic accuracy of the ABCD rule of dermatoscopy, the three-point checklist, the seven-point checklist, and the CASH algorithm for the diagnosis of melanocytic lesions on the hairy skin. In our study, the ABCD rule of dermatoscopy and the CASH algorithm showed the highest sensitivity for the diagnosis of melanoma. © 2014 Japanese Dermatological Association.
2017-07-07
RESEARCH ARTICLE Self-reported HIV-positive status but subsequent HIV-negative test result using rapid diagnostic testing algorithms among seven sub...America * judith.harbertson.ctr@mail.mil Abstract HIV rapid diagnostic tests (RDTs) combined in an algorithm are the current standard for HIV diagnosis...in many sub-Saharan African countries, and extensive laboratory testing has con- firmed HIV RDTs have excellent sensitivity and specificity. However
Mahmood, Khalid; Jung, Chol-Hee; Philip, Gayle; Georgeson, Peter; Chung, Jessica; Pope, Bernard J; Park, Daniel J
2017-05-16
Genetic variant effect prediction algorithms are used extensively in clinical genomics and research to determine the likely consequences of amino acid substitutions on protein function. It is vital that we better understand their accuracies and limitations because published performance metrics are confounded by serious problems of circularity and error propagation. Here, we derive three independent, functionally determined human mutation datasets, UniFun, BRCA1-DMS and TP53-TA, and employ them, alongside previously described datasets, to assess the pre-eminent variant effect prediction tools. Apparent accuracies of variant effect prediction tools were influenced significantly by the benchmarking dataset. Benchmarking with the assay-determined datasets UniFun and BRCA1-DMS yielded areas under the receiver operating characteristic curves in the modest ranges of 0.52 to 0.63 and 0.54 to 0.75, respectively, considerably lower than observed for other, potentially more conflicted datasets. These results raise concerns about how such algorithms should be employed, particularly in a clinical setting. Contemporary variant effect prediction tools are unlikely to be as accurate at the general prediction of functional impacts on proteins as reported prior. Use of functional assay-based datasets that avoid prior dependencies promises to be valuable for the ongoing development and accurate benchmarking of such tools.
Eliseev, Platon; Balantcev, Grigory; Nikishova, Elena; Gaida, Anastasia; Bogdanova, Elena; Enarson, Donald; Ornstein, Tara; Detjen, Anne; Dacombe, Russell; Gospodarevskaya, Elena; Phillips, Patrick P J; Mann, Gillian; Squire, Stephen Bertel; Mariandyshev, Andrei
2016-01-01
In the Arkhangelsk region of Northern Russia, multidrug-resistant (MDR) tuberculosis (TB) rates in new cases are amongst the highest in the world. In 2014, MDR-TB rates reached 31.7% among new cases and 56.9% among retreatment cases. The development of new diagnostic tools allows for faster detection of both TB and MDR-TB and should lead to reduced transmission by earlier initiation of anti-TB therapy. The PROVE-IT (Policy Relevant Outcomes from Validating Evidence on Impact) Russia study aimed to assess the impact of the implementation of line probe assay (LPA) as part of an LPA-based diagnostic algorithm for patients with presumptive MDR-TB focusing on time to treatment initiation with time from first-care seeking visit to the initiation of MDR-TB treatment rather than diagnostic accuracy as the primary outcome, and to assess treatment outcomes. We hypothesized that the implementation of LPA would result in faster time to treatment initiation and better treatment outcomes. A culture-based diagnostic algorithm used prior to LPA implementation was compared to an LPA-based algorithm that replaced BacTAlert and Löwenstein Jensen (LJ) for drug sensitivity testing. A total of 295 MDR-TB patients were included in the study, 163 diagnosed with the culture-based algorithm, 132 with the LPA-based algorithm. Among smear positive patients, the implementation of the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 50 and 66 days compared to the culture-based algorithm (BacTAlert and LJ respectively, p<0.001). In smear negative patients, the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 78 days when compared to the culture-based algorithm (LJ, p<0.001). However, several weeks were still needed for treatment initiation in LPA-based algorithm, 24 days in smear positive, and 62 days in smear negative patients. Overall treatment outcomes were better in LPA-based algorithm compared to culture-based algorithm (p = 0.003). Treatment success rates at 20 months of treatment were higher in patients diagnosed with the LPA-based algorithm (65.2%) as compared to those diagnosed with the culture-based algorithm (44.8%). Mortality was also lower in the LPA-based algorithm group (7.6%) compared to the culture-based algorithm group (15.9%). There was no statistically significant difference in smear and culture conversion rates between the two algorithms. The results of the study suggest that the introduction of LPA leads to faster time to MDR diagnosis and earlier treatment initiation as well as better treatment outcomes for patients with MDR-TB. These findings also highlight the need for further improvements within the health system to reduce both patient and diagnostic delays to truly optimize the impact of new, rapid diagnostics.
Pugliese, Cara E; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L; Yerys, Benjamin E; Maddox, Brenna B; White, Susan W; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D; Schultz, Robert T; Martin, Alex; Anthony, Laura Gutermuth
2015-12-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised algorithm demonstrated increased sensitivity, but lower specificity in the overall sample. Estimates were highest for females, individuals with a verbal IQ below 85 or above 115, and ages 16 and older. Best practice diagnostic procedures should include the Module 4 in conjunction with other assessment tools. Balancing needs for sensitivity and specificity depending on the purpose of assessment (e.g., clinical vs. research) and demographic characteristics mentioned above will enhance its utility.
Hus, Vanessa; Lord, Catherine
2014-08-01
The recently published Autism Diagnostic Observation Schedule, 2nd edition (ADOS-2) includes revised diagnostic algorithms and standardized severity scores for modules used to assess younger children. A revised algorithm and severity scores are not yet available for Module 4, used with verbally fluent adults. The current study revises the Module 4 algorithm and calibrates raw overall and domain totals to provide metrics of autism spectrum disorder (ASD) symptom severity. Sensitivity and specificity of the revised Module 4 algorithm exceeded 80 % in the overall sample. Module 4 calibrated severity scores provide quantitative estimates of ASD symptom severity that are relatively independent of participant characteristics. These efforts increase comparability of ADOS scores across modules and should facilitate efforts to examine symptom trajectories from toddler to adulthood.
Implementation of an Algorithm for Prosthetic Joint Infection: Deviations and Problems.
Mühlhofer, Heinrich M L; Kanz, Karl-Georg; Pohlig, Florian; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; von Eisenhart-Rothe, Ruediger; Schauwecker, Johannes
The outcome of revision surgery in arthroplasty is based on a precise diagnosis. In addition, the treatment varies based on whether the prosthetic failure is caused by aseptic or septic loosening. Algorithms can help to identify periprosthetic joint infections (PJI) and standardize diagnostic steps, however, algorithms tend to oversimplify the treatment of complex cases. We conducted a process analysis during the implementation of a PJI algorithm to determine problems and deviations associated with the implementation of this algorithm. Fifty patients who were treated after implementing a standardized algorithm were monitored retrospectively. Their treatment plans and diagnostic cascades were analyzed for deviations from the implemented algorithm. Each diagnostic procedure was recorded, compared with the algorithm, and evaluated statistically. We detected 52 deviations while treating 50 patients. In 25 cases, no discrepancy was observed. Synovial fluid aspiration was not performed in 31.8% of patients (95% confidence interval [CI], 18.1%-45.6%), while white blood cell counts (WBCs) and neutrophil differentiation were assessed in 54.5% of patients (95% CI, 39.8%-69.3%). We also observed that the prolonged incubation of cultures was not requested in 13.6% of patients (95% CI, 3.5%-23.8%). In seven of 13 cases (63.6%; 95% CI, 35.2%-92.1%), arthroscopic biopsy was performed; 6 arthroscopies were performed in discordance with the algorithm (12%; 95% CI, 3%-21%). Self-critical analysis of diagnostic processes and monitoring of deviations using algorithms are important and could increase the quality of treatment by revealing recurring faults.
Strategy in the Surgical Treatment of Primary Spinal Tumors
Williams, Richard; Foote, Matthew; Deverall, Hamish
2012-01-01
Primary spine tumors are rare, accounting for only 4% of all tumors of the spine. A minority of the more common primary benign lesions will require surgical treatment, and most amenable malignant lesions will proceed to attempted resection. The rarity of malignant primary lesions has resulted in a paucity of historical data regarding optimal surgical and adjuvant treatment and, although we now derive benefit from standardized guidelines of overall care, management of each neoplasm often proceeds on a case-by-case basis, taking into account the individual characteristics of patient operability, tumor resectability, and biological potential. This article aims to provide an overview of diagnostic techniques, staging algorithms and the authors' experience of surgical treatment alternatives that have been employed in the care of selected benign and malignant lesions. Although broadly a review of contemporary management, it is hoped that the case illustrations given will serve as additional “arrows in the quiver” of the treating surgeon. PMID:24353976
Current perspectives of CASA applications in diverse mammalian spermatozoa.
van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S
2018-03-26
Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.
Smeets, Miek; Degryse, Jan; Janssens, Stefan; Matheï, Catharina; Wallemacq, Pierre; Vanoverschelde, Jean-Louis; Aertgeerts, Bert; Vaes, Bert
2016-10-06
Different diagnostic algorithms for non-acute heart failure (HF) exist. Our aim was to compare the ability of these algorithms to identify HF in symptomatic patients aged 80 years and older and identify those patients at highest risk for mortality. Diagnostic accuracy and validation study. General practice, Belgium. 365 patients with HF symptoms aged 80 years and older (BELFRAIL cohort). Participants underwent a full clinical assessment, including a detailed echocardiographic examination at home. The diagnostic accuracy of 4 different algorithms was compared using an intention-to-diagnose analysis. The European Society of Cardiology (ESC) definition of HF was used as the reference standard for HF diagnosis. Kaplan-Meier curves for 5-year all-cause mortality were plotted and HRs and corresponding 95% CIs were calculated to compare the mortality risk predicting abilities of the different algorithms. Net reclassification improvement (NRI) was calculated. The prevalence of HF was 20% (n=74). The 2012 ESC algorithm yielded the highest sensitivity (92%, 95% CI 83% to 97%) as well as the highest referral rate (71%, n=259), whereas the Oudejans algorithm yielded the highest specificity (73%, 95% CI 68% to 78%) and the lowest referral rate (36%, n=133). These differences could be ascribed to differences in N-terminal probrain natriuretic peptide cut-off values (125 vs 400 pg/mL). The Kelder and Oudejans algorithms exhibited NRIs of 12% (95% CI 0.7% to 22%, p=0.04) and 22% (95% CI 9% to 32%, p<0.001), respectively, compared with the ESC algorithm. All algorithms detected patients at high risk for mortality (HR 1.9, 95% CI 1.4 to 2.5; Kelder) to 2.3 (95% CI 1.7 to 3.1; Oudejans). No significant differences were observed among the algorithms with respect to mortality risk predicting abilities. Choosing a diagnostic algorithm for non-acute HF in elderly patients represents a trade-off between sensitivity and specificity, mainly depending on differences between cut-off values for natriuretic peptides. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Cho-Vega, Jeong Hee
2016-07-01
Atypical spitzoid tumors are a morphologically diverse group of rare melanocytic lesions most frequently seen in children and young adults. As atypical spitzoid tumors bear striking resemblance to Spitz nevus and spitzoid melanomas clinically and histopathologically, it is crucial to determine its malignant potential and predict its clinical behavior. To date, many researchers have attempted to differentiate atypical spitzoid tumors from unequivocal melanomas based on morphological, immonohistochemical, and molecular diagnostic differences. A diagnostic algorithm is proposed here to assess the malignant potential of atypical spitzoid tumors by using a combination of immunohistochemical and cytogenetic/molecular tests. Together with classical morphological evaluation, this algorithm includes a set of immunohistochemistry assays (p16(Ink4a), a dual-color Ki67/MART-1, and HMB45), fluorescence in situ hybridization (FISH) with five probes (6p25, 8q24, 11q13, CEN9, and 9p21), and an array-based comparative genomic hybridization. This review discusses details of the algorithm, the rationale of each test used in the algorithm, and utility of this algorithm in routine dermatopathology practice. This algorithmic approach will provide a comprehensive diagnostic tool that complements conventional histological criteria and will significantly contribute to improve the diagnosis and prediction of the clinical behavior of atypical spitzoid tumors.
Portable Health Algorithms Test System
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Wong, Edmond; Fulton, Christopher E.; Sowers, Thomas S.; Maul, William A.
2010-01-01
A document discusses the Portable Health Algorithms Test (PHALT) System, which has been designed as a means for evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT system allows systems health management algorithms to be developed in a graphical programming environment, to be tested and refined using system simulation or test data playback, and to be evaluated in a real-time hardware-in-the-loop mode with a live test article. The integrated hardware and software development environment provides a seamless transition from algorithm development to real-time implementation. The portability of the hardware makes it quick and easy to transport between test facilities. This hard ware/software architecture is flexible enough to support a variety of diagnostic applications and test hardware, and the GUI-based rapid prototyping capability is sufficient to support development execution, and testing of custom diagnostic algorithms. The PHALT operating system supports execution of diagnostic algorithms under real-time constraints. PHALT can perform real-time capture and playback of test rig data with the ability to augment/ modify the data stream (e.g. inject simulated faults). It performs algorithm testing using a variety of data input sources, including real-time data acquisition, test data playback, and system simulations, and also provides system feedback to evaluate closed-loop diagnostic response and mitigation control.
Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L; Yerys, Benjamin E; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth
2015-01-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised algorithm demonstrated increased sensitivity, but lower specificity in the overall sample. Estimates were highest for females, individuals with a verbal IQ below 85 or above 115, and ages 16 and older. Best practice diagnostic procedures should include the Module 4 in conjunction with other assessment tools. Balancing needs for sensitivity and specificity depending on the purpose of assessment (e.g., clinical vs. research) and demographic characteristics mentioned above will enhance its utility. PMID:26385796
Image quality enhancement for skin cancer optical diagnostics
NASA Astrophysics Data System (ADS)
Bliznuks, Dmitrijs; Kuzmina, Ilona; Bolocko, Katrina; Lihachev, Alexey
2017-12-01
The research presents image quality analysis and enhancement proposals in biophotonic area. The sources of image problems are reviewed and analyzed. The problems with most impact in biophotonic area are analyzed in terms of specific biophotonic task - skin cancer diagnostics. The results point out that main problem for skin cancer analysis is the skin illumination problems. Since it is often not possible to prevent illumination problems, the paper proposes image post processing algorithm - low frequency filtering. Practical results show diagnostic results improvement after using proposed filter. Along that, filter do not reduces diagnostic results' quality for images without illumination defects. Current filtering algorithm requires empirical tuning of filter parameters. Further work needed to test the algorithm in other biophotonic applications and propose automatic filter parameter selection.
Kros, Johan M; Huizer, Karin; Hernández-Laín, Aurelio; Marucci, Gianluca; Michotte, Alex; Pollo, Bianca; Rushing, Elisabeth J; Ribalta, Teresa; French, Pim; Jaminé, David; Bekka, Nawal; Lacombe, Denis; van den Bent, Martin J; Gorlia, Thierry
2015-06-10
With the rapid discovery of prognostic and predictive molecular parameters for glioma, the status of histopathology in the diagnostic process should be scrutinized. Our project aimed to construct a diagnostic algorithm for gliomas based on molecular and histologic parameters with independent prognostic values. The pathology slides of 636 patients with gliomas who had been included in EORTC 26951 and 26882 trials were reviewed using virtual microscopy by a panel of six neuropathologists who independently scored 18 histologic features and provided an overall diagnosis. The molecular data for IDH1, 1p/19q loss, EGFR amplification, loss of chromosome 10 and chromosome arm 10q, gain of chromosome 7, and hypermethylation of the promoter of MGMT were available for some of the cases. The slides were divided in discovery (n = 426) and validation sets (n = 210). The diagnostic algorithm resulting from analysis of the discovery set was validated in the latter. In 66% of cases, consensus of overall diagnosis was present. A diagnostic algorithm consisting of two molecular markers and one consensus histologic feature was created by conditional inference tree analysis. The order of prognostic significance was: 1p/19q loss, EGFR amplification, and astrocytic morphology, which resulted in the identification of four diagnostic nodes. Validation of the nodes in the validation set confirmed the prognostic value (P < .001). We succeeded in the creation of a timely diagnostic algorithm for anaplastic glioma based on multivariable analysis of consensus histopathology and molecular parameters. © 2015 by American Society of Clinical Oncology.
Hus, Vanessa; Lord, Catherine
2014-01-01
The Autism Diagnostic Observation Schedule, 2nd Edition includes revised diagnostic algorithms and standardized severity scores for modules used to assess children and adolescents of varying language abilities. Comparable revisions have not yet been applied to the Module 4, used with verbally fluent adults. The current study revises the Module 4 algorithm and calibrates raw overall and domain totals to provide metrics of ASD symptom severity. Sensitivity and specificity of the revised Module 4 algorithm exceeded 80% in the overall sample. Module 4 calibrated severity scores provide quantitative estimates of ASD symptom severity that are relatively independent of participant characteristics. These efforts increase comparability of ADOS scores across modules and should facilitate efforts to increase understanding of adults with ASD. PMID:24590409
Shah, Benoy N; MacNab, Anita; Lynch, Jane; Hampson, Reinette; Senior, Roxy; Steeds, Richard P
2018-01-01
Stress echocardiography is a widely utilised test in patients with known or suspected coronary artery disease (CAD), valvular heart disease and cardiomyopathies. Its advantages include the ubiquitous availability of echocardiography, lack of ionising radiation, choice of physiological or pharmacological stressors, good diagnostic accuracy and robust supporting evidence base. SE has evolved significantly as a technique over the past three decades and has benefitted considerably from improvements in overall image quality (superior resolution), machine technology (e.g. digital cine-loop acquisition and side-by-side image display) and development of second-generation ultrasound contrast agents that have improved reader confidence and diagnostic accuracy. The purpose of this article is to review the breadth of SE in contemporary clinical cardiology and discuss the recently launched British Society of Echocardiography (BSE) Stress Echocardiography accreditation scheme. PMID:29358185
Translocations, inversions and other chromosome rearrangements.
Morin, Scott J; Eccles, Jennifer; Iturriaga, Amanda; Zimmerman, Rebekah S
2017-01-01
Chromosomal rearrangements have long been known to significantly impact fertility and miscarriage risk. Advancements in molecular diagnostics are challenging contemporary clinicians and patients in accurately characterizing the reproductive risk of a given abnormality. Initial attempts at preimplantation genetic diagnosis were limited by the inability to simultaneously evaluate aneuploidy and missed up to 70% of aneuploidy in chromosomes unrelated to the rearrangement. Contemporary platforms are more accurate and less susceptible to technical errors. These techniques also offer the ability to improve outcomes through diagnosis of uniparental disomy and may soon be able to consistently distinguish between normal and balanced translocation karyotypes. Although an accurate projection of the anticipated number of unbalanced embryos is not possible at present, confirmation of normal/balanced status results in high pregnancy rates (PRs) and diagnostic accuracy. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Xu, Li-Qian; Yang, Yun-Mei; Tong, Hong; Xu, Chang-Fu
2018-04-01
Although cardiac troponin is the cornerstone in diagnosis of acute myocardial infarction (AMI), the accuracy is still suboptimal in the early hours after chest pain onset. Due to its small size, heart-type fatty acid-binding protein (H-FABP) has been reported accurate in diagnosis of AMI, however, this remains undetermined. The aim is to investigate the diagnostic performance of H-FABP alone and in conjunction with high-sensitivity troponin (hs-Tn) within 6 hours of symptom onset. Furthermore, accuracy in 0h/3h algorithm was also assessed. Medline and EMBASE databases were searched; sensitivity, specificity and area under ROC curve (AUC) were used as measures of the diagnostic accuracy. We pooled data on bivariate modelling, threshold effect and publication bias was applied for heterogeneity analysis. Twenty-two studies with 6602 populations were included, pooled sensitivity, specificity and AUC of H-FABP were 0.75 (0.68-0.81), 0.81 (0.75-0.86) and 0.85 (0.82-0.88) within 6 hours. Similar sensitivity (0.76, 0.69-0.82), specificity (0.80, 0.71-0.87) and AUC (0.85, 0.82-0.88) of H-FABP were observed in 4185 (63%) patients in 0h/3h algorithm. The additional use of H-FABP improved the sensitivity of hs-Tn alone but worsened its specificity (all p<0.001), and resulted in no improvement of AUC (p>0.99). There was no threshold effect (p=0.18) and publication bias (p=0.31) in this study. H-FABP has modest accuracy for early diagnosis of AMI within 3 and 6 hours of symptom onset. The incremental value of H-FABP seemed much smaller and was of uncertain clinical significance in addition to hs-Tn in patients with suspected AMI. Routine use of H-FABP in early presentation does not seem warranted. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
de Bildt, Annelies; Sytema, Sjoerd; Meffert, Harma; Bastiaansen, Jojanneke A. C. J.
2016-01-01
This study examined the discriminative ability of the revised Autism Diagnostic Observation Schedule module 4 algorithm (Hus and Lord in "J Autism Dev Disord" 44(8):1996-2012, 2014) in 93 Dutch males with Autism Spectrum Disorder (ASD), schizophrenia, psychopathy or controls. Discriminative ability of the revised algorithm ASD cut-off…
Medley, S S; Donné, A J H; Kaita, R; Kislyakov, A I; Petrov, M P; Roquemore, A L
2008-01-01
An overview of the developments postcirca 1980s in the instrumentation and application of charge exchange neutral particle diagnostics on magnetic fusion energy experiments is presented. First, spectrometers that employ only electric fields and hence provide ion energy resolution but not mass resolution are discussed. Next, spectrometers that use various geometrical combinations of both electric and magnetic fields to provide both energy and mass resolutions are reviewed. Finally, neutral particle diagnostics based on utilization of time-of-flight techniques are presented.
Cone beam tomographic imaging anatomy of the maxillofacial region.
Angelopoulos, Christos
2008-10-01
Multiplanar imaging is a fairly new concept in diagnostic imaging available with a number of contemporary imaging modalities such as CT, MR imaging, diagnostic ultrasound, and others. This modality allows reconstruction of images in different planes (flat or curved) from a volume of data that was acquired previously. This concept makes the diagnostic process more interactive, and proper use may increase diagnostic potential. At the same time, the complexity of the anatomical structures on the maxillofacial region may make it harder for these images to be interpreted. This article reviews the anatomy of maxillofacial structures in planar imaging, and more specifically cone-beam CT images.
An accelerated diagnostic protocol for the early, safe discharge of low-risk chest pain patients.
Altherwi, Tawfeeq; Grad, Willis B
2015-07-01
Can an accelerated 2-hour diagnostic protocol using the cardiac troponin I (cTnI) measurement as the only biomarker be implemented to allow an earlier and safe discharge of low-risk chest pain patients? Than M, Cullen L, Aldous S, et al. 2-Hour accelerated diagnostic protocol to assess patients with chest pain symptoms using contemporary troponins as the only biomarker: the ADAPT trial. J Am Coll Cardiol 2012;59(23):2091-8. To determine whether an accelerated diagnostic protocol (ADP) for possible cardiac chest pain could identify low-risk patients suitable for early discharge using cTnI as the sole biomarker.
Issues in contemporary and potential future molecular diagnostics for dengue.
Sekaran, Shamala Devi; Soe, Hui Jen
2017-03-01
Dengue has been the most common arbovirus infection worldwide with 2.5 billion people living in over 100 endemic tropical and subtropical regions. Due to the high number of asymptomatic cases and the signs and symptoms being rather unspecific, dengue cases are often under-reported and might influence dengue surveillance programs. Therefore, a rapid, easy to use, inexpensive, and highly sensitive and specific diagnostic tool is essential for early and accurate diagnosis to ease the clinical management of patients as well as for the development of new interventions. Areas covered: This report discusses the contemporary dengue diagnostic tool, mainly from the aspect of molecular diagnosis where an overview of several nuclei acid amplification tests has been included. Potential molecular diagnostic tools such as biosensor and microarray are also discussed in this report. Expert commentary: Rapidness and accuracy in terms of sensitivity and specificity is imperative in dengue diagnosis for both clinical management and surveillance of dengue to ensure early treatment and corrective control measures can be carried out. In the next five years it is expected that there will be newer tests developed using not only the lateral flow techniques but more specifically biosensors and nanotechnology. These new technologies will have to be validated with the appropriate number and category of samples and to address the issue of cross-reactivity.
NASA Technical Reports Server (NTRS)
Ricks, Brian W.; Mengshoel, Ole J.
2009-01-01
Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
FPGA based charge acquisition algorithm for soft x-ray diagnostics system
NASA Astrophysics Data System (ADS)
Wojenski, A.; Kasprowicz, G.; Pozniak, K. T.; Zabolotny, W.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Krawczyk, R. D.; Zienkiewicz, P.; Chernyshova, M.; Czarski, T.
2015-09-01
Soft X-ray (SXR) measurement systems working in tokamaks or with laser generated plasma can expect high photon fluxes. Therefore it is necessary to focus on data processing algorithms to have the best possible efficiency in term of processed photon events per second. This paper refers to recently designed algorithm and data-flow for implementation of charge data acquisition in FPGA. The algorithms are currently on implementation stage for the soft X-ray diagnostics system. In this paper despite of the charge processing algorithm is also described general firmware overview, data storage methods and other key components of the measurement system. The simulation section presents algorithm performance and expected maximum photon rate.
Akberov, R F; Gorshkov, A N
1997-01-01
The X-ray endoscopic semiotics of precancerous gastric mucosal changes (epithelial dysplasia, intestinal epithelial rearrangement) was examined by the results of 1574 gastric examination. A diagnostic algorithm was developed for radiation studies in the diagnosis of the above pathology.
2012 HIV Diagnostics Conference: the molecular diagnostics perspective.
Branson, Bernard M; Pandori, Mark
2013-04-01
2012 HIV Diagnostic Conference Atlanta, GA, USA, 12-14 December 2012. This report highlights the presentations and discussions from the 2012 National HIV Diagnostic Conference held in Atlanta (GA, USA), on 12-14 December 2012. Reflecting changes in the evolving field of HIV diagnostics, the conference provided a forum for evaluating developments in molecular diagnostics and their role in HIV diagnosis. In 2010, the HIV Diagnostics Conference concluded with the proposal of a new diagnostic algorithm which included nucleic acid testing to resolve discordant screening and supplemental antibody test results. The 2012 meeting, picking up where the 2010 meeting left off, focused on scientific presentations that assessed this new algorithm and the role played by RNA testing and new developments in molecular diagnostics, including detection of total and integrated HIV-1 DNA, detection and quantification of HIV-2 RNA, and rapid formats for detection of HIV-1 RNA.
Alfa, Michelle J; Sepehri, Shadi
2013-01-01
BACKGROUND: There has been a growing interest in developing an appropriate laboratory diagnostic algorithm for Clostridium difficile, mainly as a result of increases in both the number and severity of cases of C difficile infection in the past decade. A C difficile diagnostic algorithm is necessary because diagnostic kits, mostly for the detection of toxins A and B or glutamate dehydrogenase (GDH) antigen, are not sufficient as stand-alone assays for optimal diagnosis of C difficile infection. In addition, conventional reference methods for C difficile detection (eg, toxigenic culture and cytotoxin neutralization [CTN] assays) are not routinely practiced in diagnostic laboratory settings. OBJECTIVE: To review the four-step algorithm used at Diagnostic Services of Manitoba sites for the laboratory diagnosis of toxigenic C difficile. RESULT: One year of retrospective C difficile data using the proposed algorithm was reported. Of 5695 stool samples tested, 9.1% (n=517) had toxigenic C difficile. Sixty per cent (310 of 517) of toxigenic C difficile stools were detected following the first two steps of the algorithm. CTN confirmation of GDH-positive, toxin A- and B-negative assays resulted in detection of an additional 37.7% (198 of 517) of toxigenic C difficile. Culture of the third specimen, from patients who had two previous negative specimens, detected an additional 2.32% (12 of 517) of toxigenic C difficile samples. DISCUSSION: Using GDH antigen as the screening and toxin A and B as confirmatory test for C difficile, 85% of specimens were reported negative or positive within 4 h. Without CTN confirmation for GDH antigen and toxin A and B discordant results, 37% (195 of 517) of toxigenic C difficile stools would have been missed. Following the algorithm, culture was needed for only 2.72% of all specimens submitted for C difficile testing. CONCLUSION: The overview of the data illustrated the significance of each stage of this four-step C difficile algorithm and emphasized the value of using CTN assay and culture as parts of an algorithm that ensures accurate diagnosis of toxigenic C difficile. PMID:24421808
ERIC Educational Resources Information Center
de Bildt, Annelies; Sytema, Sjoerd; Zander, Eric; Bölte, Sven; Sturm, Harald; Yirmiya, Nurit; Yaari, Maya; Charman, Tony; Salomone, Erica; LeCouteur, Ann; Green, Jonathan; Bedia, Ricardo Canal; Primo, Patricia García; van Daalen, Emma; de Jonge, Maretha V.; Guðmundsdóttir, Emilía; Jóhannsdóttir, Sigurrós; Raleva, Marija; Boskovska, Meri; Rogé, Bernadette; Baduel, Sophie; Moilanen, Irma; Yliherva, Anneli; Buitelaar, Jan; Oosterling, Iris J.
2015-01-01
The current study aimed to investigate the Autism Diagnostic Interview-Revised (ADI-R) algorithms for toddlers and young preschoolers (Kim and Lord, "J Autism Dev Disord" 42(1):82-93, 2012) in a non-US sample from ten sites in nine countries (n = 1,104). The construct validity indicated a good fit of the algorithms. The diagnostic…
Rajpara, S M; Botello, A P; Townend, J; Ormerod, A D
2009-09-01
Dermoscopy improves diagnostic accuracy of the unaided eye for melanoma, and digital dermoscopy with artificial intelligence or computer diagnosis has also been shown useful for the diagnosis of melanoma. At present there is no clear evidence regarding the diagnostic accuracy of dermoscopy compared with artificial intelligence. To evaluate the diagnostic accuracy of dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis and to compare the diagnostic accuracy of the different dermoscopic algorithms with each other and with digital dermoscopy/artificial intelligence for the detection of melanoma. A literature search on dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis was performed using several databases. Titles and abstracts of the retrieved articles were screened using a literature evaluation form. A quality assessment form was developed to assess the quality of the included studies. Heterogeneity among the studies was assessed. Pooled data were analysed using meta-analytical methods and comparisons between different algorithms were performed. Of 765 articles retrieved, 30 studies were eligible for meta-analysis. Pooled sensitivity for artificial intelligence was slightly higher than for dermoscopy (91% vs. 88%; P = 0.076). Pooled specificity for dermoscopy was significantly better than artificial intelligence (86% vs. 79%; P < 0.001). Pooled diagnostic odds ratio was 51.5 for dermoscopy and 57.8 for artificial intelligence, which were not significantly different (P = 0.783). There were no significance differences in diagnostic odds ratio among the different dermoscopic diagnostic algorithms. Dermoscopy and artificial intelligence performed equally well for diagnosis of melanocytic skin lesions. There was no significant difference in the diagnostic performance of various dermoscopy algorithms. The three-point checklist, the seven-point checklist and Menzies score had better diagnostic odds ratios than the others; however, these results need to be confirmed by a large-scale high-quality population-based study.
Ehteshami Bejnordi, Babak; Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A W M; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory Crf; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-12-12
Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin-stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists' diagnoses in a diagnostic setting. Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting.
Manna, Raffaele; Cauda, Roberto; Feriozzi, Sandro; Gambaro, Giovanni; Gasbarrini, Antonio; Lacombe, Didier; Livneh, Avi; Martini, Alberto; Ozdogan, Huri; Pisani, Antonio; Riccio, Eleonora; Verrecchia, Elena; Dagna, Lorenzo
2017-10-01
Fever of unknown origin (FUO) is a rather rare clinical syndrome representing a major diagnostic challenge. The occurrence of more than three febrile attacks with fever-free intervals of variable duration during 6 months of observation has recently been proposed as a subcategory of FUO, Recurrent FUO (RFUO). A substantial number of patients with RFUO have auto-inflammatory genetic fevers, but many patients remain undiagnosed. We hypothesize that this undiagnosed subgroup may be comprised of, at least in part, a number of rare genetic febrile diseases such as Fabry disease. We aimed to identify key features or potential diagnostic clues for Fabry disease as a model of rare genetic febrile diseases causing RFUO, and to develop diagnostic guidelines for RFUO, using Fabry disease as an example of inserting other rare diseases in the existing FUO algorithms. An international panel of specialists in recurrent fevers and rare diseases, including internists, infectious disease specialists, rheumatologists, gastroenterologists, nephrologists, and medical geneticists convened to review the existing diagnostic algorithms, and to suggest recommendations for arriving at accurate diagnoses on the basis of available literature and clinical experience. By combining specific features of rare diseases with other diagnostic considerations, guidelines have been designed to raise awareness and identify rare diseases among other causes of FUO. The proposed guidelines may be useful for the inclusion of rare diseases in the diagnostic algorithms for FUO. A wide spectrum of patients will be needed to validate the algorithm in different clinical settings.
Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil
2010-01-01
We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach
Bernabeu-Mestre, Josep; Santos, Ana Paula Cid; Pellicer, Josep Xavier Esplugues; Galiana-Sánchez, María Eugenia
2008-01-01
Chlorosis and Neurasthenia are two classical examples of pathological dissociations and the difficulties involved in approaching their diagnosis using scientific-naturalistic criteria. In the realm of those difficulties, the study examines the androcentric viewpoint and the ideological perspective of Contemporary Spanish Medicine when addressing the feminine nature and women's pathologies. Moreover, based on the similarities with present-day pain and fatigue syndromes, the study underlines the need to review the clinical approach to these illnesses by attempting to overcome the existing biomedical limitations.
Pragmatic ethical basis for radiation protection in diagnostic radiology.
Malone, Jim; Zölzer, Friedo
2016-01-01
Medical ethics has a tried and tested literature and a global active research community. Even among health professionals, literate and fluent in medical ethics, there is low recognition of radiation protection principles such as justification and optimization. On the other hand, many in healthcare environments misunderstand dose limitation obligations and incorrectly believe patients are protected by norms including a dose limit. Implementation problems for radiation protection in medicine possibly flow from apparent inadequacies of the International Commission on Radiological Protection (ICRP) principles taken on their own, coupled with their failure to transfer successfully to the medical world. Medical ethics, on the other hand, is essentially global, is acceptable in most cultures, is intuitively understood in hospitals, and its expectations are monitored, even by managements. This article presents an approach to ethics in diagnostic imaging rooted in the medical tradition, and alert to contemporary social expectations. ICRP and the International Radiation Protection Association (IRPA), both alert to growing ethical concerns, organized a series of consultations on ethics for general radiation protection in the last few years. The literature on medical ethics and implicit ICRP ethical values were reviewed qualitatively, with a view to identifying a system that will help guide contemporary behaviour in radiation protection of patients. Application of the system is illustrated in six clinical scenarios. The proposed system is designed, as far as is possible, so as not to be in conflict with the conclusions emerging from the ICRP/IRPA consultations. A widely recognized and well-respected system of medical ethics was identified that has global reach and claims acceptance in all cultures. Three values based on this system are grouped with two additional values to provide an ethical framework for application in diagnostic imaging. This system has the potential to be robust and to reach conclusions that are in accord with contemporary medical, social and ethical thinking. The system is not intended to replace the ICRP principles. Rather, it is intended as a well-informed interim approach that will help judge and analyse situations that arouse ethical concerns in radiology. Six scenarios illustrate the practicality of the value system in alerting one to possible deficits in practice. Five widely recognized values and the basis for them are identified to support the contemporary practice of diagnostic radiology. These are essential to complement the widely used ICRP principles pending further development in the area.
A Framework to Debug Diagnostic Matrices
NASA Technical Reports Server (NTRS)
Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann
2013-01-01
Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.
Eosinophilic pustular folliculitis: A proposal of diagnostic and therapeutic algorithms.
Nomura, Takashi; Katoh, Mayumi; Yamamoto, Yosuke; Miyachi, Yoshiki; Kabashima, Kenji
2016-11-01
Eosinophilic pustular folliculitis (EPF) is a sterile inflammatory dermatosis of unknown etiology. In addition to classic EPF, which affects otherwise healthy individuals, an immunocompromised state can cause immunosuppression-associated EPF (IS-EPF), which may be referred to dermatologists in inpatient services for assessments. Infancy-associated EPF (I-EPF) is the least characterized subtype, being observed mainly in non-Japanese infants. Diagnosis of EPF is challenging because its lesions mimic those of other common diseases, such as acne and dermatomycosis. Furthermore, there is no consensus regarding the treatment for each subtype of EPF. Here, we created procedure algorithms that facilitate the diagnosis and selection of therapeutic options on the basis of published work available in the public domain. Our diagnostic algorithm comprised a simple flowchart to direct physicians toward proper diagnosis. Recommended regimens were summarized in an easy-to-comprehend therapeutic algorithm for each subtype of EPF. These algorithms would facilitate the diagnostic and therapeutic procedure of EPF. © 2016 Japanese Dermatological Association.
Liou, Kevin; Negishi, Kazuaki; Ho, Suyen; Russell, Elizabeth A; Cranney, Greg; Ooi, Sze-Yuan
2016-08-01
Global longitudinal strain (GLS) is well validated and has important applications in contemporary clinical practice. The aim of this analysis was to evaluate the accuracy of resting peak GLS in the diagnosis of obstructive coronary artery disease (CAD). A systematic literature search was performed through July 2015 using four databases. Data were extracted independently by two authors and correlated before analyses. Using a random-effect model, the pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, and summary area under the curve for GLS were estimated with their respective 95% CIs. Screening of 1,669 articles yielded 10 studies with 1,385 patients appropriate for inclusion in the analysis. The mean age and left ventricular ejection fraction were 59.9 years and 61.1%. On the whole, 54.9% and 20.9% of the patients had hypertension and diabetes, respectively. Overall, abnormal GLS detected moderate to severe CAD with a pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of 74.4%, 72.1%, 2.9, and 0.35 respectively. The area under the curve and diagnostic odds ratio were 0.81 and 8.5. The mean values of GLS for those with and without CAD were -16.5% (95% CI, -15.8% to -17.3%) and -19.7% (95% CI, -18.8% to -20.7%), respectively. Subgroup analyses for patients with severe CAD and normal left ventricular ejection fractions yielded similar results. Current evidence supports the use of GLS in the detection of moderate to severe obstructive CAD in symptomatic patients. GLS may complement existing diagnostic algorithms and act as an early adjunctive marker of cardiac ischemia. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Corbett, E. L.; MacPherson, P.
2014-01-01
SUMMARY Twenty years of sky-high tuberculosis (TB) incidence rates and high TB mortality in high human immunodeficiency virus (HIV) prevalence countries have so far not been matched by the same magnitude or breadth of responses as seen in malaria or HIV programmes. Instead, recommendations have been narrowly focused on people presenting to health facilities for investigation of TB symptoms, or for HIV testing and care. However, despite the recent major investment and scale-up of TB and HIV services, undiagnosed TB remains highly prevalent at community level, implying that diagnosis of TB remains slow and incomplete. This maintains high transmission rates and exposes people living with HIV to high rates of morbidity and mortality. More intensive use of TB screening, with broader definitions of target populations, expanded indications for screening both inside and outside of health facilities, and appropriate selection of new diagnostic tools, offers the prospect of rapidly improving population-level control of TB. Diagnostic accuracy of suitable (high throughput) algorithms remains the major barrier to realising this goal. In the present study, we review the evidence available to guide expanded TB screening in HIV-prevalent settings, ideally through combined TB-HIV interventions that provide screening for both TB and HIV, and maximise entry to HIV and TB care and prevention. Ideally, we would systematically test, treat and prevent TB and HIV comprehensively, offering both TB and HIV screening to all health facility attendees, TB households and all adults in the highest risk communities. However, we are still held back by inadequate diagnostics, financing and paucity of population-impact data. Relevant contemporary research showing the high need for potential gains, and pitfalls from expanded and intensified TB screening in high HIV prevalence settings are discussed in this review. PMID:23928165
Towards dosimetry for photodynamic diagnosis with the low-level dose of photosensitizer.
Buzalewicz, Igor; Hołowacz, Iwona; Ulatowska-Jarża, Agnieszka; Podbielska, Halina
2017-08-01
Contemporary medicine does not concern the issue of dosimetry in photodynamic diagnosis (PDD) but follows the photosensitizer (PS) producers recommendation. Most preclinical and clinical PDD studies indicate a considerable variation in the possibility of visualization and treatment, as e.g. in case of cervix lesions. Although some of these variations can be caused by the different histological subtypes or various tumor geometries, the issue of varying PS concentration in the tumor tissue volume is definitely an important factor. Therefore, there is a need to establish the objective and systematic PDD dosimetry protocol regarding doses of light and photosensitizers. Four different irradiation sources investigated in PDD (literature) were used for PS excitation. The PS luminescence was examined by means of the non-imaging (spectroscopic) and imaging (wide- and narrow-field of view) techniques. The methodology for low-level intensity photoluminescence (PL) characterization and dedicated image processing algorithm for PS luminescence images analysis were proposed. Further, HeLa cells' cultures penetration by PS was studied by a confocal microscopy. Reducing the PS dose with the choice of proper photoexcitation conditions decreases the PDD procedure costs and the side effects, not affecting the diagnostic efficiency. We determined in vitro the minimum incubation time and photosensitizer concentration of Photolon for diagnostic purposes, for which the Photolon PL can still be observed. It was demonstrated that quantification of PS concentration, choice of proper photoexcitation source, appropriate adjustment of light dose and PS penetration of cancer cells may improve the low-level luminescence photodynamic diagnostics performance. Practical effectiveness of the PDD strongly depends on irradiation source parameters (bandwidth, maximum intensity, half-width) and their optimization is the main conditioning factor for low-level intensity and low-cost PDD. Copyright © 2017 Elsevier B.V. All rights reserved.
On the qi deficiency in traditional Chinese medicine.
Chiang, Hui-Chu; Chang, Hen-Hong; Huang, Po-Yu; Hsu, Mutsu
2014-09-01
Qi deficiency (QD), one of the most common disorders in Traditional Chinese medicine (TCM), is relevant to many disorders in obstetrics and gynecology. This study aimed to identify the common processes and criteria for diagnosing QD among contemporary proficient TCM practitioners. Steps of decision tree analysis and modified Delphi method were merged together into four-round postal questionnaires to collect qualitative and quantitative data. Open-ended questions and content analysis were used to explore the proficient TCM practitioners' cognitive activities used for diagnosis. The statements obtained from the qualitative responses were used to develop the items for subsequent questionnaires. Based on the TCM practitioners' responses, the diagnostic processes and criteria for making diagnosis were generated. Twenty-eight out of the 30 participants completed all four questionnaires from June 2007 to January 2010. The 11 diagnostic procedures identified in the returned first round of questionnaires were used as the alternatives to select and rank for all the steps to diagnose QD. After three more rounds of postal surveys, an algorithm with a five-stage diagnostic process as well as sets of decision criteria were identified. Although the priorities of procedures and descriptions of reasoning were varied, the content revealed the major themes in the model. The criteria to differentiate signs and symptoms (S/S) included five principles for correlating S/S with QD, and 17 S/S should be differentiated carefully. The results demonstrate that the TCM practitioners precisely diagnosed QD using a number of specific procedures and criteria that could be used as a reference to understand women complaining of S/S that could be similar to QD. Copyright © 2014. Published by Elsevier B.V.
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
NASA Astrophysics Data System (ADS)
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
de Bildt, Annelies; Sytema, Sjoerd; Meffert, Harma; Bastiaansen, Jojanneke A C J
2016-01-01
This study examined the discriminative ability of the revised Autism Diagnostic Observation Schedule module 4 algorithm (Hus and Lord in J Autism Dev Disord 44(8):1996-2012, 2014) in 93 Dutch males with Autism Spectrum Disorder (ASD), schizophrenia, psychopathy or controls. Discriminative ability of the revised algorithm ASD cut-off resembled the original algorithm ASD cut-off: highly specific for psychopathy and controls, lower sensitivity than Hus and Lord (2014; i.e. ASD .61, AD .53). The revised algorithm AD cut-off improved sensitivity over the original algorithm. Discriminating ASD from schizophrenia was still challenging, but the better-balanced sensitivity (.53) and specificity (.78) of the revised algorithm AD cut-off may aide clinicians' differential diagnosis. Findings support using the revised algorithm, being conceptually conform the other modules, thus improving comparability across the lifespan.
Severson, Carl A; Pendharkar, Sachin R; Ronksley, Paul E; Tsai, Willis H
2015-01-01
To assess the ability of electronic health data and existing screening tools to identify clinically significant obstructive sleep apnea (OSA), as defined by symptomatic or severe OSA. The present retrospective cohort study of 1041 patients referred for sleep diagnostic testing was undertaken at a tertiary sleep centre in Calgary, Alberta. A diagnosis of clinically significant OSA or an alternative sleep diagnosis was assigned to each patient through blinded independent chart review by two sleep physicians. Predictive variables were identified from online questionnaire data, and diagnostic algorithms were developed. The performance of electronically derived algorithms for identifying patients with clinically significant OSA was determined. Diagnostic performance of these algorithms was compared with versions of the STOP-Bang questionnaire and adjusted neck circumference score (ANC) derived from electronic data. Electronic questionnaire data were highly sensitive (>95%) at identifying clinically significant OSA, but not specific. Sleep diagnostic testing-determined respiratory disturbance index was very specific (specificity ≥95%) for clinically relevant disease, but not sensitive (<35%). Derived algorithms had similar accuracy to the STOP-Bang or ANC, but required fewer questions and calculations. These data suggest that a two-step process using a small number of clinical variables (maximizing sensitivity) and objective diagnostic testing (maximizing specificity) is required to identify clinically significant OSA. When used in an online setting, simple algorithms can identify clinically relevant OSA with similar performance to existing decision rules such as the STOP-Bang or ANC.
Chen, Xiao-Yang
2007-01-01
In contemporary China, physicians tend to require more diagnostic work-ups and prescribe more expensive medications than are clearly medically indicated. These practices have been interpreted as defensive medicine in response to a rising threat of potential medical malpractice lawsuits. After outlining recent changes in Chinese malpractice law, this essay contends that the overuse of expensive diagnostic and therapeutic interventions cannot be attributed to malpractice concerns alone. These practice patterns are due as well, if not primarily, to the corruption of medical decision-making by physicians being motivated to earn supplementary income, given the constraints of an ill-structured governmental policy by the over-use of expensive diagnostic and therapeutic interventions. To respond to these difficulties of Chinese health care policy, China will need not only to reform the particular policies that encourage these behaviors, but also to nurture a moral understanding that can place the pursuit of profit within the pursuit of virtue. This can be done by drawing on Confucian moral resources that integrate the pursuit of profit within an appreciation of benevolence. It is this Confucian moral account that can formulate a medical care policy suitable to China's contemporary market economy.
J.-M. Charcot and simulated neurologic disease: attitudes and diagnostic strategies.
Goetz, Christopher G
2007-07-03
Neurologists have long wrestled with the diagnosis of elaborated or feigned disease. Studies have not focused on early techniques utilized to diagnose malingering. To analyze cases of purposeful neurologic malingering among patients treated by the 19th century neurologist J.-M. Charcot, describe his attitudes, and study his methods to separate malingering from primary neurologic diseases. A study was conducted of Charcot's printed and original documents from the Bibliothèque Charcot, Paris, and added documents on American neurology. Charcot recognized that purposeful simulation occurred in isolation as well as in established neurologic disorders. Charcot was strict with subjects motivated by greed or spite, but showed forbearance and wonder in those who created illness as "art for art's sake." Charcot developed diagnostic equipment that measured inspiratory depth and muscle activity as a strategy to identify malingerers. His approach strikingly contrasted with contemporary military medical treatises on malingering and S.W. Mitchell's civilian neurologic approaches that unmasked patients through more aggressive strategies. Charcot provided an academically professional approach to the assessment of neurologic malingering, with a stern, often patronizing attitude, but without categorical condemnation. His diagnostic techniques are echoed by contemporary approaches and emphasized an attention to enhanced and inconsistent patterns of behaviors by malingerers.
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2002-01-01
As part of the NASA Aviation Safety Program, a unique model-based diagnostics method that employs neural networks and genetic algorithms for aircraft engine performance diagnostics has been developed and demonstrated at the NASA Glenn Research Center against a nonlinear gas turbine engine model. Neural networks are applied to estimate the internal health condition of the engine, and genetic algorithms are used for sensor fault detection, isolation, and quantification. This hybrid architecture combines the excellent nonlinear estimation capabilities of neural networks with the capability to rank the likelihood of various faults given a specific sensor suite signature. The method requires a significantly smaller data training set than a neural network approach alone does, and it performs the combined engine health monitoring objectives of performance diagnostics and sensor fault detection and isolation in the presence of nominal and degraded engine health conditions.
The PHQ-8 as a measure of current depression in the general population.
Kroenke, Kurt; Strine, Tara W; Spitzer, Robert L; Williams, Janet B W; Berry, Joyce T; Mokdad, Ali H
2009-04-01
The eight-item Patient Health Questionnaire depression scale (PHQ-8) is established as a valid diagnostic and severity measure for depressive disorders in large clinical studies. Our objectives were to assess the PHQ-8 as a depression measure in a large, epidemiological population-based study, and to determine the comparability of depression as defined by the PHQ-8 diagnostic algorithm vs. a PHQ-8 cutpoint > or = 10. Random-digit-dialed telephone survey of 198,678 participants in the 2006 Behavioral Risk Factor Surveillance Survey (BRFSS), a population-based survey in the United States. Current depression as defined by either the DSM-IV based diagnostic algorithm (i.e., major depressive or other depressive disorder) of the PHQ-8 or a PHQ-8 score > or = 10; respondent sociodemographic characteristics; number of days of impairment in the past 30 days in multiple domains of health-related quality of life (HRQoL). The prevalence of current depression was similar whether defined by the diagnostic algorithm or a PHQ-8 score > or = 10 (9.1% vs. 8.6%). Depressed patients had substantially more days of impairment across multiple domains of HRQoL, and the impairment was nearly identical in depressed groups defined by either method. Of the 17,040 respondents with a PHQ-8 score > or = 10, major depressive disorder was present in 49.7%, other depressive disorder in 23.9%, depressed mood or anhedonia in another 22.8%, and no evidence of depressive disorder or depressive symptoms in only 3.5%. The PHQ-8 diagnostic algorithm rather than an independent structured psychiatric interview was used as the criterion standard. The PHQ-8 is a useful depression measure for population-based studies, and either its diagnostic algorithm or a cutpoint > or = 10 can be used for defining current depression.
An Efficient Reachability Analysis Algorithm
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Fijany, Amir
2008-01-01
A document discusses a new algorithm for generating higher-order dependencies for diagnostic and sensor placement analysis when a system is described with a causal modeling framework. This innovation will be used in diagnostic and sensor optimization and analysis tools. Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in-situ platforms. This algorithm will serve as a power tool for technologies that satisfy a key requirement of autonomous spacecraft, including science instruments and in-situ missions.
Prosthetic joint infection development of an evidence-based diagnostic algorithm.
Mühlhofer, Heinrich M L; Pohlig, Florian; Kanz, Karl-Georg; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; Kelch, Sarah; Harrasser, Norbert; von Eisenhart-Rothe, Rüdiger; Schauwecker, Johannes
2017-03-09
Increasing rates of prosthetic joint infection (PJI) have presented challenges for general practitioners, orthopedic surgeons and the health care system in the recent years. The diagnosis of PJI is complex; multiple diagnostic tools are used in the attempt to correctly diagnose PJI. Evidence-based algorithms can help to identify PJI using standardized diagnostic steps. We reviewed relevant publications between 1990 and 2015 using a systematic literature search in MEDLINE and PUBMED. The selected search results were then classified into levels of evidence. The keywords were prosthetic joint infection, biofilm, diagnosis, sonication, antibiotic treatment, implant-associated infection, Staph. aureus, rifampicin, implant retention, pcr, maldi-tof, serology, synovial fluid, c-reactive protein level, total hip arthroplasty (THA), total knee arthroplasty (TKA) and combinations of these terms. From an initial 768 publications, 156 publications were stringently reviewed. Publications with class I-III recommendations (EAST) were considered. We developed an algorithm for the diagnostic approach to display the complex diagnosis of PJI in a clear and logically structured process according to ISO 5807. The evidence-based standardized algorithm combines modern clinical requirements and evidence-based treatment principles. The algorithm provides a detailed transparent standard operating procedure (SOP) for diagnosing PJI. Thus, consistently high, examiner-independent process quality is assured to meet the demands of modern quality management in PJI diagnosis.
Grude, Nils; Lindbaek, Morten
2015-01-01
Objective. To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Design. Randomized controlled trial. Setting. Out-of-hours service, Oslo, Norway. Intervention. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010–November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Subjects. Women (n = 441) aged 16–55 years. Mean age in both groups 27 years. Main outcome measures. Number of days until symptomatic resolution. Results. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Conclusion. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder. PMID:25961367
Bollestad, Marianne; Grude, Nils; Lindbaek, Morten
2015-06-01
To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Randomized controlled trial. Out-of-hours service, Oslo, Norway. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010-November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Women (n = 441) aged 16-55 years. Mean age in both groups 27 years. Number of days until symptomatic resolution. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder.
Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A. W. M.; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory CRF; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M. Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-01-01
Importance Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Objective Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin–stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists’ diagnoses in a diagnostic setting. Design, Setting, and Participants Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Exposures Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. Main Outcomes and Measures The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. Results The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). Conclusions and Relevance In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting. PMID:29234806
Xu, Jin; Xu, Zhao-Xia; Lu, Ping; Guo, Rui; Yan, Hai-Xia; Xu, Wen-Jie; Wang, Yi-Qin; Xia, Chun-Ming
2016-11-01
To develop an effective Chinese Medicine (CM) diagnostic model of coronary heart disease (CHD) and to confifirm the scientifific validity of CM theoretical basis from an algorithmic viewpoint. Four types of objective diagnostic data were collected from 835 CHD patients by using a self-developed CM inquiry scale for the diagnosis of heart problems, a tongue diagnosis instrument, a ZBOX-I pulse digital collection instrument, and the sound of an attending acquisition system. These diagnostic data was analyzed and a CM diagnostic model was established using a multi-label learning algorithm (REAL). REAL was employed to establish a Xin (Heart) qi defificiency, Xin yang defificiency, Xin yin defificiency, blood stasis, and phlegm fifive-card CM diagnostic model, which had recognition rates of 80.32%, 89.77%, 84.93%, 85.37%, and 69.90%, respectively. The multi-label learning method established using four diagnostic models based on mutual information feature selection yielded good recognition results. The characteristic model parameters were selected by maximizing the mutual information for each card type. The four diagnostic methods used to obtain information in CM, i.e., observation, auscultation and olfaction, inquiry, and pulse diagnosis, can be characterized by these parameters, which is consistent with CM theory.
Seizures in the elderly: development and validation of a diagnostic algorithm.
Dupont, Sophie; Verny, Marc; Harston, Sandrine; Cartz-Piver, Leslie; Schück, Stéphane; Martin, Jennifer; Puisieux, François; Alecu, Cosmin; Vespignani, Hervé; Marchal, Cécile; Derambure, Philippe
2010-05-01
Seizures are frequent in the elderly, but their diagnosis can be challenging. The objective of this work was to develop and validate an expert-based algorithm for the diagnosis of seizures in elderly people. A multidisciplinary group of neurologists and geriatricians developed a diagnostic algorithm using a combination of selected clinical, electroencephalographical and radiological criteria. The algorithm was validated by multicentre retrospective analysis of data of patients referred for specific symptoms and classified by the experts as epileptic patients or not. The algorithm was applied to all the patients, and the diagnosis provided by the algorithm was compared to the clinical diagnosis of the experts. Twenty-nine clinical, electroencephalographical and radiological criteria were selected for the algorithm. According to criteria combination, seizures were classified in four levels of diagnosis: certain, highly probable, possible or improbable. To validate the algorithm, the medical records of 269 elderly patients were analyzed (138 with epileptic seizures, 131 with non-epileptic manifestations). Patients were mainly referred for a transient focal deficit (40%), confusion (38%), unconsciousness (27%). The algorithm best classified certain and probable seizures versus possible and improbable seizures, with 86.2% sensitivity and 67.2% specificity. Using logistical regression, 2 simplified models were developed, the first with 13 criteria (Se 85.5%, Sp 90.1%), and the second with 7 criteria only (Se 84.8%, Sp 88.6%). In conclusion, the present study validated the use of a revised diagnostic algorithm to help diagnosis epileptic seizures in the elderly. A prospective study is planned to further validate this algorithm. Copyright 2010 Elsevier B.V. All rights reserved.
Bar-Cohen, Yaniv; Khairy, Paul; Morwood, James; Alexander, Mark E; Cecchin, Frank; Berul, Charles I
2006-07-01
ECG algorithms used to localize accessory pathways (AP) in patients with Wolff-Parkinson-White (WPW) syndrome have been validated in adults, but less is known of their use in children, especially in patients with congenital heart disease (CHD). We hypothesize that these algorithms have low diagnostic accuracy in children and even lower in those with CHD. Pre-excited ECGs in 43 patients with WPW and CHD (median age 5.4 years [0.9-32 years]) were evaluated and compared to 43 consecutive WPW control patients without CHD (median age 14.5 years [1.8-18 years]). Two blinded observers predicted AP location using 2 adult and 1 pediatric WPW algorithms, and a third blinded observer served as a tiebreaker. Predicted locations were compared with ablation-verified AP location to identify (a) exact match for AP location and (b) match for laterality (left-sided vs right-sided AP). In control children, adult algorithms were accurate in only 56% and 60%, while the pediatric algorithm was correct in 77%. In 19 patients with Ebstein's anomaly, diagnostic accuracy was similar to controls with at times an even better ability to predict laterality. In non-Ebstein's CHD, however, the algorithms were markedly worse (29% for the adult algorithms and 42% for the pediatric algorithms). A relatively large degree of interobserver variability was seen (kappa values from 0.30 to 0.58). Adult localization algorithms have poor diagnostic accuracy in young patients with and without CHD. Both adult and pediatric algorithms are particularly misleading in non-Ebstein's CHD patients and should be interpreted with caution.
Moon, Hee-Won; Kim, Hyeong Nyeon; Hur, Mina; Shim, Hee Sook; Kim, Heejung; Yun, Yeo-Min
2016-01-01
Since every single test has some limitations for detecting toxigenic Clostridium difficile, multistep algorithms are recommended. This study aimed to compare the current, representative diagnostic algorithms for detecting toxigenic C. difficile, using VIDAS C. difficile toxin A&B (toxin ELFA), VIDAS C. difficile GDH (GDH ELFA, bioMérieux, Marcy-l'Etoile, France), and Xpert C. difficile (Cepheid, Sunnyvale, California, USA). In 271 consecutive stool samples, toxigenic culture, toxin ELFA, GDH ELFA, and Xpert C. difficile were performed. We simulated two algorithms: screening by GDH ELFA and confirmation by Xpert C. difficile (GDH + Xpert) and combined algorithm of GDH ELFA, toxin ELFA, and Xpert C. difficile (GDH + Toxin + Xpert). The performance of each assay and algorithm was assessed. The agreement of Xpert C. difficile and two algorithms (GDH + Xpert and GDH+ Toxin + Xpert) with toxigenic culture were strong (Kappa, 0.848, 0.857, and 0.868, respectively). The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of algorithms (GDH + Xpert and GDH + Toxin + Xpert) were 96.7%, 95.8%, 85.0%, 98.1%, and 94.5%, 95.8%, 82.3%, 98.5%, respectively. There were no significant differences between Xpert C. difficile and two algorithms in sensitivity, specificity, PPV and NPV. The performances of both algorithms for detecting toxigenic C. difficile were comparable to that of Xpert C. difficile. Either algorithm would be useful in clinical laboratories and can be optimized in the diagnostic workflow of C. difficile depending on costs, test volume, and clinical needs.
Bone, Daniel; Bishop, Somer; Black, Matthew P.; Goodwin, Matthew S.; Lord, Catherine; Narayanan, Shrikanth S.
2016-01-01
Background Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely-used ASD screening and diagnostic tools. Methods The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders (DD), split at age 10. Algorithms were created via a robust ML classifier, support vector machine (SVM), while targeting best-estimate clinical diagnosis of ASD vs. non-ASD. Parameter settings were tuned in multiple levels of cross-validation. Results The created algorithms were more effective (higher performing) than current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. Conclusions ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. PMID:27090613
Bone, Daniel; Bishop, Somer L; Black, Matthew P; Goodwin, Matthew S; Lord, Catherine; Narayanan, Shrikanth S
2016-08-01
Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely used ASD screening and diagnostic tools. The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders, split at age 10. Algorithms were created via a robust ML classifier, support vector machine, while targeting best-estimate clinical diagnosis of ASD versus non-ASD. Parameter settings were tuned in multiple levels of cross-validation. The created algorithms were more effective (higher performing) than the current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight the limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. © 2016 Association for Child and Adolescent Mental Health.
HIV misdiagnosis in sub-Saharan Africa: performance of diagnostic algorithms at six testing sites
Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng’ang’a, Anne; Andre, Bita; Zahinda, Jean-Paul BN; Fransen, Katrien; Page, Anne-Laure
2017-01-01
Abstract Introduction: We evaluated the diagnostic accuracy of HIV testing algorithms at six programmes in five sub-Saharan African countries. Methods: In this prospective multisite diagnostic evaluation study (Conakry, Guinea; Kitgum, Uganda; Arua, Uganda; Homa Bay, Kenya; Doula, Cameroun and Baraka, Democratic Republic of Congo), samples from clients (greater than equal to five years of age) testing for HIV were collected and compared to a state-of-the-art algorithm from the AIDS reference laboratory at the Institute of Tropical Medicine, Belgium. The reference algorithm consisted of an enzyme-linked immuno-sorbent assay, a line-immunoassay, a single antigen-enzyme immunoassay and a DNA polymerase chain reaction test. Results: Between August 2011 and January 2015, over 14,000 clients were tested for HIV at 6 HIV counselling and testing sites. Of those, 2786 (median age: 30; 38.1% males) were included in the study. Sensitivity of the testing algorithms ranged from 89.5% in Arua to 100% in Douala and Conakry, while specificity ranged from 98.3% in Doula to 100% in Conakry. Overall, 24 (0.9%) clients, and as many as 8 per site (1.7%), were misdiagnosed, with 16 false-positive and 8 false-negative results. Six false-negative specimens were retested with the on-site algorithm on the same sample and were found to be positive. Conversely, 13 false-positive specimens were retested: 8 remained false-positive with the on-site algorithm. Conclusions: The performance of algorithms at several sites failed to meet expectations and thresholds set by the World Health Organization, with unacceptably high rates of false results. Alongside the careful selection of rapid diagnostic tests and the validation of algorithms, strictly observing correct procedures can reduce the risk of false results. In the meantime, to identify false-positive diagnoses at initial testing, patients should be retested upon initiating antiretroviral therapy. PMID:28691437
HIV misdiagnosis in sub-Saharan Africa: performance of diagnostic algorithms at six testing sites.
Kosack, Cara S; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng'ang'a, Anne; Andre, Bita; Zahinda, Jean-Paul Bn; Fransen, Katrien; Page, Anne-Laure
2017-07-03
We evaluated the diagnostic accuracy of HIV testing algorithms at six programmes in five sub-Saharan African countries. In this prospective multisite diagnostic evaluation study (Conakry, Guinea; Kitgum, Uganda; Arua, Uganda; Homa Bay, Kenya; Doula, Cameroun and Baraka, Democratic Republic of Congo), samples from clients (greater than equal to five years of age) testing for HIV were collected and compared to a state-of-the-art algorithm from the AIDS reference laboratory at the Institute of Tropical Medicine, Belgium. The reference algorithm consisted of an enzyme-linked immuno-sorbent assay, a line-immunoassay, a single antigen-enzyme immunoassay and a DNA polymerase chain reaction test. Between August 2011 and January 2015, over 14,000 clients were tested for HIV at 6 HIV counselling and testing sites. Of those, 2786 (median age: 30; 38.1% males) were included in the study. Sensitivity of the testing algorithms ranged from 89.5% in Arua to 100% in Douala and Conakry, while specificity ranged from 98.3% in Doula to 100% in Conakry. Overall, 24 (0.9%) clients, and as many as 8 per site (1.7%), were misdiagnosed, with 16 false-positive and 8 false-negative results. Six false-negative specimens were retested with the on-site algorithm on the same sample and were found to be positive. Conversely, 13 false-positive specimens were retested: 8 remained false-positive with the on-site algorithm. The performance of algorithms at several sites failed to meet expectations and thresholds set by the World Health Organization, with unacceptably high rates of false results. Alongside the careful selection of rapid diagnostic tests and the validation of algorithms, strictly observing correct procedures can reduce the risk of false results. In the meantime, to identify false-positive diagnoses at initial testing, patients should be retested upon initiating antiretroviral therapy.
Spitzer, C; Dahlenburg, B; Freyberger, H J
2006-07-01
Caspar David Friedrich (1774 - 1840) is one of the most important German Romantic painters. In his paintings, he prototypically represents the melancholy, which has been mentioned by his contemporaries and later biographers. Art scientists have also referred to his melancholy for the interpretation of his work. From a medical point of view, there are only two pathographies which remain inconclusive. Having applied diagnostic criteria for psychiatric disorders to his letters and publications, to statements of his contemporaries and to his art, we propose that he had suffered from a recurrent major depression which occurred in 1799 for the first time. At least three depressive episodes followed before he was struck by a stroke in 1835. There are epidemiological, psychodynamic and personality-typological reasons supporting our diagnostic assumption. The course of his depression corresponds to phases of reduced creativity, to the chosen techniques and motives. Finally we discuss the implications of our approach for the pathographical method in general.
Sex-specific performance of pre-imaging diagnostic algorithms for pulmonary embolism.
van Mens, T E; van der Pol, L M; van Es, N; Bistervels, I M; Mairuhu, A T A; van der Hulle, T; Klok, F A; Huisman, M V; Middeldorp, S
2018-05-01
Essentials Decision rules for pulmonary embolism are used indiscriminately despite possible sex-differences. Various pre-imaging diagnostic algorithms have been investigated in several prospective studies. When analysed at an individual patient data level the algorithms perform similarly in both sexes. Estrogen use and male sex were associated with a higher prevalence in suspected pulmonary embolism. Background In patients suspected of pulmonary embolism (PE), clinical decision rules are combined with D-dimer testing to rule out PE, avoiding the need for imaging in those at low risk. Despite sex differences in several aspects of the disease, including its diagnosis, these algorithms are used indiscriminately in women and men. Objectives To compare the performance, defined as efficiency and failure rate, of three pre-imaging diagnostic algorithms for PE between women and men: the Wells rule with fixed or with age-adjusted D-dimer cut-off, and a recently validated algorithm (YEARS). A secondary aim was to determine the sex-specific prevalence of PE. Methods Individual patient data were obtained from six studies using the Wells rule (fixed D-dimer, n = 5; age adjusted, n = 1) and from one study using the YEARS algorithm. All studies prospectively enrolled consecutive patients with suspected PE. Main outcomes were efficiency (proportion of patients in which the algorithm ruled out PE without imaging) and failure rate (proportion of patients with PE not detected by the algorithm). Outcomes were estimated using (multilevel) logistic regression models. Results The main outcomes showed no sex differences in any of the separate algorithms. With all three, the prevalence of PE was lower in women (OR, 0.66, 0.68 and 0.74). In women, estrogen use, adjusted for age, was associated with lower efficiency and higher prevalence and D-dimer levels. Conclusions The investigated pre-imaging diagnostic algorithms for patients suspected of PE show no sex differences in performance. Male sex and estrogen use are both associated with a higher probability of having the disease. © 2018 International Society on Thrombosis and Haemostasis.
An EEG blind source separation algorithm based on a weak exclusion principle.
Lan Ma; Blu, Thierry; Wang, William S-Y
2016-08-01
The question of how to separate individual brain and non-brain signals, mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings, is a significant problem in contemporary neuroscience. This study proposes and evaluates a novel EEG Blind Source Separation (BSS) algorithm based on a weak exclusion principle (WEP). The chief point in which it differs from most previous EEG BSS algorithms is that the proposed algorithm is not based upon the hypothesis that the sources are statistically independent. Our first step was to investigate algorithm performance on simulated signals which have ground truth. The purpose of this simulation is to illustrate the proposed algorithm's efficacy. The results show that the proposed algorithm has good separation performance. Then, we used the proposed algorithm to separate real EEG signals from a memory study using a revised version of Sternberg Task. The results show that the proposed algorithm can effectively separate the non-brain and brain sources.
Cassani, Raymundo; Falk, Tiago H.; Fraga, Francisco J.; Kanda, Paulo A. M.; Anghinah, Renato
2014-01-01
Over the last decade, electroencephalography (EEG) has emerged as a reliable tool for the diagnosis of cortical disorders such as Alzheimer's disease (AD). EEG signals, however, are susceptible to several artifacts, such as ocular, muscular, movement, and environmental. To overcome this limitation, existing diagnostic systems commonly depend on experienced clinicians to manually select artifact-free epochs from the collected multi-channel EEG data. Manual selection, however, is a tedious and time-consuming process, rendering the diagnostic system “semi-automated.” Notwithstanding, a number of EEG artifact removal algorithms have been proposed in the literature. The (dis)advantages of using such algorithms in automated AD diagnostic systems, however, have not been documented; this paper aims to fill this gap. Here, we investigate the effects of three state-of-the-art automated artifact removal (AAR) algorithms (both alone and in combination with each other) on AD diagnostic systems based on four different classes of EEG features, namely, spectral, amplitude modulation rate of change, coherence, and phase. The three AAR algorithms tested are statistical artifact rejection (SAR), blind source separation based on second order blind identification and canonical correlation analysis (BSS-SOBI-CCA), and wavelet enhanced independent component analysis (wICA). Experimental results based on 20-channel resting-awake EEG data collected from 59 participants (20 patients with mild AD, 15 with moderate-to-severe AD, and 24 age-matched healthy controls) showed the wICA algorithm alone outperforming other enhancement algorithm combinations across three tasks: diagnosis (control vs. mild vs. moderate), early detection (control vs. mild), and disease progression (mild vs. moderate), thus opening the doors for fully-automated systems that can assist clinicians with early detection of AD, as well as disease severity progression assessment. PMID:24723886
Dionne, Audrey; Meloche-Dumas, Léamarie; Desjardins, Laurent; Turgeon, Jean; Saint-Cyr, Claire; Autmizguine, Julie; Spigelblatt, Linda; Fournier, Anne; Dahdah, Nagib
2017-03-01
Diagnosis of Kawasaki disease (KD) can be challenging in the absence of a confirmatory test or pathognomonic finding, especially when clinical criteria are incomplete. We recently proposed serum N-terminal pro-B-type natriuretic peptide (NT-proBNP) as an adjunctive diagnostic test. We retrospectively tested a new algorithm to help KD diagnosis based on NT-proBNP, coronary artery dilation (CAD) at onset, and abnormal serum albumin or C-reactive protein (CRP). The goal was to assess the performance of the algorithm and compare its performance with that of the 2004 American Heart Association (AHA)/American Academy of Pediatrics (AAP) algorithm. The algorithm was tested on 124 KD patients with NT-proBNP measured on admission at the present institutions between 2007 and 2013. Age at diagnosis was 3.4 ± 3.0 years, with a median of five diagnostic criteria; and 55 of the 124 patients (44%) had incomplete KD. CA complications occurred in 64 (52%), with aneurysm in 14 (11%). Using this algorithm, 120/124 (97%) were to be treated, based on high NT-proBNP alone for 79 (64%); on onset CAD for 14 (11%); and on high CRP or low albumin for 27 (22%). Using the AHA/AAP algorithm, 22/47 (47%) of the eligible patients with incomplete KD would not have been referred for treatment, compared with 3/55 (5%) with the NT-proBNP algorithm (P < 0.001). This NT-proBNP-based algorithm is efficient to identify and treat patients with KD, including those with incomplete KD. This study paves the way for a prospective validation trial of the algorithm. © 2016 Japan Pediatric Society.
Dey, Nilanjan; Bose, Soumyo; Das, Achintya; Chaudhuri, Sheli Sinha; Saba, Luca; Shafique, Shoaib; Nicolaides, Andrew; Suri, Jasjit S
2016-04-01
Embedding of diagnostic and health care information requires secure encryption and watermarking. This research paper presents a comprehensive study for the behavior of some well established watermarking algorithms in frequency domain for the preservation of stroke-based diagnostic parameters. Two different sets of watermarking algorithms namely: two correlation-based (binary logo hiding) and two singular value decomposition (SVD)-based (gray logo hiding) watermarking algorithms are used for embedding ownership logo. The diagnostic parameters in atherosclerotic plaque ultrasound video are namely: (a) bulb identification and recognition which consists of identifying the bulb edge points in far and near carotid walls; (b) carotid bulb diameter; and (c) carotid lumen thickness all along the carotid artery. The tested data set consists of carotid atherosclerotic movies taken under IRB protocol from University of Indiana Hospital, USA-AtheroPoint™ (Roseville, CA, USA) joint pilot study. ROC (receiver operating characteristic) analysis was performed on the bulb detection process that showed an accuracy and sensitivity of 100 % each, respectively. The diagnostic preservation (DPsystem) for SVD-based approach was above 99 % with PSNR (Peak signal-to-noise ratio) above 41, ensuring the retention of diagnostic parameter devalorization as an effect of watermarking. Thus, the fully automated proposed system proved to be an efficient method for watermarking the atherosclerotic ultrasound video for stroke application.
Understanding and Counseling Narcissistic Clients.
ERIC Educational Resources Information Center
Stevens, Michael J.; And Others
1984-01-01
Provides counselors with an overview of narcissism and its treatment. In the first section, dysfunctional narcissism is described, drawing on the diagnostic indicators presented in the DSM-III and the contemporary object relations theories of Heinz Kohut and Otto Kerberg. The second section focuses on counseling narcissistic clients. (Author/JAC)
Pragmatic ethical basis for radiation protection in diagnostic radiology
Zölzer, Friedo
2016-01-01
Objective: Medical ethics has a tried and tested literature and a global active research community. Even among health professionals, literate and fluent in medical ethics, there is low recognition of radiation protection principles such as justification and optimization. On the other hand, many in healthcare environments misunderstand dose limitation obligations and incorrectly believe patients are protected by norms including a dose limit. Implementation problems for radiation protection in medicine possibly flow from apparent inadequacies of the International Commission on Radiological Protection (ICRP) principles taken on their own, coupled with their failure to transfer successfully to the medical world. Medical ethics, on the other hand, is essentially global, is acceptable in most cultures, is intuitively understood in hospitals, and its expectations are monitored, even by managements. This article presents an approach to ethics in diagnostic imaging rooted in the medical tradition, and alert to contemporary social expectations. ICRP and the International Radiation Protection Association (IRPA), both alert to growing ethical concerns, organized a series of consultations on ethics for general radiation protection in the last few years. Methods: The literature on medical ethics and implicit ICRP ethical values were reviewed qualitatively, with a view to identifying a system that will help guide contemporary behaviour in radiation protection of patients. Application of the system is illustrated in six clinical scenarios. The proposed system is designed, as far as is possible, so as not to be in conflict with the conclusions emerging from the ICRP/IRPA consultations. Results and conclusion: A widely recognized and well-respected system of medical ethics was identified that has global reach and claims acceptance in all cultures. Three values based on this system are grouped with two additional values to provide an ethical framework for application in diagnostic imaging. This system has the potential to be robust and to reach conclusions that are in accord with contemporary medical, social and ethical thinking. The system is not intended to replace the ICRP principles. Rather, it is intended as a well-informed interim approach that will help judge and analyse situations that arouse ethical concerns in radiology. Six scenarios illustrate the practicality of the value system in alerting one to possible deficits in practice. Advances in knowledge: Five widely recognized values and the basis for them are identified to support the contemporary practice of diagnostic radiology. These are essential to complement the widely used ICRP principles pending further development in the area. PMID:26796852
Gearbox vibration diagnostic analyzer
NASA Technical Reports Server (NTRS)
1992-01-01
This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.
Anatomy-Based Algorithms for Detecting Oral Cancer Using Reflectance and Fluorescence Spectroscopy
McGee, Sasha; Mardirossian, Vartan; Elackattu, Alphi; Mirkovic, Jelena; Pistey, Robert; Gallagher, George; Kabani, Sadru; Yu, Chung-Chieh; Wang, Zimmern; Badizadegan, Kamran; Grillone, Gregory; Feld, Michael S.
2010-01-01
Objectives We used reflectance and fluorescence spectroscopy to noninvasively and quantitatively distinguish benign from dysplastic/malignant oral lesions. We designed diagnostic algorithms to account for differences in the spectral properties among anatomic sites (gingiva, buccal mucosa, etc). Methods In vivo reflectance and fluorescence spectra were collected from 71 patients with oral lesions. The tissue was then biopsied and the specimen evaluated by histopathology. Quantitative parameters related to tissue morphology and biochemistry were extracted from the spectra. Diagnostic algorithms specific for combinations of sites with similar spectral properties were developed. Results Discrimination of benign from dysplastic/malignant lesions was most successful when algorithms were designed for individual sites (area under the receiver operator characteristic curve [ROC-AUC], 0.75 for the lateral surface of the tongue) and was least accurate when all sites were combined (ROC-AUC, 0.60). The combination of sites with similar spectral properties (floor of mouth and lateral surface of the tongue) yielded an ROC-AUC of 0.71. Conclusions Accurate spectroscopic detection of oral disease must account for spectral variations among anatomic sites. Anatomy-based algorithms for single sites or combinations of sites demonstrated good diagnostic performance in distinguishing benign lesions from dysplastic/malignant lesions and consistently performed better than algorithms developed for all sites combined. PMID:19999369
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
Ronald, L A; Ling, D I; FitzGerald, J M; Schwartzman, K; Bartlett-Esquilant, G; Boivin, J-F; Benedetti, A; Menzies, D
2017-05-01
An increasing number of studies are using health administrative databases for tuberculosis (TB) research. However, there are limitations to using such databases for identifying patients with TB. To summarise validated methods for identifying TB in health administrative databases. We conducted a systematic literature search in two databases (Ovid Medline and Embase, January 1980-January 2016). We limited the search to diagnostic accuracy studies assessing algorithms derived from drug prescription, International Classification of Diseases (ICD) diagnostic code and/or laboratory data for identifying patients with TB in health administrative databases. The search identified 2413 unique citations. Of the 40 full-text articles reviewed, we included 14 in our review. Algorithms and diagnostic accuracy outcomes to identify TB varied widely across studies, with positive predictive value ranging from 1.3% to 100% and sensitivity ranging from 20% to 100%. Diagnostic accuracy measures of algorithms using out-patient, in-patient and/or laboratory data to identify patients with TB in health administrative databases vary widely across studies. Use solely of ICD diagnostic codes to identify TB, particularly when using out-patient records, is likely to lead to incorrect estimates of case numbers, given the current limitations of ICD systems in coding TB.
Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.
Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab
2017-09-01
Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.
Yiadom, Maame Yaa A B; Mumma, Bryn E; Baugh, Christopher W; Patterson, Brian W; Mills, Angela M; Salazar, Gilberto; Tanski, Mary; Jenkins, Cathy A; Vogus, Timothy J; Miller, Karen F; Jackson, Brittney E; Lehmann, Christoph U; Dorner, Stephen C; West, Jennifer L; Wang, Thomas J; Collins, Sean P; Dittus, Robert S; Bernard, Gordon R; Storrow, Alan B; Liu, Dandan
2018-05-03
Advances in ST-segment elevation myocardial infarction (STEMI) management have involved improving the clinical processes connecting patients with timely emergency cardiovascular care. Screening upon emergency department (ED) arrival for an early ECG to diagnose STEMI, however, is not optimal for all patients. In addition, the degree to which timely screening and diagnosis are associated with improved time to intervention and postpercutaneous coronary intervention outcomes, under more contemporary practice conditions, is not known. We present the methods for a retrospective multicentre cohort study anticipated to include 1220 patients across seven EDs to (1) evaluate the relationship between timely screening and diagnosis with treatment and postintervention clinical outcomes; (2) introduce novel measures for cross-facility performance comparisons of screening and diagnostic care team performance including: door-to-screening, door-to-diagnosis and door-to-catheterisation laboratory arrival times and (3) describe the use of electronic health record data in tandem with an existing disease registry. The completion of this study will provide critical feedback on the quality of screening and diagnostic performance within the contemporary STEMI care pathway that can be used to (1) improve emergency care delivery for patients with STEMI presenting to the ED, (2) present novel metrics for the comparison of screening and diagnostic care and (3) inform the development of screening and diagnostic support tools that could be translated to other care environments. We will disseminate our results via publication and quality performance data sharing with each site. Institutional ethics review approval was received prior to study initiation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
In this presentation, three diagnostic evaluation methods of model performance for carbonaceous aerosol are reviewed. The EC-tracer method is used to distinguish primary and secondary carbon, radiocarbon data are used to distinguish fossil-fuel and contemporary carbon, and organ...
Combined algorithmic and GPU acceleration for ultra-fast circular conebeam backprojection
NASA Astrophysics Data System (ADS)
Brokish, Jeffrey; Sack, Paul; Bresler, Yoram
2010-04-01
In this paper, we describe the first implementation and performance of a fast O(N3logN) hierarchical backprojection algorithm for cone beam CT with a circular trajectory1,developed on a modern Graphics Processing Unit (GPU). The resulting tomographic backprojection system for 3D cone beam geometry combines speedup through algorithmic improvements provided by the hierarchical backprojection algorithm with speedup from a massively parallel hardware accelerator. For data parameters typical in diagnostic CT and using a mid-range GPU card, we report reconstruction speeds of up to 360 frames per second, and relative speedup of almost 6x compared to conventional backprojection on the same hardware. The significance of these results is twofold. First, they demonstrate that the reduction in operation counts demonstrated previously for the FHBP algorithm can be translated to a comparable run-time improvement in a massively parallel hardware implementation, while preserving stringent diagnostic image quality. Second, the dramatic speedup and throughput numbers achieved indicate the feasibility of systems based on this technology, which achieve real-time 3D reconstruction for state-of-the art diagnostic CT scanners with small footprint, high-reliability, and affordable cost.
Ahn, Hye Shin; Kim, Sun Mi; Jang, Mijung; Yun, Bo La; Kim, Bohyoung; Ko, Eun Sook; Han, Boo-Kyung; Chang, Jung Min; Yi, Ann; Cho, Nariya; Moon, Woo Kyung; Choi, Hye Young
2014-01-01
To compare new full-field digital mammography (FFDM) with and without use of an advanced post-processing algorithm to improve image quality, lesion detection, diagnostic performance, and priority rank. During a 22-month period, we prospectively enrolled 100 cases of specimen FFDM mammography (Brestige®), which was performed alone or in combination with a post-processing algorithm developed by the manufacturer: group A (SMA), specimen mammography without application of "Mammogram enhancement ver. 2.0"; group B (SMB), specimen mammography with application of "Mammogram enhancement ver. 2.0". Two sets of specimen mammographies were randomly reviewed by five experienced radiologists. Image quality, lesion detection, diagnostic performance, and priority rank with regard to image preference were evaluated. Three aspects of image quality (overall quality, contrast, and noise) of the SMB were significantly superior to those of SMA (p < 0.05). SMB was significantly superior to SMA for visualizing calcifications (p < 0.05). Diagnostic performance, as evaluated by cancer score, was similar between SMA and SMB. SMB was preferred to SMA by four of the five reviewers. The post-processing algorithm may improve image quality with better image preference in FFDM than without use of the software.
Colonoscopy video quality assessment using hidden Markov random fields
NASA Astrophysics Data System (ADS)
Park, Sun Young; Sargent, Dusty; Spofford, Inbar; Vosburgh, Kirby
2011-03-01
With colonoscopy becoming a common procedure for individuals aged 50 or more who are at risk of developing colorectal cancer (CRC), colon video data is being accumulated at an ever increasing rate. However, the clinically valuable information contained in these videos is not being maximally exploited to improve patient care and accelerate the development of new screening methods. One of the well-known difficulties in colonoscopy video analysis is the abundance of frames with no diagnostic information. Approximately 40% - 50% of the frames in a colonoscopy video are contaminated by noise, acquisition errors, glare, blur, and uneven illumination. Therefore, filtering out low quality frames containing no diagnostic information can significantly improve the efficiency of colonoscopy video analysis. To address this challenge, we present a quality assessment algorithm to detect and remove low quality, uninformative frames. The goal of our algorithm is to discard low quality frames while retaining all diagnostically relevant information. Our algorithm is based on a hidden Markov model (HMM) in combination with two measures of data quality to filter out uninformative frames. Furthermore, we present a two-level framework based on an embedded hidden Markov model (EHHM) to incorporate the proposed quality assessment algorithm into a complete, automated diagnostic image analysis system for colonoscopy video.
New method for detection of gastric cancer by hyperspectral imaging: a pilot study
NASA Astrophysics Data System (ADS)
Kiyotoki, Shu; Nishikawa, Jun; Okamoto, Takeshi; Hamabe, Kouichi; Saito, Mari; Goto, Atsushi; Fujita, Yusuke; Hamamoto, Yoshihiko; Takeuchi, Yusuke; Satori, Shin; Sakaida, Isao
2013-02-01
We developed a new, easy, and objective method to detect gastric cancer using hyperspectral imaging (HSI) technology combining spectroscopy and imaging A total of 16 gastroduodenal tumors removed by endoscopic resection or surgery from 14 patients at Yamaguchi University Hospital, Japan, were recorded using a hyperspectral camera (HSC) equipped with HSI technology Corrected spectral reflectance was obtained from 10 samples of normal mucosa and 10 samples of tumors for each case The 16 cases were divided into eight training cases (160 training samples) and eight test cases (160 test samples) We established a diagnostic algorithm with training samples and evaluated it with test samples Diagnostic capability of the algorithm for each tumor was validated, and enhancement of tumors by image processing using the HSC was evaluated The diagnostic algorithm used the 726-nm wavelength, with a cutoff point established from training samples The sensitivity, specificity, and accuracy rates of the algorithm's diagnostic capability in the test samples were 78.8% (63/80), 92.5% (74/80), and 85.6% (137/160), respectively Tumors in HSC images of 13 (81.3%) cases were well enhanced by image processing Differences in spectral reflectance between tumors and normal mucosa suggested that tumors can be clearly distinguished from background mucosa with HSI technology.
Vertigo in childhood: proposal for a diagnostic algorithm based upon clinical experience.
Casani, A P; Dallan, I; Navari, E; Sellari Franceschini, S; Cerchiai, N
2015-06-01
The aim of this paper is to analyse, after clinical experience with a series of patients with established diagnoses and review of the literature, all relevant anamnestic features in order to build a simple diagnostic algorithm for vertigo in childhood. This study is a retrospective chart review. A series of 37 children underwent complete clinical and instrumental vestibular examination. Only neurological disorders or genetic diseases represented exclusion criteria. All diagnoses were reviewed after applying the most recent diagnostic guidelines. In our experience, the most common aetiology for dizziness is vestibular migraine (38%), followed by acute labyrinthitis/neuritis (16%) and somatoform vertigo (16%). Benign paroxysmal vertigo was diagnosed in 4 patients (11%) and paroxysmal torticollis was diagnosed in a 1-year-old child. In 8% (3 patients) of cases, the dizziness had a post-traumatic origin: 1 canalolithiasis of the posterior semicircular canal and 2 labyrinthine concussions, respectively. Menière's disease was diagnosed in 2 cases. A bilateral vestibular failure of unknown origin caused chronic dizziness in 1 patient. In conclusion, this algorithm could represent a good tool for guiding clinical suspicion to correct diagnostic assessment in dizzy children where no neurological findings are detectable. The algorithm has just a few simple steps, based mainly on two aspects to be investigated early: temporal features of vertigo and presence of hearing impairment. A different algorithm has been proposed for cases in which a traumatic origin is suspected.
Guglielmi, Valeria; Bellia, Alfonso; Pecchioli, Serena; Medea, Gerardo; Parretti, Damiano; Lauro, Davide; Sbraccia, Paolo; Federici, Massimo; Cricelli, Iacopo; Cricelli, Claudio; Lapi, Francesco
2016-11-15
There are some inconsistencies on prevalence estimates of familial hypercholesterolemia (FH) in general population across Europe due to variable application of its diagnostic criteria. We aimed to investigate the FH epidemiology in Italy applying the Dutch Lipid Clinical Network (DLCN) score, and two alternative diagnostic algorithms to a primary care database. We performed a retrospective population-based study using the Health Search IMS Health Longitudinal Patient Database (HSD) and including active (alive and currently registered with their general practitioners (GPs)) patients on December 31, 2014. Cases of FH were identified by applying DLCN score. Two further algorithms, based on either ICD9CM coding for FH or some clinical items adopted by the DLCN, were tested towards DLCN itself as gold standard. We estimated a prevalence of 0.01% for "definite" and 0.18% for "definite" plus "probable" cases as per the DLCN. Algorithms 1 and 2 reported a FH prevalence of 0.9 and 0.13%, respectively. Both algorithms resulted in consistent specificity (1: 99.10%; 2: 99.9%) towards DLCN, but Algorithm 2 considerably better identified true positive (sensitivity=85.90%) than Algorithm 1 (sensitivity=10.10%). The application of DLCN or valid diagnostic alternatives in the Italian primary care setting provides estimates of FH prevalence consistent with those reported in other screening studies in Caucasian population. These diagnostic criteria should be therefore fostered among GPs. In the perspective of FH new therapeutic options, the epidemiological picture of FH is even more relevant to foresee the costs and to plan affordable reimbursement programs in Italy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Diagnostic Utility of the ADI-R and DSM-5 in the Assessment of Latino Children and Adolescents
ERIC Educational Resources Information Center
Magaña, Sandy; Vanegas, Sandra B.
2017-01-01
Latino children in the US are systematically underdiagnosed with Autism Spectrum Disorder (ASD); therefore, it is important that recent changes to the diagnostic process do not exacerbate this pattern of under-identification. Previous research has found that the Autism Diagnostic Interview-Revised (ADI-R) algorithm, based on the Diagnostic and…
Current challenges in diagnostic imaging of venous thromboembolism.
Huisman, Menno V; Klok, Frederikus A
2015-01-01
Because the clinical diagnosis of deep-vein thrombosis and pulmonary embolism is nonspecific, integrated diagnostic approaches for patients with suspected venous thromboembolism have been developed over the years, involving both non-invasive bedside tools (clinical decision rules and D-dimer blood tests) for patients with low pretest probability and diagnostic techniques (compression ultrasound for deep-vein thrombosis and computed tomography pulmonary angiography for pulmonary embolism) for those with a high pretest probability. This combination has led to standardized diagnostic algorithms with proven safety for excluding venous thrombotic disease. At the same time, it has become apparent that, as a result of the natural history of venous thrombosis, there are special patient populations in which the current standard diagnostic algorithms are not sufficient. In this review, we present 3 evidence-based patient cases to underline recent developments in the imaging diagnosis of venous thromboembolism. © 2015 by The American Society of Hematology. All rights reserved.
[Chronic diarrhoea: Definition, classification and diagnosis].
Fernández-Bañares, Fernando; Accarino, Anna; Balboa, Agustín; Domènech, Eugeni; Esteve, Maria; Garcia-Planella, Esther; Guardiola, Jordi; Molero, Xavier; Rodríguez-Luna, Alba; Ruiz-Cerulla, Alexandra; Santos, Javier; Vaquero, Eva
2016-10-01
Chronic diarrhoea is a common presenting symptom in both primary care medicine and in specialized gastroenterology clinics. It is estimated that >5% of the population has chronic diarrhoea and nearly 40% of these patients are older than 60 years. Clinicians often need to select the best diagnostic approach to these patients and choose between the multiple diagnostic tests available. In 2014 the Catalan Society of Gastroenterology formed a working group with the main objective of creating diagnostic algorithms based on clinical practice and to evaluate diagnostic tests and the scientific evidence available for their use. The GRADE system was used to classify scientific evidence and strength of recommendations. The consensus document contains 28 recommendations and 6 diagnostic algorithms. The document also describes criteria for referral from primary to specialized care. Copyright © 2015 Elsevier España, S.L.U. y AEEH y AEG. All rights reserved.
The diagnostic management of upper extremity deep vein thrombosis: A review of the literature.
Kraaijpoel, Noémie; van Es, Nick; Porreca, Ettore; Büller, Harry R; Di Nisio, Marcello
2017-08-01
Upper extremity deep vein thrombosis (UEDVT) accounts for 4% to 10% of all cases of deep vein thrombosis. UEDVT may present with localized pain, erythema, and swelling of the arm, but may also be detected incidentally by diagnostic imaging tests performed for other reasons. Prompt and accurate diagnosis is crucial to prevent pulmonary embolism and long-term complications as the post-thrombotic syndrome of the arm. Unlike the diagnostic management of deep vein thrombosis (DVT) of the lower extremities, which is well established, the work-up of patients with clinically suspected UEDVT remains uncertain with limited evidence from studies of small size and poor methodological quality. Currently, only one prospective study evaluated the use of an algorithm, similar to the one used for DVT of the lower extremities, for the diagnostic workup of clinically suspected UEDVT. The algorithm combined clinical probability assessment, D-dimer testing and ultrasonography and appeared to safely and effectively exclude UEDVT. However, before recommending its use in routine clinical practice, external validation of this strategy and improvements of the efficiency are needed, especially in high-risk subgroups in whom the performance of the algorithm appeared to be suboptimal, such as hospitalized or cancer patients. In this review, we critically assess the accuracy and efficacy of current diagnostic tools and provide clinical guidance for the diagnostic management of clinically suspected UEDVT. Copyright © 2017 Elsevier Ltd. All rights reserved.
Boehnke, Mitchell; Patel, Nayana; McKinney, Kristin; Clark, Toshimasa
The Society of Radiologists in Ultrasound (SRU 2005) and American Thyroid Association (ATA 2009 and ATA 2015) have published algorithms regarding thyroid nodule management. Kwak et al. and other groups have described models that estimate thyroid nodules' malignancy risk. The aim of our study is to use Kwak's model to evaluate the tradeoffs of both sensitivity and specificity of SRU 2005, ATA 2009 and ATA 2015 management algorithms. 1,000,000 thyroid nodules were modeled in MATLAB. Ultrasound characteristics were modeled after published data. Malignancy risk was estimated per Kwak's model and assigned as a binary variable. All nodules were then assessed using the published management algorithms. With the malignancy variable as condition positivity and algorithms' recommendation for FNA as test positivity, diagnostic performance was calculated. Modeled nodule characteristics mimic those of Kwak et al. 12.8% nodules were assigned as malignant (malignancy risk range of 2.0-98%). FNA was recommended for 41% of nodules by SRU 2005, 66% by ATA 2009, and 82% by ATA 2015. Sensitivity and specificity is significantly different (< 0.0001): 49% and 60% for SRU; 81% and 36% for ATA 2009; and 95% and 20% for ATA 2015. SRU 2005, ATA 2009 and ATA 2015 algorithms are used routinely in clinical practice to determine whether thyroid nodule biopsy is indicated. We demonstrate significant differences in these algorithms' diagnostic performance, which result in a compromise between sensitivity and specificity. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chandra, Malavika; Scheiman, James; Simeone, Diane; McKenna, Barbara; Purdy, Julianne; Mycek, Mary-Ann
2010-01-01
Pancreatic adenocarcinoma is one of the leading causes of cancer death, in part because of the inability of current diagnostic methods to reliably detect early-stage disease. We present the first assessment of the diagnostic accuracy of algorithms developed for pancreatic tissue classification using data from fiber optic probe-based bimodal optical spectroscopy, a real-time approach that would be compatible with minimally invasive diagnostic procedures for early cancer detection in the pancreas. A total of 96 fluorescence and 96 reflectance spectra are considered from 50 freshly excised tissue sites-including human pancreatic adenocarcinoma, chronic pancreatitis (inflammation), and normal tissues-on nine patients. Classification algorithms using linear discriminant analysis are developed to distinguish among tissues, and leave-one-out cross-validation is employed to assess the classifiers' performance. The spectral areas and ratios classifier (SpARC) algorithm employs a combination of reflectance and fluorescence data and has the best performance, with sensitivity, specificity, negative predictive value, and positive predictive value for correctly identifying adenocarcinoma being 85, 89, 92, and 80%, respectively.
Implications of the Value of Hydrologic Information to Reservoir Operations--Learning from the Past
ERIC Educational Resources Information Center
Hejazi, Mohamad Issa
2009-01-01
Closing the gap between theoretical reservoir operation and the real-world implementation remains a challenge in contemporary reservoir operations. Past research has focused on optimization algorithms and establishing optimal policies for reservoir operations. In this research, we attempt to understand operators' release decisions by investigating…
[Characteristics of pain syndrome in patients with upper limbs occupational polyneuropathies].
Kochetova, O A; Mal'kova, N Yu
2015-01-01
Pain syndrome accompanies various diseases of central and peripheral nervous system--that is one of the most important problems in contemporary neurology. Many scientists are in search for effective diagnostic and therapeutic tools. The article covers characteristics of the pain syndrome and its mechanisms in patients with upper limbs occupational polyneuropathies.
Prevalence and Phenotype of Childhood Apraxia of Speech in Youth with Galactosemia
ERIC Educational Resources Information Center
Shriberg, Lawrence D.; Potter, Nancy L.; Strand, Edythe A.
2011-01-01
Purpose: In this article, the authors address the hypothesis that the severe and persistent speech disorder reported in persons with galactosemia meets contemporary diagnostic criteria for Childhood Apraxia of Speech (CAS). A positive finding for CAS in this rare metabolic disorder has the potential to impact treatment of persons with galactosemia…
Wormanns, D
2016-09-01
Pulmonary nodules are the most frequent pathological finding in low-dose computed tomography (CT) scanning for early detection of lung cancer. Early stages of lung cancer are often manifested as pulmonary nodules; however, the very commonly occurring small nodules are predominantly benign. These benign nodules are responsible for the high percentage of false positive test results in screening studies. Appropriate diagnostic algorithms are necessary to reduce false positive screening results and to improve the specificity of lung cancer screening. Such algorithms are based on some of the basic principles comprehensively described in this article. Firstly, the diameter of nodules allows a differentiation between large (>8 mm) probably malignant and small (<8 mm) probably benign nodules. Secondly, some morphological features of pulmonary nodules in CT can prove their benign nature. Thirdly, growth of small nodules is the best non-invasive predictor of malignancy and is utilized as a trigger for further diagnostic work-up. Non-invasive testing using positron emission tomography (PET) and contrast enhancement as well as invasive diagnostic tests (e.g. various procedures for cytological and histological diagnostics) are briefly described in this article. Different nodule morphology using CT (e.g. solid and semisolid nodules) is associated with different biological behavior and different algorithms for follow-up are required. Currently, no obligatory algorithm is available in German-speaking countries for the management of pulmonary nodules, which reflects the current state of knowledge. The main features of some international and American recommendations are briefly presented in this article from which conclusions for the daily clinical use are derived.
Cost-effective Diagnostic Checklists for Meningitis in Resource Limited Settings
Durski, Kara N.; Kuntz, Karen M.; Yasukawa, Kosuke; Virnig, Beth A.; Meya, David B.; Boulware, David R.
2013-01-01
Background Checklists can standardize patient care, reduce errors, and improve health outcomes. For meningitis in resource-limited settings, with high patient loads and limited financial resources, CNS diagnostic algorithms may be useful to guide diagnosis and treatment. However, the cost-effectiveness of such algorithms is unknown. Methods We used decision analysis methodology to evaluate the costs, diagnostic yield, and cost-effectiveness of diagnostic strategies for adults with suspected meningitis in resource limited settings with moderate/high HIV prevalence. We considered three strategies: 1) comprehensive “shotgun” approach of utilizing all routine tests; 2) “stepwise” strategy with tests performed in a specific order with additional TB diagnostics; 3) “minimalist” strategy of sequential ordering of high-yield tests only. Each strategy resulted in one of four meningitis diagnoses: bacterial (4%), cryptococcal (59%), TB (8%), or other (aseptic) meningitis (29%). In model development, we utilized prevalence data from two Ugandan sites and published data on test performance. We validated the strategies with data from Malawi, South Africa, and Zimbabwe. Results The current comprehensive testing strategy resulted in 93.3% correct meningitis diagnoses costing $32.00/patient. A stepwise strategy had 93.8% correct diagnoses costing an average of $9.72/patient, and a minimalist strategy had 91.1% correct diagnoses costing an average of $6.17/patient. The incremental cost effectiveness ratio was $133 per additional correct diagnosis for the stepwise over minimalist strategy. Conclusions Through strategically choosing the order and type of testing coupled with disease prevalence rates, algorithms can deliver more care more efficiently. The algorithms presented herein are generalizable to East Africa and Southern Africa. PMID:23466647
A utility/cost analysis of breast cancer risk prediction algorithms
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.
2016-03-01
Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.
Urine trouble: should we think differently about UTI?
Price, Travis K; Hilt, Evann E; Dune, Tanaka J; Mueller, Elizabeth R; Wolfe, Alan J; Brubaker, Linda
2018-02-01
Urinary tract infection (UTI) is clinically important, given that it is one of the most common bacterial infections in adult women. However, the current understanding of UTI remains based on a now disproven concept that the urinary bladder is sterile. Thus, current standards for UTI diagnosis have significant limitations that may reduce the opportunity to improve patient care. Using data from our work and numerous other peer-reviewed studies, we identified four major limitations to the contemporary UTI description: the language of UTI, UTI diagnostic testing, the Escherichia coli-centric view of UTI, and the colony-forming units (CFU) threshold-based diagnosis. Contemporary methods and technology, combined with continued rigorous clinical research can be used to correct these limitations.
Verhaeghe, Paul; Vanheule, Stijn; De Rick, Ann
2007-10-01
Starting from a contemporary critique of the DSM-IV, this paper argues that the diagnostic categories of panic disorder somatization, and undifferentiated somatoform disorders can be understood as belonging to a common type of psychopathology--i.e., the Freudian actual neuroses. In addition to their strong clinical similarity, these disorders share an etiological similarity; and the authors propose a combination of Freud's focus on this type of patient's inability to represent an endogenous drive arousal with the post-Freudian focus on separation anxiety. An etiological hypothesis is put forward based on contemporary psychoanalytic attachment theory, highlighting mentalization. Concrete implications for a psychoanalytically based treatment are proposed.
Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.
2014-01-01
Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.
Prevalence of rosacea in the general population of Germany and Russia - The RISE study.
Tan, J; Schöfer, H; Araviiskaia, E; Audibert, F; Kerrouche, N; Berg, M
2016-03-01
There is an unmet need for general population-based epidemiological data on rosacea based on contemporary diagnostic criteria and validated population survey methodology. To evaluate the prevalence of rosacea in the general population of Germany and Russia. General population screening was conducted in 9-10 cities per country to ensure adequate geographic representation. In Part I of this two-phase study, screening of a representative sample of the general population (every fifth person or every fifth door using a fixed-step procedure on a random route sample) was expedited with use of a questionnaire and algorithm based on current diagnostic criteria for rosacea. Of the subjects that screened positive in the initial phase, a randomly selected sample (every third subject) t`hen underwent diagnostic confirmation by a dermatologist in Part II. A total of 3052 and 3013 subjects (aged 18-65 years) were screened in Germany and Russia respectively. Rosacea prevalence was 12.3% [95%CI, 10.2-14.4] in Germany and 5.0% [95%CI, 2.8-7.2] in Russia. The profile of subjects with rosacea (75% women; mean age of 40 years; mainly skin phototype II or III, majority of subjects with sensitive facial skin) and subtype distribution were similar. Overall, 18% of subjects diagnosed with rosacea were aged 18-30 years. Over 80% were not previously diagnosed. Within the previous year, 47.5% of subjects had received no rosacea care and 23.7% had received topical and/or systemic drugs. Over one-third (35% Germany, 43% Russia) of rosacea subjects reported a moderate to severe adverse impact on quality of life. Rosacea is highly prevalent in Germany (12.3%) and Russia (5.0%). The demographic profile of rosacea subjects was similar between countries and the majority were previously undiagnosed. © 2016 The Authors. Journal of the European Academy of Dermatology and Venereology published by John Wiley & Sons Ltd on behalf of European Academy of Dermatology and Venereology.
Mohamed, Abdallah S. R.; Ruangskul, Manee-Naad; Awan, Musaddiq J.; Baron, Charles A.; Kalpathy-Cramer, Jayashree; Castillo, Richard; Castillo, Edward; Guerrero, Thomas M.; Kocak-Uzel, Esengul; Yang, Jinzhong; Court, Laurence E.; Kantor, Michael E.; Gunn, G. Brandon; Colen, Rivka R.; Frank, Steven J.; Garden, Adam S.; Rosenthal, David I.
2015-01-01
Purpose To develop a quality assurance (QA) workflow by using a robust, curated, manually segmented anatomic region-of-interest (ROI) library as a benchmark for quantitative assessment of different image registration techniques used for head and neck radiation therapy–simulation computed tomography (CT) with diagnostic CT coregistration. Materials and Methods Radiation therapy–simulation CT images and diagnostic CT images in 20 patients with head and neck squamous cell carcinoma treated with curative-intent intensity-modulated radiation therapy between August 2011 and May 2012 were retrospectively retrieved with institutional review board approval. Sixty-eight reference anatomic ROIs with gross tumor and nodal targets were then manually contoured on images from each examination. Diagnostic CT images were registered with simulation CT images rigidly and by using four deformable image registration (DIR) algorithms: atlas based, B-spline, demons, and optical flow. The resultant deformed ROIs were compared with manually contoured reference ROIs by using similarity coefficient metrics (ie, Dice similarity coefficient) and surface distance metrics (ie, 95% maximum Hausdorff distance). The nonparametric Steel test with control was used to compare different DIR algorithms with rigid image registration (RIR) by using the post hoc Wilcoxon signed-rank test for stratified metric comparison. Results A total of 2720 anatomic and 50 tumor and nodal ROIs were delineated. All DIR algorithms showed improved performance over RIR for anatomic and target ROI conformance, as shown for most comparison metrics (Steel test, P < .008 after Bonferroni correction). The performance of different algorithms varied substantially with stratification by specific anatomic structures or category and simulation CT section thickness. Conclusion Development of a formal ROI-based QA workflow for registration assessment demonstrated improved performance with DIR techniques over RIR. After QA, DIR implementation should be the standard for head and neck diagnostic CT and simulation CT allineation, especially for target delineation. © RSNA, 2014 Online supplemental material is available for this article. PMID:25380454
van't Hoog, Anna H; Cobelens, Frank; Vassall, Anna; van Kampen, Sanne; Dorman, Susan E; Alland, David; Ellner, Jerrold
2013-01-01
High costs are a limitation to scaling up the Xpert MTB/RIF assay (Xpert) for the diagnosis of tuberculosis in resource-constrained settings. A triaging strategy in which a sensitive but not necessarily highly specific rapid test is used to select patients for Xpert may result in a more affordable diagnostic algorithm. To inform the selection and development of particular diagnostics as a triage test we explored combinations of sensitivity, specificity and cost at which a hypothetical triage test will improve affordability of the Xpert assay. In a decision analytical model parameterized for Uganda, India and South Africa, we compared a diagnostic algorithm in which a cohort of patients with presumptive TB received Xpert to a triage algorithm whereby only those with a positive triage test were tested by Xpert. A triage test with sensitivity equal to Xpert, 75% specificity, and costs of US$5 per patient tested reduced total diagnostic costs by 42% in the Uganda setting, and by 34% and 39% respectively in the India and South Africa settings. When exploring triage algorithms with lower sensitivity, the use of an example triage test with 95% sensitivity relative to Xpert, 75% specificity and test costs $5 resulted in similar cost reduction, and was cost-effective by the WHO willingness-to-pay threshold compared to Xpert for all in Uganda, but not in India and South Africa. The gain in affordability of the examined triage algorithms increased with decreasing prevalence of tuberculosis among the cohort. A triage test strategy could potentially improve the affordability of Xpert for TB diagnosis, particularly in low-income countries and with enhanced case-finding. Tests and markers with lower accuracy than desired of a diagnostic test may fall within the ranges of sensitivity, specificity and cost required for triage tests and be developed as such.
ERIC Educational Resources Information Center
Kamp-Becker, Inge; Ghahreman, Mardjan; Heinzel-Gutenbrunner, Monika; Peters, Mira; Remschmidt, Helmut; Becker, Katja
2013-01-01
The Autism Diagnostic Observation Schedule (ADOS) is a semi-structured, standardized assessment designed for use in diagnostic evaluation of individuals with suspected autism spectrum disorder (ASD). The ADOS has been effective in categorizing children who definitely have autism or not, but has lower specificity and sometimes sensitivity for…
Teaching Robotics Software with the Open Hardware Mobile Manipulator
ERIC Educational Resources Information Center
Vona, M.; Shekar, N. H.
2013-01-01
The "open hardware mobile manipulator" (OHMM) is a new open platform with a unique combination of features for teaching robotics software and algorithms. On-board low- and high-level processors support real-time embedded programming and motor control, as well as higher-level coding with contemporary libraries. Full hardware designs and…
Alici, Ibrahim Onur; Yılmaz Demirci, Nilgün; Yılmaz, Aydın; Karakaya, Jale; Özaydın, Esra
2016-09-01
There are several papers on the sonographic features of mediastinal lymph nodes affected by several diseases, but none gives the importance and clinical utility of the features. In order to find out which lymph node should be sampled in a particular nodal station during endobronchial ultrasound, we investigated the diagnostic performances of certain sonographic features and proposed an algorithmic approach. We retrospectively analyzed 1051 lymph nodes and randomly assigned them into a preliminary experimental and a secondary study group. The diagnostic performances of the sonographic features (gray scale, echogeneity, shape, size, margin, presence of necrosis, presence of calcification and absence of central hilar structure) were calculated, and an algorithm for lymph node sampling was obtained with decision tree analysis in the experimental group. Later, a modified algorithm was applied to the patients in the study group to give the accuracy. The demographic characteristics of the patients were not statistically significant between the primary and the secondary groups. All of the features were discriminative between malignant and benign diseases. The modified algorithm sensitivity, specificity, and positive and negative predictive values and diagnostic accuracy for detecting metastatic lymph nodes were 100%, 51.2%, 50.6%, 100% and 67.5%, respectively. In this retrospective analysis, the standardized sonographic classification system and the proposed algorithm performed well in choosing the node that should be sampled in a particular station during endobronchial ultrasound. © 2015 John Wiley & Sons Ltd.
Abraham, N S; Cohen, D C; Rivers, B; Richardson, P
2006-07-15
To validate veterans affairs (VA) administrative data for the diagnosis of nonsteroidal anti-inflammatory drug (NSAID)-related upper gastrointestinal events (UGIE) and to develop a diagnostic algorithm. A retrospective study of veterans prescribed an NSAID as identified from the national pharmacy database merged with in-patient and out-patient data, followed by primary chart abstraction. Contingency tables were constructed to allow comparison with a random sample of patients prescribed an NSAID, but without UGIE. Multivariable logistic regression analysis was used to derive a predictive algorithm. Once derived, the algorithm was validated in a separate cohort of veterans. Of 906 patients, 606 had a diagnostic code for UGIE; 300 were a random subsample of 11 744 patients (control). Only 161 had a confirmed UGIE. The positive predictive value (PPV) of diagnostic codes was poor, but improved from 27% to 51% with the addition of endoscopic procedural codes. The strongest predictors of UGIE were an in-patient ICD-9 code for gastric ulcer, duodenal ulcer and haemorrhage combined with upper endoscopy. This algorithm had a PPV of 73% when limited to patients >or=65 years (c-statistic 0.79). Validation of the algorithm revealed a PPV of 80% among patients with an overlapping NSAID prescription. NSAID-related UGIE can be assessed using VA administrative data. The optimal algorithm includes an in-patient ICD-9 code for gastric or duodenal ulcer and gastrointestinal bleeding combined with a procedural code for upper endoscopy.
Yakhelef, N; Audibert, M; Varaine, F; Chakaya, J; Sitienei, J; Huerga, H; Bonnet, M
2014-05-01
In 2007, the World Health Organization recommended introducing rapid Mycobacterium tuberculosis culture into the diagnostic algorithm of smear-negative pulmonary tuberculosis (TB). To assess the cost-effectiveness of introducing a rapid non-commercial culture method (thin-layer agar), together with Löwenstein-Jensen culture to diagnose smear-negative TB at a district hospital in Kenya. Outcomes (number of true TB cases treated) were obtained from a prospective study evaluating the effectiveness of a clinical and radiological algorithm (conventional) against the alternative algorithm (conventional plus M. tuberculosis culture) in 380 smear-negative TB suspects. The costs of implementing each algorithm were calculated using a 'micro-costing' or 'ingredient-based' method. We then compared the cost and effectiveness of conventional vs. culture-based algorithms and estimated the incremental cost-effectiveness ratio. The costs of conventional and culture-based algorithms per smear-negative TB suspect were respectively €39.5 and €144. The costs per confirmed and treated TB case were respectively €452 and €913. The culture-based algorithm led to diagnosis and treatment of 27 more cases for an additional cost of €1477 per case. Despite the increase in patients started on treatment thanks to culture, the relatively high cost of a culture-based algorithm will make it difficult for resource-limited countries to afford.
ERIC Educational Resources Information Center
Treloar, Amanda Jane Commons; Lewis, Andrew J.
2009-01-01
This paper reviews the history of the recognition of borderline personality disorder as a clinical disorder, followed by a review of the contemporary practice of diagnosing borderline personality disorder in psychiatric settings. Many researchers have cautioned against the conflation of difficult patients with the diagnostic category of borderline…
NASA Astrophysics Data System (ADS)
Li, Shaoxin; Li, Linfang; Zeng, Qiuyao; Zhang, Yanjiao; Guo, Zhouyi; Liu, Zhiming; Jin, Mei; Su, Chengkang; Lin, Lin; Xu, Junfa; Liu, Songhao
2015-05-01
This study aims to characterize and classify serum surface-enhanced Raman spectroscopy (SERS) spectra between bladder cancer patients and normal volunteers by genetic algorithms (GAs) combined with linear discriminate analysis (LDA). Two group serum SERS spectra excited with nanoparticles are collected from healthy volunteers (n = 36) and bladder cancer patients (n = 55). Six diagnostic Raman bands in the regions of 481-486, 682-687, 1018-1034, 1313-1323, 1450-1459 and 1582-1587 cm-1 related to proteins, nucleic acids and lipids are picked out with the GAs and LDA. By the diagnostic models built with the identified six Raman bands, the improved diagnostic sensitivity of 90.9% and specificity of 100% were acquired for classifying bladder cancer patients from normal serum SERS spectra. The results are superior to the sensitivity of 74.6% and specificity of 97.2% obtained with principal component analysis by the same serum SERS spectra dataset. Receiver operating characteristic (ROC) curves further confirmed the efficiency of diagnostic algorithm based on GA-LDA technique. This exploratory work demonstrates that the serum SERS associated with GA-LDA technique has enormous potential to characterize and non-invasively detect bladder cancer through peripheral blood.
Conwell, Darwin L; Lee, Linda S; Yadav, Dhiraj; Longnecker, Daniel S; Miller, Frank H; Mortele, Koenraad J; Levy, Michael J; Kwon, Richard; Lieb, John G; Stevens, Tyler; Toskes, Phillip P; Gardner, Timothy B; Gelrud, Andres; Wu, Bechien U; Forsmark, Christopher E; Vege, Santhi S
2014-11-01
The diagnosis of chronic pancreatitis remains challenging in early stages of the disease. This report defines the diagnostic criteria useful in the assessment of patients with suspected and established chronic pancreatitis. All current diagnostic procedures are reviewed, and evidence-based statements are provided about their utility and limitations. Diagnostic criteria for chronic pancreatitis are classified as definitive, probable, or insufficient evidence. A diagnostic (STEP-wise; survey, tomography, endoscopy, and pancreas function testing) algorithm is proposed that proceeds from a noninvasive to a more invasive approach. This algorithm maximizes specificity (low false-positive rate) in subjects with chronic abdominal pain and equivocal imaging changes. Furthermore, a nomenclature is suggested to further characterize patients with established chronic pancreatitis based on TIGAR-O (toxic, idiopathic, genetic, autoimmune, recurrent, and obstructive) etiology, gland morphology (Cambridge criteria), and physiologic state (exocrine, endocrine function) for uniformity across future multicenter research collaborations. This guideline will serve as a baseline manuscript that will be modified as new evidence becomes available and our knowledge of chronic pancreatitis improves.
Teh, Seng Khoon; Zheng, Wei; Lau, David P; Huang, Zhiwei
2009-06-01
In this work, we evaluated the diagnostic ability of near-infrared (NIR) Raman spectroscopy associated with the ensemble recursive partitioning algorithm based on random forests for identifying cancer from normal tissue in the larynx. A rapid-acquisition NIR Raman system was utilized for tissue Raman measurements at 785 nm excitation, and 50 human laryngeal tissue specimens (20 normal; 30 malignant tumors) were used for NIR Raman studies. The random forests method was introduced to develop effective diagnostic algorithms for classification of Raman spectra of different laryngeal tissues. High-quality Raman spectra in the range of 800-1800 cm(-1) can be acquired from laryngeal tissue within 5 seconds. Raman spectra differed significantly between normal and malignant laryngeal tissues. Classification results obtained from the random forests algorithm on tissue Raman spectra yielded a diagnostic sensitivity of 88.0% and specificity of 91.4% for laryngeal malignancy identification. The random forests technique also provided variables importance that facilitates correlation of significant Raman spectral features with cancer transformation. This study shows that NIR Raman spectroscopy in conjunction with random forests algorithm has a great potential for the rapid diagnosis and detection of malignant tumors in the larynx.
Development of PET projection data correction algorithm
NASA Astrophysics Data System (ADS)
Bazhanov, P. V.; Kotina, E. D.
2017-12-01
Positron emission tomography is modern nuclear medicine method used in metabolism and internals functions examinations. This method allows to diagnosticate treatments on their early stages. Mathematical algorithms are widely used not only for images reconstruction but also for PET data correction. In this paper random coincidences and scatter correction algorithms implementation are considered, as well as algorithm of PET projection data acquisition modeling for corrections verification.
Raschke, R A; Gallo, T; Curry, S C; Whiting, T; Padilla-Jones, A; Warkentin, T E; Puri, A
2017-08-01
Essentials We previously published a diagnostic algorithm for heparin-induced thrombocytopenia (HIT). In this study, we validated the algorithm in an independent large healthcare system. The accuracy was 98%, sensitivity 82% and specificity 99%. The algorithm has potential to improve accuracy and efficiency in the diagnosis of HIT. Background Heparin-induced thrombocytopenia (HIT) is a life-threatening drug reaction caused by antiplatelet factor 4/heparin (anti-PF4/H) antibodies. Commercial tests to detect these antibodies have suboptimal operating characteristics. We previously developed a diagnostic algorithm for HIT that incorporated 'four Ts' (4Ts) scoring and a stratified interpretation of an anti-PF4/H enzyme-linked immunosorbent assay (ELISA) and yielded a discriminant accuracy of 0.97 (95% confidence interval [CI], 0.93-1.00). Objectives The purpose of this study was to validate the algorithm in an independent patient population and quantitate effects that algorithm adherence could have on clinical care. Methods A retrospective cohort comprised patients who had undergone anti-PF4/H ELISA and serotonin release assay (SRA) testing in our healthcare system from 2010 to 2014. We determined the algorithm recommendation for each patient, compared recommendations with the clinical care received, and enumerated consequences of discrepancies. Operating characteristics were calculated for algorithm recommendations using SRA as the reference standard. Results Analysis was performed on 181 patients, 10 of whom were ruled in for HIT. The algorithm accurately stratified 98% of patients (95% CI, 95-99%), ruling out HIT in 158, ruling in HIT in 10 and recommending an SRA in 13 patients. Algorithm adherence would have obviated 165 SRAs and prevented 30 courses of unnecessary antithrombotic therapy for HIT. Diagnostic sensitivity was 0.82 (95% CI, 0.48-0.98), specificity 0.99 (95% CI, 0.97-1.00), PPV 0.90 (95% CI, 0.56-0.99) and NPV 0.99 (95% CI, 0.96-1.00). Conclusions An algorithm incorporating 4Ts scoring and a stratified interpretation of the anti-PF4/H ELISA has good operating characteristics and the potential to improve management of suspected HIT patients. © 2017 International Society on Thrombosis and Haemostasis.
Climate science in the tropics: waves, vortices and PDEs
NASA Astrophysics Data System (ADS)
Khouider, Boualem; Majda, Andrew J.; Stechmann, Samuel N.
2013-01-01
Clouds in the tropics can organize the circulation on planetary scales and profoundly impact long range seasonal forecasting and climate on the entire globe, yet contemporary operational computer models are often deficient in representing these phenomena. On the other hand, contemporary observations reveal remarkably complex coherent waves and vortices in the tropics interacting across a bewildering range of scales from kilometers to ten thousand kilometers. This paper reviews the interdisciplinary contributions over the last decade through the modus operandi of applied mathematics to these important scientific problems. Novel physical phenomena, new multiscale equations, novel PDEs, and numerical algorithms are presented here with the goal of attracting mathematicians and physicists to this exciting research area.
Validation of Living Donor Nephrectomy Codes
Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.
2018-01-01
Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679
NASA Astrophysics Data System (ADS)
Kovalev, I. A.; Rakovskii, V. G.; Isakov, N. Yu.; Sandovskii, A. V.
2016-03-01
The work results on the development and improvement of the techniques, algorithms, and software-hardware of continuous operating diagnostics systems of rotating units and parts of turbine equipment state are presented. In particular, to ensure the full remote service of monitored turbine equipment using web technologies, the web version of the software of the automated systems of vibration-based diagnostics (ASVD VIDAS) was developed. The experience in the automated analysis of data obtained by ASVD VIDAS form the basis of the new algorithm of early detection of such dangerous defects as rotor deflection, crack in the rotor, and strong misalignment of supports. The program-technical complex of monitoring and measuring the deflection of medium pressure rotor (PTC) realizing this algorithm will alert the electric power plant staff during a deflection and indicate its value. This will give the opportunity to take timely measures to prevent the further extension of the defect. Repeatedly, recorded cases of full or partial destruction of shrouded shelves of rotor blades of the last stages of low-pressure cylinders of steam turbines defined the need to develop a version of the automated system of blade diagnostics (ASBD SKALA) for shrouded stages. The processing, analysis, presentation, and backup of data characterizing the mechanical state of blade device are carried out with a newly developed controller of the diagnostics system. As a result of the implementation of the works, the diagnosed parameters determining the operation security of rotating elements of equipment was expanded and the new tasks on monitoring the state of units and parts of turbines were solved. All algorithmic solutions and hardware-software implementations mentioned in the article were tested on the test benches and applied at some power plants.
Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan
2013-01-01
In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493
A simple algorithm for beam profile diagnostics using a thermographic camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katagiri, Ken; Hojo, Satoru; Honma, Toshihiro
2014-03-15
A new algorithm for digital image processing apparatuses is developed to evaluate profiles of high-intensity DC beams from temperature images of irradiated thin foils. Numerical analyses are performed to examine the reliability of the algorithm. To simulate the temperature images acquired by a thermographic camera, temperature distributions are numerically calculated for 20 MeV proton beams with different parameters. Noise in the temperature images which is added by the camera sensor is also simulated to account for its effect. Using the algorithm, beam profiles are evaluated from the simulated temperature images and compared with exact solutions. We find that niobium ismore » an appropriate material for the thin foil used in the diagnostic system. We also confirm that the algorithm is adaptable over a wide beam current range of 0.11–214 μA, even when employing a general-purpose thermographic camera with rather high noise (ΔT{sub NETD} ≃ 0.3 K; NETD: noise equivalent temperature difference)« less
Diagnostic Accuracy Comparison of Artificial Immune Algorithms for Primary Headaches.
Çelik, Ufuk; Yurtay, Nilüfer; Koç, Emine Rabia; Tepe, Nermin; Güllüoğlu, Halil; Ertaş, Mustafa
2015-01-01
The present study evaluated the diagnostic accuracy of immune system algorithms with the aim of classifying the primary types of headache that are not related to any organic etiology. They are divided into four types: migraine, tension, cluster, and other primary headaches. After we took this main objective into consideration, three different neurologists were required to fill in the medical records of 850 patients into our web-based expert system hosted on our project web site. In the evaluation process, Artificial Immune Systems (AIS) were used as the classification algorithms. The AIS are classification algorithms that are inspired by the biological immune system mechanism that involves significant and distinct capabilities. These algorithms simulate the specialties of the immune system such as discrimination, learning, and the memorizing process in order to be used for classification, optimization, or pattern recognition. According to the results, the accuracy level of the classifier used in this study reached a success continuum ranging from 95% to 99%, except for the inconvenient one that yielded 71% accuracy.
NASA Astrophysics Data System (ADS)
Smarda, M.; Alexopoulou, E.; Mazioti, A.; Kordolaimi, S.; Ploussi, A.; Priftis, K.; Efstathopoulos, E.
2015-09-01
Purpose of the study is to determine the appropriate iterative reconstruction (IR) algorithm level that combines image quality and diagnostic confidence, for pediatric patients undergoing high-resolution computed tomography (HRCT). During the last 2 years, a total number of 20 children up to 10 years old with a clinical presentation of chronic bronchitis underwent HRCT in our department's 64-detector row CT scanner using the iDose IR algorithm, with almost similar image settings (80kVp, 40-50 mAs). CT images were reconstructed with all iDose levels (level 1 to 7) as well as with filtered-back projection (FBP) algorithm. Subjective image quality was evaluated by 2 experienced radiologists in terms of image noise, sharpness, contrast and diagnostic acceptability using a 5-point scale (1=excellent image, 5=non-acceptable image). Artifacts existance was also pointed out. All mean scores from both radiologists corresponded to satisfactory image quality (score ≤3), even with the FBP algorithm use. Almost excellent (score <2) overall image quality was achieved with iDose levels 5 to 7, but oversmoothing artifacts appearing with iDose levels 6 and 7 affected the diagnostic confidence. In conclusion, the use of iDose level 5 enables almost excellent image quality without considerable artifacts affecting the diagnosis. Further evaluation is needed in order to draw more precise conclusions.
Stapleton, Brandon M; Lin, Wei-Shao; Ntounis, Athanasios; Harris, Bryan T; Morton, Dean
2014-09-01
This clinical report demonstrated the use of an implant-supported fixed dental prosthesis fabricated with a contemporary digital approach. The digital diagnostic data acquisition was completed with a digital diagnostic impression with an intraoral scanner and cone-beam computed tomography with a prefabricated universal radiographic template to design a virtual prosthetically driven implant surgical plan. A surgical template fabricated with computer-aided design and computer-aided manufacturing (CAD/CAM) was used to perform computer-guided implant surgery. The definitive digital data were then used to design the definitive CAD/CAM-fabricated fixed dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2008-03-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Improved Temperature Diagnostic for Non-Neutral Plasmas with Single-Electron Resolution
NASA Astrophysics Data System (ADS)
Shanman, Sabrina; Evans, Lenny; Fajans, Joel; Hunter, Eric; Nelson, Cheyenne; Sierra, Carlos; Wurtele, Jonathan
2016-10-01
Plasma temperature diagnostics in a Penning-Malmberg trap are essential for reliably obtaining cold, non-neutral plasmas. We have developed a setup for detecting the initial electrons that escape from a trapped pure electron plasma as the confining electrode potential is slowly reduced. The setup minimizes external noise by using a silicon photomultiplier to capture light emitted from an MCP-amplified phosphor screen. To take advantage of this enhanced resolution, we have developed a new plasma temperature diagnostic analysis procedure which takes discrete electron arrival times as input. We have run extensive simulations comparing this new discrete algorithm to our existing exponential fitting algorithm. These simulations are used to explore the behavior of these two temperature diagnostic procedures at low N and at high electronic noise. This work was supported by the DOE DE-FG02-06ER54904, and the NSF 1500538-PHY.
Embedded Reasoning Supporting Aerospace IVHM
2007-01-01
c method (BIT or health assessment algorithm) which the monitoring diagnostic relies on input information tics and Astronautics In the diagram...viewing of the current health state of all monitored subsystems, while also providing a means to probe deeper in the event anomalous operation is...seeks to integrate detection , diagnostic, and prognostic capabilities with a hierarchical diagnostic reasoning architecture into a single
NASA Astrophysics Data System (ADS)
Potlov, A. Yu.; Frolov, S. V.; Proskurin, S. G.
2018-04-01
High-quality OCT structural images reconstruction algorithm for endoscopic optical coherence tomography of biological tissue is described. The key features of the presented algorithm are: (1) raster scanning and averaging of adjacent Ascans and pixels; (2) speckle level minimization. The described algorithm can be used in the gastroenterology, urology, gynecology, otorhinolaryngology for mucous membranes and skin diagnostics in vivo and in situ.
Infrastructural intelligence: Contemporary entanglements between neuroscience and AI.
Bruder, Johannes
2017-01-01
In this chapter, I reflect on contemporary entanglements between artificial intelligence and the neurosciences by tracing the development of Google's recent DeepMind algorithms back to their roots in neuroscientific studies of episodic memory and imagination. Google promotes a new form of "infrastructural intelligence," which excels by constantly reassessing its cognitive architecture in exchange with a cloud of data that surrounds it, and exhibits putatively human capacities such as intuition. I argue that such (re)alignments of biological and artificial intelligence have been enabled by a paradigmatic infrastructuralization of the brain in contemporary neuroscience. This infrastructuralization is based in methodologies that epistemically liken the brain to complex systems of an entirely different scale (i.e., global logistics) and has given rise to diverse research efforts that target the neuronal infrastructures of higher cognitive functions such as empathy and creativity. What is at stake in this process is no less than the shape of brains to come and a revised understanding of the intelligent and creative social subject. © 2017 Elsevier B.V. All rights reserved.
An algorithmic approach to the brain biopsy--part I.
Kleinschmidt-DeMasters, B K; Prayson, Richard A
2006-11-01
The formulation of appropriate differential diagnoses for a slide is essential to the practice of surgical pathology but can be particularly challenging for residents and fellows. Algorithmic flow charts can help the less experienced pathologist to systematically consider all possible choices and eliminate incorrect diagnoses. They can assist pathologists-in-training in developing orderly, sequential, and logical thinking skills when confronting difficult cases. To present an algorithmic flow chart as an approach to formulating differential diagnoses for lesions seen in surgical neuropathology. An algorithmic flow chart to be used in teaching residents. Algorithms are not intended to be final diagnostic answers on any given case. Algorithms do not substitute for training received from experienced mentors nor do they substitute for comprehensive reading by trainees of reference textbooks. Algorithmic flow diagrams can, however, direct the viewer to the correct spot in reference texts for further in-depth reading once they hone down their diagnostic choices to a smaller number of entities. The best feature of algorithms is that they remind the user to consider all possibilities on each case, even if they can be quickly eliminated from further consideration. In Part I, we assist the resident in learning how to handle brain biopsies in general and how to distinguish nonneoplastic lesions that mimic tumors from true neoplasms.
An algorithmic approach to the brain biopsy--part II.
Prayson, Richard A; Kleinschmidt-DeMasters, B K
2006-11-01
The formulation of appropriate differential diagnoses for a slide is essential to the practice of surgical pathology but can be particularly challenging for residents and fellows. Algorithmic flow charts can help the less experienced pathologist to systematically consider all possible choices and eliminate incorrect diagnoses. They can assist pathologists-in-training in developing orderly, sequential, and logical thinking skills when confronting difficult cases. To present an algorithmic flow chart as an approach to formulating differential diagnoses for lesions seen in surgical neuropathology. An algorithmic flow chart to be used in teaching residents. Algorithms are not intended to be final diagnostic answers on any given case. Algorithms do not substitute for training received from experienced mentors nor do they substitute for comprehensive reading by trainees of reference textbooks. Algorithmic flow diagrams can, however, direct the viewer to the correct spot in reference texts for further in-depth reading once they hone down their diagnostic choices to a smaller number of entities. The best feature of algorithms is that they remind the user to consider all possibilities on each case, even if they can be quickly eliminated from further consideration. In Part II, we assist the resident in arriving at the correct diagnosis for neuropathologic lesions containing granulomatous inflammation, macrophages, or abnormal blood vessels.
Ambavane, Apoorva; Lindahl, Bertil; Giannitsis, Evangelos; Roiz, Julie; Mendivil, Joan; Frankenstein, Lutz; Body, Richard; Christ, Michael; Bingisser, Roland; Alquezar, Aitor; Mueller, Christian
2017-01-01
The 1-hour (h) algorithm triages patients presenting with suspected acute myocardial infarction (AMI) to the emergency department (ED) towards "rule-out," "rule-in," or "observation," depending on baseline and 1-h levels of high-sensitivity cardiac troponin (hs-cTn). The economic consequences of applying the accelerated 1-h algorithm are unknown. We performed a post-hoc economic analysis in a large, diagnostic, multicenter study of hs-cTnT using central adjudication of the final diagnosis by two independent cardiologists. Length of stay (LoS), resource utilization (RU), and predicted diagnostic accuracy of the 1-h algorithm compared to standard of care (SoC) in the ED were estimated. The ED LoS, RU, and accuracy of the 1-h algorithm was compared to that achieved by the SoC at ED discharge. Expert opinion was sought to characterize clinical implementation of the 1-h algorithm, which required blood draws at ED presentation and 1h, after which "rule-in" patients were transferred for coronary angiography, "rule-out" patients underwent outpatient stress testing, and "observation" patients received SoC. Unit costs were for the United Kingdom, Switzerland, and Germany. The sensitivity and specificity for the 1-h algorithm were 87% and 96%, respectively, compared to 69% and 98% for SoC. The mean ED LoS for the 1-h algorithm was 4.3h-it was 6.5h for SoC, which is a reduction of 33%. The 1-h algorithm was associated with reductions in RU, driven largely by the shorter LoS in the ED for patients with a diagnosis other than AMI. The estimated total costs per patient were £2,480 for the 1-h algorithm compared to £4,561 for SoC, a reduction of up to 46%. The analysis shows that the use of 1-h algorithm is associated with reduction in overall AMI diagnostic costs, provided it is carefully implemented in clinical practice. These results need to be prospectively validated in the future.
A Comparative Analysis of the ADOS-G and ADOS-2 Algorithms: Preliminary Findings.
Dorlack, Taylor P; Myers, Orrin B; Kodituwakku, Piyadasa W
2018-06-01
The Autism Diagnostic Observation Schedule (ADOS) is a widely utilized observational assessment tool for diagnosis of autism spectrum disorders. The original ADOS was succeeded by the ADOS-G with noted improvements. More recently, the ADOS-2 was introduced to further increase its diagnostic accuracy. Studies examining the validity of the ADOS have produced mixed findings, and pooled relationship trends between the algorithm versions are yet to be analyzed. The current review seeks to compare the relative merits of the ADOS-G and ADOS-2 algorithms, Modules 1-3. Eight studies met inclusion criteria for the review, and six were selected for paired comparisons of the sensitivity and specificity of the ADOS. Results indicate several contradictory findings, underscoring the importance of further study.
Automatic analysis and classification of surface electromyography.
Abou-Chadi, F E; Nashar, A; Saad, M
2001-01-01
In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device.
NASA Technical Reports Server (NTRS)
Lawson, Charles L.; Krogh, Fred; Van Snyder, W.; Oken, Carol A.; Mccreary, Faith A.; Lieske, Jay H.; Perrine, Jack; Coffin, Ralph S.; Wayne, Warren J.
1994-01-01
MATH77 is high-quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for basic computational processes of science and engineering. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. MATH77 release 4.0 subroutine library designed to be usable on any computer system supporting full ANSI standard FORTRAN 77 language.
Deep learning based syndrome diagnosis of chronic gastritis.
Liu, Guo-Ping; Yan, Jian-Jun; Wang, Yi-Qin; Zheng, Wu; Zhong, Tao; Lu, Xiong; Qian, Peng
2014-01-01
In Traditional Chinese Medicine (TCM), most of the algorithms used to solve problems of syndrome diagnosis are superficial structure algorithms and not considering the cognitive perspective from the brain. However, in clinical practice, there is complex and nonlinear relationship between symptoms (signs) and syndrome. So we employed deep leaning and multilabel learning to construct the syndrome diagnostic model for chronic gastritis (CG) in TCM. The results showed that deep learning could improve the accuracy of syndrome recognition. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice.
Open Energy Information System version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
OpenEIS was created to provide standard methods for authoring, sharing, testing, using, and improving algorithms for operational building energy efficiency with building managers and building owners. OpenEIS is designed as a no-cost/low-cost solution that will propagate the fault detection and diagnostic (FDD) solutions into the marketplace by providing state- of- the-art analytical and diagnostic algorithms. As OpenEIS penetrates the market, demand by control system manufacturers and integrators serving small and medium commercial customers will help push these types of commercial software tool offerings into the broader marketplace.
Deep Learning Based Syndrome Diagnosis of Chronic Gastritis
Liu, Guo-Ping; Wang, Yi-Qin; Zheng, Wu; Zhong, Tao; Lu, Xiong; Qian, Peng
2014-01-01
In Traditional Chinese Medicine (TCM), most of the algorithms used to solve problems of syndrome diagnosis are superficial structure algorithms and not considering the cognitive perspective from the brain. However, in clinical practice, there is complex and nonlinear relationship between symptoms (signs) and syndrome. So we employed deep leaning and multilabel learning to construct the syndrome diagnostic model for chronic gastritis (CG) in TCM. The results showed that deep learning could improve the accuracy of syndrome recognition. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice. PMID:24734118
Fong, Simon; Deb, Suash; Yang, Xin-She; Zhuang, Yan
2014-01-01
Traditional K-means clustering algorithms have the drawback of getting stuck at local optima that depend on the random values of initial centroids. Optimization algorithms have their advantages in guiding iterative computation to search for global optima while avoiding local optima. The algorithms help speed up the clustering process by converging into a global optimum early with multiple search agents in action. Inspired by nature, some contemporary optimization algorithms which include Ant, Bat, Cuckoo, Firefly, and Wolf search algorithms mimic the swarming behavior allowing them to cooperatively steer towards an optimal objective within a reasonable time. It is known that these so-called nature-inspired optimization algorithms have their own characteristics as well as pros and cons in different applications. When these algorithms are combined with K-means clustering mechanism for the sake of enhancing its clustering quality by avoiding local optima and finding global optima, the new hybrids are anticipated to produce unprecedented performance. In this paper, we report the results of our evaluation experiments on the integration of nature-inspired optimization methods into K-means algorithms. In addition to the standard evaluation metrics in evaluating clustering quality, the extended K-means algorithms that are empowered by nature-inspired optimization methods are applied on image segmentation as a case study of application scenario.
Deb, Suash; Yang, Xin-She
2014-01-01
Traditional K-means clustering algorithms have the drawback of getting stuck at local optima that depend on the random values of initial centroids. Optimization algorithms have their advantages in guiding iterative computation to search for global optima while avoiding local optima. The algorithms help speed up the clustering process by converging into a global optimum early with multiple search agents in action. Inspired by nature, some contemporary optimization algorithms which include Ant, Bat, Cuckoo, Firefly, and Wolf search algorithms mimic the swarming behavior allowing them to cooperatively steer towards an optimal objective within a reasonable time. It is known that these so-called nature-inspired optimization algorithms have their own characteristics as well as pros and cons in different applications. When these algorithms are combined with K-means clustering mechanism for the sake of enhancing its clustering quality by avoiding local optima and finding global optima, the new hybrids are anticipated to produce unprecedented performance. In this paper, we report the results of our evaluation experiments on the integration of nature-inspired optimization methods into K-means algorithms. In addition to the standard evaluation metrics in evaluating clustering quality, the extended K-means algorithms that are empowered by nature-inspired optimization methods are applied on image segmentation as a case study of application scenario. PMID:25202730
Naidoo, Pren; van Niekerk, Margaret; du Toit, Elizabeth; Beyers, Nulda; Leon, Natalie
2015-10-28
Although new molecular diagnostic tests such as GenoType MTBDRplus and Xpert® MTB/RIF have reduced multidrug-resistant tuberculosis (MDR-TB) treatment initiation times, patients' experiences of diagnosis and treatment initiation are not known. This study aimed to explore and compare MDR-TB patients' experiences of their diagnostic and treatment initiation pathway in GenoType MTBDRplus and Xpert® MTB/RIF-based diagnostic algorithms. The study was undertaken in Cape Town, South Africa where primary health-care services provided free TB diagnosis and treatment. A smear, culture and GenoType MTBDRplus diagnostic algorithm was used in 2010, with Xpert® MTB/RIF phased in from 2011-2013. Participants diagnosed in each algorithm at four facilities were purposively sampled, stratifying by age, gender and MDR-TB risk profiles. We conducted in-depth qualitative interviews using a semi-structured interview guide. Through constant comparative analysis we induced common and divergent themes related to symptom recognition, health-care access, testing for MDR-TB and treatment initiation within and between groups. Data were triangulated with clinical information and health visit data from a structured questionnaire. We identified both enablers and barriers to early MDR-TB diagnosis and treatment. Half the patients had previously been treated for TB; most recognised recurring symptoms and reported early health-seeking. Those who attributed symptoms to other causes delayed health-seeking. Perceptions of poor public sector services were prevalent and may have contributed both to deferred health-seeking and to patient's use of the private sector, contributing to delays. However, once on treatment, most patients expressed satisfaction with public sector care. Two patients in the Xpert® MTB/RIF-based algorithm exemplified its potential to reduce delays, commencing MDR-TB treatment within a week of their first health contact. However, most patients in both algorithms experienced substantial delays. Avoidable health system delays resulted from providers not testing for TB at initial health contact, non-adherence to testing algorithms, results not being available and failure to promptly recall patients with positive results. Whilst the introduction of rapid tests such as Xpert® MTB/RIF can expedite MDR-TB diagnosis and treatment initiation, the full benefits are unlikely to be realised without reducing delays in health-seeking and addressing the structural barriers present in the health-care system.
A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics
2003-03-01
and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection
ERIC Educational Resources Information Center
Wiggins, Lisa D.; Reynolds, Ann; Rice, Catherine E.; Moody, Eric J.; Bernal, Pilar; Blaskey, Lisa; Rosenberg, Steven A.; Lee, Li-Ching; Levy, Susan E.
2015-01-01
The Study to Explore Early Development (SEED) is a multi-site case-control study designed to explore the relationship between autism spectrum disorder (ASD) phenotypes and etiologies. The goals of this paper are to (1) describe the SEED algorithm that uses the Autism Diagnostic Interview-Revised (ADI-R) and Autism Diagnostic Observation Schedule…
ERIC Educational Resources Information Center
Kim, So Hyun; Thurm, Audrey; Shumway, Stacy; Lord, Catherine
2013-01-01
Using two independent datasets provided by National Institute of Health funded consortia, the Collaborative Programs for Excellence in Autism and Studies to Advance Autism Research and Treatment (n = 641) and the National Institute of Mental Health (n = 167), diagnostic validity and factor structure of the new Autism Diagnostic Interview (ADI-R)…
Vitte, Joana; Ranque, Stéphane; Carsin, Ania; Gomez, Carine; Romain, Thomas; Cassagne, Carole; Gouitaa, Marion; Baravalle-Einaudi, Mélisande; Bel, Nathalie Stremler-Le; Reynaud-Gaubert, Martine; Dubus, Jean-Christophe; Mège, Jean-Louis; Gaudart, Jean
2017-01-01
Molecular-based allergy diagnosis yields multiple biomarker datasets. The classical diagnostic score for allergic bronchopulmonary aspergillosis (ABPA), a severe disease usually occurring in asthmatic patients and people with cystic fibrosis, comprises succinct immunological criteria formulated in 1977: total IgE, anti- Aspergillus fumigatus ( Af ) IgE, anti- Af "precipitins," and anti- Af IgG. Progress achieved over the last four decades led to multiple IgE and IgG(4) Af biomarkers available with quantitative, standardized, molecular-level reports. These newly available biomarkers have not been included in the current diagnostic criteria, either individually or in algorithms, despite persistent underdiagnosis of ABPA. Large numbers of individual biomarkers may hinder their use in clinical practice. Conversely, multivariate analysis using new tools may bring about a better chance of less diagnostic mistakes. We report here a proof-of-concept work consisting of a three-step multivariate analysis of Af IgE, IgG, and IgG4 biomarkers through a combination of principal component analysis, hierarchical ascendant classification, and classification and regression tree multivariate analysis. The resulting diagnostic algorithms might show the way for novel criteria and improved diagnostic efficiency in Af -sensitized patients at risk for ABPA.
NASA Astrophysics Data System (ADS)
Nam, Kyoung Won; Kim, In Young; Kang, Ho Chul; Yang, Hee Kyung; Yoon, Chang Ki; Hwang, Jeong Min; Kim, Young Jae; Kim, Tae Yun; Kim, Kwang Gi
2012-10-01
Accurate measurement of binocular misalignment between both eyes is important for proper preoperative management, surgical planning, and postoperative evaluation of patients with strabismus. In this study, we proposed a new computerized diagnostic algorithm that can calculate the angle of binocular eye misalignment photographically by using a dedicated three-dimensional eye model mimicking the structure of the natural human eye. To evaluate the performance of the proposed algorithm, eight healthy volunteers and eight individuals with strabismus were recruited in this study, the horizontal deviation angle, vertical deviation angle, and angle of eye misalignment were calculated and the angular differences between the healthy and the strabismus groups were evaluated using the nonparametric Mann-Whitney test and the Pearson correlation test. The experimental results demonstrated a statistically significant difference between the healthy and strabismus groups (p = 0.015 < 0.05), but no statistically significant difference between the proposed method and the Krimsky test (p = 0.912 > 0.05). The measurements of the two methods were highly correlated (r = 0.969, p < 0.05). From the experimental results, we believe that the proposed diagnostic method has the potential to be a diagnostic tool that measures the physical disorder of the human eye to diagnose non-invasively the severity of strabismus.
NASA Astrophysics Data System (ADS)
Huang, Shaohua; Wang, Lan; Chen, Weiwei; Lin, Duo; Huang, Lingling; Wu, Shanshan; Feng, Shangyuan; Chen, Rong
2014-09-01
A surface-enhanced Raman spectroscopy (SERS) approach was utilized for urine biochemical analysis with the aim to develop a label-free and non-invasive optical diagnostic method for esophagus cancer detection. SERS spectrums were acquired from 31 normal urine samples and 47 malignant esophagus cancer (EC) urine samples. Tentative assignments of urine SERS bands demonstrated esophagus cancer specific changes, including an increase in the relative amounts of urea and a decrease in the percentage of uric acid in the urine of normal compared with EC. The empirical algorithm integrated with linear discriminant analysis (LDA) were employed to identify some important urine SERS bands for differentiation between healthy subjects and EC urine. The empirical diagnostic approach based on the ratio of the SERS peak intensity at 527 to 1002 cm-1 and 725 to 1002 cm-1 coupled with LDA yielded a diagnostic sensitivity of 72.3% and specificity of 96.8%, respectively. The area under the receive operating characteristic (ROC) curve was 0.954, which further evaluate the performance of the diagnostic algorithm based on the ratio of the SERS peak intensity combined with LDA analysis. This work demonstrated that the urine SERS spectra associated with empirical algorithm has potential for noninvasive diagnosis of esophagus cancer.
Volumetric visualization algorithm development for an FPGA-based custom computing machine
NASA Astrophysics Data System (ADS)
Sallinen, Sami J.; Alakuijala, Jyrki; Helminen, Hannu; Laitinen, Joakim
1998-05-01
Rendering volumetric medical images is a burdensome computational task for contemporary computers due to the large size of the data sets. Custom designed reconfigurable hardware could considerably speed up volume visualization if an algorithm suitable for the platform is used. We present an algorithm and speedup techniques for visualizing volumetric medical CT and MR images with a custom-computing machine based on a Field Programmable Gate Array (FPGA). We also present simulated performance results of the proposed algorithm calculated with a software implementation running on a desktop PC. Our algorithm is capable of generating perspective projection renderings of single and multiple isosurfaces with transparency, simulated X-ray images, and Maximum Intensity Projections (MIP). Although more speedup techniques exist for parallel projection than for perspective projection, we have constrained ourselves to perspective viewing, because of its importance in the field of radiotherapy. The algorithm we have developed is based on ray casting, and the rendering is sped up by three different methods: shading speedup by gradient precalculation, a new generalized version of Ray-Acceleration by Distance Coding (RADC), and background ray elimination by speculative ray selection.
Autism in the Faroe Islands: Diagnostic Stability from Childhood to Early Adult Life
Kočovská, Eva; Billstedt, Eva; Ellefsen, Asa; Kampmann, Hanna; Gillberg, I. Carina; Biskupstø, Rannvá; Andorsdóttir, Guðrið; Stóra, Tormóður; Minnis, Helen; Gillberg, Christopher
2013-01-01
Childhood autism or autism spectrum disorder (ASD) has been regarded as one of the most stable diagnostic categories applied to young children with psychiatric/developmental disorders. The stability over time of a diagnosis of ASD is theoretically interesting and important for various diagnostic and clinical reasons. We studied the diagnostic stability of ASD from childhood to early adulthood in the Faroe Islands: a total school age population sample (8–17-year-olds) was screened and diagnostically assessed for AD in 2002 and 2009. This paper compares both independent clinical diagnosis and Diagnostic Interview for Social and Communication Disorders (DISCO) algorithm diagnosis at two time points, separated by seven years. The stability of clinical ASD diagnosis was perfect for AD, good for “atypical autism”/PDD-NOS, and less than perfect for Asperger syndrome (AS). Stability of the DISCO algorithm subcategory diagnoses was more variable but still good for AD. Both systems showed excellent stability over the seven-year period for “any ASD” diagnosis, although a number of clear cases had been missed at the original screening in 2002. The findings support the notion that subcategories of ASD should be collapsed into one overarching diagnostic entity with subgrouping achieved on other “non-autism” variables, such as IQ and language levels and overall adaptive functioning. PMID:23476144
Investigating the Link Between Radiologists Gaze, Diagnostic Decision, and Image Content
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia; Voisin, Sophie; Paquit, Vincent C
2013-01-01
Objective: To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods: Gaze data and diagnostic decisions were collected from six radiologists who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Texture analysis was performed in mammographic regions that attracted radiologists attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results: By poolingmore » the data from all radiologists machine learning produced highly accurate predictive models linking image content, gaze, cognition, and error. Merging radiologists gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the radiologists diagnostic errors while confirming 96.2% of their correct diagnoses. The radiologists individual errors could be adequately predicted by modeling the behavior of their peers. However, personalized tuning appears to be beneficial in many cases to capture more accurately individual behavior. Conclusions: Machine learning algorithms combining image features with radiologists gaze data and diagnostic decisions can be effectively developed to recognize cognitive and perceptual errors associated with the diagnostic interpretation of mammograms.« less
NASA Astrophysics Data System (ADS)
Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas
2016-02-01
In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.
Efficient fault diagnosis of helicopter gearboxes
NASA Technical Reports Server (NTRS)
Chin, H.; Danai, K.; Lewicki, D. G.
1993-01-01
Application of a diagnostic system to a helicopter gearbox is presented. The diagnostic system is a nonparametric pattern classifier that uses a multi-valued influence matrix (MVIM) as its diagnostic model and benefits from a fast learning algorithm that enables it to estimate its diagnostic model from a small number of measurement-fault data. To test this diagnostic system, vibration measurements were collected from a helicopter gearbox test stand during accelerated fatigue tests and at various fault instances. The diagnostic results indicate that the MVIM system can accurately detect and diagnose various gearbox faults so long as they are included in training.
Holmes, John B; Dodds, Ken G; Lee, Michael A
2017-03-02
An important issue in genetic evaluation is the comparability of random effects (breeding values), particularly between pairs of animals in different contemporary groups. This is usually referred to as genetic connectedness. While various measures of connectedness have been proposed in the literature, there is general agreement that the most appropriate measure is some function of the prediction error variance-covariance matrix. However, obtaining the prediction error variance-covariance matrix is computationally demanding for large-scale genetic evaluations. Many alternative statistics have been proposed that avoid the computational cost of obtaining the prediction error variance-covariance matrix, such as counts of genetic links between contemporary groups, gene flow matrices, and functions of the variance-covariance matrix of estimated contemporary group fixed effects. In this paper, we show that a correction to the variance-covariance matrix of estimated contemporary group fixed effects will produce the exact prediction error variance-covariance matrix averaged by contemporary group for univariate models in the presence of single or multiple fixed effects and one random effect. We demonstrate the correction for a series of models and show that approximations to the prediction error matrix based solely on the variance-covariance matrix of estimated contemporary group fixed effects are inappropriate in certain circumstances. Our method allows for the calculation of a connectedness measure based on the prediction error variance-covariance matrix by calculating only the variance-covariance matrix of estimated fixed effects. Since the number of fixed effects in genetic evaluation is usually orders of magnitudes smaller than the number of random effect levels, the computational requirements for our method should be reduced.
New web-based algorithm to improve rigid gas permeable contact lens fitting in keratoconus.
Ortiz-Toquero, Sara; Rodriguez, Guadalupe; de Juan, Victoria; Martin, Raul
2017-06-01
To calculate and validate a new web-based algorithm for selecting the back optic zone radius (BOZR) of spherical gas permeable (GP) lens in keratoconus eyes. A retrospective calculation (n=35; multiple regression analysis) and a posterior prospective validation (new sample of 50 keratoconus eyes) of a new algorithm to select the BOZR of spherical KAKC design GP lenses (Conoptica) in keratoconus were conducted. BOZR calculated with the new algorithm, manufacturer guidelines and APEX software were compared with the BOZR that was finally prescribed. Number of diagnostic lenses, ordered lenses and visits to achieve optimal fitting were recorded and compared those obtained for a control group [50 healthy eyes fitted with spherical GP (BIAS design; Conoptica)]. The new algorithm highly correlated with the final BOZR fitted (r 2 =0.825, p<0.001). BOZR of the first diagnostic lens using the new algorithm demonstrated lower difference with the final BOZR prescribed (-0.01±0.12mm, p=0.65; 58% difference≤0.05mm) than with the manufacturer guidelines (+0.12±0.22mm, p<0.001; 26% difference≤0.05mm) and APEX software (-0.14±0.16mm, p=0.001; 34% difference≤0.05mm). Close numbers of diagnostic lens (1.6±0.8, 1.3±0.5; p=0.02), ordered lens (1.4±0.6, 1.1±0.3; P<0.001), and visits (3.4±0.7, 3.2±0.4; p=0.08) were required to fit keratoconus and healthy eyes, respectively. This new algorithm (free access at www.calculens.com) improves spherical KAKC GP fitting in keratoconus and can reduce the practitioner and patient chair time to achieve a final acceptable fit in keratoconus. This algorithm reduces differences between keratoconus GP fitting (KAKC design) and standard GP (BIAS design) lenses fitting in healthy eyes. Copyright © 2016 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
Sun, Xiang; Allison, Carrie; Auyeung, Bonnie; Zhang, Zhixiang; Matthews, Fiona E; Baron-Cohen, Simon; Brayne, Carol
2015-11-01
Research to date in mainland China has mainly focused on children with autistic disorder rather than Autism Spectrum Conditions and the diagnosis largely depended on clinical judgment without the use of diagnostic instruments. Whether children who have been diagnosed in China before meet the diagnostic criteria of Autism Spectrum Conditions is not known nor how many such children would meet these criteria. The aim of this study was to evaluate children with a known diagnosis of autism in mainland China using the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview-Revised to verify that children who were given a diagnosis of autism made by Chinese clinicians in China were mostly children with severe autism. Of 50 children with an existing diagnosis of autism made by Chinese clinicians, 47 children met the diagnosis of autism on the Autism Diagnostic Observation Schedule algorithm and 44 children met the diagnosis of autism on the Autism Diagnostic Interview-Revised algorithm. Using the Gwet's alternative chance-corrected statistic, the agreement between the Chinese diagnosis and the Autism Diagnostic Observation Schedule diagnosis was very good (AC1 = 0.94, p < 0.005, 95% confidence interval (0.86, 1.00)), so was the agreement between the Chinese diagnosis and the Autism Diagnostic Interview-Revised (AC1 = 0.91, p < 0.005, 95% confidence interval (0.81, 1.00)). The agreement between the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview-Revised was lower but still very good (AC1 = 0.83, p < 0.005). © The Author(s) 2015.
Sun, Xiang; Allison, Carrie; Auyeung, Bonnie; Zhang, Zhixiang; Matthews, Fiona E; Baron-Cohen, Simon; Brayne, Carol
2016-01-01
Research to date in mainland China has mainly focused on children with autistic disorder rather than Autism Spectrum Conditions and the diagnosis largely depended on clinical judgment without the use of diagnostic instruments. Whether children who have been diagnosed in China before meet the diagnostic criteria of Autism Spectrum Conditions is not known nor how many such children would meet these criteria. The aim of this study was to evaluate children with a known diagnosis of autism in mainland China using the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview–Revised to verify that children who were given a diagnosis of autism made by Chinese clinicians in China were mostly children with severe autism. Of 50 children with an existing diagnosis of autism made by Chinese clinicians, 47 children met the diagnosis of autism on the Autism Diagnostic Observation Schedule algorithm and 44 children met the diagnosis of autism on the Autism Diagnostic Interview–Revised algorithm. Using the Gwet’s alternative chance-corrected statistic, the agreement between the Chinese diagnosis and the Autism Diagnostic Observation Schedule diagnosis was very good (AC1 = 0.94, p < 0.005, 95% confidence interval (0.86, 1.00)), so was the agreement between the Chinese diagnosis and the Autism Diagnostic Interview–Revised (AC1 = 0.91, p < 0.005, 95% confidence interval (0.81, 1.00)). The agreement between the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview–Revised was lower but still very good (AC1 = 0.83, p < 0.005). PMID:25757721
Decision support methods for the detection of adverse events in post-marketing data.
Hauben, M; Bate, A
2009-04-01
Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.
Conwell, Darwin L.; Lee, Linda S.; Yadav, Dhiraj; Longnecker, Daniel S.; Miller, Frank H.; Mortele, Koenraad J.; Levy, Michael J.; Kwon, Richard; Lieb, John G.; Stevens, Tyler; Toskes, Philip P.; Gardner, Timothy B.; Gelrud, Andres; Wu, Bechien U.; Forsmark, Christopher E.; Vege, Santhi S.
2016-01-01
The diagnosis of chronic pancreatitis remains challenging in early stages of the disease. This report defines the diagnostic criteria useful in the assessment of patients with suspected and established chronic pancreatitis. All current diagnostic procedures are reviewed and evidence based statements are provided about their utility and limitations. Diagnostic criteria for chronic pancreatitis are classified as definitive, probable or insufficient evidence. A diagnostic (STEP-wise; S-survey, T-tomography, E-endoscopy and P-pancreas function testing) algorithm is proposed that proceeds from a non-invasive to a more invasive approach. This algorithm maximizes specificity (low false positive rate) in subjects with chronic abdominal pain and equivocal imaging changes. Futhermore, a nomenclature is suggested to further characterize patients with established chronic pancreatitis based on TIGAR-O (T-toxic, I-idiopathic, G-genetic, A- autoimmune, R-recurrent and O-obstructive) etiology, gland morphology (Cambridge criteria) and physiologic state (exocrine, endocrine function) for uniformity across future multi-center research collaborations. This guideline will serve as a baseline manuscript that will be modified as new evidence becomes available and our knowledge of chronic pancreatitis improves. PMID:25333398
Arpaia, P; Cimmino, P; Girone, M; La Commara, G; Maisto, D; Manna, C; Pezzetti, M
2014-09-01
Evolutionary approach to centralized multiple-faults diagnostics is extended to distributed transducer networks monitoring large experimental systems. Given a set of anomalies detected by the transducers, each instance of the multiple-fault problem is formulated as several parallel communicating sub-tasks running on different transducers, and thus solved one-by-one on spatially separated parallel processes. A micro-genetic algorithm merges evaluation time efficiency, arising from a small-size population distributed on parallel-synchronized processors, with the effectiveness of centralized evolutionary techniques due to optimal mix of exploitation and exploration. In this way, holistic view and effectiveness advantages of evolutionary global diagnostics are combined with reliability and efficiency benefits of distributed parallel architectures. The proposed approach was validated both (i) by simulation at CERN, on a case study of a cold box for enhancing the cryogeny diagnostics of the Large Hadron Collider, and (ii) by experiments, under the framework of the industrial research project MONDIEVOB (Building Remote Monitoring and Evolutionary Diagnostics), co-funded by EU and the company Del Bo srl, Napoli, Italy.
Tan, Daniel S W; Yom, Sue S; Tsao, Ming S; Pass, Harvey I; Kelly, Karen; Peled, Nir; Yung, Rex C; Wistuba, Ignacio I; Yatabe, Yasushi; Unger, Michael; Mack, Philip C; Wynes, Murry W; Mitsudomi, Tetsuya; Weder, Walter; Yankelevitz, David; Herbst, Roy S; Gandara, David R; Carbone, David P; Bunn, Paul A; Mok, Tony S K; Hirsch, Fred R
2016-07-01
Mutations in the epidermal growth factor receptor gene (EGFR) represent one of the most frequent "actionable" alterations in non-small cell lung cancer (NSCLC). Typified by high response rates to targeted therapies, EGFR tyrosine kinase inhibitors (TKIs) are now established first-line treatment options and have transformed the treatment paradigm for NSCLC. With the recent breakthrough designation and approval of the third-generation EGFR TKI osimertinib, available systemic and local treatment options have expanded, requiring new clinical algorithms that take into account individual patient molecular and clinical profiles. In this International Association for the Study of Lung Cancer commissioned consensus statement, key pathologic, diagnostic, and therapeutic considerations, such as optimal choice of EGFR TKI and management of brain metastasis, are discussed. In addition, recommendations are made for clinical guidelines and research priorities, such as the role of repeat biopsies and use of circulating free DNA for molecular studies. With the rapid pace of progress in treating EGFR-mutant NSCLC, this statement provides a state-of-the-art review of the contemporary issues in managing this unique subgroup of patients. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Donczo, Boglarka; Guttman, Andras
2018-06-05
More than a century ago in 1893, a revolutionary idea about fixing biological tissue specimens was introduced by Ferdinand Blum, a German physician. Since then, a plethora of fixation methods have been investigated and used. Formalin fixation with paraffin embedment became the most widely used types of fixation and preservation method, due to its proper architectural conservation of tissue structures and cellular shape. The huge collection of formalin-fixed, paraffin-embedded (FFPE) sample archives worldwide holds a large amount of unearthed information about diseases that could be the Holy Grail in contemporary biomarker research utilizing analytical omics based molecular diagnostics. The aim of this review is to critically evaluate the omics options for FFPE tissue sample analysis in the molecular diagnostics field. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Krokhin, G.; Pestunov, A.; Arakelyan, E.; Mukhin, V.
2017-11-01
During the last decades, there can be noticed an increase of interest concerning various aspects of intellectual diagnostics and management in thermal power engineering according the hybrid principle. It is conditioned by the fact that conservative static methods does not allow to reflect the actual power installation state adequately. In order to improve the diagnostics quality, we use various fuzzy systems apparatus. In this paper, we introduce the intellectual system, called SKAIS, which is intended for quick and precise diagnostics of thermal power equipment. This system was developed as the result of the research carried out by specialists from National Research University “Moscow Power Engineering Institute” and Novosibirsk State University of Economics and Management. It drastically increases the level of intelligence of the automatic power plant control system.
Periprosthetic joint infections: a clinical practice algorithm.
Volpe, Luigi; Indelli, Pier Francesco; Latella, Leonardo; Poli, Paolo; Yakupoglu, Jale; Marcucci, Massimiliano
2014-01-01
periprosthetic joint infection (PJI) accounts for 25% of failed total knee arthroplasties (TKAs) and 15% of failed total hip arthroplasties (THAs). The purpose of the present study was to design a multidisciplinary diagnostic algorithm to detect a PJI as cause of a painful TKA or THA. from April 2010 to October 2012, 111 patients with suspected PJI were evaluated. The study group comprised 75 females and 36 males with an average age of 71 years (range, 48 to 94 years). Eighty-four patients had a painful THA, while 27 reported a painful TKA. The stepwise diagnostic algorithm, applied in all the patients, included: measurement of serum C-reactive protein (CRP) and erythrocyte sedimentation rate (ESR) levels; imaging studies, including standard radiological examination, standard technetium-99m-methylene diphosphonate (MDP) bone scan (if positive, confirmation by LeukoScan was obtained); and joint aspiration with analysis of synovial fluid. following application of the stepwise diagnostic algorithm, 24 out of our 111 screened patients were classified as having a suspected PJI (21.7%). CRP and ESR levels were negative in 84 and positive in 17 cases; 93.7% of the patients had a positive technetium-labeled bone scan, and 23% a positive LeukoScan. Preoperative synovial fluid analysis was positive in 13.5%; analysis of synovial fluid obtained by preoperative aspiration showed a leucocyte count of > 3000 cells μ/l in 52% of the patients. the present study showed that the diagnosis of PJI requires the application of a multimodal diagnostic protocol in order to avoid complications related to surgical revision of a misdiagnosed "silent" PJI. Level IV, therapeutic case series.
Sanjuán, Pilar; Rodríguez-Núñez, Nuria; Rábade, Carlos; Lama, Adriana; Ferreiro, Lucía; González-Barcala, Francisco Javier; Alvarez-Dobaño, José Manuel; Toubes, María Elena; Golpe, Antonio; Valdés, Luis
2014-05-01
Clinical probability scores (CPS) determine the pre-test probability of pulmonary embolism (PE) and assess the need for the tests required in these patients. Our objective is to investigate if PE is diagnosed according to clinical practice guidelines. Retrospective study of clinically suspected PE in the emergency department between January 2010 and December 2012. A D-dimer value ≥ 500 ng/ml was considered positive. PE was diagnosed on the basis of the multislice computed tomography angiography and, to a lesser extent, with other imaging techniques. The CPS used was the revised Geneva scoring system. There was 3,924 cases of suspected PE (56% female). Diagnosis was determined in 360 patients (9.2%) and the incidence was 30.6 cases per 100,000 inhabitants/year. Sensitivity and the negative predictive value of the D-dimer test were 98.7% and 99.2% respectively. CPS was calculated in only 24 cases (0.6%) and diagnostic algorithms were not followed in 2,125 patients (54.2%): in 682 (17.4%) because clinical probability could not be estimated and in 482 (37.6%), 852 (46.4%) and 109 (87.9%) with low, intermediate and high clinical probability, respectively, because the diagnostic algorithms for these probabilities were not applied. CPS are rarely calculated in the diagnosis of PE and the diagnostic algorithm is rarely used in clinical practice. This may result in procedures with potential significant side effects being unnecessarily performed or to a high risk of underdiagnosis. Copyright © 2013 SEPAR. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Wickersham, Andrew Joseph
There are two critical research needs for the study of hydrocarbon combustion in high speed flows: 1) combustion diagnostics with adequate temporal and spatial resolution, and 2) mathematical techniques that can extract key information from large datasets. The goal of this work is to address these needs, respectively, by the use of high speed and multi-perspective chemiluminescence and advanced mathematical algorithms. To obtain the measurements, this work explored the application of high speed chemiluminescence diagnostics and the use of fiber-based endoscopes (FBEs) for non-intrusive and multi-perspective chemiluminescence imaging up to 20 kHz. Non-intrusive and full-field imaging measurements provide a wealth of information for model validation and design optimization of propulsion systems. However, it is challenging to obtain such measurements due to various implementation difficulties such as optical access, thermal management, and equipment cost. This work therefore explores the application of FBEs for non-intrusive imaging to supersonic propulsion systems. The FBEs used in this work are demonstrated to overcome many of the aforementioned difficulties and provided datasets from multiple angular positions up to 20 kHz in a supersonic combustor. The combustor operated on ethylene fuel at Mach 2 with an inlet stagnation temperature and pressure of approximately 640 degrees Fahrenheit and 70 psia, respectively. The imaging measurements were obtained from eight perspectives simultaneously, providing full-field datasets under such flow conditions for the first time, allowing the possibility of inferring multi-dimensional measurements. Due to the high speed and multi-perspective nature, such new diagnostic capability generates a large volume of data and calls for analysis algorithms that can process the data and extract key physics effectively. To extract the key combustion dynamics from the measurements, three mathematical methods were investigated in this work: Fourier analysis, proper orthogonal decomposition (POD), and wavelet analysis (WA). These algorithms were first demonstrated and tested on imaging measurements obtained from one perspective in a sub-sonic combustor (up to Mach 0.2). The results show that these algorithms are effective in extracting the key physics from large datasets, including the characteristic frequencies of flow-flame interactions especially during transient processes such as lean blow off and ignition. After these relatively simple tests and demonstrations, these algorithms were applied to process the measurements obtained from multi-perspective in the supersonic combustor. compared to past analyses (which have been limited to data obtained from one perspective only), the availability of data at multiple perspective provide further insights into the flame and flow structures in high speed flows. In summary, this work shows that high speed chemiluminescence is a simple yet powerful combustion diagnostic. Especially when combined with FBEs and the analyses algorithms described in this work, such diagnostics provide full-field imaging at high repetition rate in challenging flows. Based on such measurements, a wealth of information can be obtained from proper analysis algorithms, including characteristic frequency, dominating flame modes, and even multi-dimensional flame and flow structures.
Rehm, K; Seeley, G W; Dallas, W J; Ovitt, T W; Seeger, J F
1990-01-01
One of the goals of our research in the field of digital radiography has been to develop contrast-enhancement algorithms for eventual use in the display of chest images on video devices with the aim of preserving the diagnostic information presently available with film, some of which would normally be lost because of the smaller dynamic range of video monitors. The ASAHE algorithm discussed in this article has been tested by investigating observer performance in a difficult detection task involving phantoms and simulated lung nodules, using film as the output medium. The results of the experiment showed that the algorithm is successful in providing contrast-enhanced, natural-looking chest images while maintaining diagnostic information. The algorithm did not effect an increase in nodule detectability, but this was not unexpected because film is a medium capable of displaying a wide range of gray levels. It is sufficient at this stage to show that there is no degradation in observer performance. Future tests will evaluate the performance of the ASAHE algorithm in preparing chest images for video display.
Diagnosing breast cancer using Raman spectroscopy: prospective analysis
NASA Astrophysics Data System (ADS)
Haka, Abigail S.; Volynskaya, Zoya; Gardecki, Joseph A.; Nazemi, Jon; Shenk, Robert; Wang, Nancy; Dasari, Ramachandra R.; Fitzmaurice, Maryann; Feld, Michael S.
2009-09-01
We present the first prospective test of Raman spectroscopy in diagnosing normal, benign, and malignant human breast tissues. Prospective testing of spectral diagnostic algorithms allows clinicians to accurately assess the diagnostic information contained in, and any bias of, the spectroscopic measurement. In previous work, we developed an accurate, internally validated algorithm for breast cancer diagnosis based on analysis of Raman spectra acquired from fresh-frozen in vitro tissue samples. We currently evaluate the performance of this algorithm prospectively on a large ex vivo clinical data set that closely mimics the in vivo environment. Spectroscopic data were collected from freshly excised surgical specimens, and 129 tissue sites from 21 patients were examined. Prospective application of the algorithm to the clinical data set resulted in a sensitivity of 83%, a specificity of 93%, a positive predictive value of 36%, and a negative predictive value of 99% for distinguishing cancerous from normal and benign tissues. The performance of the algorithm in different patient populations is discussed. Sources of bias in the in vitro calibration and ex vivo prospective data sets, including disease prevalence and disease spectrum, are examined and analytical methods for comparison provided.
NASA Technical Reports Server (NTRS)
Russell, B. Don
1989-01-01
This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms.
A cDNA microarray gene expression data classifier for clinical diagnostics based on graph theory.
Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco
2011-01-01
Despite great advances in discovering cancer molecular profiles, the proper application of microarray technology to routine clinical diagnostics is still a challenge. Current practices in the classification of microarrays' data show two main limitations: the reliability of the training data sets used to build the classifiers, and the classifiers' performances, especially when the sample to be classified does not belong to any of the available classes. In this case, state-of-the-art algorithms usually produce a high rate of false positives that, in real diagnostic applications, are unacceptable. To address this problem, this paper presents a new cDNA microarray data classification algorithm based on graph theory and is able to overcome most of the limitations of known classification methodologies. The classifier works by analyzing gene expression data organized in an innovative data structure based on graphs, where vertices correspond to genes and edges to gene expression relationships. To demonstrate the novelty of the proposed approach, the authors present an experimental performance comparison between the proposed classifier and several state-of-the-art classification algorithms.
A Diagnostic Approach for Electro-Mechanical Actuators in Aerospace Systems
NASA Technical Reports Server (NTRS)
Balaban, Edward; Saxena, Abhinav; Bansal, Prasun; Goebel, Kai Frank; Stoelting, Paul; Curran, Simon
2009-01-01
Electro-mechanical actuators (EMA) are finding increasing use in aerospace applications, especially with the trend towards all all-electric aircraft and spacecraft designs. However, electro-mechanical actuators still lack the knowledge base accumulated for other fielded actuator types, particularly with regard to fault detection and characterization. This paper presents a thorough analysis of some of the critical failure modes documented for EMAs and describes experiments conducted on detecting and isolating a subset of them. The list of failures has been prepared through an extensive Failure Modes and Criticality Analysis (FMECA) reference, literature review, and accessible industry experience. Methods for data acquisition and validation of algorithms on EMA test stands are described. A variety of condition indicators were developed that enabled detection, identification, and isolation among the various fault modes. A diagnostic algorithm based on an artificial neural network is shown to operate successfully using these condition indicators and furthermore, robustness of these diagnostic routines to sensor faults is demonstrated by showing their ability to distinguish between them and component failures. The paper concludes with a roadmap leading from this effort towards developing successful prognostic algorithms for electromechanical actuators.
Ecological impacts and management strategies for western larch in the face of climate-change
Gerald E. Rehfeldt; Barry C. Jaquish
2010-01-01
Approximately 185,000 forest inventory and ecological plots from both USA and Canada were used to predict the contemporary distribution of western larch (Larix occidentalis Nutt.) from climate variables. The random forests algorithm, using an 8-variable model, produced an overall error rate of about 2.9 %, nearly all of which consisted of predicting presence at...
NASA Astrophysics Data System (ADS)
Losik, L.
A predictive medicine program allows disease and illness including mental illness to be predicted using tools created to identify the presence of accelerated aging (a.k.a. disease) in electrical and mechanical equipment. When illness and disease can be predicted, actions can be taken so that the illness and disease can be prevented and eliminated. A predictive medicine program uses the same tools and practices from a prognostic and health management program to process biological and engineering diagnostic data provided in analog telemetry during prelaunch readiness and space exploration missions. The biological and engineering diagnostic data necessary to predict illness and disease is collected from the pre-launch spaceflight readiness activities and during space flight for the ground crew to perform a prognostic analysis on the results from a diagnostic analysis. The diagnostic, biological data provided in telemetry is converted to prognostic (predictive) data using the predictive algorithms. Predictive algorithms demodulate telemetry behavior. They illustrate the presence of accelerated aging/disease in normal appearing systems that function normally. Mental illness can predicted using biological diagnostic measurements provided in CCSDS telemetry from a spacecraft such as the ISS or from a manned spacecraft in deep space. The measurements used to predict mental illness include biological and engineering data from an astronaut's circadian and ultranian rhythms. This data originates deep in the brain that is also damaged from the long-term exposure to cortisol and adrenaline anytime the body's fight or flight response is activated. This paper defines the brain's FOFR; the diagnostic, biological and engineering measurements needed to predict mental illness, identifies the predictive algorithms necessary to process the behavior in CCSDS analog telemetry to predict and thus prevent mental illness from occurring on human spaceflight missions.
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
Advances in Diagnostic Bronchoscopy
Haas, Andrew R.; Vachani, Anil; Sterman, Daniel H.
2010-01-01
Diagnostic bronchoscopy has undergone two major paradigm shifts in the last 40 years. First, the advent of flexible bronchoscopy gave chest physicians improved access to the tracheobronchial tree with a rapid learning curve and greater patient comfort compared with rigid bronchoscopy. The second paradigm shift has evolved over the last 5 years with the proliferation of new technologies that have significantly enhanced the diagnostic capabilities of flexible bronchoscopy compared with traditional methods. At the forefront of these new technologies is endobronchial ultrasound. In its various forms, endobronchial ultrasound has improved diagnostic yield for pulmonary masses, nodules, intrathoracic adenopathy, and disease extent, thereby reducing the need for more invasive surgical interventions. Various navigational bronchoscopy systems have become available to increase flexible bronchoscope access to small peripheral pulmonary lesions. Furthermore, various modalities of airway assessment, including optical microscopic imaging technologies, may play significant roles in the diagnosis of a variety of pulmonary diseases in the future. Finally, the combination of new diagnostic bronchoscopy technologies and novel approaches in molecular analysis and biomarker assessment hold promise for enhanced diagnosis and personalized management of many pulmonary disorders. In this review, we provide a contemporary review of diagnostic bronchoscopy developments over the past decade. PMID:20378726
Chen, Yong; Liu, Yulun; Ning, Jing; Cormier, Janice; Chu, Haitao
2014-01-01
Systematic reviews of diagnostic tests often involve a mixture of case-control and cohort studies. The standard methods for evaluating diagnostic accuracy only focus on sensitivity and specificity and ignore the information on disease prevalence contained in cohort studies. Consequently, such methods cannot provide estimates of measures related to disease prevalence, such as population averaged or overall positive and negative predictive values, which reflect the clinical utility of a diagnostic test. In this paper, we propose a hybrid approach that jointly models the disease prevalence along with the diagnostic test sensitivity and specificity in cohort studies, and the sensitivity and specificity in case-control studies. In order to overcome the potential computational difficulties in the standard full likelihood inference of the proposed hybrid model, we propose an alternative inference procedure based on the composite likelihood. Such composite likelihood based inference does not suffer computational problems and maintains high relative efficiency. In addition, it is more robust to model mis-specifications compared to the standard full likelihood inference. We apply our approach to a review of the performance of contemporary diagnostic imaging modalities for detecting metastases in patients with melanoma. PMID:25897179
Cashin, Andrew; Gallagher, Hilary; Newman, Claire; Hughes, Mark
2012-08-01
The next iteration of the Diagnostic and Statistical Manual of Mental Disorders is due for release in May 2013. The current diagnostic criteria for autism are based on a behavioral triad of impairment, which has been helpful for diagnosis and identifying the need for intervention, but is not useful with regard to developing interventions. Revised diagnostic criteria are needed to better inform research and therapeutic intervention. This article examines the research underpinning the behavioral triad of impairment to consider alternative explanations and a more useful framing for diagnosis and intervention. Contemporary research and literature on autism were used in this study. It is proposed that the cognitive processing triad of impaired abstraction, impaired theory of mind, and impaired linguistic processing become the triad of impairment for autism in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders. These are investigable at the diagnostic level and can usefully inform intervention. Further, in addressing the debate on whether restrictive and repetitive behavior should remain central to diagnosis or be replaced by a deficit in imagination, the authors argue that both behavioral manifestations are underpinned by impaired abstraction. © 2012 Wiley Periodicals, Inc.
Hush, Julia M; Marcuzzi, Anna
2012-07-01
SUMMARY Contemporary clinical assessment of back pain is based on the diagnostic triage paradigm. The most common diagnostic classification is nonspecific back pain, considered to be of nociceptive etiology. A small proportion are diagnosed with radicular pain, of neuropathic origin. In this study we review the body of literature on the prevalence of neuropathic features of back pain, revealing that the point prevalence is 17% in primary care, 34% in mixed clinical settings and 53% in tertiary care. There is evidence that neuropathic features of back pain are not restricted to typical clinical radicular pain phenotypes and may be under-recognized, particularly in primary care. The consequence of this is that in the clinic, diagnostic triage may erroneously classify patients with nonspecific back pain or radicular pain. A promising alternative is the development of mechanism-based pain phenotyping in patients with back pain. Timely identification of contributory pain mechanisms may enable greater opportunity to select appropriate therapeutic targets and improve patient outcomes.
Shrestha, Swastina; Dave, Amish J; Losina, Elena; Katz, Jeffrey N
2016-07-07
Administrative health care data are frequently used to study disease burden and treatment outcomes in many conditions including osteoarthritis (OA). OA is a chronic condition with significant disease burden affecting over 27 million adults in the US. There are few studies examining the performance of administrative data algorithms to diagnose OA. The purpose of this study is to perform a systematic review of administrative data algorithms for OA diagnosis; and, to evaluate the diagnostic characteristics of algorithms based on restrictiveness and reference standards. Two reviewers independently screened English-language articles published in Medline, Embase, PubMed, and Cochrane databases that used administrative data to identify OA cases. Each algorithm was classified as restrictive or less restrictive based on number and type of administrative codes required to satisfy the case definition. We recorded sensitivity and specificity of algorithms and calculated positive likelihood ratio (LR+) and positive predictive value (PPV) based on assumed OA prevalence of 0.1, 0.25, and 0.50. The search identified 7 studies that used 13 algorithms. Of these 13 algorithms, 5 were classified as restrictive and 8 as less restrictive. Restrictive algorithms had lower median sensitivity and higher median specificity compared to less restrictive algorithms when reference standards were self-report and American college of Rheumatology (ACR) criteria. The algorithms compared to reference standard of physician diagnosis had higher sensitivity and specificity than those compared to self-reported diagnosis or ACR criteria. Restrictive algorithms are more specific for OA diagnosis and can be used to identify cases when false positives have higher costs e.g. interventional studies. Less restrictive algorithms are more sensitive and suited for studies that attempt to identify all cases e.g. screening programs.
Breast Cancer Diagnostic System Final Report CRADA No. TC02098.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubenchik, A. M.; DaSilva, L. B.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Liver more National Laboratory (LLNL) and BioTelligent, Inc. together with a Russian Institution (BioFil, Ltd.), to develop a new system ( diagnostic device, operating procedures, algorithms and software) to accurately distinguish between benign and malignant breast tissue (Breast Cancer Diagnostic System, BCDS).
[EBOLA HEMORRHAGIC FEVER: DIAGNOSTICS, ETIOTROPIC AND PATHOGENETIC THERAPY, PREVENTION].
Zhdanov, K V; Zakharenko, S M; Kovalenko, A N; Semenov, A V; Fisun, A Ya
2015-01-01
The data on diagnostics, etiotropic and pathogenetic therapy, prevention of Ebola hemorrhagic fever are presented including diagnostic algorithms for different clinical situations. Fundamentals of pathogenetic therapy are described. Various groups of medications used for antiviral therapy of conditions caused by Ebola virus are characterized. Experimental drugs at different stages of clinical studies are considered along with candidate vaccines being developed for the prevention of the disease.
ERIC Educational Resources Information Center
Gelhorn, Heather; Hartman, Christie; Sakai, Joseph; Stallings, Michael; Young, Susan; Rhee, So Hyun; Corley, Robin; Hewitt, John; Hopger, Christian; Crowley, Thomas D.
2008-01-01
Clinical interviews of approximately 5,587 adolescents revealed that DSM-IV diagnostic categories were found to be different in terms of the severity of alcohol use disorders (AUDs). However, a substantial inconsistency and overlap was found in severity of AUDs across categories. The need for an alternative diagnostic algorithm which considers all…
Measurements and modeling of contemporary radiocarbon in the stratosphere
NASA Astrophysics Data System (ADS)
Kanu, A. M.; Comfort, L. L.; Guilderson, T. P.; Cameron-Smith, P. J.; Bergmann, D. J.; Atlas, E. L.; Schauffler, S.; Boering, K. A.
2016-02-01
Measurements of the 14C content of carbon dioxide in air collected by high-altitude balloon flights in 2003-2005 reveal the contemporary radiocarbon distribution in the northern midlatitude stratosphere, four decades after the Limited Test Ban Treaty restricted atmospheric testing of nuclear weapons. Comparisons with results from a 3-D chemical-transport model show that the 14CO2 distribution is now largely governed by the altitude/latitude dependence of the natural cosmogenic production rate, stratospheric transport, and propagation into the stratosphere of the decreasing radiocarbon trend in tropospheric CO2 due to fossil fuel combustion. From the observed correlation of 14CO2 with N2O mixing ratios, an annual global mean net flux of 14CO2 to the troposphere of 1.6(±0.4) × 1017‰ mol CO2 yr-1 and a global production rate of 2.2(±0.6) × 1026 atoms 14C yr-1 are empirically derived. The results also indicate that contemporary 14CO2 observations provide highly sensitive diagnostics for stratospheric transport and residence times in models.
Diagnosis of Posttraumatic Stress Disorder in Preschool Children
ERIC Educational Resources Information Center
De Young, Alexandra C.; Kenardy, Justin A.; Cobham, Vanessa E.
2011-01-01
This study investigated the existing diagnostic algorithms for posttraumatic stress disorder (PTSD) to determine the most developmentally sensitive and valid approach for diagnosing this disorder in preschoolers. Participants were 130 parents of unintentionally burned children (1-6 years). Diagnostic interviews were conducted with parents to…
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.
2003-01-01
A diagnostic tool for detecting damage to gears was developed. Two different measurement technologies, oil debris analysis and vibration were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Rig. An oil debris sensor and the two vibration algorithms were adapted as the diagnostic tools. An inductance type oil debris sensor was selected for the oil analysis measurement technology. Gear damage data for this type of sensor was limited to data collected in the NASA Glenn test rigs. For this reason, this analysis included development of a parameter for detecting gear pitting damage using this type of sensor. The vibration data was used to calculate two previously available gear vibration diagnostic algorithms. The two vibration algorithms were selected based on their maturity and published success in detecting damage to gears. Oil debris and vibration features were then developed using fuzzy logic analysis techniques, then input into a multi sensor data fusion process. Results show combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spur gears. As a result of this research, this new diagnostic tool has significantly improved detection of gear damage in the NASA Glenn Spur Gear Fatigue Rigs. This research also resulted in several other findings that will improve the development of future health monitoring systems. Oil debris analysis was found to be more reliable than vibration analysis for detecting pitting fatigue failure of gears and is capable of indicating damage progression. Also, some vibration algorithms are as sensitive to operational effects as they are to damage. Another finding was that clear threshold limits must be established for diagnostic tools. Based on additional experimental data obtained from the NASA Glenn Spiral Bevel Gear Fatigue Rig, the methodology developed in this study can be successfully implemented on other geared systems.
Approximation algorithms for a genetic diagnostics problem.
Kosaraju, S R; Schäffer, A A; Biesecker, L G
1998-01-01
We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.
Recurrent Pneumonia in Children: A Reasoned Diagnostic Approach and a Single Centre Experience.
Montella, Silvia; Corcione, Adele; Santamaria, Francesca
2017-01-29
Recurrent pneumonia (RP), i.e., at least two episodes of pneumonia in one year or three episodes ever with intercritical radiographic clearing of densities, occurs in 7.7%-9% of children with community-acquired pneumonia. In RP, the challenge is to discriminate between children with self-limiting or minor problems, that do not require a diagnostic work-up, and those with an underlying disease. The aim of the current review is to discuss a reasoned diagnostic approach to RP in childhood. Particular emphasis has been placed on which children should undergo a diagnostic work-up and which tests should be performed. A pediatric case series is also presented, in order to document a single centre experience of RP. A management algorithm for the approach to children with RP, based on the evidence from a literature review, is proposed. Like all algorithms, it is not meant to replace clinical judgment, but it should drive physicians to adopt a systematic approach to pediatric RP and provide a useful guide to the clinician.
Walusimbi, Simon; Kwesiga, Brendan; Rodrigues, Rashmi; Haile, Melles; de Costa, Ayesha; Bogg, Lennart; Katamba, Achilles
2016-10-10
Microscopic Observation Drug Susceptibility (MODS) and Xpert MTB/Rif (Xpert) are highly sensitive tests for diagnosis of pulmonary tuberculosis (PTB). This study evaluated the cost effectiveness of utilizing MODS versus Xpert for diagnosis of active pulmonary TB in HIV infected patients in Uganda. A decision analysis model comparing MODS versus Xpert for TB diagnosis was used. Costs were estimated by measuring and valuing relevant resources required to perform the MODS and Xpert tests. Diagnostic accuracy data of the tests were obtained from systematic reviews involving HIV infected patients. We calculated base values for unit costs and varied several assumptions to obtain the range estimates. Cost effectiveness was expressed as costs per TB patient diagnosed for each of the two diagnostic strategies. Base case analysis was performed using the base estimates for unit cost and diagnostic accuracy of the tests. Sensitivity analysis was performed using a range of value estimates for resources, prevalence, number of tests and diagnostic accuracy. The unit cost of MODS was US$ 6.53 versus US$ 12.41 of Xpert. Consumables accounted for 59 % (US$ 3.84 of 6.53) of the unit cost for MODS and 84 % (US$10.37 of 12.41) of the unit cost for Xpert. The cost effectiveness ratio of the algorithm using MODS was US$ 34 per TB patient diagnosed compared to US$ 71 of the algorithm using Xpert. The algorithm using MODS was more cost-effective compared to the algorithm using Xpert for a wide range of different values of accuracy, cost and TB prevalence. The cost (threshold value), where the algorithm using Xpert was optimal over the algorithm using MODS was US$ 5.92. MODS versus Xpert was more cost-effective for the diagnosis of PTB among HIV patients in our setting. Efforts to scale-up MODS therefore need to be explored. However, since other non-economic factors may still favour the use of Xpert, the current cost of the Xpert cartridge still needs to be reduced further by more than half, in order to make it economically competitive with MODS.
Code-based Diagnostic Algorithms for Idiopathic Pulmonary Fibrosis. Case Validation and Improvement.
Ley, Brett; Urbania, Thomas; Husson, Gail; Vittinghoff, Eric; Brush, David R; Eisner, Mark D; Iribarren, Carlos; Collard, Harold R
2017-06-01
Population-based studies of idiopathic pulmonary fibrosis (IPF) in the United States have been limited by reliance on diagnostic code-based algorithms that lack clinical validation. To validate a well-accepted International Classification of Diseases, Ninth Revision, code-based algorithm for IPF using patient-level information and to develop a modified algorithm for IPF with enhanced predictive value. The traditional IPF algorithm was used to identify potential cases of IPF in the Kaiser Permanente Northern California adult population from 2000 to 2014. Incidence and prevalence were determined overall and by age, sex, and race/ethnicity. A validation subset of cases (n = 150) underwent expert medical record and chest computed tomography review. A modified IPF algorithm was then derived and validated to optimize positive predictive value. From 2000 to 2014, the traditional IPF algorithm identified 2,608 cases among 5,389,627 at-risk adults in the Kaiser Permanente Northern California population. Annual incidence was 6.8/100,000 person-years (95% confidence interval [CI], 6.1-7.7) and was higher in patients with older age, male sex, and white race. The positive predictive value of the IPF algorithm was only 42.2% (95% CI, 30.6 to 54.6%); sensitivity was 55.6% (95% CI, 21.2 to 86.3%). The corrected incidence was estimated at 5.6/100,000 person-years (95% CI, 2.6-10.3). A modified IPF algorithm had improved positive predictive value but reduced sensitivity compared with the traditional algorithm. A well-accepted International Classification of Diseases, Ninth Revision, code-based IPF algorithm performs poorly, falsely classifying many non-IPF cases as IPF and missing a substantial proportion of IPF cases. A modification of the IPF algorithm may be useful for future population-based studies of IPF.
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1996-01-01
NASA's advanced propulsion system Small Scale Magnetic Disturbances/Advanced Technology Development (SSME/ATD) has been undergoing extensive flight certification and developmental testing, which involves large numbers of health monitoring measurements. To enhance engine safety and reliability, detailed analysis and evaluation of the measurement signals are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce the risk of catastrophic system failures and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. During the development of SSME, ASRI participated in the research and development of several advanced non- linear signal diagnostic methods for health monitoring and failure prediction in turbomachinery components. However, due to the intensive computational requirement associated with such advanced analysis tasks, current SSME dynamic data analysis and diagnostic evaluation is performed off-line following flight or ground test with a typical diagnostic turnaround time of one to two days. The objective of MSFC's MPP Prototype System is to eliminate such 'diagnostic lag time' by achieving signal processing and analysis in real-time. Such an on-line diagnostic system can provide sufficient lead time to initiate corrective action and also to enable efficient scheduling of inspection, maintenance and repair activities. The major objective of this project was to convert and implement a number of advanced nonlinear diagnostic DSP algorithms in a format consistent with that required for integration into the Vanderbilt Multigraph Architecture (MGA) Model Based Programming environment. This effort will allow the real-time execution of these algorithms using the MSFC MPP Prototype System. ASRI has completed the software conversion and integration of a sequence of nonlinear signal analysis techniques specified in the SOW for real-time execution on MSFC's MPP Prototype. This report documents and summarizes the results of the contract tasks; provides the complete computer source code; including all FORTRAN/C Utilities; and all other utilities/supporting software libraries that are required for operation.
[Autism in children. Speech, behavior and motor activity point to diagnosis].
Neumärker, K J
2001-02-01
Austistic disorders characteristically involve specific impairments of social skills, of the language and of stereotyped body movements. L Kanner and H. Asperger were the first to describe these psychopathologic features, which still form the core of the diagnostic criteria of contemporary psychiatric classification systems, ICD-10 and DSM-IV, in the category pervasive developmental disorders. Useful diagnostic tools have been developed to establish the clinical diagnosis. The results of research point to a predominantly genetic pathogenesis involving a complex interaction of multiple genes. While no causal treatments are available for these heterogenic disorders, there are many therapeutic concepts. Although some treatments may achieve significant improvements, autistic disorders usually mean a lifelong individual impairment.
Cremers, Charlotte H P; Dankbaar, Jan Willem; Vergouwen, Mervyn D I; Vos, Pieter C; Bennink, Edwin; Rinkel, Gabriel J E; Velthuis, Birgitta K; van der Schaaf, Irene C
2015-05-01
Tracer delay-sensitive perfusion algorithms in CT perfusion (CTP) result in an overestimation of the extent of ischemia in thromboembolic stroke. In diagnosing delayed cerebral ischemia (DCI) after aneurysmal subarachnoid hemorrhage (aSAH), delayed arrival of contrast due to vasospasm may also overestimate the extent of ischemia. We investigated the diagnostic accuracy of tracer delay-sensitive and tracer delay-insensitive algorithms for detecting DCI. From a prospectively collected series of aSAH patients admitted between 2007-2011, we included patients with any clinical deterioration other than rebleeding within 21 days after SAH who underwent NCCT/CTP/CTA imaging. Causes of clinical deterioration were categorized into DCI and no DCI. CTP maps were calculated with tracer delay-sensitive and tracer delay-insensitive algorithms and were visually assessed for the presence of perfusion deficits by two independent observers with different levels of experience. The diagnostic value of both algorithms was calculated for both observers. Seventy-one patients were included. For the experienced observer, the positive predictive values (PPVs) were 0.67 for the delay-sensitive and 0.66 for the delay-insensitive algorithm, and the negative predictive values (NPVs) were 0.73 and 0.74. For the less experienced observer, PPVs were 0.60 for both algorithms, and NPVs were 0.66 for the delay-sensitive and 0.63 for the delay-insensitive algorithm. Test characteristics are comparable for tracer delay-sensitive and tracer delay-insensitive algorithms for the visual assessment of CTP in diagnosing DCI. This indicates that both algorithms can be used for this purpose.
ERIC Educational Resources Information Center
Hus, Vanessa; Lord, Catherine
2013-01-01
The Autism Diagnostic Interview-Revised (ADI-R) is commonly used to inform diagnoses of autism spectrum disorders (ASD). Considering the time dedicated to using the ADI-R, it is of interest to expand the ways in which information obtained from this interview is used. The current study examines how algorithm totals reflecting past (ADI-Diagnostic)…
Nallikuzhy, Jiss J; Dandapat, S
2017-06-01
In this work, a new patient-specific approach to enhance the spatial resolution of ECG is proposed and evaluated. The proposed model transforms a three-lead ECG into a standard twelve-lead ECG thereby enhancing its spatial resolution. The three leads used for prediction are obtained from the standard twelve-lead ECG. The proposed model takes advantage of the improved inter-lead correlation in wavelet domain. Since the model is patient-specific, it also selects the optimal predictor leads for a given patient using a lead selection algorithm. The lead selection algorithm is based on a new diagnostic similarity score which computes the diagnostic closeness between the original and the spatially enhanced leads. Standard closeness measures are used to assess the performance of the model. The similarity in diagnostic information between the original and the spatially enhanced leads are evaluated using various diagnostic measures. Repeatability and diagnosability are performed to quantify the applicability of the model. A comparison of the proposed model is performed with existing models that transform a subset of standard twelve-lead ECG into the standard twelve-lead ECG. From the analysis of the results, it is evident that the proposed model preserves diagnostic information better compared to other models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of a novel diagnostic algorithm to predict NASH in HCV-positive patients.
Gallotta, Andrea; Paneghetti, Laura; Mrázová, Viera; Bednárová, Adriana; Kružlicová, Dáša; Frecer, Vladimir; Miertus, Stanislav; Biasiolo, Alessandra; Martini, Andrea; Pontisso, Patrizia; Fassina, Giorgio
2018-05-01
Non-alcoholic steato-hepatitis (NASH) is a severe disease characterised by liver inflammation and progressive hepatic fibrosis, which may progress to cirrhosis and hepatocellular carcinoma. Clinical evidence suggests that in hepatitis C virus patients steatosis and NASH are associated with faster fibrosis progression and hepatocellular carcinoma. A safe and reliable non-invasive diagnostic method to detect NASH at its early stages is still needed to prevent progression of the disease. We prospectively enrolled 91 hepatitis C virus-positive patients with histologically proven chronic liver disease: 77 patients were included in our study; of these, 10 had NASH. For each patient, various clinical and serological variables were collected. Different algorithms combining squamous cell carcinoma antigen-immunoglobulin-M (SCCA-IgM) levels with other common clinical data were created to provide the probability of having NASH. Our analysis revealed a statistically significant correlation between the histological presence of NASH and SCCA-IgM, insulin, homeostasis model assessment, haemoglobin, high-density lipoprotein and ferritin levels, and smoke. Compared to the use of a single marker, algorithms that combined four, six or seven variables identified NASH with higher accuracy. The best diagnostic performance was obtained with the logistic regression combination, which included all seven variables correlated with NASH. The combination of SCCA-IgM with common clinical data shows promising diagnostic performance for the detection of NASH in hepatitis C virus patients.
Hatzichristou, Dimitris; Kirana, Paraskevi-Sofia; Banner, Linda; Althof, Stanley E; Lonnee-Hoffmann, Risa A M; Dennerstein, Lorraine; Rosen, Raymond C
2016-08-01
A detailed sexual history is the cornerstone for all sexual problem assessments and sexual dysfunction diagnoses. Diagnostic evaluation is based on an in-depth sexual history, including sexual and gender identity and orientation, sexual activity and function, current level of sexual function, overall health and comorbidities, partner relationship and interpersonal factors, and the role of cultural and personal expectations and attitudes. To propose key steps in the diagnostic evaluation of sexual dysfunctions, with special focus on the use of symptom scales and questionnaires. Critical assessment of the current literature by the International Consultation on Sexual Medicine committee. A revised algorithm for the management of sexual dysfunctions, level of evidence, and recommendation for scales and questionnaires. The International Consultation on Sexual Medicine proposes an updated algorithm for diagnostic evaluation of sexual dysfunction in men and women, with specific recommendations for sexual history taking and diagnostic evaluation. Standardized scales, checklists, and validated questionnaires are additional adjuncts that should be used routinely in sexual problem evaluation. Scales developed for specific patient groups are included. Results of this evaluation are presented with recommendations for clinical and research uses. Defined principles, an algorithm and a range of scales may provide coherent and evidence based management for sexual dysfunctions. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Stothard, J Russell; Adams, Emily
2014-12-01
There are many reasons why detection of parasites of medical and veterinary importance is vital and where novel diagnostic and surveillance tools are required. From a medical perspective alone, these originate from a desire for better clinical management and rational use of medications. Diagnosis can be at the individual-level, at close to patient settings in testing a clinical suspicion or at the community-level, perhaps in front of a computer screen, in classification of endemic areas and devising appropriate control interventions. Thus diagnostics for parasitic diseases has a broad remit as parasites are not only tied with their definitive hosts but also in some cases with their vectors/intermediate hosts. Application of current diagnostic tools and decision algorithms in sustaining control programmes, or in elimination settings, can be problematic and even ill-fitting. For example in resource-limited settings, are current diagnostic tools sufficiently robust for operational use at scale or are they confounded by on-the-ground realities; are the diagnostic algorithms underlying public health interventions always understood and well-received within communities which are targeted for control? Within this Special Issue (SI) covering a variety of diseases and diagnostic settings some answers are forthcoming. An important theme, however, throughout the SI is to acknowledge that cross-talk and continuous feedback between development and application of diagnostic tests is crucial if they are to be used effectively and appropriately.
Maximum likelihood phase-retrieval algorithm: applications.
Nahrstedt, D A; Southwell, W H
1984-12-01
The maximum likelihood estimator approach is shown to be effective in determining the wave front aberration in systems involving laser and flow field diagnostics and optical testing. The robustness of the algorithm enables convergence even in cases of severe wave front error and real, nonsymmetrical, obscured amplitude distributions.
Diagnosis and treatment of gastroesophageal reflux disease complicated by Barrett's esophagus.
Stasyshyn, Andriy
2017-08-31
The aim of the study was to evaluate the effectiveness of a diagnostic and therapeutic algorithm for gastroesophageal reflux disease complicated by Barrett's esophagus in 46 patients. A diagnostic and therapeutic algorithm for complicated GERD was developed. To describe the changes in the esophagus with reflux esophagitis, the Los Angeles classification was used. Intestinal metaplasia of the epithelium in the lower third of the esophagus was assessed using videoendoscopy, chromoscopy, and biopsy. Quality of life was assessed with the Gastro-Intestinal Quality of Life Index. The used methods were modeling, clinical, analytical, comparative, standardized, and questionnaire-based. Results and their discussion. Among the complications of GERD, Barrett's esophagus was diagnosed in 9 (19.6 %), peptic ulcer in the esophagus in 10 (21.7 %), peptic stricture of the esophagus in 4 (8.7 %), esophageal-gastric bleeding in 23 (50.0 %), including Malory-Weiss syndrome in 18, and erosive ulcerous bleeding in 5 people. Hiatal hernia was diagnosed in 171 (87.7 %) patients (sliding in 157 (91.8%), paraesophageal hernia in 2 (1.2%), and mixed hernia in 12 (7.0%) cases). One hundred ninety-five patients underwent laparoscopic surgery. Nissen fundoplication was conducted in 176 (90.2%) patients, Toupet fundoplication in 14 (7.2%), and Dor fundoplication in 5 (2.6%). It was established that the use of the diagnostic and treatment algorithm promoted systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life. Argon coagulation and use of PPIs for 8-12 weeks before surgery led to the regeneration of the mucous membrane in the esophagus. The developed diagnostic and therapeutic algorithm facilitated systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life.
Quint, Jennifer K; Müllerova, Hana; DiSantostefano, Rachael L; Forbes, Harriet; Eaton, Susan; Hurst, John R; Davis, Kourtney; Smeeth, Liam
2014-01-01
Objectives The optimal method of identifying people with chronic obstructive pulmonary disease (COPD) from electronic primary care records is not known. We assessed the accuracy of different approaches using the Clinical Practice Research Datalink, a UK electronic health record database. Setting 951 participants registered with a CPRD practice in the UK between 1 January 2004 and 31 December 2012. Individuals were selected for ≥1 of 8 algorithms to identify people with COPD. General practitioners were sent a brief questionnaire and additional evidence to support a COPD diagnosis was requested. All information received was reviewed independently by two respiratory physicians whose opinion was taken as the gold standard. Primary outcome measure The primary measure of accuracy was the positive predictive value (PPV), the proportion of people identified by each algorithm for whom COPD was confirmed. Results 951 questionnaires were sent and 738 (78%) returned. After quality control, 696 (73.2%) patients were included in the final analysis. All four algorithms including a specific COPD diagnostic code performed well. Using a diagnostic code alone, the PPV was 86.5% (77.5–92.3%) while requiring a diagnosis plus spirometry plus specific medication; the PPV was slightly higher at 89.4% (80.7–94.5%) but reduced case numbers by 10%. Algorithms without specific diagnostic codes had low PPVs (range 12.2–44.4%). Conclusions Patients with COPD can be accurately identified from UK primary care records using specific diagnostic codes. Requiring spirometry or COPD medications only marginally improved accuracy. The high accuracy applies since the introduction of an incentivised disease register for COPD as part of Quality and Outcomes Framework in 2004. PMID:25056980
Baltzer, Pascal A T; Dietzel, Matthias; Kaiser, Werner A
2013-08-01
In the face of multiple available diagnostic criteria in MR-mammography (MRM), a practical algorithm for lesion classification is needed. Such an algorithm should be as simple as possible and include only important independent lesion features to differentiate benign from malignant lesions. This investigation aimed to develop a simple classification tree for differential diagnosis in MRM. A total of 1,084 lesions in standardised MRM with subsequent histological verification (648 malignant, 436 benign) were investigated. Seventeen lesion criteria were assessed by 2 readers in consensus. Classification analysis was performed using the chi-squared automatic interaction detection (CHAID) method. Results include the probability for malignancy for every descriptor combination in the classification tree. A classification tree incorporating 5 lesion descriptors with a depth of 3 ramifications (1, root sign; 2, delayed enhancement pattern; 3, border, internal enhancement and oedema) was calculated. Of all 1,084 lesions, 262 (40.4 %) and 106 (24.3 %) could be classified as malignant and benign with an accuracy above 95 %, respectively. Overall diagnostic accuracy was 88.4 %. The classification algorithm reduced the number of categorical descriptors from 17 to 5 (29.4 %), resulting in a high classification accuracy. More than one third of all lesions could be classified with accuracy above 95 %. • A practical algorithm has been developed to classify lesions found in MR-mammography. • A simple decision tree consisting of five criteria reaches high accuracy of 88.4 %. • Unique to this approach, each classification is associated with a diagnostic certainty. • Diagnostic certainty of greater than 95 % is achieved in 34 % of all cases.
Sideroudi, Haris; Labiris, Georgios; Georgantzoglou, Kimon; Ntonti, Panagiota; Siganos, Charalambos; Kozobolis, Vassilios
2017-07-01
To develop an algorithm for the Fourier analysis of posterior corneal videokeratographic data and to evaluate the derived parameters in the diagnosis of Subclinical Keratoconus (SKC) and Keratoconus (KC). This was a cross-sectional, observational study that took place in the Eye Institute of Thrace, Democritus University, Greece. Eighty eyes formed the KC group, 55 eyes formed the SKC group while 50 normal eyes populated the control group. A self-developed algorithm in visual basic for Microsoft Excel performed a Fourier series harmonic analysis for the posterior corneal sagittal curvature data. The algorithm decomposed the obtained curvatures into a spherical component, regular astigmatism, asymmetry and higher order irregularities for averaged central 4 mm and for each individual ring separately (1, 2, 3 and 4 mm). The obtained values were evaluated for their diagnostic capacity using receiver operating curves (ROC). Logistic regression was attempted for the identification of a combined diagnostic model. Significant differences were detected in regular astigmatism, asymmetry and higher order irregularities among groups. For the SKC group, the parameters with high diagnostic ability (AUC > 90%) were the higher order irregularities, the asymmetry and the regular astigmatism, mainly in the corneal periphery. Higher predictive accuracy was identified using diagnostic models that combined the asymmetry, regular astigmatism and higher order irregularities in averaged 3and 4 mm area (AUC: 98.4%, Sensitivity: 91.7% and Specificity:100%). Fourier decomposition of posterior Keratometric data provides parameters with high accuracy in differentiating SKC from normal corneas and should be included in the prompt diagnosis of KC. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.
Diagnostic algorithm for relapsing acquired demyelinating syndromes in children.
Hacohen, Yael; Mankad, Kshitij; Chong, W K; Barkhof, Frederik; Vincent, Angela; Lim, Ming; Wassmer, Evangeline; Ciccarelli, Olga; Hemingway, Cheryl
2017-07-18
To establish whether children with relapsing acquired demyelinating syndromes (RDS) and myelin oligodendrocyte glycoprotein antibodies (MOG-Ab) show distinctive clinical and radiologic features and to generate a diagnostic algorithm for the main RDS for clinical use. A panel reviewed the clinical characteristics, MOG-Ab and aquaporin-4 (AQP4) Ab, intrathecal oligoclonal bands, and Epstein-Barr virus serology results of 110 children with RDS. A neuroradiologist blinded to the diagnosis scored the MRI scans. Clinical, radiologic, and serologic tests results were compared. The findings showed that 56.4% of children were diagnosed with multiple sclerosis (MS), 25.4% with neuromyelitis optica spectrum disorder (NMOSD), 12.7% with multiphasic disseminated encephalomyelitis (MDEM), and 5.5% with relapsing optic neuritis (RON). Blinded analysis defined baseline MRI as typical of MS in 93.5% of children with MS. Acute disseminated encephalomyelitis presentation was seen only in the non-MS group. Of NMOSD cases, 30.7% were AQP4-Ab positive. MOG-Ab were found in 83.3% of AQP4-Ab-negative NMOSD, 100% of MDEM, and 33.3% of RON. Children with MOG-Ab were younger, were less likely to present with area postrema syndrome, and had lower disability, longer time to relapse, and more cerebellar peduncle lesions than children with AQP4-Ab NMOSD. A diagnostic algorithm applicable to any episode of CNS demyelination leads to 4 main phenotypes: MS, AQP4-Ab NMOSD, MOG-Ab-associated disease, and antibody-negative RDS. Children with MS and AQP4-Ab NMOSD showed features typical of adult cases. Because MOG-Ab-positive children showed notable and distinctive clinical and MRI features, they were grouped into a unified phenotype (MOG-Ab-associated disease), included in a new diagnostic algorithm. © 2017 American Academy of Neurology.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki
2009-02-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Hebephilia: quintessence of diagnostic pretextuality.
Franklin, Karen
2010-01-01
Hebephilia is an archaic term used to describe adult sexual attraction to adolescents. Prior to the advent of contemporary sexually violent predator laws, the term was not found in any dictionary or formal diagnostic system. Overnight, it is on the fast track toward recognition as a psychiatric condition meriting inclusion in the upcoming fifth edition of the Diagnostic and Statistical Manual of Mental Disorders. This article traces the sudden emergence and popularity of hebephilia to pressure from the legal arena and, specifically, to the legal mandate of a serious mental abnormality for civil commitment of sex offenders. Hebephilia is proposed as a quintessential example of pretextuality, in which special interests promote a pseudoscientific construct that furthers an implicit, instrumental goal. Inherent problems with the construct's reliability and validity are discussed. A warning is issued about unintended consequences if hebephilia or its relative, pedohebephilia, make their way into the DSM-5, due out in 2013. Copyright © 2010 John Wiley & Sons, Ltd.
Yadav, Ravi K; Begum, Viquar U; Addepalli, Uday K; Senthil, Sirisha; Garudadri, Chandra S; Rao, Harsha L
2016-02-01
To compare the abilities of retinal nerve fiber layer (RNFL) parameters of variable corneal compensation (VCC) and enhanced corneal compensation (ECC) algorithms of scanning laser polarimetry (GDx) in detecting various severities of glaucoma. Two hundred and eighty-five eyes of 194 subjects from the Longitudinal Glaucoma Evaluation Study who underwent GDx VCC and ECC imaging were evaluated. Abilities of RNFL parameters of GDx VCC and ECC to diagnose glaucoma were compared using area under receiver operating characteristic curves (AUC), sensitivities at fixed specificities, and likelihood ratios. After excluding 5 eyes that failed to satisfy manufacturer-recommended quality parameters with ECC and 68 with VCC, 56 eyes of 41 normal subjects and 161 eyes of 121 glaucoma patients [36 eyes with preperimetric glaucoma, 52 eyes with early (MD>-6 dB), 34 with moderate (MD between -6 and -12 dB), and 39 with severe glaucoma (MD<-12 dB)] were included for the analysis. Inferior RNFL, average RNFL, and nerve fiber indicator parameters showed the best AUCs and sensitivities both with GDx VCC and ECC in diagnosing all severities of glaucoma. AUCs and sensitivities of all RNFL parameters were comparable between the VCC and ECC algorithms (P>0.20 for all comparisons). Likelihood ratios associated with the diagnostic categorization of RNFL parameters were comparable between the VCC and ECC algorithms. In scans satisfying the manufacturer-recommended quality parameters, which were significantly greater with ECC than VCC algorithm, diagnostic abilities of GDx ECC and VCC in glaucoma were similar.
Huerga, Helena; Ferlazzo, Gabriella; Bevilacqua, Paolo; Kirubi, Beatrice; Ardizzoni, Elisa; Wanjala, Stephen; Sitienei, Joseph; Bonnet, Maryline
2017-01-01
Determine-TB LAM assay is a urine point-of-care test useful for TB diagnosis in HIV-positive patients. We assessed the incremental diagnostic yield of adding LAM to algorithms based on clinical signs, sputum smear-microscopy, chest X-ray and Xpert MTB/RIF in HIV-positive patients with symptoms of pulmonary TB (PTB). Prospective observational cohort of ambulatory (either severely ill or CD4<200cells/μl or with Body Mass Index<17Kg/m2) and hospitalized symptomatic HIV-positive adults in Kenya. Incremental diagnostic yield of adding LAM was the difference in the proportion of confirmed TB patients (positive Xpert or MTB culture) diagnosed by the algorithm with LAM compared to the algorithm without LAM. The multivariable mortality model was adjusted for age, sex, clinical severity, BMI, CD4, ART initiation, LAM result and TB confirmation. Among 474 patients included, 44.1% were severely ill, 69.6% had CD4<200cells/μl, 59.9% had initiated ART, 23.2% could not produce sputum. LAM, smear-microscopy, Xpert and culture in sputum were positive in 39.0% (185/474), 21.6% (76/352), 29.1% (102/350) and 39.7% (92/232) of the patients tested, respectively. Of 156 patients with confirmed TB, 65.4% were LAM positive. Of those classified as non-TB, 84.0% were LAM negative. Adding LAM increased the diagnostic yield of the algorithms by 36.6%, from 47.4% (95%CI:39.4-55.6) to 84.0% (95%CI:77.3-89.4%), when using clinical signs and X-ray; by 19.9%, from 62.2% (95%CI:54.1-69.8) to 82.1% (95%CI:75.1-87.7), when using clinical signs and microscopy; and by 13.4%, from 74.4% (95%CI:66.8-81.0) to 87.8% (95%CI:81.6-92.5), when using clinical signs and Xpert. LAM positive patients had an increased risk of 2-months mortality (aOR:2.7; 95%CI:1.5-4.9). LAM should be included in TB diagnostic algorithms in parallel to microscopy or Xpert request for HIV-positive patients either ambulatory (severely ill or CD4<200cells/μl) or hospitalized. LAM allows same day treatment initiation in patients at higher risk of death and in those not able to produce sputum.
Prevalence and Phenotype of Childhood Apraxia of Speech In Youth with Galactosemia
Shriberg, Lawrence D.; Potter, Nancy L.; Strand, Edythe A.
2010-01-01
Purpose We address the hypothesis that the severe and persistent speech disorder reported in persons with galactosemia meets contemporary diagnostic criteria for Childhood Apraxia of Speech (CAS). A positive finding for CAS in this rare metabolic disorder has the potential to impact treatment of persons with galactosemia and inform explanatory perspectives on CAS in neurological, neurodevelopmental, and idiopathic contexts. Method Thirty-three youth with galactosemia and significant prior or persistent speech sound disorder were assessed in their homes in 17 states. Participants completed a protocol yielding information on their cognitive, structural, sensorimotor, language, speech, prosody, and voice status and function. Results Eight of the 33 participants (24%) met contemporary diagnostic criteria for CAS. Two participants, one of whom was among the 8 with CAS, met criteria for ataxic or hyperkinetic dysarthria. Group-wise findings for the remaining 24 participants are consistent with a classification category termed Motor Speech Disorder-Not Otherwise Specified (MSD-NOS; Shriberg, Fourakis, et al., in press-a). Conclusion We estimate the prevalence of CAS in galactosemia at 18 per hundred, 180 times the estimated risk for idiopathic CAS. Findings support the need to study risk factors for the high occurrence of motor speech disorders in galactosemia, despite early compliant dietary management. PMID:20966389
Feeding Disorders in Children with Developmental Disabilities.
ERIC Educational Resources Information Center
Schwarz, Steven M.
2003-01-01
This article describes an approach to evaluating and managing feeding disorders in children with developmental disabilities and examines effects of these management strategies on growth and clinical outcomes. A structured approach is stressed and a diagnostic and treatment algorithm is presented. Use with 79 children found that diagnostic-specific…
Novotny, Tomas; Bond, Raymond; Andrsova, Irena; Koc, Lumir; Sisakova, Martina; Finlay, Dewar; Guldenring, Daniel; Spinar, Jindrich; Malik, Marek
2017-05-01
Most contemporary 12-lead electrocardiogram (ECG) devices offer computerized diagnostic proposals. The reliability of these automated diagnoses is limited. It has been suggested that incorrect computer advice can influence physician decision-making. This study analyzed the role of diagnostic proposals in the decision process by a group of fellows of cardiology and other internal medicine subspecialties. A set of 100 clinical 12-lead ECG tracings was selected covering both normal cases and common abnormalities. A team of 15 junior Cardiology Fellows and 15 Non-Cardiology Fellows interpreted the ECGs in 3 phases: without any diagnostic proposal, with a single diagnostic proposal (half of them intentionally incorrect), and with four diagnostic proposals (only one of them being correct) for each ECG. Self-rated confidence of each interpretation was collected. Availability of diagnostic proposals significantly increased the diagnostic accuracy (p<0.001). Nevertheless, in case of a single proposal (either correct or incorrect) the increase of accuracy was present in interpretations with correct diagnostic proposals, while the accuracy was substantially reduced with incorrect proposals. Confidence levels poorly correlated with interpretation scores (rho≈2, p<0.001). Logistic regression showed that an interpreter is most likely to be correct when the ECG offers a correct diagnostic proposal (OR=10.87) or multiple proposals (OR=4.43). Diagnostic proposals affect the diagnostic accuracy of ECG interpretations. The accuracy is significantly influenced especially when a single diagnostic proposal (either correct or incorrect) is provided. The study suggests that the presentation of multiple computerized diagnoses is likely to improve the diagnostic accuracy of interpreters. Copyright © 2017 Elsevier B.V. All rights reserved.
Schiffman, Eric; Ohrbach, Richard; Truelove, Edmond; Look, John; Anderson, Gary; Goulet, Jean-Paul; List, Thomas; Svensson, Peter; Gonzalez, Yoly; Lobbezoo, Frank; Michelotti, Ambra; Brooks, Sharon L.; Ceusters, Werner; Drangsholt, Mark; Ettlin, Dominik; Gaul, Charly; Goldberg, Louis J.; Haythornthwaite, Jennifer A.; Hollender, Lars; Jensen, Rigmor; John, Mike T.; De Laat, Antoon; de Leeuw, Reny; Maixner, William; van der Meulen, Marylee; Murray, Greg M.; Nixdorf, Donald R.; Palla, Sandro; Petersson, Arne; Pionchon, Paul; Smith, Barry; Visscher, Corine M.; Zakrzewska, Joanna; Dworkin, Samuel F.
2015-01-01
Aims The original Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Axis I diagnostic algorithms have been demonstrated to be reliable. However, the Validation Project determined that the RDC/TMD Axis I validity was below the target sensitivity of ≥ 0.70 and specificity of ≥ 0.95. Consequently, these empirical results supported the development of revised RDC/TMD Axis I diagnostic algorithms that were subsequently demonstrated to be valid for the most common pain-related TMD and for one temporomandibular joint (TMJ) intra-articular disorder. The original RDC/TMD Axis II instruments were shown to be both reliable and valid. Working from these findings and revisions, two international consensus workshops were convened, from which recommendations were obtained for the finalization of new Axis I diagnostic algorithms and new Axis II instruments. Methods Through a series of workshops and symposia, a panel of clinical and basic science pain experts modified the revised RDC/TMD Axis I algorithms by using comprehensive searches of published TMD diagnostic literature followed by review and consensus via a formal structured process. The panel's recommendations for further revision of the Axis I diagnostic algorithms were assessed for validity by using the Validation Project's data set, and for reliability by using newly collected data from the ongoing TMJ Impact Project—the follow-up study to the Validation Project. New Axis II instruments were identified through a comprehensive search of the literature providing valid instruments that, relative to the RDC/TMD, are shorter in length, are available in the public domain, and currently are being used in medical settings. Results The newly recommended Diagnostic Criteria for TMD (DC/TMD) Axis I protocol includes both a valid screener for detecting any pain-related TMD as well as valid diagnostic criteria for differentiating the most common pain-related TMD (sensitivity ≥ 0.86, specificity ≥ 0.98) and for one intra-articular disorder (sensitivity of 0.80 and specificity of 0.97). Diagnostic criteria for other common intra-articular disorders lack adequate validity for clinical diagnoses but can be used for screening purposes. Inter-examiner reliability for the clinical assessment associated with the validated DC/TMD criteria for pain-related TMD is excellent (kappa ≥ 0.85). Finally, a comprehensive classification system that includes both the common and less common TMD is also presented. The Axis II protocol retains selected original RDC/TMD screening instruments augmented with new instruments to assess jaw function as well as behavioral and additional psychosocial factors. The Axis II protocol is divided into screening and comprehensive self-report instrument sets. The screening instruments’ 41 questions assess pain intensity, pain-related disability, psychological distress, jaw functional limitations, and parafunctional behaviors, and a pain drawing is used to assess locations of pain. The comprehensive instruments, composed of 81 questions, assess in further detail jaw functional limitations and psychological distress as well as additional constructs of anxiety and presence of comorbid pain conditions. Conclusion The recommended evidence-based new DC/TMD protocol is appropriate for use in both clinical and research settings. More comprehensive instruments augment short and simple screening instruments for Axis I and Axis II. These validated instruments allow for identification of patients with a range of simple to complex TMD presentations. PMID:24482784
Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T
2017-07-01
Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.
Data decomposition of Monte Carlo particle transport simulations via tally servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit
An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithmmore » in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.« less
Comparison of Traditional and Reverse Syphilis Screening Algorithms in Medical Health Checkups.
Nah, Eun Hee; Cho, Seon; Kim, Suyoung; Cho, Han Ik; Chai, Jong Yil
2017-11-01
The syphilis diagnostic algorithms applied in different countries vary significantly depending on the local syphilis epidemiology and other considerations, including the expected workload, the need for automation in the laboratory and budget factors. This study was performed to investigate the efficacy of traditional and reverse syphilis diagnostic algorithms during general health checkups. In total, 1,000 blood specimens were obtained from 908 men and 92 women during their regular health checkups. Traditional screening and reverse screening were applied to the same specimens using automatic rapid plasma regain (RPR) and Treponema pallidum latex agglutination (TPLA) tests, respectively. Specimens that were reverse algorithm (TPLA) reactive, were subjected to a second treponemal test performed by using the chemiluminescent microparticle immunoassay (CMIA). Of the 1,000 specimens tested, 68 (6.8%) were reactive by reverse screening (TPLA) compared with 11 (1.1%) by traditional screening (RPR). The traditional algorithm failed to detect 48 specimens [TPLA(+)/RPR(-)/CMIA(+)]. The median TPLA cutoff index (COI) was higher in CMIA-reactive cases than in CMIA-nonreactive cases (90.5 vs 12.5 U). The reverse screening algorithm could detect the subjects with possible latent syphilis who were not detected by the traditional algorithm. Those individuals could be provided with opportunities for evaluating syphilis during their health checkups. The COI values of the initial TPLA test may be helpful in excluding false-positive TPLA test results in the reverse algorithm. © The Korean Society for Laboratory Medicine
PCA-based artifact removal algorithm for stroke detection using UWB radar imaging.
Ricci, Elisa; di Domenico, Simone; Cianca, Ernestina; Rossi, Tommaso; Diomedi, Marina
2017-06-01
Stroke patients should be dispatched at the highest level of care available in the shortest time. In this context, a transportable system in specialized ambulances, able to evaluate the presence of an acute brain lesion in a short time interval (i.e., few minutes), could shorten delay of treatment. UWB radar imaging is an emerging diagnostic branch that has great potential for the implementation of a transportable and low-cost device. Transportability, low cost and short response time pose challenges to the signal processing algorithms of the backscattered signals as they should guarantee good performance with a reasonably low number of antennas and low computational complexity, tightly related to the response time of the device. The paper shows that a PCA-based preprocessing algorithm can: (1) achieve good performance already with a computationally simple beamforming algorithm; (2) outperform state-of-the-art preprocessing algorithms; (3) enable a further improvement in the performance (and/or decrease in the number of antennas) by using a multistatic approach with just a modest increase in computational complexity. This is an important result toward the implementation of such a diagnostic device that could play an important role in emergency scenario.
Mental Health Risk Adjustment with Clinical Categories and Machine Learning.
Shrestha, Akritee; Bergquist, Savannah; Montz, Ellen; Rose, Sherri
2017-12-15
To propose nonparametric ensemble machine learning for mental health and substance use disorders (MHSUD) spending risk adjustment formulas, including considering Clinical Classification Software (CCS) categories as diagnostic covariates over the commonly used Hierarchical Condition Category (HCC) system. 2012-2013 Truven MarketScan database. We implement 21 algorithms to predict MHSUD spending, as well as a weighted combination of these algorithms called super learning. The algorithm collection included seven unique algorithms that were supplied with three differing sets of MHSUD-related predictors alongside demographic covariates: HCC, CCS, and HCC + CCS diagnostic variables. Performance was evaluated based on cross-validated R 2 and predictive ratios. Results show that super learning had the best performance based on both metrics. The top single algorithm was random forests, which improved on ordinary least squares regression by 10 percent with respect to relative efficiency. CCS categories-based formulas were generally more predictive of MHSUD spending compared to HCC-based formulas. Literature supports the potential benefit of implementing a separate MHSUD spending risk adjustment formula. Our results suggest there is an incentive to explore machine learning for MHSUD-specific risk adjustment, as well as considering CCS categories over HCCs. © Health Research and Educational Trust.
MacRae, J; Darlow, B; McBain, L; Jones, O; Stubbe, M; Turner, N; Dowell, A
2015-08-21
To develop a natural language processing software inference algorithm to classify the content of primary care consultations using electronic health record Big Data and subsequently test the algorithm's ability to estimate the prevalence and burden of childhood respiratory illness in primary care. Algorithm development and validation study. To classify consultations, the algorithm is designed to interrogate clinical narrative entered as free text, diagnostic (Read) codes created and medications prescribed on the day of the consultation. Thirty-six consenting primary care practices from a mixed urban and semirural region of New Zealand. Three independent sets of 1200 child consultation records were randomly extracted from a data set of all general practitioner consultations in participating practices between 1 January 2008-31 December 2013 for children under 18 years of age (n=754,242). Each consultation record within these sets was independently classified by two expert clinicians as respiratory or non-respiratory, and subclassified according to respiratory diagnostic categories to create three 'gold standard' sets of classified records. These three gold standard record sets were used to train, test and validate the algorithm. Sensitivity, specificity, positive predictive value and F-measure were calculated to illustrate the algorithm's ability to replicate judgements of expert clinicians within the 1200 record gold standard validation set. The algorithm was able to identify respiratory consultations in the 1200 record validation set with a sensitivity of 0.72 (95% CI 0.67 to 0.78) and a specificity of 0.95 (95% CI 0.93 to 0.98). The positive predictive value of algorithm respiratory classification was 0.93 (95% CI 0.89 to 0.97). The positive predictive value of the algorithm classifying consultations as being related to specific respiratory diagnostic categories ranged from 0.68 (95% CI 0.40 to 1.00; other respiratory conditions) to 0.91 (95% CI 0.79 to 1.00; throat infections). A software inference algorithm that uses primary care Big Data can accurately classify the content of clinical consultations. This algorithm will enable accurate estimation of the prevalence of childhood respiratory illness in primary care and resultant service utilisation. The methodology can also be applied to other areas of clinical care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Neurologic Aspects of Infections in International Travelers
Han, May H.; Zunt, Joseph R.
2009-01-01
Background As international travel for business and pleasure becomes part of contemporary lifestyle, the clinician today is confronted with an increasing number of travelers returning ill with unfamiliar syndromes. The physician will encounter a myriad of patients with exotic infections, emerging infectious diseases, or resurgent Old-World infections. Review Summary This review article will discuss salient points of important infectious diseases associated with overseas travel, provide a syndromic approach to the traveler who returns with neurologic manifestations, and list resources for additional diagnostic, therapeutic, and preventive information. Conclusions As many of infections acquired in other countries can directly or indirectly affect the nervous system, the care of the ill traveler often falls into the hands of neurologists. The contemporary neurologist should therefore be knowledgeable of the clinical manifestations, potential complications, and appropriate management of region-specific infections. PMID:15631642
Noureldine, Salem I; Najafian, Alireza; Aragon Han, Patricia; Olson, Matthew T; Genther, Dane J; Schneider, Eric B; Prescott, Jason D; Agrawal, Nishant; Mathur, Aarti; Zeiger, Martha A; Tufano, Ralph P
2016-07-01
Diagnostic molecular testing is used in the workup of thyroid nodules. While these tests appear to be promising in more definitively assigning a risk of malignancy, their effect on surgical decision making has yet to be demonstrated. To investigate the effect of diagnostic molecular profiling of thyroid nodules on the surgical decision-making process. A surgical management algorithm was developed and published after peer review that incorporated individual Bethesda System for Reporting Thyroid Cytopathology classifications with clinical, laboratory, and radiological results. This algorithm was created to formalize the decision-making process selected herein in managing patients with thyroid nodules. Between April 1, 2014, and March 31, 2015, a prospective study of patients who had undergone diagnostic molecular testing of a thyroid nodule before being seen for surgical consultation was performed. The recommended management undertaken by the surgeon was then prospectively compared with the corresponding one in the algorithm. Patients with thyroid nodules who did not undergo molecular testing and were seen for surgical consultation during the same period served as a control group. All pertinent treatment options were presented to each patient, and any deviation from the algorithm was recorded prospectively. To evaluate the appropriateness of any change (deviation) in management, the surgical histopathology diagnosis was correlated with the surgery performed. The study cohort comprised 140 patients who underwent molecular testing. Their mean (SD) age was 50.3 (14.6) years, and 75.0% (105 of 140) were female. Over a 1-year period, 20.3% (140 of 688) had undergone diagnostic molecular testing before surgical consultation, and 79.7% (548 of 688) had not undergone molecular testing. The surgical management deviated from the treatment algorithm in 12.9% (18 of 140) with molecular testing and in 10.2% (56 of 548) without molecular testing (P = .37). In the group with molecular testing, the surgical management plan of only 7.9% (11 of 140) was altered as a result of the molecular test. All but 1 of those patients were found to be overtreated relative to the surgical histopathology analysis. Molecular testing did not significantly affect the surgical decision-making process in this study. Among patients whose treatment was altered based on these markers, there was evidence of overtreatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mlynar, J.; Weinzettl, V.; Imrisek, M.
2012-10-15
The contribution focuses on plasma tomography via the minimum Fisher regularisation (MFR) algorithm applied on data from the recently commissioned tomographic diagnostics on the COMPASS tokamak. The MFR expertise is based on previous applications at Joint European Torus (JET), as exemplified in a new case study of the plasma position analyses based on JET soft x-ray (SXR) tomographic reconstruction. Subsequent application of the MFR algorithm on COMPASS data from cameras with absolute extreme ultraviolet (AXUV) photodiodes disclosed a peaked radiating region near the limiter. Moreover, its time evolution indicates transient plasma edge cooling following a radial plasma shift. In themore » SXR data, MFR demonstrated that a high resolution plasma positioning independent of the magnetic diagnostics would be possible provided that a proper calibration of the cameras on an x-ray source is undertaken.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raskovskaya, I L
2015-08-31
A beam model with a discrete change in the cross-sectional intensity is proposed to describe refraction of laser beams formed on the basis of diffractive optical elements. In calculating the wave field of the beams of this class under conditions of strong refraction, in contrast to the traditional asymptotics of geometric optics which assumes a transition to the infinite limits of integration and obtaining an analytical solution, it is proposed to calculate the integral in the vicinity of stationary points. This approach allows the development of a fast algorithm for correct calculation of the wave field of the laser beamsmore » that are employed in probing and diagnostics of extended optically inhomogeneous media. Examples of the algorithm application for diagnostics of extended nonstationary objects in liquid are presented. (laser beams)« less
Algorithms for the diagnosis and treatment of restless legs syndrome in primary care
2011-01-01
Background Restless legs syndrome (RLS) is a neurological disorder with a lifetime prevalence of 3-10%. in European studies. However, the diagnosis of RLS in primary care remains low and mistreatment is common. Methods The current article reports on the considerations of RLS diagnosis and management that were made during a European Restless Legs Syndrome Study Group (EURLSSG)-sponsored task force consisting of experts and primary care practioners. The task force sought to develop a better understanding of barriers to diagnosis in primary care practice and overcome these barriers with diagnostic and treatment algorithms. Results The barriers to diagnosis identified by the task force include the presentation of symptoms, the language used to describe them, the actual term "restless legs syndrome" and difficulties in the differential diagnosis of RLS. Conclusion The EURLSSG task force reached a consensus and agreed on the diagnostic and treatment algorithms published here. PMID:21352569
Case-Deletion Diagnostics for Nonlinear Structural Equation Models
ERIC Educational Resources Information Center
Lee, Sik-Yum; Lu, Bin
2003-01-01
In this article, a case-deletion procedure is proposed to detect influential observations in a nonlinear structural equation model. The key idea is to develop the diagnostic measures based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm. An one-step pseudo approximation is proposed to reduce the…
Sequential Test Strategies for Multiple Fault Isolation
NASA Technical Reports Server (NTRS)
Shakeri, M.; Pattipati, Krishna R.; Raghavan, V.; Patterson-Hine, Ann; Kell, T.
1997-01-01
In this paper, we consider the problem of constructing near optimal test sequencing algorithms for diagnosing multiple faults in redundant (fault-tolerant) systems. The computational complexity of solving the optimal multiple-fault isolation problem is super-exponential, that is, it is much more difficult than the single-fault isolation problem, which, by itself, is NP-hard. By employing concepts from information theory and Lagrangian relaxation, we present several static and dynamic (on-line or interactive) test sequencing algorithms for the multiple fault isolation problem that provide a trade-off between the degree of suboptimality and computational complexity. Furthermore, we present novel diagnostic strategies that generate a static diagnostic directed graph (digraph), instead of a static diagnostic tree, for multiple fault diagnosis. Using this approach, the storage complexity of the overall diagnostic strategy reduces substantially. Computational results based on real-world systems indicate that the size of a static multiple fault strategy is strictly related to the structure of the system, and that the use of an on-line multiple fault strategy can diagnose faults in systems with as many as 10,000 failure sources.
When the bell tolls on Bell's palsy: finding occult malignancy in acute-onset facial paralysis.
Quesnel, Alicia M; Lindsay, Robin W; Hadlock, Tessa A
2010-01-01
This study reports 4 cases of occult parotid malignancy presenting with sudden-onset facial paralysis to demonstrate that failure to regain tone 6 months after onset distinguishes these patients from Bell's palsy patients with delayed recovery and to propose a diagnostic algorithm for this subset of patients. A case series of 4 patients with occult parotid malignancies presenting with acute-onset unilateral facial paralysis is reported. Initial imaging on all 4 patients did not demonstrate a parotid mass. Diagnostic delays ranged from 7 to 36 months from time of onset of facial paralysis to time of diagnosis of parotid malignancy. Additional physical examination findings, especially failure to regain tone, as well as properly protocolled radiologic studies reviewed with dedicated head and neck radiologists, were helpful in arriving at the diagnosis. An algorithm to minimize diagnostic delays in this subset of acute facial paralysis patients is presented. Careful attention to facial tone, in addition to movement, is important in the diagnostic evaluation of acute-onset facial paralysis. Copyright 2010 Elsevier Inc. All rights reserved.
A study of the Space Station Freedom response to the disturbance environment
NASA Technical Reports Server (NTRS)
Suleman, Afzal; Modi, V. J.; Venkayya, V. B.
1994-01-01
A relatively general formulation for studying the dynamics and control of an arbitrary spacecraft with interconnected flexible bodies has been developed. This self-contained and comprehensive numerical algorithm using system modes is applicable to a large class of spacecraft configurations of contemporary and future interest. Here, versatility of the approach is demonstrated through the dynamics and control studies aimed at the evolving Space Station Freedom.
Emmert-Streib, Frank; Glazko, Galina V.; Altay, Gökmen; de Matos Simoes, Ricardo
2012-01-01
In this paper, we present a systematic and conceptual overview of methods for inferring gene regulatory networks from observational gene expression data. Further, we discuss two classic approaches to infer causal structures and compare them with contemporary methods by providing a conceptual categorization thereof. We complement the above by surveying global and local evaluation measures for assessing the performance of inference algorithms. PMID:22408642
Karayiannis, N B
2000-01-01
This paper presents the development and investigates the properties of ordered weighted learning vector quantization (LVQ) and clustering algorithms. These algorithms are developed by using gradient descent to minimize reformulation functions based on aggregation operators. An axiomatic approach provides conditions for selecting aggregation operators that lead to admissible reformulation functions. Minimization of admissible reformulation functions based on ordered weighted aggregation operators produces a family of soft LVQ and clustering algorithms, which includes fuzzy LVQ and clustering algorithms as special cases. The proposed LVQ and clustering algorithms are used to perform segmentation of magnetic resonance (MR) images of the brain. The diagnostic value of the segmented MR images provides the basis for evaluating a variety of ordered weighted LVQ and clustering algorithms.
Goebel, Georg; Seppi, Klaus; Donnemiller, Eveline; Warwitz, Boris; Wenning, Gregor K; Virgolini, Irene; Poewe, Werner; Scherfler, Christoph
2011-04-01
The purpose of this study was to develop an observer-independent algorithm for the correct classification of dopamine transporter SPECT images as Parkinson's disease (PD), multiple system atrophy parkinson variant (MSA-P), progressive supranuclear palsy (PSP) or normal. A total of 60 subjects with clinically probable PD (n = 15), MSA-P (n = 15) and PSP (n = 15), and 15 age-matched healthy volunteers, were studied with the dopamine transporter ligand [(123)I]β-CIT. Parametric images of the specific-to-nondisplaceable equilibrium partition coefficient (BP(ND)) were generated. Following a voxel-wise ANOVA, cut-off values were calculated from the voxel values of the resulting six post-hoc t-test maps. The percentages of the volume of an individual BP(ND) image remaining below and above the cut-off values were determined. The higher percentage of image volume from all six cut-off matrices was used to classify an individual's image. For validation, the algorithm was compared to a conventional region of interest analysis. The predictive diagnostic accuracy of the algorithm in the correct assignment of a [(123)I]β-CIT SPECT image was 83.3% and increased to 93.3% on merging the MSA-P and PSP groups. In contrast the multinomial logistic regression of mean region of interest values of the caudate, putamen and midbrain revealed a diagnostic accuracy of 71.7%. In contrast to a rater-driven approach, this novel method was superior in classifying [(123)I]β-CIT-SPECT images as one of four diagnostic entities. In combination with the investigator-driven visual assessment of SPECT images, this clinical decision support tool would help to improve the diagnostic yield of [(123)I]β-CIT SPECT in patients presenting with parkinsonism at their initial visit.
Modification of YAPE keypoint detection algorithm for wide local contrast range images
NASA Astrophysics Data System (ADS)
Lukoyanov, A.; Nikolaev, D.; Konovalenko, I.
2018-04-01
Keypoint detection is an important tool of image analysis, and among many contemporary keypoint detection algorithms YAPE is known for its computational performance, allowing its use in mobile and embedded systems. One of its shortcomings is high sensitivity to local contrast which leads to high detection density in high-contrast areas while missing detections in low-contrast ones. In this work we study the contrast sensitivity of YAPE and propose a modification which compensates for this property on images with wide local contrast range (Yet Another Contrast-Invariant Point Extractor, YACIPE). As a model example, we considered the traffic sign recognition problem, where some signs are well-lighted, whereas others are in shadows and thus have low contrast. We show that the number of traffic signs on the image of which has not been detected any keypoints is 40% less for the proposed modification compared to the original algorithm.
ERIC Educational Resources Information Center
Castillo, Antonio S.; Berenguer, Isabel A.; Sánchez, Alexander G.; Álvarez, Tomás R. R.
2017-01-01
This paper analyzes the results of a diagnostic study carried out with second year students of the computational sciences majors at University of Oriente, Cuba, to determine the limitations that they present in computational algorithmization. An exploratory research was developed using quantitative and qualitative methods. The results allowed…
Diagnostic Accuracy of the Veteran Affairs' Traumatic Brain Injury Screen.
Louise Bender Pape, Theresa; Smith, Bridget; Babcock-Parziale, Judith; Evans, Charlesnika T; Herrold, Amy A; Phipps Maieritsch, Kelly; High, Walter M
2018-01-31
To comprehensively estimate the diagnostic accuracy and reliability of the Department of Veterans Affairs (VA) Traumatic Brain Injury (TBI) Clinical Reminder Screen (TCRS). Cross-sectional, prospective, observational study using the Standards for Reporting of Diagnostic Accuracy criteria. Three VA Polytrauma Network Sites. Operation Iraqi Freedom, Operation Enduring Freedom veterans (N=433). TCRS, Comprehensive TBI Evaluation, Structured TBI Diagnostic Interview, Symptom Attribution and Classification Algorithm, and Clinician-Administered Posttraumatic Stress Disorder (PTSD) Scale. Forty-five percent of veterans screened positive on the TCRS for TBI. For detecting occurrence of historical TBI, the TCRS had a sensitivity of .56 to .74, a specificity of .63 to .93, a positive predictive value (PPV) of 25% to 45%, a negative predictive value (NPV) of 91% to 94%, and a diagnostic odds ratio (DOR) of 4 to 13. For accuracy of attributing active symptoms to the TBI, the TCRS had a sensitivity of .64 to .87, a specificity of .59 to .89, a PPV of 26% to 32%, an NPV of 92% to 95%, and a DOR of 6 to 9. The sensitivity was higher for veterans with PTSD (.80-.86) relative to veterans without PTSD (.57-.82). The specificity, however, was higher among veterans without PTSD (.75-.81) relative to veterans with PTSD (.36-.49). All indices of diagnostic accuracy changed when participants with questionably valid (QV) test profiles were eliminated from analyses. The utility of the TCRS to screen for mild TBI (mTBI) depends on the stringency of the diagnostic reference standard to which it is being compared, the presence/absence of PTSD, and QV test profiles. Further development, validation, and use of reproducible diagnostic algorithms for symptom attribution after possible mTBI would improve diagnostic accuracy. Published by Elsevier Inc.
Fusar-Poli, P.; Cappucciati, M.; Rutigliano, G.; Lee, T. Y.; Beverly, Q.; Bonoldi, I.; Lelli, J.; Kaar, S. J.; Gago, E.; Rocchetti, M.; Patel, R.; Bhavsar, V.; Tognin, S.; Badger, S.; Calem, M.; Lim, K.; Kwon, J. S.; Perez, J.; McGuire, P.
2016-01-01
Background. Several psychometric instruments are available for the diagnostic interview of subjects at ultra high risk (UHR) of psychosis. Their diagnostic comparability is unknown. Methods. All referrals to the OASIS (London) or CAMEO (Cambridgeshire) UHR services from May 13 to Dec 14 were interviewed for a UHR state using both the CAARMS 12/2006 and the SIPS 5.0. Percent overall agreement, kappa, the McNemar-Bowker χ 2 test, equipercentile methods, and residual analyses were used to investigate diagnostic outcomes and symptoms severity or frequency. A conversion algorithm (CONVERT) was validated in an independent UHR sample from the Seoul Youth Clinic (Seoul). Results. There was overall substantial CAARMS-versus-SIPS agreement in the identification of UHR subjects (n = 212, percent overall agreement = 86%; kappa = 0.781, 95% CI from 0.684 to 0.878; McNemar-Bowker test = 0.069), with the exception of the brief limited intermittent psychotic symptoms (BLIPS) subgroup. Equipercentile-linking table linked symptoms severity and frequency across the CAARMS and SIPS. The conversion algorithm was validated in 93 UHR subjects, showing excellent diagnostic accuracy (CAARMS to SIPS: ROC area 0.929; SIPS to CAARMS: ROC area 0.903). Conclusions. This study provides initial comparability data between CAARMS and SIPS and will inform ongoing multicentre studies and clinical guidelines for the UHR psychometric diagnostic interview. PMID:27314005
[Cost-effectiveness of the deep vein thrombosis diagnosis process in primary care].
Fuentes Camps, Eva; Luis del Val García, José; Bellmunt Montoya, Sergi; Hmimina Hmimina, Sara; Gómez Jabalera, Efren; Muñoz Pérez, Miguel Ángel
2016-04-01
To analyse the cost effectiveness of the application of diagnostic algorithms in patients with a first episode of suspected deep vein thrombosis (DVT) in Primary Care compared with systematic referral to specialised centres. Observational, cross-sectional, analytical study. Patients from hospital emergency rooms referred from Primary Care to complete clinical evaluation and diagnosis. A total of 138 patients with symptoms of a first episode of DVT were recruited; 22 were excluded (no Primary Care report, symptoms for more than 30 days, anticoagulant treatment, and previous DVT). Of the 116 patients finally included, 61% women and the mean age was 71 years. Variables from the Wells and Oudega clinical probability scales, D-dimer (portable and hospital), Doppler ultrasound, and direct costs generated by the three algorithms analysed: all patients were referred systematically, referral according to Wells and Oudega scale. DVT was confirmed in 18.9%. The two clinical probability scales showed a sensitivity of 100% (95% CI: 85.1 to 100) and a specificity of about 40%. With the application of the scales, one third of all referrals to hospital emergency rooms could have been avoided (P<.001). The diagnostic cost could have been reduced by € 8,620 according to Oudega and € 9,741 according to Wells, per 100 patients visited. The application of diagnostic algorithms when a DVT is suspected could lead to better diagnostic management by physicians, and a more cost effective process. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
van Mourik, Maaike S M; van Duijn, Pleun Joppe; Moons, Karel G M; Bonten, Marc J M; Lee, Grace M
2015-01-01
Objective Measuring the incidence of healthcare-associated infections (HAI) is of increasing importance in current healthcare delivery systems. Administrative data algorithms, including (combinations of) diagnosis codes, are commonly used to determine the occurrence of HAI, either to support within-hospital surveillance programmes or as free-standing quality indicators. We conducted a systematic review evaluating the diagnostic accuracy of administrative data for the detection of HAI. Methods Systematic search of Medline, Embase, CINAHL and Cochrane for relevant studies (1995–2013). Methodological quality assessment was performed using QUADAS-2 criteria; diagnostic accuracy estimates were stratified by HAI type and key study characteristics. Results 57 studies were included, the majority aiming to detect surgical site or bloodstream infections. Study designs were very diverse regarding the specification of their administrative data algorithm (code selections, follow-up) and definitions of HAI presence. One-third of studies had important methodological limitations including differential or incomplete HAI ascertainment or lack of blinding of assessors. Observed sensitivity and positive predictive values of administrative data algorithms for HAI detection were very heterogeneous and generally modest at best, both for within-hospital algorithms and for formal quality indicators; accuracy was particularly poor for the identification of device-associated HAI such as central line associated bloodstream infections. The large heterogeneity in study designs across the included studies precluded formal calculation of summary diagnostic accuracy estimates in most instances. Conclusions Administrative data had limited and highly variable accuracy for the detection of HAI, and their judicious use for internal surveillance efforts and external quality assessment is recommended. If hospitals and policymakers choose to rely on administrative data for HAI surveillance, continued improvements to existing algorithms and their robust validation are imperative. PMID:26316651
Downing, Harriet; Thomas-Jones, Emma; Gal, Micaela; Waldron, Cherry-Ann; Sterne, Jonathan; Hollingworth, William; Hood, Kerenza; Delaney, Brendan; Little, Paul; Howe, Robin; Wootton, Mandy; Macgowan, Alastair; Butler, Christopher C; Hay, Alastair D
2012-07-19
Urinary tract infection (UTI) is common in children, and may cause serious illness and recurrent symptoms. However, obtaining a urine sample from young children in primary care is challenging and not feasible for large numbers. Evidence regarding the predictive value of symptoms, signs and urinalysis for UTI in young children is urgently needed to help primary care clinicians better identify children who should be investigated for UTI. This paper describes the protocol for the Diagnosis of Urinary Tract infection in Young children (DUTY) study. The overall study aim is to derive and validate a cost-effective clinical algorithm for the diagnosis of UTI in children presenting to primary care acutely unwell. DUTY is a multicentre, diagnostic and prospective observational study aiming to recruit at least 7,000 children aged before their fifth birthday, being assessed in primary care for any acute, non-traumatic, illness of ≤ 28 days duration. Urine samples will be obtained from eligible consented children, and data collected on medical history and presenting symptoms and signs. Urine samples will be dipstick tested in general practice and sent for microbiological analysis. All children with culture positive urines and a random sample of children with urine culture results in other, non-positive categories will be followed up to record symptom duration and healthcare resource use. A diagnostic algorithm will be constructed and validated and an economic evaluation conducted.The primary outcome will be a validated diagnostic algorithm using a reference standard of a pure/predominant growth of at least >103, but usually >105 CFU/mL of one, but no more than two uropathogens.We will use logistic regression to identify the clinical predictors (i.e. demographic, medical history, presenting signs and symptoms and urine dipstick analysis results) most strongly associated with a positive urine culture result. We will then use economic evaluation to compare the cost effectiveness of the candidate prediction rules. This study will provide novel, clinically important information on the diagnostic features of childhood UTI and the cost effectiveness of a validated prediction rule, to help primary care clinicians improve the efficiency of their diagnostic strategy for UTI in young children.
2012-01-01
Background Urinary tract infection (UTI) is common in children, and may cause serious illness and recurrent symptoms. However, obtaining a urine sample from young children in primary care is challenging and not feasible for large numbers. Evidence regarding the predictive value of symptoms, signs and urinalysis for UTI in young children is urgently needed to help primary care clinicians better identify children who should be investigated for UTI. This paper describes the protocol for the Diagnosis of Urinary Tract infection in Young children (DUTY) study. The overall study aim is to derive and validate a cost-effective clinical algorithm for the diagnosis of UTI in children presenting to primary care acutely unwell. Methods/design DUTY is a multicentre, diagnostic and prospective observational study aiming to recruit at least 7,000 children aged before their fifth birthday, being assessed in primary care for any acute, non-traumatic, illness of ≤ 28 days duration. Urine samples will be obtained from eligible consented children, and data collected on medical history and presenting symptoms and signs. Urine samples will be dipstick tested in general practice and sent for microbiological analysis. All children with culture positive urines and a random sample of children with urine culture results in other, non-positive categories will be followed up to record symptom duration and healthcare resource use. A diagnostic algorithm will be constructed and validated and an economic evaluation conducted. The primary outcome will be a validated diagnostic algorithm using a reference standard of a pure/predominant growth of at least >103, but usually >105 CFU/mL of one, but no more than two uropathogens. We will use logistic regression to identify the clinical predictors (i.e. demographic, medical history, presenting signs and symptoms and urine dipstick analysis results) most strongly associated with a positive urine culture result. We will then use economic evaluation to compare the cost effectiveness of the candidate prediction rules. Discussion This study will provide novel, clinically important information on the diagnostic features of childhood UTI and the cost effectiveness of a validated prediction rule, to help primary care clinicians improve the efficiency of their diagnostic strategy for UTI in young children. PMID:22812651
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki
2010-03-01
Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Measurements and modeling of contemporary radiocarbon in the stratosphere
Kanu, A. M.; Comfort, L. L.; Guilderson, T. P.; ...
2016-01-29
Measurements of the 14C content of carbon dioxide in air collected by high-altitude balloon flights in 2003–2005 reveal the contemporary radiocarbon distribution in the northern midlatitude stratosphere, four decades after the Limited Test Ban Treaty restricted atmospheric testing of nuclear weapons. Comparisons with results from a 3-D chemical-transport model show that the 14CO 2 distribution is now largely governed by the altitude/latitude dependence of the natural cosmogenic production rate, stratospheric transport, and propagation into the stratosphere of the decreasing radiocarbon trend in tropospheric CO 2 due to fossil fuel combustion. From the observed correlation of 14CO 2 with N 2Omore » mixing ratios, an annual global mean net flux of 14CO 2 to the troposphere of 1.6(±0.4) × 10 17‰ mol CO 2 yr –1 and a global production rate of 2.2(±0.6) × 10 26 atoms 14C yr –1 are empirically derived. Furthermore, the results also indicate that contemporary 14CO 2 observations provide highly sensitive diagnostics for stratospheric transport and residence times in models.« less
Sellbom, Martin; Arbisi, Paul A
2017-01-01
This special section considers 9 independent articles that seek to link the Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008/ 2011 ) to contemporary models of psychopathology. Sellbom ( this issue ) maps the Specific Problems scales onto hierarchical psychopathology structures, whereas Romero, Toorabally, Burchett, Tarescavage, and Glassmire ( this issue ) and Shkalim, Almagor, and Ben-Porath ( this issue ) show evidence of linking the instruments' scales to diagnostic representations of common higher order psychopathology constructs. McCord, Achee, Cannon, Harrop, and Poynter ( this issue ) link the MMPI-2-RF scales to psychophysiological constructs inspired by the National Institute of Mental Health (NIMH) Research Domain Criteria. Sellbom and Smith ( this issue ) find support for MMPI-2-RF scale hypotheses in covering personality psychopathology in general, whereas Klein Haneveld, Kamphuis, Smid, and Forbey ( this issue ) and Kutchen et al. ( this issue ) demonstrate the utility of the MMPI-2-RF in capturing contemporary conceptualizations of the psychopathic personality. Finally, Franz, Harrop, and McCord ( this issue ) and Rogers et al. ( this issue ) mapped the MMPI-2-RF scales onto more specific transdiagnostic constructs reflecting interpersonal functioning and suicide behavior proneness, respectively.
[Managment of acute low back pain without trauma - an algorithm].
Melcher, Carolin; Wegener, Bernd; Jansson, Volkmar; Mutschler, Wolf; Kanz, Karl-Georg; Birkenmaier, Christof
2018-05-14
Low back pain is a common problem for primary care providers, outpatient clinics and A&E departments. The predominant symptoms are those of so-called "unspecific back pain", but serious pathologies can be concealed by the clinical signs. Especially less experienced colleagues have problems in treating these patients, as - despite the multitude of recommendations and guidelines - there is no generally accepted algorithm. After a literature search (Medline/Cochrane), 158 articles were selected from 15,000 papers and classified according to their level of evidence. These were attuned to the clinical guidelines of the orthopaedic and pain-physician associations in Europe, North America and overseas and the experience of specialists at LMU Munich, in order to achieve consistency with literature recommendations, as well as feasibility in everyday clinical work and optimised with practical relevance. An algorithm was formed to provide the crucial differential diagnosis of lumbar back pain according to its clinical relevance and to provide a plan of action offering reasonable diagnostic and therapeutic steps. As a consequence of distinct binary decisions, low back patients should be treated at any given time according to the guidelines, with emergencies detected, unnecessary diagnostic testing and interventions averted and reasonable treatment initiated pursuant to the underlying pathology. In the context of the available evidence, a clinical algorithm has been developed that translates the complex diagnostic testing of acute low back pain into a transparent, structured and systematic guideline. Georg Thieme Verlag KG Stuttgart · New York.
Koa-Wing, Michael; Nakagawa, Hiroshi; Luther, Vishal; Jamil-Copley, Shahnaz; Linton, Nick; Sandler, Belinda; Qureshi, Norman; Peters, Nicholas S; Davies, D Wyn; Francis, Darrel P; Jackman, Warren; Kanagaratnam, Prapa
2015-11-15
Ripple Mapping (RM) is designed to overcome the limitations of existing isochronal 3D mapping systems by representing the intracardiac electrogram as a dynamic bar on a surface bipolar voltage map that changes in height according to the electrogram voltage-time relationship, relative to a fiduciary point. We tested the hypothesis that standard approaches to atrial tachycardia CARTO™ activation maps were inadequate for RM creation and interpretation. From the results, we aimed to develop an algorithm to optimize RMs for future prospective testing on a clinical RM platform. CARTO-XP™ activation maps from atrial tachycardia ablations were reviewed by two blinded assessors on an off-line RM workstation. Ripple Maps were graded according to a diagnostic confidence scale (Grade I - high confidence with clear pattern of activation through to Grade IV - non-diagnostic). The RM-based diagnoses were corroborated against the clinical diagnoses. 43 RMs from 14 patients were classified as Grade I (5 [11.5%]); Grade II (17 [39.5%]); Grade III (9 [21%]) and Grade IV (12 [28%]). Causes of low gradings/errors included the following: insufficient chamber point density; window-of-interest<100% of cycle length (CL); <95% tachycardia CL mapped; variability of CL and/or unstable fiducial reference marker; and suboptimal bar height and scar settings. A data collection and map interpretation algorithm has been developed to optimize Ripple Maps in atrial tachycardias. This algorithm requires prospective testing on a real-time clinical platform. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam
2016-01-01
The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.
Rosier, Peter F W M; Giarenis, Ilias; Valentini, Francoise A; Wein, Alan; Cardozo, Linda
2014-06-01
The ICI-RS Think Tank discussed the diagnostic process for patients who present with symptoms and signs of lower urinary tract (LUT) dysfunction. This manuscript reflects the Think Tank's summary and opinion. An overview of the existing evidence and consensus regarding urodynamic testing was presented and discussed in relation to contemporary treatment strategies. Evidence of the validity of the diagnostic process in relation to the contemporary management paradigm is incomplete, scattered, and sometimes conflicting and therefore a process redesign may be necessary. The Think Tanks' suggestion, contained in this manuscript, is that the symptoms and signs that the patients present can be more precisely delineated as syndromes. The overactive bladder syndrome (OAB-S); the stress urinary incontinence syndrome (SUI-S); the urinary incontinence syndrome (UI-S); the voiding dysfunction syndrome (VD-S); and or the neurogenic LUT dysfunction syndrome (NLUTD-S) may become evidence based starting point for initial management. Consistent addition of the word syndrome, if adequately defined, acknowledges the uncertainty, but will improve outcome and will improve selection of patients that need further (invasive) diagnosis before management. The ICS-RS Think Tank has summarized the level of evidence for UDS and discussed the evidence in association with the currently changing management paradigm. The ICI-RS Think Tank recommends that the diagnostic process for patients with LUTD can be redesigned. Carefully delineated and evidence based LUTD syndromes may better indicate, personalize and improve the outcome of initial management, and may also contribute to improved and rational selection of patients for invasive UDS. Neurourol. Urodynam. 33:581-586, 2014. © 2014 Wiley Periodicals, Inc. © 2014 Wiley Periodicals, Inc.
Lee, Jae-Hong; Kim, Do-Hyung; Jeong, Seong-Nyum; Choi, Seong-Ho
2018-04-01
The aim of the current study was to develop a computer-assisted detection system based on a deep convolutional neural network (CNN) algorithm and to evaluate the potential usefulness and accuracy of this system for the diagnosis and prediction of periodontally compromised teeth (PCT). Combining pretrained deep CNN architecture and a self-trained network, periapical radiographic images were used to determine the optimal CNN algorithm and weights. The diagnostic and predictive accuracy, sensitivity, specificity, positive predictive value, negative predictive value, receiver operating characteristic (ROC) curve, area under the ROC curve, confusion matrix, and 95% confidence intervals (CIs) were calculated using our deep CNN algorithm, based on a Keras framework in Python. The periapical radiographic dataset was split into training (n=1,044), validation (n=348), and test (n=348) datasets. With the deep learning algorithm, the diagnostic accuracy for PCT was 81.0% for premolars and 76.7% for molars. Using 64 premolars and 64 molars that were clinically diagnosed as severe PCT, the accuracy of predicting extraction was 82.8% (95% CI, 70.1%-91.2%) for premolars and 73.4% (95% CI, 59.9%-84.0%) for molars. We demonstrated that the deep CNN algorithm was useful for assessing the diagnosis and predictability of PCT. Therefore, with further optimization of the PCT dataset and improvements in the algorithm, a computer-aided detection system can be expected to become an effective and efficient method of diagnosing and predicting PCT.
Rizzo, G; Capponi, A; Pietrolucci, M E; Capece, A; Aiello, E; Mammarella, S; Arduini, D
2011-08-01
To describe a novel algorithm, based on the new display technology 'OmniView', developed to visualize diagnostic sagittal and coronal planes of the fetal brain from volumes obtained by three-dimensional (3D) ultrasonography. We developed an algorithm to image standard neurosonographic planes by drawing dissecting lines through the axial transventricular view of 3D volume datasets acquired transabdominally. The algorithm was tested on 106 normal fetuses at 18-24 weeks of gestation and the visualization rates of brain diagnostic planes were evaluated by two independent reviewers. The algorithm was also applied to nine cases with proven brain defects. The two reviewers, using the algorithm on normal fetuses, found satisfactory images with visualization rates ranging between 71.7% and 96.2% for sagittal planes and between 76.4% and 90.6% for coronal planes. The agreement rate between the two reviewers, as expressed by Cohen's kappa coefficient, was > 0.93 for sagittal planes and > 0.89 for coronal planes. All nine abnormal volumes were identified by a single observer from among a series including normal brains, and eight of these nine cases were diagnosed correctly. This novel algorithm can be used to visualize standard sagittal and coronal planes in the fetal brain. This approach may simplify the examination of the fetal brain and reduce dependency of success on operator skill. Copyright © 2011 ISUOG. Published by John Wiley & Sons, Ltd.
Accounting for False Positive HIV Tests: Is Visceral Leishmaniasis Responsible?
Shanks, Leslie; Ritmeijer, Koert; Piriou, Erwan; Siddiqui, M. Ruby; Kliescikova, Jarmila; Pearce, Neil; Ariti, Cono; Muluneh, Libsework; Masiga, Johnson; Abebe, Almaz
2015-01-01
Background Co-infection with HIV and visceral leishmaniasis is an important consideration in treatment of either disease in endemic areas. Diagnosis of HIV in resource-limited settings relies on rapid diagnostic tests used together in an algorithm. A limitation of the HIV diagnostic algorithm is that it is vulnerable to falsely positive reactions due to cross reactivity. It has been postulated that visceral leishmaniasis (VL) infection can increase this risk of false positive HIV results. This cross sectional study compared the risk of false positive HIV results in VL patients with non-VL individuals. Methodology/Principal Findings Participants were recruited from 2 sites in Ethiopia. The Ethiopian algorithm of a tiebreaker using 3 rapid diagnostic tests (RDTs) was used to test for HIV. The gold standard test was the Western Blot, with indeterminate results resolved by PCR testing. Every RDT screen positive individual was included for testing with the gold standard along with 10% of all negatives. The final analysis included 89 VL and 405 non-VL patients. HIV prevalence was found to be 12.8% (47/ 367) in the VL group compared to 7.9% (200/2526) in the non-VL group. The RDT algorithm in the VL group yielded 47 positives, 4 false positives, and 38 negatives. The same algorithm for those without VL had 200 positives, 14 false positives, and 191 negatives. Specificity and positive predictive value for the group with VL was less than the non-VL group; however, the difference was not found to be significant (p = 0.52 and p = 0.76, respectively). Conclusion The test algorithm yielded a high number of HIV false positive results. However, we were unable to demonstrate a significant difference between groups with and without VL disease. This suggests that the presence of endemic visceral leishmaniasis alone cannot account for the high number of false positive HIV results in our study. PMID:26161864
Accounting for False Positive HIV Tests: Is Visceral Leishmaniasis Responsible?
Shanks, Leslie; Ritmeijer, Koert; Piriou, Erwan; Siddiqui, M Ruby; Kliescikova, Jarmila; Pearce, Neil; Ariti, Cono; Muluneh, Libsework; Masiga, Johnson; Abebe, Almaz
2015-01-01
Co-infection with HIV and visceral leishmaniasis is an important consideration in treatment of either disease in endemic areas. Diagnosis of HIV in resource-limited settings relies on rapid diagnostic tests used together in an algorithm. A limitation of the HIV diagnostic algorithm is that it is vulnerable to falsely positive reactions due to cross reactivity. It has been postulated that visceral leishmaniasis (VL) infection can increase this risk of false positive HIV results. This cross sectional study compared the risk of false positive HIV results in VL patients with non-VL individuals. Participants were recruited from 2 sites in Ethiopia. The Ethiopian algorithm of a tiebreaker using 3 rapid diagnostic tests (RDTs) was used to test for HIV. The gold standard test was the Western Blot, with indeterminate results resolved by PCR testing. Every RDT screen positive individual was included for testing with the gold standard along with 10% of all negatives. The final analysis included 89 VL and 405 non-VL patients. HIV prevalence was found to be 12.8% (47/ 367) in the VL group compared to 7.9% (200/2526) in the non-VL group. The RDT algorithm in the VL group yielded 47 positives, 4 false positives, and 38 negatives. The same algorithm for those without VL had 200 positives, 14 false positives, and 191 negatives. Specificity and positive predictive value for the group with VL was less than the non-VL group; however, the difference was not found to be significant (p = 0.52 and p = 0.76, respectively). The test algorithm yielded a high number of HIV false positive results. However, we were unable to demonstrate a significant difference between groups with and without VL disease. This suggests that the presence of endemic visceral leishmaniasis alone cannot account for the high number of false positive HIV results in our study.
Kudo, Kohsuke; Uwano, Ikuko; Hirai, Toshinori; Murakami, Ryuji; Nakamura, Hideo; Fujima, Noriyuki; Yamashita, Fumio; Goodwin, Jonathan; Higuchi, Satomi; Sasaki, Makoto
2017-04-10
The purpose of the present study was to compare different software algorithms for processing DSC perfusion images of cerebral tumors with respect to i) the relative CBV (rCBV) calculated, ii) the cutoff value for discriminating low- and high-grade gliomas, and iii) the diagnostic performance for differentiating these tumors. Following approval of institutional review board, informed consent was obtained from all patients. Thirty-five patients with primary glioma (grade II, 9; grade III, 8; and grade IV, 18 patients) were included. DSC perfusion imaging was performed with 3-Tesla MRI scanner. CBV maps were generated by using 11 different algorithms of four commercially available software and one academic program. rCBV of each tumor compared to normal white matter was calculated by ROI measurements. Differences in rCBV value were compared between algorithms for each tumor grade. Receiver operator characteristics analysis was conducted for the evaluation of diagnostic performance of different algorithms for differentiating between different grades. Several algorithms showed significant differences in rCBV, especially for grade IV tumors. When differentiating between low- (II) and high-grade (III/IV) tumors, the area under the ROC curve (Az) was similar (range 0.85-0.87), and there were no significant differences in Az between any pair of algorithms. In contrast, the optimal cutoff values varied between algorithms (range 4.18-6.53). rCBV values of tumor and cutoff values for discriminating low- and high-grade gliomas differed between software packages, suggesting that optimal software-specific cutoff values should be used for diagnosis of high-grade gliomas.
Murungi, Moses; Fulton, Travis; Reyes, Raquel; Matte, Michael; Ntaro, Moses; Mulogo, Edgar; Nyehangane, Dan; Juliano, Jonathan J; Siedner, Mark J; Boum, Yap; Boyce, Ross M
2017-05-01
Poor specificity may negatively impact rapid diagnostic test (RDT)-based diagnostic strategies for malaria. We performed real-time PCR on a subset of subjects who had undergone diagnostic testing with a multiple-antigen (histidine-rich protein 2 and pan -lactate dehydrogenase pLDH [HRP2/pLDH]) RDT and microscopy. We determined the sensitivity and specificity of the RDT in comparison to results of PCR for the detection of Plasmodium falciparum malaria. We developed and evaluated a two-step algorithm utilizing the multiple-antigen RDT to screen patients, followed by confirmatory microscopy for those individuals with HRP2-positive (HRP2 + )/pLDH-negative (pLDH - ) results. In total, dried blood spots (DBS) were collected from 276 individuals. There were 124 (44.9%) individuals with an HRP2 + /pLDH + result, 94 (34.1%) with an HRP2 + /pLDH - result, and 58 (21%) with a negative RDT result. The sensitivity and specificity of the RDT compared to results with real-time PCR were 99.4% (95% confidence interval [CI], 95.9 to 100.0%) and 46.7% (95% CI, 37.7 to 55.9%), respectively. Of the 94 HRP2 + /pLDH - results, only 32 (34.0%) and 35 (37.2%) were positive by microscopy and PCR, respectively. The sensitivity and specificity of the two-step algorithm compared to results with real-time PCR were 95.5% (95% CI, 90.5 to 98.0%) and 91.0% (95% CI, 84.1 to 95.2), respectively. HRP2 antigen bands demonstrated poor specificity for the diagnosis of malaria compared to that of real-time PCR in a high-transmission setting. The most likely explanation for this finding is the persistence of HRP2 antigenemia following treatment of an acute infection. The two-step diagnostic algorithm utilizing microscopy as a confirmatory test for indeterminate HRP2 + /pLDH - results showed significantly improved specificity with little loss of sensitivity in a high-transmission setting. Copyright © 2017 American Society for Microbiology.
Primer on clinical acid-base problem solving.
Whittier, William L; Rutecki, Gregory W
2004-03-01
Acid-base problem solving has been an integral part of medical practice in recent generations. Diseases discovered in the last 30-plus years, for example, Bartter syndrome and Gitelman syndrome, D-lactic acidosis, and bulimia nervosa, can be diagnosed according to characteristic acid-base findings. Accuracy in acid-base problem solving is a direct result of a reproducible, systematic approach to arterial pH, partial pressure of carbon dioxide, bicarbonate concentration, and electrolytes. The 'Rules of Five' is one tool that enables clinicians to determine the cause of simple and complex disorders, even triple acid-base disturbances, with consistency. In addition, other electrolyte abnormalities that accompany acid-base disorders, such as hypokalemia, can be incorporated into algorithms that complement the Rules and contribute to efficient problem solving in a wide variety of diseases. Recently urine electrolytes have also assisted clinicians in further characterizing select disturbances. Acid-base patterns, in many ways, can serve as a 'common diagnostic pathway' shared by all subspecialties in medicine. From infectious disease (eg, lactic acidemia with highly active antiviral therapy therapy) through endocrinology (eg, Conn's syndrome, high urine chloride alkalemia) to the interface between primary care and psychiatry (eg, bulimia nervosa with multiple potential acid-base disturbances), acid-base problem solving is the key to unlocking otherwise unrelated diagnoses. Inasmuch as the Rules are clinical tools, they are applied throughout this monograph to diverse pathologic conditions typical in contemporary practice.
Dynamical Vertex Approximation for the Hubbard Model
NASA Astrophysics Data System (ADS)
Toschi, Alessandro
A full understanding of correlated electron systems in the physically relevant situations of three and two dimensions represents a challenge for the contemporary condensed matter theory. However, in the last years considerable progress has been achieved by means of increasingly more powerful quantum many-body algorithms, applied to the basic model for correlated electrons, the Hubbard Hamiltonian. Here, I will review the physics emerging from studies performed with the dynamical vertex approximation, which includes diagrammatic corrections to the local description of the dynamical mean field theory (DMFT). In particular, I will first discuss the phase diagram in three dimensions with a special focus on the commensurate and incommensurate magnetic phases, their (quantum) critical properties, and the impact of fluctuations on electronic lifetimes and spectral functions. In two dimensions, the effects of non-local fluctuations beyond DMFT grow enormously, determining the appearance of a low-temperature insulating behavior for all values of the interaction in the unfrustrated model: Here the prototypical features of the Mott-Hubbard metal-insulator transition, as well as the existence of magnetically ordered phases, are completely overwhelmed by antiferromagnetic fluctuations of exponentially large extension, in accordance with the Mermin-Wagner theorem. Eventually, by a fluctuation diagnostics analysis of cluster DMFT self-energies, the same magnetic fluctuations are identified as responsible for the pseudogap regime in the holed-doped frustrated case, with important implications for the theoretical modeling of the cuprate physics.
Sensor Fusion, Prognostics, Diagnostics and Failure Mode Control for Complex Aerospace Systems
2010-10-01
algorithm and to then tune the candidates individually using known metaheuristics . As will be...parallel. The result of this arrangement is that the processing is a form that is analogous to standard parallel genetic algorithms , and as such...search algorithm then uses the hybrid of fitness data to rank the results. The ETRAS controller is developed using pre-selection, showing that a
Zamdborg, Leonid; Holloway, David M; Merelo, Juan J; Levchenko, Vladimir F; Spirov, Alexander V
2015-06-10
Modern evolutionary computation utilizes heuristic optimizations based upon concepts borrowed from the Darwinian theory of natural selection. Their demonstrated efficacy has reawakened an interest in other aspects of contemporary biology as an inspiration for new algorithms. However, amongst the many excellent candidates for study, contemporary models of biological macroevolution attract special attention. We believe that a vital direction in this field must be algorithms that model the activity of "genomic parasites", such as transposons, in biological evolution. Many evolutionary biologists posit that it is the co-evolution of populations with their genomic parasites that permits the high efficiency of evolutionary searches found in the living world. This publication is our first step in the direction of developing a minimal assortment of algorithms that simulate the role of genomic parasites. Specifically, we started in the domain of genetic algorithms (GA) and selected the Artificial Ant Problem as a test case. This navigation problem is widely known as a classical benchmark test and possesses a large body of literature. We add new objects to the standard toolkit of GA - artificial transposons and a collection of operators that operate on them. We define these artificial transposons as a fragment of an ant's code with properties that cause it to stand apart from the rest. The minimal set of operators for transposons is a transposon mutation operator, and a transposon reproduction operator that causes a transposon to multiply within the population of hosts. An analysis of the population dynamics of transposons within the course of ant evolution showed that transposons are involved in the processes of propagation and selection of blocks of ant navigation programs. During this time, the speed of evolutionary search increases significantly. We concluded that artificial transposons, analogous to real transposons, are truly capable of acting as intelligent mutators that adapt in response to an evolutionary problem in the course of co-evolution with their hosts.
Zamdborg, Leonid; Holloway, David M.; Merelo, Juan J.; Levchenko, Vladimir F.; Spirov, Alexander V.
2015-01-01
Modern evolutionary computation utilizes heuristic optimizations based upon concepts borrowed from the Darwinian theory of natural selection. Their demonstrated efficacy has reawakened an interest in other aspects of contemporary biology as an inspiration for new algorithms. However, amongst the many excellent candidates for study, contemporary models of biological macroevolution attract special attention. We believe that a vital direction in this field must be algorithms that model the activity of “genomic parasites”, such as transposons, in biological evolution. Many evolutionary biologists posit that it is the co-evolution of populations with their genomic parasites that permits the high efficiency of evolutionary searches found in the living world. This publication is our first step in the direction of developing a minimal assortment of algorithms that simulate the role of genomic parasites. Specifically, we started in the domain of genetic algorithms (GA) and selected the Artificial Ant Problem as a test case. This navigation problem is widely known as a classical benchmark test and possesses a large body of literature. We add new objects to the standard toolkit of GA - artificial transposons and a collection of operators that operate on them. We define these artificial transposons as a fragment of an ant's code with properties that cause it to stand apart from the rest. The minimal set of operators for transposons is a transposon mutation operator, and a transposon reproduction operator that causes a transposon to multiply within the population of hosts. An analysis of the population dynamics of transposons within the course of ant evolution showed that transposons are involved in the processes of propagation and selection of blocks of ant navigation programs. During this time, the speed of evolutionary search increases significantly. We concluded that artificial transposons, analogous to real transposons, are truly capable of acting as intelligent mutators that adapt in response to an evolutionary problem in the course of co-evolution with their hosts. PMID:25767296
Accuracy of vaginal symptom self-diagnosis algorithms for deployed military women.
Ryan-Wenger, Nancy A; Neal, Jeremy L; Jones, Ashley S; Lowe, Nancy K
2010-01-01
Deployed military women have an increased risk for development of vaginitis due to extreme temperatures, primitive sanitation, hygiene and laundry facilities, and unavailable or unacceptable healthcare resources. The Women in the Military Self-Diagnosis (WMSD) and treatment kit was developed as a field-expedient solution to this problem. The primary study aims were to evaluate the accuracy of women's self-diagnosis of vaginal symptoms and eight diagnostic algorithms and to predict potential self-medication omission and commission error rates. Participants included 546 active duty, deployable Army (43.3%) and Navy (53.6%) women with vaginal symptoms who sought healthcare at troop medical clinics on base.In the clinic lavatory, women conducted a self-diagnosis using a sterile cotton swab to obtain vaginal fluid, a FemExam card to measure positive or negative pH and amines, and the investigator-developed WMSD Decision-Making Guide. Potential self-diagnoses were "bacterial infection" (bacterial vaginosis [BV] and/or trichomonas vaginitis [TV]), "yeast infection" (candida vaginitis [CV]), "no infection/normal," or "unclear." The Affirm VPIII laboratory reference standard was used to detect clinically significant amounts of vaginal fluid DNA for organisms associated with BV, TV, and CV. Women's self-diagnostic accuracy was 56% for BV/TV and 69.2% for CV. False-positives would have led to a self-medication commission error rate of 20.3% for BV/TV and 8% for CV. Potential self-medication omission error rates due to false-negatives were 23.7% for BV/TV and 24.8% for CV. The positive predictive value of diagnostic algorithms ranged from 0% to 78.1% for BV/TV and 41.7% for CV. The algorithms were based on clinical diagnostic standards. The nonspecific nature of vaginal symptoms, mixed infections, and a faulty device intended to measure vaginal pH and amines explain why none of the algorithms reached the goal of 95% accuracy. The next prototype of the WMSD kit will not include nonspecific vaginal signs and symptoms in favor of recently available point-of-care devices that identify antigens or enzymes of the causative BV, TV, and CV organisms.
Statistical physics of medical diagnostics: Study of a probabilistic model.
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Statistical physics of medical diagnostics: Study of a probabilistic model
NASA Astrophysics Data System (ADS)
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
[Diagnostic algorithm in chronic myeloproliferative diseases (CMPD)].
Haferlach, Torsten; Bacher, Ulrike; Kern, Wolfgang; Schnittger, Susanne; Haferlach, Claudia
2007-09-15
The Philadelphia-negative chronic myeloproliferative diseases (CMPD) are very complex and heterogeneous disorders. They are represented by polycythemia vera (PV), chronic idiopathic myelofibrosis (CIMF), essential thrombocythemia (ET), CMPD/unclassifiable (CMPD-U), chronic neutrophilic leukemia (CNL), and chronic eosinophilic leukemia/hypereosinophilic syndrome (CEL/HES) according to the WHO classification. Before, diagnostics were mainly focused on clinical and morphological aspects, but in recent years cytogenetics and fluorescence in situ hybridization (FISH) found entrance in routine schedules as chromosomal abnormalities are relevant for prognosis and classification. Recently, there is rapid progress in the field of molecular characterization: the JAK2V617F mutation which shows a high incidence in PV, CIMF, and ET already plays a central role and will probably soon be included in follow-up procedures. Due to the detection of mutations in exon 12 of the JAK2 gene or mutations in the MPL gene the variety of activating mutations in the CMPD is still increasing. In CEL/HES the detection of the FIP1L1-PDGFRA fusion gene and overexpression of PDGFRA and PDGFRB led to targeted therapy with tyrosine kinase inhibitors. Thus, diagnostics in the CMPD transform toward a multimodal diagnostic concept based on a combination of methods - cyto-/histomorphology, cytogenetics, and individual molecular methods which can be included in a diagnostic algorithm.
Lykiardopoulos, Byron; Hagström, Hannes; Fredrikson, Mats; Ignatova, Simone; Stål, Per; Hultcrantz, Rolf; Ekstedt, Mattias; Kechagias, Stergios
2016-01-01
Detection of advanced fibrosis (F3-F4) in nonalcoholic fatty liver disease (NAFLD) is important for ascertaining prognosis. Serum markers have been proposed as alternatives to biopsy. We attempted to develop a novel algorithm for detection of advanced fibrosis based on a more efficient combination of serological markers and to compare this with established algorithms. We included 158 patients with biopsy-proven NAFLD. Of these, 38 had advanced fibrosis. The following fibrosis algorithms were calculated: NAFLD fibrosis score, BARD, NIKEI, NASH-CRN regression score, APRI, FIB-4, King´s score, GUCI, Lok index, Forns score, and ELF. Study population was randomly divided in a training and a validation group. A multiple logistic regression analysis using bootstrapping methods was applied to the training group. Among many variables analyzed age, fasting glucose, hyaluronic acid and AST were included, and a model (LINKI-1) for predicting advanced fibrosis was created. Moreover, these variables were combined with platelet count in a mathematical way exaggerating the opposing effects, and alternative models (LINKI-2) were also created. Models were compared using area under the receiver operator characteristic curves (AUROC). Of established algorithms FIB-4 and King´s score had the best diagnostic accuracy with AUROCs 0.84 and 0.83, respectively. Higher accuracy was achieved with the novel LINKI algorithms. AUROCs in the total cohort for LINKI-1 was 0.91 and for LINKI-2 models 0.89. The LINKI algorithms for detection of advanced fibrosis in NAFLD showed better accuracy than established algorithms and should be validated in further studies including larger cohorts.
Optimizing Tissue Sampling for the Diagnosis, Subtyping, and Molecular Analysis of Lung Cancer
Ofiara, Linda Marie; Navasakulpong, Asma; Beaudoin, Stephane; Gonzalez, Anne Valerie
2014-01-01
Lung cancer has entered the era of personalized therapy with histologic subclassification and the presence of molecular biomarkers becoming increasingly important in therapeutic algorithms. At the same time, biopsy specimens are becoming increasingly smaller as diagnostic algorithms seek to establish diagnosis and stage with the least invasive techniques. Here, we review techniques used in the diagnosis of lung cancer including bronchoscopy, ultrasound-guided bronchoscopy, transthoracic needle biopsy, and thoracoscopy. In addition to discussing indications and complications, we focus our discussion on diagnostic yields and the feasibility of testing for molecular biomarkers such as epidermal growth factor receptor and anaplastic lymphoma kinase, emphasizing the importance of a sufficient tumor biopsy. PMID:25295226
Multispectral autofluorescence diagnosis of non-melanoma cutaneous tumors
NASA Astrophysics Data System (ADS)
Borisova, Ekaterina; Dogandjiiska, Daniela; Bliznakova, Irina; Avramov, Latchezar; Pavlova, Elmira; Troyanova, Petranka
2009-07-01
Fluorescent analysis of basal cell carcinoma (BCC), squamous cell carcinoma (SCC), keratoacanthoma and benign cutaneous lesions is carried out under initial phase of clinical trial in the National Oncological Center - Sofia. Excitation sources with maximum of emission at 365, 380, 405, 450 and 630 nm are applied for better differentiation between nonmelanoma malignant cutaneous lesions fluorescence and spectral discrimination from the benign pathologies. Major spectral features are addressed and diagnostic discrimination algorithms based on lesions' emission properties are proposed. The diagnostic algorithms and evaluation procedures found will be applied for development of an optical biopsy clinical system for skin cancer detection in the frames of National Oncological Center and other university hospital dermatological departments in our country.
[Coagulation Monitoring and Bleeding Management in Cardiac Surgery].
Bein, Berthold; Schiewe, Robert
2018-05-01
The transfusion of allogeneic blood products is associated with increased morbidity and mortality. An impaired hemostasis is frequently found in patients undergoing cardiac surgery and may in turn cause bleeding and transfusions. A goal directed coagulation management addressing the often complex coagulation disorders needs sophisticated diagnostics. This may improve both patients' outcome and costs. Recent data suggest that coagulation management based on a rational algorithm is more effective than traditional therapy based on conventional laboratory variables such as PT and INR. Platelet inhibitors, cumarins, direct oral anticoagulants and heparin need different diagnostic and therapeutic approaches. An algorithm specifically developed for use during cardiac surgery is presented. Georg Thieme Verlag KG Stuttgart · New York.
ERIC Educational Resources Information Center
Wang, Chun
2013-01-01
Cognitive diagnostic computerized adaptive testing (CD-CAT) purports to combine the strengths of both CAT and cognitive diagnosis. Cognitive diagnosis models aim at classifying examinees into the correct mastery profile group so as to pinpoint the strengths and weakness of each examinee whereas CAT algorithms choose items to determine those…
"Scientific peep show": the human body in contemporary science museums.
Canadelli, Elena
2011-01-01
The essay focuses on the discourse about the human body developed by contemporary science museums with educational and instructive purposes directed at the general public. These museums aim mostly at mediating concepts such as health and prevention. The current scenario is linked with two examples of past museums: the popular anatomical museums which emerged during the 19th century and the health museums thrived between 1910 and 1940. On the museological path about the human body self-care we went from the emotionally involving anatomical Venuses to the inexpressive Transparent Man, from anatomical specimens of ill organs and deformed subjects to the mechanical and electronic models of the healthy body. Today the body is made transparent by the new medical diagnostics and by the latest discoveries of endoscopy. The way museums and science centers presently display the human body involves computers, 3D animation, digital technologies, hands-on models of large size human parts.
ROBNCA: robust network component analysis for recovering transcription factor activities.
Noor, Amina; Ahmad, Aitzaz; Serpedin, Erchin; Nounou, Mohamed; Nounou, Hazem
2013-10-01
Network component analysis (NCA) is an efficient method of reconstructing the transcription factor activity (TFA), which makes use of the gene expression data and prior information available about transcription factor (TF)-gene regulations. Most of the contemporary algorithms either exhibit the drawback of inconsistency and poor reliability, or suffer from prohibitive computational complexity. In addition, the existing algorithms do not possess the ability to counteract the presence of outliers in the microarray data. Hence, robust and computationally efficient algorithms are needed to enable practical applications. We propose ROBust Network Component Analysis (ROBNCA), a novel iterative algorithm that explicitly models the possible outliers in the microarray data. An attractive feature of the ROBNCA algorithm is the derivation of a closed form solution for estimating the connectivity matrix, which was not available in prior contributions. The ROBNCA algorithm is compared with FastNCA and the non-iterative NCA (NI-NCA). ROBNCA estimates the TF activity profiles as well as the TF-gene control strength matrix with a much higher degree of accuracy than FastNCA and NI-NCA, irrespective of varying noise, correlation and/or amount of outliers in case of synthetic data. The ROBNCA algorithm is also tested on Saccharomyces cerevisiae data and Escherichia coli data, and it is observed to outperform the existing algorithms. The run time of the ROBNCA algorithm is comparable with that of FastNCA, and is hundreds of times faster than NI-NCA. The ROBNCA software is available at http://people.tamu.edu/∼amina/ROBNCA
VLSI (Very Large Scale Integrated Circuits) Design with the MacPitts Silicon Compiler.
1985-09-01
the background. If the algorithm is not fully debugged, then issue instead macpitts basename herald so MacPitts diagnostics and Liszt diagnostics both...command interpreter. Upon compilation, however, the following LI!F compiler ( Liszt ) diagnostic results, Error: Non-number to minus nil where the first...language used in the MacPitts source code. The more instructive solution is to write the Franz LISP code to decide if a jumper wire is needed, and if so, to
Dust measurements in tokamaks (invited).
Rudakov, D L; Yu, J H; Boedo, J A; Hollmann, E M; Krasheninnikov, S I; Moyer, R A; Muller, S H; Pigarov, A Yu; Rosenberg, M; Smirnov, R D; West, W P; Boivin, R L; Bray, B D; Brooks, N H; Hyatt, A W; Wong, C P C; Roquemore, A L; Skinner, C H; Solomon, W M; Ratynskaia, S; Fenstermacher, M E; Groth, M; Lasnier, C J; McLean, A G; Stangeby, P C
2008-10-01
Dust production and accumulation present potential safety and operational issues for the ITER. Dust diagnostics can be divided into two groups: diagnostics of dust on surfaces and diagnostics of dust in plasma. Diagnostics from both groups are employed in contemporary tokamaks; new diagnostics suitable for ITER are also being developed and tested. Dust accumulation in ITER is likely to occur in hidden areas, e.g., between tiles and under divertor baffles. A novel electrostatic dust detector for monitoring dust in these regions has been developed and tested at PPPL. In the DIII-D tokamak dust diagnostics include Mie scattering from Nd:YAG lasers, visible imaging, and spectroscopy. Laser scattering is able to resolve particles between 0.16 and 1.6 microm in diameter; using these data the total dust content in the edge plasmas and trends in the dust production rates within this size range have been established. Individual dust particles are observed by visible imaging using fast framing cameras, detecting dust particles of a few microns in diameter and larger. Dust velocities and trajectories can be determined in two-dimension with a single camera or three-dimension using multiple cameras, but determination of particle size is challenging. In order to calibrate diagnostics and benchmark dust dynamics modeling, precharacterized carbon dust has been injected into the lower divertor of DIII-D. Injected dust is seen by cameras, and spectroscopic diagnostics observe an increase in carbon line (CI, CII, C(2) dimer) and thermal continuum emissions from the injected dust. The latter observation can be used in the design of novel dust survey diagnostics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudakov, D. L.; Yu, J. H.; Boedo, J. A.
Dust production and accumulation present potential safety and operational issues for the ITER. Dust diagnostics can be divided into two groups: diagnostics of dust on surfaces and diagnostics of dust in plasma. Diagnostics from both groups are employed in contemporary tokamaks; new diagnostics suitable for ITER are also being developed and tested. Dust accumulation in ITER is likely to occur in hidden areas, e.g., between tiles and under divertor baffles. A novel electrostatic dust detector for monitoring dust in these regions has been developed and tested at PPPL. In the DIII-D tokamak dust diagnostics include Mie scattering from Nd:YAG lasers,more » visible imaging, and spectroscopy. Laser scattering is able to resolve particles between 0.16 and 1.6 {mu}m in diameter; using these data the total dust content in the edge plasmas and trends in the dust production rates within this size range have been established. Individual dust particles are observed by visible imaging using fast framing cameras, detecting dust particles of a few microns in diameter and larger. Dust velocities and trajectories can be determined in two-dimension with a single camera or three-dimension using multiple cameras, but determination of particle size is challenging. In order to calibrate diagnostics and benchmark dust dynamics modeling, precharacterized carbon dust has been injected into the lower divertor of DIII-D. Injected dust is seen by cameras, and spectroscopic diagnostics observe an increase in carbon line (CI, CII, C{sub 2} dimer) and thermal continuum emissions from the injected dust. The latter observation can be used in the design of novel dust survey diagnostics.« less
Janjua, Naveed Zafar; Islam, Nazrul; Kuo, Margot; Yu, Amanda; Wong, Stanley; Butt, Zahid A; Gilbert, Mark; Buxton, Jane; Chapinal, Nuria; Samji, Hasina; Chong, Mei; Alvarez, Maria; Wong, Jason; Tyndall, Mark W; Krajden, Mel
2018-05-01
Large linked healthcare administrative datasets could be used to monitor programs providing prevention and treatment services to people who inject drugs (PWID). However, diagnostic codes in administrative datasets do not differentiate non-injection from injection drug use (IDU). We validated algorithms based on diagnostic codes and prescription records representing IDU in administrative datasets against interview-based IDU data. The British Columbia Hepatitis Testers Cohort (BC-HTC) includes ∼1.7 million individuals tested for HCV/HIV or reported HBV/HCV/HIV/tuberculosis cases in BC from 1990 to 2015, linked to administrative datasets including physician visit, hospitalization and prescription drug records. IDU, assessed through interviews as part of enhanced surveillance at the time of HIV or HCV/HBV diagnosis from a subset of cases included in the BC-HTC (n = 6559), was used as the gold standard. ICD-9/ICD-10 codes for IDU and injecting-related infections (IRI) were grouped with records of opioid substitution therapy (OST) into multiple IDU algorithms in administrative datasets. We assessed the performance of IDU algorithms through calculation of sensitivity, specificity, positive predictive, and negative predictive values. Sensitivity was highest (90-94%), and specificity was lowest (42-73%) for algorithms based either on IDU or IRI and drug misuse codes. Algorithms requiring both drug misuse and IRI had lower sensitivity (57-60%) and higher specificity (90-92%). An optimal sensitivity and specificity combination was found with two medical visits or a single hospitalization for injectable drugs with (83%/82%) and without OST (78%/83%), respectively. Based on algorithms that included two medical visits, a single hospitalization or OST records, there were 41,358 (1.2% of 11-65 years individuals in BC) recent PWID in BC based on health encounters during 3- year period (2013-2015). Algorithms for identifying PWID using diagnostic codes in linked administrative data could be used for tracking the progress of programing aimed at PWID. With population-based datasets, this tool can be used to inform much needed estimates of PWID population size. Copyright © 2018 Elsevier B.V. All rights reserved.
Hesar, Hamed Danandeh; Mohebbi, Maryam
2017-05-01
In this paper, a model-based Bayesian filtering framework called the "marginalized particle-extended Kalman filter (MP-EKF) algorithm" is proposed for electrocardiogram (ECG) denoising. This algorithm does not have the extended Kalman filter (EKF) shortcoming in handling non-Gaussian nonstationary situations because of its nonlinear framework. In addition, it has less computational complexity compared with particle filter. This filter improves ECG denoising performance by implementing marginalized particle filter framework while reducing its computational complexity using EKF framework. An automatic particle weighting strategy is also proposed here that controls the reliance of our framework to the acquired measurements. We evaluated the proposed filter on several normal ECGs selected from MIT-BIH normal sinus rhythm database. To do so, artificial white Gaussian and colored noises as well as nonstationary real muscle artifact (MA) noise over a range of low SNRs from 10 to -5 dB were added to these normal ECG segments. The benchmark methods were the EKF and extended Kalman smoother (EKS) algorithms which are the first model-based Bayesian algorithms introduced in the field of ECG denoising. From SNR viewpoint, the experiments showed that in the presence of Gaussian white noise, the proposed framework outperforms the EKF and EKS algorithms in lower input SNRs where the measurements and state model are not reliable. Owing to its nonlinear framework and particle weighting strategy, the proposed algorithm attained better results at all input SNRs in non-Gaussian nonstationary situations (such as presence of pink noise, brown noise, and real MA). In addition, the impact of the proposed filtering method on the distortion of diagnostic features of the ECG was investigated and compared with EKF/EKS methods using an ECG diagnostic distortion measure called the "Multi-Scale Entropy Based Weighted Distortion Measure" or MSEWPRD. The results revealed that our proposed algorithm had the lowest MSEPWRD for all noise types at low input SNRs. Therefore, the morphology and diagnostic information of ECG signals were much better conserved compared with EKF/EKS frameworks, especially in non-Gaussian nonstationary situations.
Glavatskiĭ, A Ia; Guzhovskaia, N V; Lysenko, S N; Kulik, A V
2005-12-01
The authors proposed a possible preoperative diagnostics of the degree of supratentorial brain gliom anaplasia using statistical analysis methods. It relies on a complex examination of 934 patients with I-IV degree anaplasias, which had been treated in the Institute of Neurosurgery from 1990 to 2004. The use of statistical analysis methods for differential diagnostics of the degree of brain gliom anaplasia may optimize a diagnostic algorithm, increase reliability of obtained data and in some cases avoid carrying out irrational operative intrusions. Clinically important signs for the use of statistical analysis methods directed to preoperative diagnostics of brain gliom anaplasia have been defined
Sethi, Gaurav; Saini, B S
2015-12-01
This paper presents an abdomen disease diagnostic system based on the flexi-scale curvelet transform, which uses different optimal scales for extracting features from computed tomography (CT) images. To optimize the scale of the flexi-scale curvelet transform, we propose an improved genetic algorithm. The conventional genetic algorithm assumes that fit parents will likely produce the healthiest offspring that leads to the least fit parents accumulating at the bottom of the population, reducing the fitness of subsequent populations and delaying the optimal solution search. In our improved genetic algorithm, combining the chromosomes of a low-fitness and a high-fitness individual increases the probability of producing high-fitness offspring. Thereby, all of the least fit parent chromosomes are combined with high fit parent to produce offspring for the next population. In this way, the leftover weak chromosomes cannot damage the fitness of subsequent populations. To further facilitate the search for the optimal solution, our improved genetic algorithm adopts modified elitism. The proposed method was applied to 120 CT abdominal images; 30 images each of normal subjects, cysts, tumors and stones. The features extracted by the flexi-scale curvelet transform were more discriminative than conventional methods, demonstrating the potential of our method as a diagnostic tool for abdomen diseases.
Song, Lele; Jia, Jia; Peng, Xiumei; Xiao, Wenhua; Li, Yuemin
2017-06-08
The SEPT9 gene methylation assay is the first FDA-approved blood assay for colorectal cancer (CRC) screening. Fecal immunochemical test (FIT), FIT-DNA test and CEA assay are also in vitro diagnostic (IVD) tests used in CRC screening. This meta-analysis aims to review the SEPT9 assay performance and compare it with other IVD CRC screening tests. By searching the Ovid MEDLINE, EMBASE, CBMdisc and CJFD database, 25 out of 180 studies were identified to report the SEPT9 assay performance. 2613 CRC cases and 6030 controls were included, and sensitivity and specificity were used to evaluate its performance at various algorithms. 1/3 algorithm exhibited the best sensitivity while 2/3 and 1/1 algorithm exhibited the best balance between sensitivity and specificity. The performance of the blood SEPT9 assay is superior to that of the serum protein markers and the FIT test in symptomatic population, while appeared to be less potent than FIT and FIT-DNA tests in asymptomatic population. In conclusion, 1/3 algorithm is recommended for CRC screening, and 2/3 or 1/1 algorithms are suitable for early detection for diagnostic purpose. The SEPT9 assay exhibited better performance in symptomatic population than in asymptomatic population.
NASA Astrophysics Data System (ADS)
Wu, Yu-Jie; Lin, Guan-Wei
2017-04-01
Since 1999, Taiwan has experienced a rapid rise in the number of landslides, and the number even reached a peak after the 2009 Typhoon Morakot. Although it is proved that the ground-motion signals induced by slope processes could be recorded by seismograph, it is difficult to be distinguished from continuous seismic records due to the lack of distinct P and S waves. In this study, we combine three common seismic detectors including the short-term average/long-term average (STA/LTA) approach, and two diagnostic functions of moving average and scintillation index. Based on these detectors, we have established an auto-detection algorithm of landslide-quakes and the detection thresholds are defined to distinguish landslide-quake from earthquakes and background noises. To further improve the proposed detection algorithm, we apply it to seismic archives recorded by Broadband Array in Taiwan for Seismology (BATS) during the 2009 Typhoon Morakots and consequently the discrete landslide-quakes detected by the automatic algorithm are located. The detection algorithm show that the landslide-detection results are consistent with that of visual inspection and hence can be used to automatically monitor landslide-quakes.
Hirose, Hitoshi; Sarosiek, Konrad; Cavarocchi, Nicholas C
2014-01-01
Gastrointestinal bleed (GIB) is a known complication in patients receiving nonpulsatile ventricular assist devices (VAD). Previously, we reported a new algorithm for the workup of GIB in VAD patients using deep bowel enteroscopy. In this new algorithm, patients underwent fewer procedures, received less transfusions, and took less time to make the diagnosis than the traditional GIB algorithm group. Concurrently, we reviewed the cost-effectiveness of this new algorithm compared with the traditional workup. The procedure charges for the diagnosis and treatment of each episode of GIB was ~ $2,902 in the new algorithm group versus ~ $9,013 in the traditional algorithm group (p < 0.0001). Following the new algorithm in VAD patients with GIB resulted in fewer transfusions and diagnostic tests while attaining a substantial cost savings per episode of bleeding.
Retinex enhancement of infrared images.
Li, Ying; He, Renjie; Xu, Guizhi; Hou, Changzhi; Sun, Yunyan; Guo, Lei; Rao, Liyun; Yan, Weili
2008-01-01
With the ability of imaging the temperature distribution of body, infrared imaging is promising in diagnostication and prognostication of diseases. However the poor quality of the raw original infrared images prevented applications and one of the essential problems is the low contrast appearance of the imagined object. In this paper, the image enhancement technique based on the Retinex theory is studied, which is a process that automatically retrieve the visual realism to images. The algorithms, including Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm, multi-scale Retinex algorithm and multi-scale Retinex algorithm with color restoration, are experienced to the enhancement of infrared images. The entropy measurements along with the visual inspection were compared and results shown the algorithms based on Retinex theory have the ability in enhancing the infrared image. Out of the algorithms compared, MSRCR demonstrated the best performance.
Probabilistic numerics and uncertainty in computations
Hennig, Philipp; Osborne, Michael A.; Girolami, Mark
2015-01-01
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling
NASA Technical Reports Server (NTRS)
Kenton, Marc A.
2001-01-01
The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).
[Contemporary threat of influenza virus infection].
Płusa, Tadeusz
2010-01-01
Swine-origine H1N1 influenza virus (S-OIV) caused a great mobilization of health medical service over the world. Now it is well known that a vaccine against novel virus is expected as a key point in that battle. In the situation when recommended treatment with neuraminidase inhibitors is not sufficient to control influenza A/H1N1 viral infection the quick and precisely diagnostic procedures should be applied to save and protect our patients.
Liu, Guo-Ping; Yan, Jian-Jun; Wang, Yi-Qin; Fu, Jing-Jing; Xu, Zhao-Xia; Guo, Rui; Qian, Peng
2012-01-01
Background. In Traditional Chinese Medicine (TCM), most of the algorithms are used to solve problems of syndrome diagnosis that only focus on one syndrome, that is, single label learning. However, in clinical practice, patients may simultaneously have more than one syndrome, which has its own symptoms (signs). Methods. We employed a multilabel learning using the relevant feature for each label (REAL) algorithm to construct a syndrome diagnostic model for chronic gastritis (CG) in TCM. REAL combines feature selection methods to select the significant symptoms (signs) of CG. The method was tested on 919 patients using the standard scale. Results. The highest prediction accuracy was achieved when 20 features were selected. The features selected with the information gain were more consistent with the TCM theory. The lowest average accuracy was 54% using multi-label neural networks (BP-MLL), whereas the highest was 82% using REAL for constructing the diagnostic model. For coverage, hamming loss, and ranking loss, the values obtained using the REAL algorithm were the lowest at 0.160, 0.142, and 0.177, respectively. Conclusion. REAL extracts the relevant symptoms (signs) for each syndrome and improves its recognition accuracy. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice. PMID:22719781
Real-time plasma control based on the ISTTOK tomography diagnostica)
NASA Astrophysics Data System (ADS)
Carvalho, P. J.; Carvalho, B. B.; Neto, A.; Coelho, R.; Fernandes, H.; Sousa, J.; Varandas, C.; Chávez-Alarcón, E.; Herrera-Velázquez, J. J. E.
2008-10-01
The presently available processing power in generic processing units (GPUs) combined with state-of-the-art programmable logic devices benefits the implementation of complex, real-time driven, data processing algorithms for plasma diagnostics. A tomographic reconstruction diagnostic has been developed for the ISTTOK tokamak, based on three linear pinhole cameras each with ten lines of sight. The plasma emissivity in a poloidal cross section is computed locally on a submillisecond time scale, using a Fourier-Bessel algorithm, allowing the use of the output signals for active plasma position control. The data acquisition and reconstruction (DAR) system is based on ATCA technology and consists of one acquisition board with integrated field programmable gate array (FPGA) capabilities and a dual-core Pentium module running real-time application interface (RTAI) Linux. In this paper, the DAR real-time firmware/software implementation is presented, based on (i) front-end digital processing in the FPGA; (ii) a device driver specially developed for the board which enables streaming data acquisition to the host GPU; and (iii) a fast reconstruction algorithm running in Linux RTAI. This system behaves as a module of the central ISTTOK control and data acquisition system (FIRESIGNAL). Preliminary results of the above experimental setup are presented and a performance benchmarking against the magnetic coil diagnostic is shown.
2010-01-01
A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176
Autoimmune diagnostics: the technology, the strategy and the clinical governance.
Bizzaro, Nicola; Tozzoli, Renato; Villalta, Danilo
2015-02-01
In recent years, there has been a profound change in autoimmune diagnostics. From long, tiring and inaccurate manual methods, the art of diagnostics has turned to modern, rapid and automated technology. New antibody tests have been developed, and almost all autoimmune diseases now have some specific diagnostic markers. The current need to make the most of available economic and human resources has led to the production of diagnostic algorithms and guidelines designated for optimal strategic use of the tests and to increase the diagnostic appropriateness. An important role in this scenario was assumed by the laboratory autoimmunologist, whose task is not only to govern the analytical phase, but also to help clinicians in correctly choosing the most suitable test for each clinical situation and provide consultancy support. In this review, we summarize recent advances in technology, describe the diagnostic strategies and highlight the current role of the laboratory autoimmunologist in the clinical governance of autoimmune diagnostics.
Basic primitives for molecular diagram sketching
2010-01-01
A collection of primitive operations for molecular diagram sketching has been developed. These primitives compose a concise set of operations which can be used to construct publication-quality 2 D coordinates for molecular structures using a bare minimum of input bandwidth. The input requirements for each primitive consist of a small number of discrete choices, which means that these primitives can be used to form the basis of a user interface which does not require an accurate pointing device. This is particularly relevant to software designed for contemporary mobile platforms. The reduction of input bandwidth is accomplished by using algorithmic methods for anticipating probable geometries during the sketching process, and by intelligent use of template grafting. The algorithms and their uses are described in detail. PMID:20923555
Weiß, Jakob; Schabel, Christoph; Bongers, Malte; Raupach, Rainer; Clasen, Stephan; Notohamiprodjo, Mike; Nikolaou, Konstantin; Bamberg, Fabian
2017-03-01
Background Metal artifacts often impair diagnostic accuracy in computed tomography (CT) imaging. Therefore, effective and workflow implemented metal artifact reduction algorithms are crucial to gain higher diagnostic image quality in patients with metallic hardware. Purpose To assess the clinical performance of a novel iterative metal artifact reduction (iMAR) algorithm for CT in patients with dental fillings. Material and Methods Thirty consecutive patients scheduled for CT imaging and dental fillings were included in the analysis. All patients underwent CT imaging using a second generation dual-source CT scanner (120 kV single-energy; 100/Sn140 kV in dual-energy, 219 mAs, gantry rotation time 0.28-1/s, collimation 0.6 mm) as part of their clinical work-up. Post-processing included standard kernel (B49) and an iterative MAR algorithm. Image quality and diagnostic value were assessed qualitatively (Likert scale) and quantitatively (HU ± SD) by two reviewers independently. Results All 30 patients were included in the analysis, with equal reconstruction times for iMAR and standard reconstruction (17 s ± 0.5 vs. 19 s ± 0.5; P > 0.05). Visual image quality was significantly higher for iMAR as compared with standard reconstruction (3.8 ± 0.5 vs. 2.6 ± 0.5; P < 0.0001, respectively) and showed improved evaluation of adjacent anatomical structures. Similarly, HU-based measurements of degree of artifacts were significantly lower in the iMAR reconstructions as compared with the standard reconstruction (0.9 ± 1.6 vs. -20 ± 47; P < 0.05, respectively). Conclusion The tested iterative, raw-data based reconstruction MAR algorithm allows for a significant reduction of metal artifacts and improved evaluation of adjacent anatomical structures in the head and neck area in patients with dental hardware.
Update on the biological effects of ionizing radiation, relative dose factors and radiation hygiene.
White, Stuart C; Mallya, S M
2012-03-01
Diagnostic imaging is an indispensable part of contemporary medical and dental practice. Over the last few decades there has been a dramatic increase in the use of ionizing radiation for diagnostic imaging. The carcinogenic effects of high-dose exposure are well known. Does diagnostic radiation rarely cause cancer? We don't know but we should act as if it does. Accordingly, dentists should select patients wisely - only make radiographs when there is patient-specific reason to believe there is a reasonable expectation the radiograph will offer unique information influencing diagnosis or treatment. Low-dose examinations should be made: intraoral imaging - use fast film or digital sensors, thyroid collars, rectangular collimation; panoramic and lateral cephalometric imaging - use digital systems or rare-earth film screen combinations; and cone beam computed tomography - use low-dose machines, restrict field size to region of interest, reduce mA and length of exposure arc as appropriate. © 2012 Australian Dental Association.
[Role of contemporary pathological diagnostics in the personalized treatment of cancer].
Tímár, József
2013-03-01
Due to the developments of pathology in the past decades (immunohistochemistry and molecular pathology) classification of cancers changed fundamentally, laying a ground for personalized management of cancer patients. Our picture of cancer is more complex today, identifying the genetic basis of the morphological variants. On the other hand, this picture has a much higher resolution enabling us to subclassify similar histological cancer types based on molecular markers. This redefined classification of cancers helps us to better predict the possible biological behavior of the disease and/or the therapeutic sensitivity, opening the way toward a more personalized treatment of this disease. The redefined molecular classification of cancer may affect the universal application of treatment protocols. To achieve this goal molecular diagnostics must be an integral and reimbursed part of the routine pathological diagnostics. On the other hand, it is time to extend the multidisciplinary team with molecular pathologist to improve the decision making process of the management of cancer patients.
Evidence for Model-based Computations in the Human Amygdala during Pavlovian Conditioning
Prévost, Charlotte; McNamee, Daniel; Jessup, Ryan K.; Bossaerts, Peter; O'Doherty, John P.
2013-01-01
Contemporary computational accounts of instrumental conditioning have emphasized a role for a model-based system in which values are computed with reference to a rich model of the structure of the world, and a model-free system in which values are updated without encoding such structure. Much less studied is the possibility of a similar distinction operating at the level of Pavlovian conditioning. In the present study, we scanned human participants while they participated in a Pavlovian conditioning task with a simple structure while measuring activity in the human amygdala using a high-resolution fMRI protocol. After fitting a model-based algorithm and a variety of model-free algorithms to the fMRI data, we found evidence for the superiority of a model-based algorithm in accounting for activity in the amygdala compared to the model-free counterparts. These findings support an important role for model-based algorithms in describing the processes underpinning Pavlovian conditioning, as well as providing evidence of a role for the human amygdala in model-based inference. PMID:23436990
Development of an Inverse Algorithm for Resonance Inspection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Canhai; Xu, Wei; Sun, Xin
2012-10-01
Resonance inspection (RI), which employs the natural frequency spectra shift between the good and the anomalous part populations to detect defects, is a non-destructive evaluation (NDE) technique with many advantages such as low inspection cost, high testing speed, and broad applicability to structures with complex geometry compared to other contemporary NDE methods. It has already been widely used in the automobile industry for quality inspections of safety critical parts. Unlike some conventionally used NDE methods, the current RI technology is unable to provide details, i.e. location, dimension, or types, of the flaws for the discrepant parts. Such limitation severely hindersmore » its wide spread applications and further development. In this study, an inverse RI algorithm based on maximum correlation function is proposed to quantify the location and size of flaws for a discrepant part. A dog-bone shaped stainless steel sample with and without controlled flaws are used for algorithm development and validation. The results show that multiple flaws can be accurately pinpointed back using the algorithms developed, and the prediction accuracy decreases with increasing flaw numbers and decreasing distance between flaws.« less
NASA Astrophysics Data System (ADS)
Keane, Tommy P.; Saber, Eli; Rhody, Harvey; Savakis, Andreas; Raj, Jeffrey
2012-04-01
Contemporary research in automated panorama creation utilizes camera calibration or extensive knowledge of camera locations and relations to each other to achieve successful results. Research in image registration attempts to restrict these same camera parameters or apply complex point-matching schemes to overcome the complications found in real-world scenarios. This paper presents a novel automated panorama creation algorithm by developing an affine transformation search based on maximized mutual information (MMI) for region-based registration. Standard MMI techniques have been limited to applications with airborne/satellite imagery or medical images. We show that a novel MMI algorithm can approximate an accurate registration between views of realistic scenes of varying depth distortion. The proposed algorithm has been developed using stationary, color, surveillance video data for a scenario with no a priori camera-to-camera parameters. This algorithm is robust for strict- and nearly-affine-related scenes, while providing a useful approximation for the overlap regions in scenes related by a projective homography or a more complex transformation, allowing for a set of efficient and accurate initial conditions for pixel-based registration.
Rodgers, M; Nixon, J; Hempel, S; Aho, T; Kelly, J; Neal, D; Duffy, S; Ritchie, G; Kleijnen, J; Westwood, M
2006-06-01
To determine the most effective diagnostic strategy for the investigation of microscopic and macroscopic haematuria in adults. Electronic databases from inception to October 2003, updated in August 2004. A systematic review was undertaken according to published guidelines. Decision analytic modelling was undertaken, based on the findings of the review, expert opinion and additional information from the literature, to assess the relative cost-effectiveness of plausible alternative tests that are part of diagnostic algorithms for haematuria. A total of 118 studies met the inclusion criteria. No studies that evaluated the effectiveness of diagnostic algorithms for haematuria or the effectiveness of screening for haematuria or investigating its underlying cause were identified. Eighteen out of 19 identified studies evaluated dipstick tests and data from these suggested that these are moderately useful in establishing the presence of, but cannot be used to rule out, haematuria. Six studies using haematuria as a test for the presence of a disease indicated that the detection of microhaematuria cannot alone be considered a useful test either to rule in or rule out the presence of a significant underlying pathology (urinary calculi or bladder cancer). Forty-eight of 80 studies addressed methods to localise the source of bleeding (renal or lower urinary tract). The methods and thresholds described in these studies varied greatly, precluding any estimate of a 'best performance' threshold that could be applied across patient groups. However, studies of red blood cell morphology that used a cut-off value of 80% dysmorphic cells for glomerular disease reported consistently high specificities (potentially useful in ruling in a renal cause for haematuria). The reported sensitivities were generally low. Twenty-eight studies included data on the accuracy of laboratory tests (tumour markers, cytology) for the diagnosis of bladder cancer. The majority of tumour marker studies evaluated nuclear matrix protein 22 or bladder tumour antigen. The sensitivity and specificity ranges suggested that neither of these would be useful either for diagnosing bladder cancer or for ruling out patients for further investigation (cystoscopy). However, the evidence remains sparse and the diagnostic accuracy estimates varied widely between studies. Fifteen studies evaluating urine cytology as a test for urinary tract malignancies were heterogeneous and poorly reported. The calculated specificity values were generally high, suggesting some possible utility in confirming malignancy. However, the evidence suggests that urine cytology has no application in ruling out malignancy or excluding patients from further investigation. Fifteen studies evaluated imaging techniques [computed tomography (CT), intravenous urography (IVU) or ultrasound scanning (US)] to detect the underlying cause of haematuria. The target condition and the reference standard varied greatly between these studies. The diagnostic accuracy data for several individual studies appeared promising but meaningful comparison of the available imaging technologies was impossible. Eight studies met the inclusion criteria but addressed different parts of the diagnostic chain (e.g. screening programmes, laboratory investigations, full urological work-up). No single study addressed the complete diagnostic process. The review also highlighted a number of methodological limitations of these studies, including their lack of generalisability to the UK context. Separate decision analytic models were therefore developed to progress estimation of the optimal strategy for the diagnostic management of haematuria. The economic model for the detection of microhaematuria found that immediate microscopy following a positive dipstick test would improve diagnostic efficiency as it eliminates the high number of false positives produced by dipstick testing. Strategies that use routine microscopy may be associated with high numbers of false results, but evidence was lacking regarding the accuracy of routine microscopy and estimates were adopted for the model. The model for imaging the upper urinary tract showed that US detects more tumours than IVU at one-third of the cost, and is also associated with fewer false results. For any cause of haematuria, CT was shown to have a mean incremental cost-effectiveness ratio of pounds sterling 9939 in comparison with the next best option, US. When US is followed up with CT for negative results with persistent haematuria, it dominates the initial use of CT alone, with a saving of pounds sterling 235,000 for the evaluation of 1000 patients. The model for investigation of the lower urinary tract showed that for low-risk patients the use of immediate cystoscopy could be avoided if cystoscopy were used for follow-up patients with a negative initial test using tumour markers and/or cytology, resulting in a saving of pounds sterling 483,000 for the evaluation of 1000 patients. The clinical and economic impact on delayed detection of both upper and lower urinary tract tumours through the use of follow-up testing should be evaluated in future studies. There are insufficient data currently available to derive an evidence-based algorithm of the diagnostic pathway for haematuria. A hypothetical algorithm based on the opinion and practice of clinical experts in the review team, other published algorithms and the results of economic modelling is presented in this report. This algorithm is presented, for comparative purposes, alongside current US and UK guidelines. The ideas contained in these algorithms and the specific questions outlined should form the basis of future research. Quality assessment of the diagnostic accuracy studies included in this review highlighted several areas of deficiency.
Gelhorn, Heather; Hartman, Christie; Sakai, Joseph; Stallings, Michael; Young, Susan; Rhee, Soo Hyun; Corley, Robin; Hewitt, John; Hopfer, Christian; Crowley, Thomas
2008-11-01
Item response theory analyses were used to examine alcohol abuse and dependence symptoms and diagnoses in adolescents. Previous research suggests that the DSM-IV alcohol use disorder (AUD) symptoms in adolescents may be characterized by a single dimension. The present study extends prior research with a larger and more comprehensive sample and an examination of an alternative diagnostic algorithm for AUDs. Approximately 5,587 adolescents between the ages of 12 and 18 years from adjudicated, clinical, and community samples were administered structured clinical interviews. Analyses were conducted to examine the severity of alcohol abuse and dependence symptoms and the severity of alcohol use problems (AUDs) within the diagnostic categories created by the DSM-IV. Although the DSM-IV diagnostic categories differ in severity of AUDs, there is substantial overlap and inconsistency in AUD severity of persons across these categories. Item Response Theory-based AUD severity estimates suggest that many persons diagnosed with abuse have AUD severity greater than persons with dependence. Similarly, many persons who endorse some symptoms but do not qualify for a diagnosis (i.e., diagnostic orphans) have more severe AUDs than persons with an abuse diagnosis. Additionally, two dependence items, "tolerance" and "larger/longer," show differences in severity between samples. The distinction between DSM-IV abuse and dependence based on severity can be improved using an alternative diagnostic algorithm that considers all of the alcohol abuse and dependence symptoms conjointly.
UWGSP6: a diagnostic radiology workstation of the future
NASA Astrophysics Data System (ADS)
Milton, Stuart W.; Han, Sang; Choi, Hyung-Sik; Kim, Yongmin
1993-06-01
The Univ. of Washington's Image Computing Systems Lab. (ICSL) has been involved in research into the development of a series of PACS workstations since the middle 1980's. The most recent research, a joint UW-IBM project, attempted to create a diagnostic radiology workstation using an IBM RISC System 6000 (RS6000) computer workstation and the X-Window system. While the results are encouraging, there are inherent limitations in the workstation hardware which prevent it from providing an acceptable level of functionality for diagnostic radiology. Realizing the RS6000 workstation's limitations, a parallel effort was initiated to design a workstation, UWGSP6 (Univ. of Washington Graphics System Processor #6), that provides the required functionality. This paper documents the design of UWGSP6, which not only addresses the requirements for a diagnostic radiology workstation in terms of display resolution, response time, etc., but also includes the processing performance necessary to support key functions needed in the implementation of algorithms for computer-aided diagnosis. The paper includes a description of the workstation architecture, and specifically its image processing subsystem. Verification of the design through hardware simulation is then discussed, and finally, performance of selected algorithms based on detailed simulation is provided.
Plantar fasciitis in athletes: diagnostic and treatment strategies. A systematic review
Petraglia, Federica; Ramazzina, Ileana; Costantino, Cosimo
2017-01-01
Summary Background: Plantar fasciitis (PF) is reported in different sports mainly in running and soccer athletes. Purpose of this study is to conduct a systematic review of published literature concerning the diagnosis and treatment of PF in both recreational and élite athletes. The review was conducted and reported in accordance with the PRISMA statement. Methods: The following electronic databases were searched: PubMed, Cochrane Library and Scopus. As far as PF diagnosis, we investigated the electronic databases from January 2006 to June 2016, whereas in considering treatments all data in literature were investigated. Results: For both diagnosis and treatment, 17 studies matched inclusion criteria. The results have highlighted that the most frequently used diagnostic techniques were Ultrasonography and Magnetic Resonance Imaging. Conventional, complementary, and alternative treatment approaches were assessed. Conclusions: In reviewing literature, we were unable to find any specific diagnostic algorithm for PF in athletes, due to the fact that no different diagnostic strategies were used for athletes and non-athletes. As for treatment, a few literature data are available and it makes difficult to suggest practice guidelines. Specific studies are necessary to define the best treatment algorithm for both recreational and élite athletes. Level of evidence: Ib. PMID:28717618
Hultenmo, Maria; Caisander, Håkan; Mack, Karsten; Thilander-Klang, Anne
2016-06-01
The diagnostic image quality of 75 paediatric abdominal computed tomography (CT) examinations reconstructed with two different iterative reconstruction (IR) algorithms-adaptive statistical IR (ASiR™) and model-based IR (Veo™)-was compared. Axial and coronal images were reconstructed with 70 % ASiR with the Soft™ convolution kernel and with the Veo algorithm. The thickness of the reconstructed images was 2.5 or 5 mm depending on the scanning protocol used. Four radiologists graded the delineation of six abdominal structures and the diagnostic usefulness of the image quality. The Veo reconstruction significantly improved the visibility of most of the structures compared with ASiR in all subgroups of images. For coronal images, the Veo reconstruction resulted in significantly improved ratings of the diagnostic use of the image quality compared with the ASiR reconstruction. This was not seen for the axial images. The greatest improvement using Veo reconstruction was observed for the 2.5 mm coronal slices. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Multiplex PCR Tests for Detection of Pathogens Associated with Gastroenteritis
Zhang, Hongwei; Morrison, Scott; Tang, Yi-Wei
2016-01-01
Synopsis A wide range of enteric pathogens can cause infectious gastroenteritis. Conventional diagnostic algorithms including culture, biochemical identification, immunoassay and microscopic examination are time consuming and often lack sensitivity and specificity. Advances in molecular technology have as allowed its use as clinical diagnostic tools. Multiplex PCR based testing has made its way to gastroenterology diagnostic arena in recent years. In this article we present a review of recent laboratory developed multiplex PCR tests and current commercial multiplex gastrointestinal pathogen tests. We will focus on two FDA cleared commercial syndromic multiplex tests: Luminex xTAG GPP and Biofire FimArray GI test. These multiplex tests can detect and identify multiple enteric pathogens in one test and provide results within hours. Multiplex PCR tests have shown superior sensitivity to conventional methods for detection of most pathogens. The high negative predictive value of these multiplex tests has led to the suggestion that they be used as screening tools especially in outbreaks. Although the clinical utility and benefit of multiplex PCR test are to be further investigated, implementing these multiplex PCR tests in gastroenterology diagnostic algorithm has the potential to improve diagnosis of infectious gastroenteritis. PMID:26004652
NASA Technical Reports Server (NTRS)
Maul, William A.; Chicatelli, Amy; Fulton, Christopher E.; Balaban, Edward; Sweet, Adam; Hayden, Sandra Claire; Bajwa, Anupa
2005-01-01
The Propulsion IVHM Technology Experiment (PITEX) has been an on-going research effort conducted over several years. PITEX has developed and applied a model-based diagnostic system for the main propulsion system of the X-34 reusable launch vehicle, a space-launch technology demonstrator. The application was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real-time on flight-like hardware. In an attempt to expose potential performance problems, these PITEX algorithms were subject to numerous real-world effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. The current research has demonstrated the potential benefits of model-based diagnostics, defined the performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.
Communication overhead on the Intel Paragon, IBM SP2 and Meiko CS-2
NASA Technical Reports Server (NTRS)
Bokhari, Shahid H.
1995-01-01
Interprocessor communication overhead is a crucial measure of the power of parallel computing systems-its impact can severely limit the performance of parallel programs. This report presents measurements of communication overhead on three contemporary commercial multicomputer systems: the Intel Paragon, the IBM SP2 and the Meiko CS-2. In each case the time to communicate between processors is presented as a function of message length. The time for global synchronization and memory access is discussed. The performance of these machines in emulating hypercubes and executing random pairwise exchanges is also investigated. It is shown that the interprocessor communication time depends heavily on the specific communication pattern required. These observations contradict the commonly held belief that communication overhead on contemporary machines is independent of the placement of tasks on processors. The information presented in this report permits the evaluation of the efficiency of parallel algorithm implementations against standard baselines.
Minimal requirements for the molecular testing of lung cancer.
Popper, Helmut H; Tímár, József; Ryska, Ales; Olszewski, Wlodzimierz
2014-10-01
From the aspect of the contemporary pathologic diagnostics of lung cancer, it is a key issue of the tissue obtained since small biopsies and cytology still play a major role. In the non-small cell lung cancer era, cytology considered equal to biopsy. However, in recent years it is unable to provide quality diagnosis and must be replaced by biopsy. Various molecular techniques can handle various different tissue samples which must be considered during molecular pathology diagnosis. Besides, tumor cell-normal cell ratio in the obtained tissue as well as the absolute tumor cell number have great significance whose information must be provided in the primary lung cancer diagnosis. Last but not least, for continuous sustainable molecular diagnostics of lung cancer rational algorythms, affordable technology and appropriate reimbursement are equally necessary.
Syndrome Diagnosis: Human Intuition or Machine Intelligence?
Braaten, Øivind; Friestad, Johannes
2008-01-01
The aim of this study was to investigate whether artificial intelligence methods can represent objective methods that are essential in syndrome diagnosis. Most syndromes have no external criterion standard of diagnosis. The predictive value of a clinical sign used in diagnosis is dependent on the prior probability of the syndrome diagnosis. Clinicians often misjudge the probabilities involved. Syndromology needs objective methods to ensure diagnostic consistency, and take prior probabilities into account. We applied two basic artificial intelligence methods to a database of machine-generated patients - a ‘vector method’ and a set method. As reference methods we ran an ID3 algorithm, a cluster analysis and a naive Bayes’ calculation on the same patient series. The overall diagnostic error rate for the the vector algorithm was 0.93%, and for the ID3 0.97%. For the clinical signs found by the set method, the predictive values varied between 0.71 and 1.0. The artificial intelligence methods that we used, proved simple, robust and powerful, and represent objective diagnostic methods. PMID:19415142
Syndrome diagnosis: human intuition or machine intelligence?
Braaten, Oivind; Friestad, Johannes
2008-01-01
The aim of this study was to investigate whether artificial intelligence methods can represent objective methods that are essential in syndrome diagnosis. Most syndromes have no external criterion standard of diagnosis. The predictive value of a clinical sign used in diagnosis is dependent on the prior probability of the syndrome diagnosis. Clinicians often misjudge the probabilities involved. Syndromology needs objective methods to ensure diagnostic consistency, and take prior probabilities into account. We applied two basic artificial intelligence methods to a database of machine-generated patients - a 'vector method' and a set method. As reference methods we ran an ID3 algorithm, a cluster analysis and a naive Bayes' calculation on the same patient series. The overall diagnostic error rate for the the vector algorithm was 0.93%, and for the ID3 0.97%. For the clinical signs found by the set method, the predictive values varied between 0.71 and 1.0. The artificial intelligence methods that we used, proved simple, robust and powerful, and represent objective diagnostic methods.
Doehner, Wolfram; Blankenberg, Stefan; Erdmann, Erland; Ertl, Georg; Hasenfuß, Gerd; Landmesser, Ulf; Pieske, Burkert; Schieffer, Bernhard; Schunkert, Heribert; von Haehling, Stephan; Zeiher, Andreas; Anker, Stefan D
2017-05-01
Iron deficiency (ID) occurs in up to 50% of patients with heart failure (HF). Even without presence of anaemia ID contributes to more severe symptoms, increased hospitalization and mortality. A number of randomized controlled trials demonstrated the clinical benefit of replenishment of iron stores with improvement of symptoms and fewer hospitalizations. Assessment of iron status should therefore become routine assessment in newly diagnosed and in symptomatic patients with HF. ID can be identified with simple and straightforward diagnostic steps. Assessment of Ferritin (indicating iron stores) and transferrin saturation (TSAT, indication capability to mobilise internal iron stores) are sufficient to detect ID. In this review a plain diagnostic algorithm for ID is suggested. Confounding factors for diagnosis and adequate treatment of ID in HF are discussed. A regular workup for iron deficiency parameters may benefit patients with heart failure by providing symptomatic improvements and fewer hospitalizations. © Georg Thieme Verlag KG Stuttgart · New York.
Recent Trends in the Serologic Diagnosis of Syphilis
Singh, Ameeta E.
2014-01-01
Complexities in the diagnosis of syphilis continue to challenge clinicians. While direct tests (e.g., microscopy or PCR) are helpful in early syphilis, the mainstay of diagnosis remains serologic tests. The traditional algorithm using a nontreponemal test (NTT) followed by a treponemal test (TT) remains the standard in many parts of the world. More recently, the ability to automate the TT has led to the increasingly widespread use of reverse algorithms using treponemal enzyme immunoassays (EIAs). Rapid, point-of-care TTs are in widespread use in developing countries because of low cost, ease of use, and reasonable performance. However, none of the current diagnostic algorithms are able to distinguish current from previously treated infections. In addition, the reversal of traditional syphilis algorithms has led to uncertainty in the clinical management of patients. The interpretation of syphilis tests is further complicated by the lack of a reliable gold standard for syphilis diagnostics, and the newer tests can result in false-positive reactions similar to those seen with older tests. Little progress has been made in the area of serologic diagnostics for congenital syphilis, which requires assessment of maternal treatment and serologic response as well as clinical and laboratory investigation of the neonate for appropriate management. The diagnosis of neurosyphilis continues to require the collection of cerebrospinal fluid for a combination of NTT and TT, and, while newer treponemal EIAs look promising, more studies are needed to confirm their utility. This article reviews current tests and discusses current controversies in syphilis diagnosis, with a focus on serologic tests. PMID:25428245
The PHQ-PD as a Screening Tool for Panic Disorder in the Primary Care Setting in Spain
Wood, Cristina Mae; Ruíz-Rodríguez, Paloma; Tomás-Tomás, Patricia; Gracia-Gracia, Irene; Dongil-Collado, Esperanza; Iruarrizaga, M. Iciar
2016-01-01
Introduction Panic disorder is a common anxiety disorder and is highly prevalent in Spanish primary care centres. The use of validated tools can improve the detection of panic disorder in primary care populations, thus enabling referral for specialized treatment. The aim of this study is to determine the accuracy of the Patient Health Questionnaire-Panic Disorder (PHQ-PD) as a screening and diagnostic tool for panic disorder in Spanish primary care centres. Method We compared the psychometric properties of the PHQ-PD to the reference standard, the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I) interview. General practitioners referred 178 patients who completed the entire PHQ test, including the PHQ-PD, to undergo the SCID-I. The sensitivity, specificity, positive and negative predictive values and positive and negative likelihood ratios of the PHQ-PD were assessed. Results The operating characteristics of the PHQ-PD are moderate. The best cut-off score was 5 (sensitivity .77, specificity .72). Modifications to the questionnaire's algorithms improved test characteristics (sensitivity .77, specificity .72) compared to the original algorithm. The screening question alone yielded the highest sensitivity score (.83). Conclusion Although the modified algorithm of the PHQ-PD only yielded moderate results as a diagnostic test for panic disorder, it was better than the original. Using only the first question of the PHQ-PD showed the best psychometric properties (sensitivity). Based on these findings, we suggest the use of the screening questions for screening purposes and the modified algorithm for diagnostic purposes. PMID:27525977
Awaysheh, Abdullah; Wilcke, Jeffrey; Elvinger, François; Rees, Loren; Fan, Weiguo; Zimmerman, Kurt L
2016-11-01
Inflammatory bowel disease (IBD) and alimentary lymphoma (ALA) are common gastrointestinal diseases in cats. The very similar clinical signs and histopathologic features of these diseases make the distinction between them diagnostically challenging. We tested the use of supervised machine-learning algorithms to differentiate between the 2 diseases using data generated from noninvasive diagnostic tests. Three prediction models were developed using 3 machine-learning algorithms: naive Bayes, decision trees, and artificial neural networks. The models were trained and tested on data from complete blood count (CBC) and serum chemistry (SC) results for the following 3 groups of client-owned cats: normal, inflammatory bowel disease (IBD), or alimentary lymphoma (ALA). Naive Bayes and artificial neural networks achieved higher classification accuracy (sensitivities of 70.8% and 69.2%, respectively) than the decision tree algorithm (63%, p < 0.0001). The areas under the receiver-operating characteristic curve for classifying cases into the 3 categories was 83% by naive Bayes, 79% by decision tree, and 82% by artificial neural networks. Prediction models using machine learning provided a method for distinguishing between ALA-IBD, ALA-normal, and IBD-normal. The naive Bayes and artificial neural networks classifiers used 10 and 4 of the CBC and SC variables, respectively, to outperform the C4.5 decision tree, which used 5 CBC and SC variables in classifying cats into the 3 classes. These models can provide another noninvasive diagnostic tool to assist clinicians with differentiating between IBD and ALA, and between diseased and nondiseased cats. © 2016 The Author(s).
Potente, Giuseppe; Messineo, Daniela; Maggi, Claudia; Savelli, Sara
2009-03-01
The purpose of this article is to report our practical utilization of dynamic contrast-enhanced magnetic resonance mammography [DCE-MRM] in the diagnosis of breast lesions. In many European centers, was preferred a high-temporal acquisition of both breasts simultaneously in a large FOV. We preferred to scan single breasts, with the aim to combine the analysis of the contrast intake and washout with the morphological evaluation of breast lesions. We followed an interpretation model, based upon a diagnostic algorithm, which combined contrast enhancement with morphological evaluation, in order to increase our confidence in diagnosis. DCE-MRM with our diagnostic algorithm has identified 179 malignant and 41 benign lesions; final outcome has identified 178 malignant and 42 benign lesions, 3 false positives and 2 false negatives. Sensitivity of CE-MRM was 98.3%; specificity, 95.1%; positive predictive value, 98.9%; negative predictive value, 92.8% and accuracy, 97.7%.
Automated frame selection process for high-resolution microendoscopy
NASA Astrophysics Data System (ADS)
Ishijima, Ayumu; Schwarz, Richard A.; Shin, Dongsuk; Mondrik, Sharon; Vigneswaran, Nadarajah; Gillenwater, Ann M.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca
2015-04-01
We developed an automated frame selection algorithm for high-resolution microendoscopy video sequences. The algorithm rapidly selects a representative frame with minimal motion artifact from a short video sequence, enabling fully automated image analysis at the point-of-care. The algorithm was evaluated by quantitative comparison of diagnostically relevant image features and diagnostic classification results obtained using automated frame selection versus manual frame selection. A data set consisting of video sequences collected in vivo from 100 oral sites and 167 esophageal sites was used in the analysis. The area under the receiver operating characteristic curve was 0.78 (automated selection) versus 0.82 (manual selection) for oral sites, and 0.93 (automated selection) versus 0.92 (manual selection) for esophageal sites. The implementation of fully automated high-resolution microendoscopy at the point-of-care has the potential to reduce the number of biopsies needed for accurate diagnosis of precancer and cancer in low-resource settings where there may be limited infrastructure and personnel for standard histologic analysis.
Overcoming limitations of model-based diagnostic reasoning systems
NASA Technical Reports Server (NTRS)
Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.
1989-01-01
The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.
A diagnostic approach to hemochromatosis
Tavill, Anthony S; Adams, Paul C
2006-01-01
In the present clinical review, a diagnostic approach to hemochromatosis is discussed from the perspective of two clinicians with extensive experience in this area. The introduction of genetic testing and large-scale population screening studies have broadened our understanding of the clinical expression of disease and the utility of biochemical iron tests for the detection of disease and for the assessment of disease severity. Liver biopsy has become more of a prognostic test than a diagnostic test. The authors offer a stepwise, diagnostic algorithm based on current evidence-based data, that they regard as most cost-effective. An early diagnosis can lead to phlebotomy therapy to prevent the development of cirrhosis. PMID:16955151
Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection
ERIC Educational Resources Information Center
Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas
2011-01-01
Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to…
Disk Crack Detection for Seeded Fault Engine Test
NASA Technical Reports Server (NTRS)
Luo, Huageng; Rodriguez, Hector; Hallman, Darren; Corbly, Dennis; Lewicki, David G. (Technical Monitor)
2004-01-01
Work was performed to develop and demonstrate vibration diagnostic techniques for the on-line detection of engine rotor disk cracks and other anomalies through a real engine test. An existing single-degree-of-freedom non-resonance-based vibration algorithm was extended to a multi-degree-of-freedom model. In addition, a resonance-based algorithm was also proposed for the case of one or more resonances. The algorithms were integrated into a diagnostic system using state-of-the- art commercial analysis equipment. The system required only non-rotating vibration signals, such as accelerometers and proximity probes, and the rotor shaft 1/rev signal to conduct the health monitoring. Before the engine test, the integrated system was tested in the laboratory by using a small rotor with controlled mass unbalances. The laboratory tests verified the system integration and both the non-resonance and the resonance-based algorithm implementations. In the engine test, the system concluded that after two weeks of cycling, the seeded fan disk flaw did not propagate to a large enough size to be detected by changes in the synchronous vibration. The unbalance induced by mass shifting during the start up and coast down was still the dominant response in the synchronous vibration.
Data Mining for Anomaly Detection
NASA Technical Reports Server (NTRS)
Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj
2013-01-01
The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.
A Wave Diagnostics in Geophysics: Algorithmic Extraction of Atmosphere Disturbance Modes
NASA Astrophysics Data System (ADS)
Leble, S.; Vereshchagin, S.
2018-04-01
The problem of diagnostics in geophysics is discussed and a proposal based on dynamic projecting operators technique is formulated. The general exposition is demonstrated by an example of symbolic algorithm for the wave and entropy modes in the exponentially stratified atmosphere. The novel technique is developed as a discrete version for the evolution operator and the corresponding projectors via discrete Fourier transformation. Its explicit realization for directed modes in exponential one-dimensional atmosphere is presented via the correspondent projection operators in its discrete version in terms of matrices with a prescribed action on arrays formed from observation tables. A simulation based on opposite directed (upward and downward) wave train solution is performed and the modes' extraction from a mixture is illustrated.
A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging.
Jiang, J; Hall, T J
2007-07-07
Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s(-1)) that exceed our previous methods.
Role of third molars in orthodontics
Almpani, Konstantinia; Kolokitha, Olga-Elpis
2015-01-01
The role of third molars in the oral cavity has been extensively studied over the years. Literature includes numerous diagnostic and treatment alternatives regarding the third molars. However, an issue that has not been discussed at the same level is their involvement in orthodontic therapy. The aim of this study is to present a review of the contemporary literature regarding the most broadly discussed aspects of the multifactorial role of third molars in orthodontics and which are of general dental interest too. PMID:25685759
Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera
2003-11-01
Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.
An overview of contemporary nuclear cardiology.
Lewin, Howard C; Sciammarella, Maria G; Watters, Thomas A; Alexander, Herbert G
2004-01-01
Myocardial perfusion single photon emission computed tomography (SPECT) is a widely utilized noninvasive imaging modality for the diagnosis, prognosis, and risk stratification of coronary artery disease. It is clearly superior to the traditional planar technique in terms of imaging contrast and consequent diagnostic and prognostic yield. The strength of SPECT images is largely derived from the three-dimensional, volumetric nature of its image. Thus, this modality permits three-dimensional assessment and quantitation of the perfused myocardium and functional assessment through electrocardiographic gating of the perfusion images.
Reception of the stethoscope and Laënnec's book.
Bishop, P J
1981-01-01
A study of contemporary book reviews and other notices enables us to trace the reception of the stethoscope and Laënnec's book between 1816 and 1826. It is quite clear from these that the stethoscope was welcomed with enthusiasm by most people who saw it as the first major diagnostic tool medicine had ever had. Laënnec's book was recognised as being the most important, interesting, accurate, and complete work on diseases of the chest that had ever been published. PMID:7031968
CAD-Based Shielding Analysis for ITER Port Diagnostics
NASA Astrophysics Data System (ADS)
Serikov, Arkady; Fischer, Ulrich; Anthoine, David; Bertalot, Luciano; De Bock, Maartin; O'Connor, Richard; Juarez, Rafael; Krasilnikov, Vitaly
2017-09-01
Radiation shielding analysis conducted in support of design development of the contemporary diagnostic systems integrated inside the ITER ports is relied on the use of CAD models. This paper presents the CAD-based MCNP Monte Carlo radiation transport and activation analyses for the Diagnostic Upper and Equatorial Port Plugs (UPP #3 and EPP #8, #17). The creation process of the complicated 3D MCNP models of the diagnostics systems was substantially accelerated by application of the CAD-to-MCNP converter programs MCAM and McCad. High performance computing resources of the Helios supercomputer allowed to speed-up the MCNP parallel transport calculations with the MPI/OpenMP interface. The found shielding solutions could be universal, reducing ports R&D costs. The shield block behind the Tritium and Deposit Monitor (TDM) optical box was added to study its influence on Shut-Down Dose Rate (SDDR) in Port Interspace (PI) of EPP#17. Influence of neutron streaming along the Lost Alpha Monitor (LAM) on the neutron energy spectra calculated in the Tangential Neutron Spectrometer (TNS) of EPP#8. For the UPP#3 with Charge eXchange Recombination Spectroscopy (CXRS-core), an excessive neutron streaming along the CXRS shutter, which should be prevented in further design iteration.
Emerging Pathogens: Challenges and Successes of Molecular Diagnostics
Dong, Jianli; Olano, Juan P.; McBride, Jere W.; Walker, David H.
2008-01-01
More than 50 emerging and reemerging pathogens have been identified during the last 40 years. Until 1992 when the Institute of Medicine issued a report that defined emerging infectious diseases, medicine had been complacent about such infectious diseases despite the alarm bells of infections with human immunodeficiency virus. Molecular tools have proven useful in discovering and characterizing emerging viruses and bacteria such as Sin Nombre virus (hantaviral pulmonary syndrome), hepatitis C virus, Bartonella henselae (cat scratch disease, bacillary angiomatosis), and Anaplasma phagocytophilum (human granulocytotropic anaplasmosis). The feasibility of applying molecular diagnostics to dangerous, fastidious, and uncultivated agents for which conventional tests do not yield timely diagnoses has achieved proof of concept for many agents, but widespread use of cost-effective, validated commercial assays has yet to occur. This review presents representative emerging viral respiratory infections, hemorrhagic fevers, and hepatitides, as well as bacterial and parasitic zoonotic, gastrointestinal, and pulmonary infections. Agent characteristics, epidemiology, clinical manifestations, and diagnostic methods are tabulated for another 22 emerging viruses and five emerging bacteria. The ongoing challenge to the field of molecular diagnostics is to apply contemporary knowledge to facilitate agent diagnosis as well as to further discoveries of novel pathogens. PMID:18403608
To close or not to close: contemporary indications for patent foramen ovale closure.
Zier, Lucas S; Sievert, Horst; Mahadevan, Vaikom S
2016-11-01
Patent foramen ovale (PFO) is a common congenital cardiac abnormality and that has been associated with several disease processes including transient ischemic attacks (TIA), stroke, migraine headaches with aura, decompression sickness, platypnea-orthodeoxia syndrome, and shunt induced cyanosis. Controversy exists regarding closure of PFO as a therapeutic treatment modality for these disease processes. This review addresses the contemporary clinical indications for PFO closure. Areas covered: We conducted a comprehensive literature search of contemporary research studies focusing on randomized trials and meta-analyses comparing medical therapy and device closure of PFOs for the treatment of PFO associated clinical syndromes. We synthesized this literature into a review addressing indications for PFO closure in stroke, TIA, migraine headaches with aura, decompression sickness, platypnea-orthodeoxia syndrome, and shunt induced cyanosis. Expert commentary: Because in many PFO associated conditions it can be difficult to determine the degree to which the PFO is a causative factor in the disease process, we recommend a comprehensive diagnostic evaluation to exclude other obvious etiologies of PFO associated conditions before implicating the PFO and proceeding with closure. However in the properly selected patient population there is growing clinical experience and experimental evidence suggesting that closure of PFO is a safe and effective treatment modality.
ERIC Educational Resources Information Center
Maljaars, Jarymke; Noens, Ilse; Scholte, Evert; van Berckelaer-Onnes, Ina
2012-01-01
The Diagnostic Interview for Social and Communication Disorders (DISCO; Wing, 2006) is a standardized, semi-structured and interviewer-based schedule for diagnosis of autism spectrum disorder (ASD). The objective of this study was to evaluate the criterion and convergent validity of the DISCO-11 ICD-10 algorithm in young and low-functioning…
Rutstein, Sarah E; Ananworanich, Jintanat; Fidler, Sarah; Johnson, Cheryl; Sanders, Eduard J; Sued, Omar; Saez-Cirion, Asier; Pilcher, Christopher D; Fraser, Christophe; Cohen, Myron S; Vitoria, Marco; Doherty, Meg; Tucker, Joseph D
2017-06-28
The unchanged global HIV incidence may be related to ignoring acute HIV infection (AHI). This scoping review examines diagnostic, clinical, and public health implications of identifying and treating persons with AHI. We searched PubMed, in addition to hand-review of key journals identifying research pertaining to AHI detection and treatment. We focused on the relative contribution of AHI to transmission and the diagnostic, clinical, and public health implications. We prioritized research from low- and middle-income countries (LMICs) published in the last fifteen years. Extensive AHI research and limited routine AHI detection and treatment have begun in LMIC. Diagnostic challenges include ease-of-use, suitability for application and distribution in LMIC, and throughput for high-volume testing. Risk score algorithms have been used in LMIC to screen for AHI among individuals with behavioural and clinical characteristics more often associated with AHI. However, algorithms have not been implemented outside research settings. From a clinical perspective, there are substantial immunological and virological benefits to identifying and treating persons with AHI - evading the irreversible damage to host immune systems and seeding of viral reservoirs that occurs during untreated acute infection. The therapeutic benefits require rapid initiation of antiretrovirals, a logistical challenge in the absence of point-of-care testing. From a public health perspective, AHI diagnosis and treatment is critical to: decrease transmission via viral load reduction and behavioural interventions; improve pre-exposure prophylaxis outcomes by avoiding treatment initiation for HIV-seronegative persons with AHI; and, enhance partner services via notification for persons recently exposed or likely transmitting. There are undeniable clinical and public health benefits to AHI detection and treatment, but also substantial diagnostic and logistical barriers to implementation and scale-up. Effective early ART initiation may be critical for HIV eradication efforts, but widespread use in LMIC requires simple and accurate diagnostic tools. Implementation research is critical to facilitate sustainable integration of AHI detection and treatment into existing health systems and will be essential for prospective evaluation of testing algorithms, point-of-care diagnostics, and efficacious and effective first-line regimens.
Rutstein, Sarah E.; Ananworanich, Jintanat; Fidler, Sarah; Johnson, Cheryl; Sanders, Eduard J.; Sued, Omar; Saez-Cirion, Asier; Pilcher, Christopher D.; Fraser, Christophe; Cohen, Myron S.; Vitoria, Marco; Doherty, Meg; Tucker, Joseph D.
2017-01-01
Abstract Introduction: The unchanged global HIV incidence may be related to ignoring acute HIV infection (AHI). This scoping review examines diagnostic, clinical, and public health implications of identifying and treating persons with AHI. Methods: We searched PubMed, in addition to hand-review of key journals identifying research pertaining to AHI detection and treatment. We focused on the relative contribution of AHI to transmission and the diagnostic, clinical, and public health implications. We prioritized research from low- and middle-income countries (LMICs) published in the last fifteen years. Results and Discussion: Extensive AHI research and limited routine AHI detection and treatment have begun in LMIC. Diagnostic challenges include ease-of-use, suitability for application and distribution in LMIC, and throughput for high-volume testing. Risk score algorithms have been used in LMIC to screen for AHI among individuals with behavioural and clinical characteristics more often associated with AHI. However, algorithms have not been implemented outside research settings. From a clinical perspective, there are substantial immunological and virological benefits to identifying and treating persons with AHI – evading the irreversible damage to host immune systems and seeding of viral reservoirs that occurs during untreated acute infection. The therapeutic benefits require rapid initiation of antiretrovirals, a logistical challenge in the absence of point-of-care testing. From a public health perspective, AHI diagnosis and treatment is critical to: decrease transmission via viral load reduction and behavioural interventions; improve pre-exposure prophylaxis outcomes by avoiding treatment initiation for HIV-seronegative persons with AHI; and, enhance partner services via notification for persons recently exposed or likely transmitting. Conclusions: There are undeniable clinical and public health benefits to AHI detection and treatment, but also substantial diagnostic and logistical barriers to implementation and scale-up. Effective early ART initiation may be critical for HIV eradication efforts, but widespread use in LMIC requires simple and accurate diagnostic tools. Implementation research is critical to facilitate sustainable integration of AHI detection and treatment into existing health systems and will be essential for prospective evaluation of testing algorithms, point-of-care diagnostics, and efficacious and effective first-line regimens. PMID:28691435
The JET diagnostic fast central acquisition and trigger system (abstract)
NASA Astrophysics Data System (ADS)
Edwards, A. W.; Blackler, K.
1995-01-01
Most plasma physics diagnostics sample at a fixed frequency that is normally matched to available memory limits. This technique is not appropriate for long pulse machines such as JET where sampling frequencies of hundreds of kHz are required to diagnose very fast events. As a result of work using real-time event selection within the previous JET soft x-ray diagnostic, a single data acquisition and event triggering system for all suitable fast diagnostics, the fast central acquisition and trigger system (Fast CATS), has been developed for JET. The front-end analog-to-digital conversion (ADC) part samples all channels at 250 kHz, with a 100 kHz pass band and a stop band of 125 kHz. The back-end data collection system is based around Texas Instruments TMS320C40 microprocessors. Within this system, two levels of trigger algorithms are able to evaluate data. The first level typically analyzes data on a per diagnostic and individual channel basis. The second level looks at the data from one or more diagnostics in a window around the time of interest flagged by the first level system. Selection criteria defined by the diagnosticians are then imposed on the results from the second level to decide whether that data should be kept. The use of such a system involving intelligent real time trigger algorithms and fast data analysis will improve both the quantity and quality of JET diagnostic data, while providing valuable input to the design of data acquisition systems for very long pulse machines such as ITER. This paper will give an overview of the various elements of this new system. In addition, first results from this system following the restart of JET operation will be presented.
NASA Astrophysics Data System (ADS)
Uma Maheswari, R.; Umamaheswari, R.
2017-02-01
Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.
KANTS: a stigmergic ant algorithm for cluster analysis and swarm art.
Fernandes, Carlos M; Mora, Antonio M; Merelo, Juan J; Rosa, Agostinho C
2014-06-01
KANTS is a swarm intelligence clustering algorithm inspired by the behavior of social insects. It uses stigmergy as a strategy for clustering large datasets and, as a result, displays a typical behavior of complex systems: self-organization and global patterns emerging from the local interaction of simple units. This paper introduces a simplified version of KANTS and describes recent experiments with the algorithm in the context of a contemporary artistic and scientific trend called swarm art, a type of generative art in which swarm intelligence systems are used to create artwork or ornamental objects. KANTS is used here for generating color drawings from the input data that represent real-world phenomena, such as electroencephalogram sleep data. However, the main proposal of this paper is an art project based on well-known abstract paintings, from which the chromatic values are extracted and used as input. Colors and shapes are therefore reorganized by KANTS, which generates its own interpretation of the original artworks. The project won the 2012 Evolutionary Art, Design, and Creativity Competition.
Multiple signal classification algorithm for super-resolution fluorescence microscopy
Agarwal, Krishna; Macháň, Radek
2016-01-01
Single-molecule localization techniques are restricted by long acquisition and computational times, or the need of special fluorophores or biologically toxic photochemical environments. Here we propose a statistical super-resolution technique of wide-field fluorescence microscopy we call the multiple signal classification algorithm which has several advantages. It provides resolution down to at least 50 nm, requires fewer frames and lower excitation power and works even at high fluorophore concentrations. Further, it works with any fluorophore that exhibits blinking on the timescale of the recording. The multiple signal classification algorithm shows comparable or better performance in comparison with single-molecule localization techniques and four contemporary statistical super-resolution methods for experiments of in vitro actin filaments and other independently acquired experimental data sets. We also demonstrate super-resolution at timescales of 245 ms (using 49 frames acquired at 200 frames per second) in samples of live-cell microtubules and live-cell actin filaments imaged without imaging buffers. PMID:27934858
NASA Astrophysics Data System (ADS)
Maass, Bolko
2016-12-01
This paper describes an efficient and easily implemented algorithmic approach to extracting an approximation to an image's dominant projected illumination direction, based on intermediary results from a segmentation-based crater detection algorithm (CDA), at a computational cost that is negligible in comparison to that of the prior stages of the CDA. Most contemporary CDAs built for spacecraft navigation use this illumination direction as a means of improving performance or even require it to function at all. Deducing the illumination vector from the image alone reduces the reliance on external information such as the accurate knowledge of the spacecraft inertial state, accurate time base and solar system ephemerides. Therefore, a method such as the one described in this paper is a prerequisite for true "Lost in Space" operation of a purely segmentation-based crater detecting and matching method for spacecraft navigation. The proposed method is verified using ray-traced lunar elevation model data, asteroid image data, and in a laboratory setting with a camera in the loop.
Ostreĭkov, I F; Podkopaev, V N; Moiseev, D B; Karpysheva, E V; Markova, L A; Sizov, S V
1997-01-01
Total mortality decreased by 2.5 times in the wards for intensive care of the newborns in the Tushino Pediatric Hospital in 1996 and is now 7.6%. Such results are due to a complex of measures, one such measure being the development and introduction of an algorithm for the diagnosis and treatment of newborns hospitalized in intensive care wards. The algorithm facilitates the work of the staff, helps earlier diagnose a disease, and, hence, carry out timely scientifically based therapy.
Shanks, Leslie; Siddiqui, M Ruby; Abebe, Almaz; Piriou, Erwan; Pearce, Neil; Ariti, Cono; Masiga, Johnson; Muluneh, Libsework; Wazome, Joseph; Ritmeijer, Koert; Klarkowski, Derryck
2015-05-14
Current WHO testing guidelines for resource limited settings diagnose HIV on the basis of screening tests without a confirmation test due to cost constraints. This leads to a potential risk of false positive HIV diagnosis. In this paper, we evaluate the dilution test, a novel method for confirmation testing, which is simple, rapid, and low cost. The principle of the dilution test is to alter the sensitivity of a rapid diagnostic test (RDT) by dilution of the sample, in order to screen out the cross reacting antibodies responsible for falsely positive RDT results. Participants were recruited from two testing centres in Ethiopia where a tiebreaker algorithm using 3 different RDTs in series is used to diagnose HIV. All samples positive on the initial screening RDT and every 10th negative sample underwent testing with the gold standard and dilution test. Dilution testing was performed using Determine™ rapid diagnostic test at 6 different dilutions. Results were compared to the gold standard of Western Blot; where Western Blot was indeterminate, PCR testing determined the final result. 2895 samples were recruited to the study. 247 were positive for a prevalence of 8.5 % (247/2895). A total of 495 samples underwent dilution testing. The RDT diagnostic algorithm misclassified 18 samples as positive. Dilution at the level of 1/160 was able to correctly identify all these 18 false positives, but at a cost of a single false negative result (sensitivity 99.6 %, 95 % CI 97.8-100; specificity 100 %, 95 % CI: 98.5-100). Concordance between the gold standard and the 1/160 dilution strength was 99.8 %. This study provides proof of concept for a new, low cost method of confirming HIV diagnosis in resource-limited settings. It has potential for use as a supplementary test in a confirmatory algorithm, whereby double positive RDT results undergo dilution testing, with positive results confirming HIV infection. Negative results require nucleic acid testing to rule out false negative results due to seroconversion or misclassification by the lower sensitivity dilution test. Further research is needed to determine if these results can be replicated in other settings. ClinicalTrials.gov, NCT01716299 .
Emergency ultrasound-based algorithms for diagnosing blunt abdominal trauma.
Stengel, Dirk; Bauwens, Kai; Rademacher, Grit; Ekkernkamp, Axel; Güthoff, Claas
2013-07-31
Ultrasonography is regarded as the tool of choice for early diagnostic investigations in patients with suspected blunt abdominal trauma. Although its sensitivity is too low for definite exclusion of abdominal organ injury, proponents of ultrasound argue that ultrasound-based clinical pathways enhance the speed of primary trauma assessment, reduce the number of computed tomography scans and cut costs. To assess the effects of trauma algorithms that include ultrasound examinations in patients with suspected blunt abdominal trauma. We searched the Cochrane Injuries Group's Specialised Register, CENTRAL (The Cochrane Library), MEDLINE (OvidSP), EMBASE (OvidSP), CINAHL (EBSCO), publishers' databases, controlled trials registers and the Internet. Bibliographies of identified articles and conference abstracts were searched for further elligible studies. Trial authors were contacted for further information and individual patient data. The searches were updated in February 2013. randomised controlled trials (RCTs) and quasi-randomised trials (qRCTs). patients with blunt torso, abdominal or multiple trauma undergoing diagnostic investigations for abdominal organ injury. diagnostic algorithms comprising emergency ultrasonography (US). diagnostic algorithms without ultrasound examinations (for example, primary computed tomography [CT] or diagnostic peritoneal lavage [DPL]). mortality, use of CT and DPL, cost-effectiveness, laparotomy and negative laparotomy rates, delayed diagnoses, and quality of life. Two authors independently selected trials for inclusion, assessed methodological quality and extracted data. Where possible, data were pooled and relative risks (RRs), risk differences (RDs) and weighted mean differences, each with 95% confidence intervals (CIs), were calculated by fixed- or random-effects modelling, as appropriate. We identified four studies meeting our inclusion criteria. Overall, trials were of moderate methodological quality. Few trial authors responded to our written inquiries seeking to resolve controversial issues and to obtain individual patient data. We pooled mortality data from three trials involving 1254 patients; relative risk in favour of the US arm was 1.00 (95% CI 0.50 to 2.00). US-based pathways significantly reduced the number of CT scans (random-effects RD -0.52, 95% CI -0.83 to -0.21), but the meaning of this result is unclear. Given the low sensitivity of ultrasound, the reduction in CT scans may either translate to a number needed to treat or number needed to harm of two. There is currently insufficient evidence from RCTs to justify promotion of ultrasound-based clinical pathways in diagnosing patients with suspected blunt abdominal trauma.
Harman, David J; Ryder, Stephen D; James, Martin W; Jelpke, Matthew; Ottey, Dominic S; Wilkes, Emilie A; Card, Timothy R; Aithal, Guruprasad P; Guha, Indra Neil
2015-05-03
To assess the feasibility of a novel diagnostic algorithm targeting patients with risk factors for chronic liver disease in a community setting. Prospective cross-sectional study. Two primary care practices (adult patient population 10,479) in Nottingham, UK. Adult patients (aged 18 years or over) fulfilling one or more selected risk factors for developing chronic liver disease: (1) hazardous alcohol use, (2) type 2 diabetes or (3) persistently elevated alanine aminotransferase (ALT) liver function enzyme with negative serology. A serial biomarker algorithm, using a simple blood-based marker (aspartate aminotransferase:ALT ratio for hazardous alcohol users, BARD score for other risk groups) and subsequently liver stiffness measurement using transient elastography (TE). Diagnosis of clinically significant liver disease (defined as liver stiffness ≥8 kPa); definitive diagnosis of liver cirrhosis. We identified 920 patients with the defined risk factors of whom 504 patients agreed to undergo investigation. A normal blood biomarker was found in 62 patients (12.3%) who required no further investigation. Subsequently, 378 patients agreed to undergo TE, of whom 98 (26.8% of valid scans) had elevated liver stiffness. Importantly, 71/98 (72.4%) patients with elevated liver stiffness had normal liver enzymes and would be missed by traditional investigation algorithms. We identified 11 new patients with definite cirrhosis, representing a 140% increase in the number of diagnosed cases in this population. A non-invasive liver investigation algorithm based in a community setting is feasible to implement. Targeting risk factors using a non-invasive biomarker approach identified a substantial number of patients with previously undetected cirrhosis. The diagnostic algorithm utilised for this study can be found on clinicaltrials.gov (NCT02037867), and is part of a continuing longitudinal cohort study. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Retrieved Vertical Profiles of Latent Heat Release Using TRMM Rainfall Products
NASA Technical Reports Server (NTRS)
Tao, W.-K.; Lang, S.; Olson, W. S.; Meneghini, R.; Yang, S.; Simpson, J.; Kummerow, C.; Smith, E.
2000-01-01
This paper represents the first attempt to use TRMM rainfall information to estimate the four dimensional latent heating structure over the global tropics for February 1998. The mean latent heating profiles over six oceanic regions (TOGA COARE IFA, Central Pacific, S. Pacific Convergence Zone, East Pacific, Indian Ocean and Atlantic Ocean) and three continental regions (S. America, Central Africa and Australia) are estimated and studied. The heating profiles obtained from the results of diagnostic budget studies over a broad range of geographic locations are used to provide comparisons and indirect validation for the heating algorithm estimated heating profiles. Three different latent heating algorithms, the Goddard Convective-Stratiform (CSH) heating, the Goddard Profiling (GPROF) heating, and the Hydrometeor heating (HH) are used and their results are intercompared. The horizontal distribution or patterns of latent heat release from the three different heating retrieval methods are quite similar. They all can identify the areas of major convective activity (i.e., a well defined ITCZ in the Pacific, a distinct SPCZ) in the global tropics. The magnitude of their estimated latent heating release is also not in bad agreement with each other and with those determined from diagnostic budget studies. However, the major difference among these three heating retrieval algorithms is the altitude of the maximum heating level. The CSH algorithm estimated heating profiles only show one maximum heating level, and the level varies between convective activity from various geographic locations. These features are in good agreement with diagnostic budget studies. By contrast, two maximum heating levels were found using the GPROF heating and HH algorithms. The latent heating profiles estimated from all three methods can not show cooling between active convective events. We also examined the impact of different TMI (Multi-channel Passive Microwave Sensor) and PR (Precipitation Radar) rainfall information on latent heating structures.
Region of interest processing for iterative reconstruction in x-ray computed tomography
NASA Astrophysics Data System (ADS)
Kopp, Felix K.; Nasirudin, Radin A.; Mei, Kai; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Noël, Peter B.
2015-03-01
The recent advancements in the graphics card technology raised the performance of parallel computing and contributed to the introduction of iterative reconstruction methods for x-ray computed tomography in clinical CT scanners. Iterative maximum likelihood (ML) based reconstruction methods are known to reduce image noise and to improve the diagnostic quality of low-dose CT. However, iterative reconstruction of a region of interest (ROI), especially ML based, is challenging. But for some clinical procedures, like cardiac CT, only a ROI is needed for diagnostics. A high-resolution reconstruction of the full field of view (FOV) consumes unnecessary computation effort that results in a slower reconstruction than clinically acceptable. In this work, we present an extension and evaluation of an existing ROI processing algorithm. Especially improvements for the equalization between regions inside and outside of a ROI are proposed. The evaluation was done on data collected from a clinical CT scanner. The performance of the different algorithms is qualitatively and quantitatively assessed. Our solution to the ROI problem provides an increase in signal-to-noise ratio and leads to visually less noise in the final reconstruction. The reconstruction speed of our technique was observed to be comparable with other previous proposed techniques. The development of ROI processing algorithms in combination with iterative reconstruction will provide higher diagnostic quality in the near future.
López-Sánchez, C; Sulleiro, E; Bocanegra, C; Romero, S; Codina, G; Sanz, I; Esperalba, J; Serra, J; Pigrau, C; Burgos, J; Almirante, B; Falcó, V
2017-04-01
In this study we attempt to assess the utility of a simplified step-wise diagnostic algorithm to determinate the aetiology of encephalitis in daily clinical practice and to describe the main causes in our setting. This was a prospective cohort study of all consecutive cases of encephalitis in adult patients diagnosed between January 2010 and March 2015 at the University Hospital Vall d'Hebron in Barcelona, Spain. The aetiological study was carried out following the proposed step-wise algorithm. The proportion of aetiological diagnoses achieved in each step was analysed. Data from 97 patients with encephalitis were assessed. Following a simplified step-wise algorithm, a definite diagnosis was made in the first step in 53 patients (55 %) and in 12 additional cases (12 %) in the second step. Overall, a definite or probable aetiological diagnosis was achieved in 78 % of the cases. Herpes virus, L. monocytogenes and M. tuberculosis were the leading causative agents demonstrated, whereas less frequent aetiologies were observed, mainly in immunosuppressed patients. The overall related mortality was 13.4 %. According to our experience, the leading and treatable causes of encephalitis can be identified in a first diagnostic step with limited microbiological studies. L. monocytogenes treatment should be considered on arrival in some patients. Additional diagnostic effort should be made in immunosuppressed patients.
NASA Astrophysics Data System (ADS)
Georgiou, Harris
2009-10-01
Medical Informatics and the application of modern signal processing in the assistance of the diagnostic process in medical imaging is one of the more recent and active research areas today. This thesis addresses a variety of issues related to the general problem of medical image analysis, specifically in mammography, and presents a series of algorithms and design approaches for all the intermediate levels of a modern system for computer-aided diagnosis (CAD). The diagnostic problem is analyzed with a systematic approach, first defining the imaging characteristics and features that are relevant to probable pathology in mammo-grams. Next, these features are quantified and fused into new, integrated radio-logical systems that exhibit embedded digital signal processing, in order to improve the final result and minimize the radiological dose for the patient. In a higher level, special algorithms are designed for detecting and encoding these clinically interest-ing imaging features, in order to be used as input to advanced pattern classifiers and machine learning models. Finally, these approaches are extended in multi-classifier models under the scope of Game Theory and optimum collective deci-sion, in order to produce efficient solutions for combining classifiers with minimum computational costs for advanced diagnostic systems. The material covered in this thesis is related to a total of 18 published papers, 6 in scientific journals and 12 in international conferences.
Naehle, Claas P; Hechelhammer, Lukas; Richter, Heiko; Ryffel, Fabian; Wildermuth, Simon; Weber, Johannes
To evaluate the effectiveness and clinical utility of a metal artifact reduction (MAR) image reconstruction algorithm for the reduction of high-attenuation object (HAO)-related image artifacts. Images were quantitatively evaluated for image noise (noiseSD and noiserange) and qualitatively for artifact severity, gray-white-matter delineation, and diagnostic confidence with conventional reconstruction and after applying a MAR algorithm. Metal artifact reduction reduces noiseSD and noiserange (median [interquartile range]) at the level of HAO in 1-cm distance compared with conventional reconstruction (noiseSD: 60.0 [71.4] vs 12.8 [16.1] and noiserange: 262.0 [236.8] vs 72.0 [28.3]; P < 0.0001). Artifact severity (reader 1 [mean ± SD]: 1.1 ± 0.6 vs 2.4 ± 0.5, reader 2: 0.8 ± 0.6 vs 2.0 ± 0.4) at level of HAO and diagnostic confidence (reader 1: 1.6 ± 0.7 vs 2.6 ± 0.5, reader 2: 1.0 ± 0.6 vs 2.3 ± 0.7) significantly improved with MAR (P < 0.0001). Metal artifact reduction did not affect gray-white-matter delineation. Metal artifact reduction effectively reduces image artifacts caused by HAO and significantly improves diagnostic confidence without worsening gray-white-matter delineation.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2007-03-01
Multislice CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multislice CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. Moreover, we have provided diagnostic assistance methods to medical screening specialists by using a lung cancer screening algorithm built into mobile helical CT scanner for the lung cancer mass screening done in the region without the hospital. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system.
AI (artificial intelligence) in histopathology--from image analysis to automated diagnosis.
Kayser, Klaus; Görtler, Jürgen; Bogovac, Milica; Bogovac, Aleksandar; Goldmann, Torsten; Vollmer, Ekkehard; Kayser, Gian
2009-01-01
The technological progress in digitalization of complete histological glass slides has opened a new door in tissue--based diagnosis. The presentation of microscopic images as a whole in a digital matrix is called virtual slide. A virtual slide allows calculation and related presentation of image information that otherwise can only be seen by individual human performance. The digital world permits attachments of several (if not all) fields of view and the contemporary visualization on a screen. The presentation of all microscopic magnifications is possible if the basic pixel resolution is less than 0.25 microns. To introduce digital tissue--based diagnosis into the daily routine work of a surgical pathologist requires a new setup of workflow arrangement and procedures. The quality of digitized images is sufficient for diagnostic purposes; however, the time needed for viewing virtual slides exceeds that of viewing original glass slides by far. The reason lies in a slower and more difficult sampling procedure, which is the selection of information containing fields of view. By application of artificial intelligence, tissue--based diagnosis in routine work can be managed automatically in steps as follows: 1. The individual image quality has to be measured, and corrected, if necessary. 2. A diagnostic algorithm has to be applied. An algorithm has be developed, that includes both object based (object features, structures) and pixel based (texture) measures. 3. These measures serve for diagnosis classification and feedback to order additional information, for example in virtual immunohistochemical slides. 4. The measures can serve for automated image classification and detection of relevant image information by themselves without any labeling. 5. The pathologists' duty will not be released by such a system; to the contrary, it will manage and supervise the system, i.e., just working at a "higher level". Virtual slides are already in use for teaching and continuous education in anatomy and pathology. First attempts to introduce them into routine work have been reported. Application of AI has been established by automated immunohistochemical measurement systems (EAMUS, www.diagnomX.eu). The performance of automated diagnosis has been reported for a broad variety of organs at sensitivity and specificity levels >85%). The implementation of a complete connected AI supported system is in its childhood. Application of AI in digital tissue--based diagnosis will allow the pathologists to work as supervisors and no longer as primary "water carriers". Its accurate use will give them the time needed to concentrating on difficult cases for the benefit of their patients.
Langley, Ivor; Lin, Hsien-Ho; Egwaga, Saidi; Doulla, Basra; Ku, Chu-Chang; Murray, Megan; Cohen, Ted; Squire, S Bertel
2014-10-01
Several promising new diagnostic methods and algorithms for tuberculosis have been endorsed by WHO. National tuberculosis programmes now face the decision on which methods to implement and where to place them in the diagnostic algorithm. We used an integrated model to assess the effects of different algorithms of Xpert MTB/RIF and light-emitting diode (LED) fluorescence microscopy in Tanzania. To understand the effects of new diagnostics from the patient, health system, and population perspective, the model incorporated and linked a detailed operational component and a transmission component. The model was designed to represent the operational and epidemiological context of Tanzania and was used to compare the effects and cost-effectiveness of different diagnostic options. Among the diagnostic options considered, we identified three strategies as cost effective in Tanzania. Full scale-up of Xpert would have the greatest population-level effect with the highest incremental cost: 346 000 disability-adjusted life-years (DALYs) averted with an additional cost of US$36·9 million over 10 years. The incremental cost-effectiveness ratio (ICER) of Xpert scale-up ($169 per DALY averted, 95% credible interval [CrI] 104-265) is below the willingness-to-pay threshold ($599) for Tanzania. Same-day LED fluorescence microscopy is the next most effective strategy with an ICER of $45 (95% CrI 25-74), followed by LED fluorescence microscopy with an ICER of $29 (6-59). Compared with same-day LED fluorescence microscopy and Xpert full rollout, targeted use of Xpert in presumptive tuberculosis cases with HIV infection, either as an initial diagnostic test or as a follow-on test to microscopy, would produce DALY gains at a higher incremental cost and therefore is dominated in the context of Tanzania. For Tanzania, this integrated modelling approach predicts that full rollout of Xpert is a cost-effective option for tuberculosis diagnosis and has the potential to substantially reduce the national tuberculosis burden. It also estimates the substantial level of funding that will need to be mobilised to translate this into clinical practice. This approach could be adapted and replicated in other developing countries to inform rational health policy formulation. Copyright © 2014 Langley et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.
Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J
2016-01-01
Background Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). Objectives To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Methods Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Results Out of 9,161,866 individuals aged 0–99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. Conclusion The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings. PMID:27785102
Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J
2016-01-01
Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Out of 9,161,866 individuals aged 0-99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings.
Recursive Bayesian recurrent neural networks for time-series modeling.
Mirikitani, Derrick T; Nikolaev, Nikolay
2010-02-01
This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.
Thoracoabdominal Computed Tomography in Trauma Patients: A Cost-Consequences Analysis
van Vugt, Raoul; Kool, Digna R.; Brink, Monique; Dekker, Helena M.; Deunk, Jaap; Edwards, Michael J.
2014-01-01
Background: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. Objectives: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use thoracoabdominal CT in primary evaluation of adult patients with high-energy blunt trauma. Materials and Methods: We compared three different algorithms in which CT was applied as an immediate diagnostic tool (rush CT), a diagnostic tool after limited conventional work-up (routine CT), and a selective tool (selective CT). Probabilities of detecting and missing clinically relevant injuries were retrospectively derived. We collected data on radiation exposure and performed a micro-cost analysis on a reference case-based approach. Results: Both rush and routine CT detected all thoracoabdominal injuries in 99.1% of the patients during primary evaluation (n = 1040). Selective CT missed one or more diagnoses in 11% of the patients in which a change of treatment was necessary in 4.8%. Rush CT algorithm costed € 2676 (US$ 3660) per patient with a mean radiation dose of 26.40 mSv per patient. Routine CT costed € 2815 (US$ 3850) and resulted in the same radiation exposure. Selective CT resulted in less radiation dose (23.23 mSv) and costed € 2771 (US$ 3790). Conclusions: Rush CT seems to result in the least costs and is comparable in terms of radiation dose exposure and diagnostic certainty with routine CT after a limited conventional work-up. However, selective CT results in less radiation dose exposure but a slightly higher cost and less certainty. PMID:25337521
Delcroix, Olivier; Robin, Philippe; Gouillou, Maelenn; Le Duc-Pennec, Alexandra; Alavi, Zarrin; Le Roux, Pierre-Yves; Abgral, Ronan; Salaun, Pierre-Yves; Bourhis, David; Querellou, Solène
2018-02-12
xSPECT Bone® (xB) is a new reconstruction algorithm developed by Siemens® in bone hybrid imaging (SPECT/CT). A CT-based tissue segmentation is incorporated into SPECT reconstruction to provide SPECT images with bone anatomy appearance. The objectives of this study were to assess xB/CT reconstruction diagnostic reliability and accuracy in comparison with Flash 3D® (F3D)/CT in clinical routine. Two hundred thirteen consecutive patients referred to the Brest Nuclear Medicine Department for non-oncological bone diseases were evaluated retrospectively. Two hundred seven SPECT/CT were included. All SPECT/CT were independently interpreted by two nuclear medicine physicians (a junior and a senior expert) with xB/CT then with F3D/CT three months later. Inter-observer agreement (IOA) and diagnostic confidence were determined using McNemar test, and unweighted Kappa coefficient. The study objectives were then re-assessed for validation through > 18 months of clinical and paraclinical follow-up. No statistically significant differences between IOA xB and IOA F3D were found (p = 0.532). Agreement for xB after categorical classification of the diagnoses was high (κ xB = 0.89 [95% CI 0.84 -0.93]) but without statistically significant difference F3D (κ F3D = 0.90 [95% CI 0.86 - 0.94]). Thirty-one (14.9%) inter-reconstruction diagnostic discrepancies were observed of which 21 (10.1%) were classified as major. The follow-up confirmed the diagnosis of F3D in 10 cases, xB in 6 cases and was non-contributory in 5 cases. xB reconstruction algorithm was found reliable, providing high interobserver agreement and similar diagnostic confidence to F3D reconstruction in clinical routine.
Algorithm for Video Summarization of Bronchoscopy Procedures
2011-01-01
Background The duration of bronchoscopy examinations varies considerably depending on the diagnostic and therapeutic procedures used. It can last more than 20 minutes if a complex diagnostic work-up is included. With wide access to videobronchoscopy, the whole procedure can be recorded as a video sequence. Common practice relies on an active attitude of the bronchoscopist who initiates the recording process and usually chooses to archive only selected views and sequences. However, it may be important to record the full bronchoscopy procedure as documentation when liability issues are at stake. Furthermore, an automatic recording of the whole procedure enables the bronchoscopist to focus solely on the performed procedures. Video recordings registered during bronchoscopies include a considerable number of frames of poor quality due to blurry or unfocused images. It seems that such frames are unavoidable due to the relatively tight endobronchial space, rapid movements of the respiratory tract due to breathing or coughing, and secretions which occur commonly in the bronchi, especially in patients suffering from pulmonary disorders. Methods The use of recorded bronchoscopy video sequences for diagnostic, reference and educational purposes could be considerably extended with efficient, flexible summarization algorithms. Thus, the authors developed a prototype system to create shortcuts (called summaries or abstracts) of bronchoscopy video recordings. Such a system, based on models described in previously published papers, employs image analysis methods to exclude frames or sequences of limited diagnostic or education value. Results The algorithm for the selection or exclusion of specific frames or shots from video sequences recorded during bronchoscopy procedures is based on several criteria, including automatic detection of "non-informative", frames showing the branching of the airways and frames including pathological lesions. Conclusions The paper focuses on the challenge of generating summaries of bronchoscopy video recordings. PMID:22185344
Alexander, C L; Currie, S; Pollock, K; Smith-Palmer, A; Jones, B L
2017-06-01
Giardia duodenalis and Cryptosporidium species are protozoan parasites capable of causing gastrointestinal disease in humans and animals through the ingestion of infective faeces. Whereas Cryptosporidium species can be acquired locally or through foreign travel, there is the mis-conception that giardiasis is considered to be largely travel-associated, which results in differences in laboratory testing algorithms. In order to determine the level of variation in testing criteria and detection methods between diagnostic laboratories for both pathogens across Scotland, an audit was performed. Twenty Scottish diagnostic microbiology laboratories were invited to participate with questions on sample acceptance criteria, testing methods, testing rates and future plans for pathogen detection. Reponses were received from 19 of the 20 laboratories representing each of the 14 territorial Health Boards. Detection methods varied between laboratories with the majority performing microscopy, one using a lateral flow immunochromatographic antigen assay, another using a manually washed plate-based enzyme immunoassay (EIA) and one laboratory trialling a plate-based EIA automated with an EIA plate washer. Whereas all laboratories except one screened every stool for Cryptosporidium species, an important finding was that significant variation in the testing algorithm for detecting Giardia was noted with only four laboratories testing all diagnostic stools. The most common criteria were 'travel history' (11 laboratories) and/or 'when requested' (14 laboratories). Despite only a small proportion of stools being examined in 15 laboratories for Giardia (2%-18% of the total number of stools submitted), of interest is the finding that a higher positivity rate was observed for Giardia than Cryptosporidium in 10 of these 15 laboratories. These findings highlight that the underreporting of Giardia in Scotland is likely based on current selection and testing algorithms.
Benz, Dominik C; Fuchs, Tobias A; Gräni, Christoph; Studer Bruengger, Annina A; Clerc, Olivier F; Mikulicic, Fran; Messerli, Michael; Stehli, Julia; Possner, Mathias; Pazhenkottil, Aju P; Gaemperli, Oliver; Kaufmann, Philipp A; Buechel, Ronny R
2018-02-01
Iterative reconstruction (IR) algorithms allow for a significant reduction in radiation dose of coronary computed tomography angiography (CCTA). We performed a head-to-head comparison of adaptive statistical IR (ASiR) and model-based IR (MBIR) algorithms to assess their impact on quantitative image parameters and diagnostic accuracy for submillisievert CCTA. CCTA datasets of 91 patients were reconstructed using filtered back projection (FBP), increasing contributions of ASiR (20, 40, 60, 80, and 100%), and MBIR. Signal and noise were measured in the aortic root to calculate signal-to-noise ratio (SNR). In a subgroup of 36 patients, diagnostic accuracy of ASiR 40%, ASiR 100%, and MBIR for diagnosis of coronary artery disease (CAD) was compared with invasive coronary angiography. Median radiation dose was 0.21 mSv for CCTA. While increasing levels of ASiR gradually reduced image noise compared with FBP (up to - 48%, P < 0.001), MBIR provided largest noise reduction (-79% compared with FBP) outperforming ASiR (-59% compared with ASiR 100%; P < 0.001). Increased noise and lower SNR with ASiR 40% and ASiR 100% resulted in substantially lower diagnostic accuracy to detect CAD as diagnosed by invasive coronary angiography compared with MBIR: sensitivity and specificity were 100 and 37%, 100 and 57%, and 100 and 74% for ASiR 40%, ASiR 100%, and MBIR, respectively. MBIR offers substantial noise reduction with increased SNR, paving the way for implementation of submillisievert CCTA protocols in clinical routine. In contrast, inferior noise reduction by ASiR negatively affects diagnostic accuracy of submillisievert CCTA for CAD detection. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.
Implementation of an Integrated On-Board Aircraft Engine Diagnostic Architecture
NASA Technical Reports Server (NTRS)
Armstrong, Jeffrey B.; Simon, Donald L.
2012-01-01
An on-board diagnostic architecture for aircraft turbofan engine performance trending, parameter estimation, and gas-path fault detection and isolation has been developed and evaluated in a simulation environment. The architecture incorporates two independent models: a realtime self-tuning performance model providing parameter estimates and a performance baseline model for diagnostic purposes reflecting long-term engine degradation trends. This architecture was evaluated using flight profiles generated from a nonlinear model with realistic fleet engine health degradation distributions and sensor noise. The architecture was found to produce acceptable estimates of engine health and unmeasured parameters, and the integrated diagnostic algorithms were able to perform correct fault isolation in approximately 70 percent of the tested cases
[Role of cytology in hematopathological diagnostics].
Bode, B; Tinguely, M
2012-07-01
The role of cytology has so far been underrecognized in the diagnostic process of hematopathological questions. This article presents an algorithm which allows a stepwise work-up of cytology specimens obtained by minimally invasive ultrasound-guided fine needle aspiration in patients with unexplained lymph node swelling. Moreover, it is shown how the selective separation of cytology specimens allows the application of immunophenotypic analysis including flow cytometry and immunohistochemistry as well as molecular analyses, such as fluorescence in situ hybridization (FISH) and polymerase chain reaction (PCR) strategies. With the integrative procedure presented, cytology offers an excellent cost-effective tool for the diagnostic approach of patients with suspected hematopathological malignancies allowing a high diagnostic accuracy, ideal for initial diagnosis or follow-up.
Noël, Peter B; Engels, Stephan; Köhler, Thomas; Muenzel, Daniela; Franz, Daniela; Rasper, Michael; Rummeny, Ernst J; Dobritz, Martin; Fingerle, Alexander A
2018-01-01
Background The explosive growth of computer tomography (CT) has led to a growing public health concern about patient and population radiation dose. A recently introduced technique for dose reduction, which can be combined with tube-current modulation, over-beam reduction, and organ-specific dose reduction, is iterative reconstruction (IR). Purpose To evaluate the quality, at different radiation dose levels, of three reconstruction algorithms for diagnostics of patients with proven liver metastases under tumor follow-up. Material and Methods A total of 40 thorax-abdomen-pelvis CT examinations acquired from 20 patients in a tumor follow-up were included. All patients were imaged using the standard-dose and a specific low-dose CT protocol. Reconstructed slices were generated by using three different reconstruction algorithms: a classical filtered back projection (FBP); a first-generation iterative noise-reduction algorithm (iDose4); and a next generation model-based IR algorithm (IMR). Results The overall detection of liver lesions tended to be higher with the IMR algorithm than with FBP or iDose4. The IMR dataset at standard dose yielded the highest overall detectability, while the low-dose FBP dataset showed the lowest detectability. For the low-dose protocols, a significantly improved detectability of the liver lesion can be reported compared to FBP or iDose 4 ( P = 0.01). The radiation dose decreased by an approximate factor of 5 between the standard-dose and the low-dose protocol. Conclusion The latest generation of IR algorithms significantly improved the diagnostic image quality and provided virtually noise-free images for ultra-low-dose CT imaging.
Sola, J; Braun, F; Muntane, E; Verjus, C; Bertschi, M; Hugon, F; Manzano, S; Benissa, M; Gervaix, A
2016-08-01
Pneumonia remains the worldwide leading cause of children mortality under the age of five, with every year 1.4 million deaths. Unfortunately, in low resource settings, very limited diagnostic support aids are provided to point-of-care practitioners. Current UNICEF/WHO case management algorithm relies on the use of a chronometer to manually count breath rates on pediatric patients: there is thus a major need for more sophisticated tools to diagnose pneumonia that increase sensitivity and specificity of breath-rate-based algorithms. These tools should be low cost, and adapted to practitioners with limited training. In this work, a novel concept of unsupervised tool for the diagnosis of childhood pneumonia is presented. The concept relies on the automated analysis of respiratory sounds as recorded by a point-of-care electronic stethoscope. By identifying the presence of auscultation sounds at different chest locations, this diagnostic tool is intended to estimate a pneumonia likelihood score. After presenting the overall architecture of an algorithm to estimate pneumonia scores, the importance of a robust unsupervised method to identify inspiratory and expiratory phases of a respiratory cycle is highlighted. Based on data from an on-going study involving pediatric pneumonia patients, a first algorithm to segment respiratory sounds is suggested. The unsupervised algorithm relies on a Mel-frequency filter bank, a two-step Gaussian Mixture Model (GMM) description of data, and a final Hidden Markov Model (HMM) interpretation of inspiratory-expiratory sequences. Finally, illustrative results on first recruited patients are provided. The presented algorithm opens the doors to a new family of unsupervised respiratory sound analyzers that could improve future versions of case management algorithms for the diagnosis of pneumonia in low-resources settings.
Cox, Zachary L; Lewis, Connie M; Lai, Pikki; Lenihan, Daniel J
2017-01-01
We aim to validate the diagnostic performance of the first fully automatic, electronic heart failure (HF) identification algorithm and evaluate the implementation of an HF Dashboard system with 2 components: real-time identification of decompensated HF admissions and accurate characterization of disease characteristics and medical therapy. We constructed an HF identification algorithm requiring 3 of 4 identifiers: B-type natriuretic peptide >400 pg/mL; admitting HF diagnosis; history of HF International Classification of Disease, Ninth Revision, diagnosis codes; and intravenous diuretic administration. We validated the diagnostic accuracy of the components individually (n = 366) and combined in the HF algorithm (n = 150) compared with a blinded provider panel in 2 separate cohorts. We built an HF Dashboard within the electronic medical record characterizing the disease and medical therapies of HF admissions identified by the HF algorithm. We evaluated the HF Dashboard's performance over 26 months of clinical use. Individually, the algorithm components displayed variable sensitivity and specificity, respectively: B-type natriuretic peptide >400 pg/mL (89% and 87%); diuretic (80% and 92%); and International Classification of Disease, Ninth Revision, code (56% and 95%). The HF algorithm achieved a high specificity (95%), positive predictive value (82%), and negative predictive value (85%) but achieved limited sensitivity (56%) secondary to missing provider-generated identification data. The HF Dashboard identified and characterized 3147 HF admissions over 26 months. Automated identification and characterization systems can be developed and used with a substantial degree of specificity for the diagnosis of decompensated HF, although sensitivity is limited by clinical data input. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh; Sheng, Shuangwen; Tan, Yuegang; Zhou, Zude
2017-09-01
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is often unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. The results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is oftenmore » unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. Lastly, the results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.« less
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh; ...
2017-02-27
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is oftenmore » unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. Lastly, the results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.« less
Monahan, Mark; Jowett, Sue; Lovibond, Kate; Gill, Paramjit; Godwin, Marshall; Greenfield, Sheila; Hanley, Janet; Hobbs, F D Richard; Martin, Una; Mant, Jonathan; McKinstry, Brian; Williams, Bryan; Sheppard, James P; McManus, Richard J
2018-02-01
Clinical guidelines in the United States and United Kingdom recommend that individuals with suspected hypertension should have ambulatory blood pressure (BP) monitoring to confirm the diagnosis. This approach reduces misdiagnosis because of white coat hypertension but will not identify people with masked hypertension who may benefit from treatment. The Predicting Out-of-Office Blood Pressure (PROOF-BP) algorithm predicts masked and white coat hypertension based on patient characteristics and clinic BP, improving the accuracy of diagnosis while limiting subsequent ambulatory BP monitoring. This study assessed the cost-effectiveness of using this tool in diagnosing hypertension in primary care. A Markov cost-utility cohort model was developed to compare diagnostic strategies: the PROOF-BP approach, including those with clinic BP ≥130/80 mm Hg who receive ambulatory BP monitoring as guided by the algorithm, compared with current standard diagnostic strategies including those with clinic BP ≥140/90 mm Hg combined with further monitoring (ambulatory BP monitoring as reference, clinic, and home monitoring also assessed). The model adopted a lifetime horizon with a 3-month time cycle, taking a UK Health Service/Personal Social Services perspective. The PROOF-BP algorithm was cost-effective in screening all patients with clinic BP ≥130/80 mm Hg compared with current strategies that only screen those with clinic BP ≥140/90 mm Hg, provided healthcare providers were willing to pay up to £20 000 ($26 000)/quality-adjusted life year gained. Deterministic and probabilistic sensitivity analyses supported the base-case findings. The PROOF-BP algorithm seems to be cost-effective compared with the conventional BP diagnostic options in primary care. Its use in clinical practice is likely to lead to reduced cardiovascular disease, death, and disability. © 2017 American Heart Association, Inc.
Design of the algorithm of photons migration in the multilayer skin structure
NASA Astrophysics Data System (ADS)
Bulykina, Anastasiia B.; Ryzhova, Victoria A.; Korotaev, Valery V.; Samokhin, Nikita Y.
2017-06-01
Design of approaches and methods of the oncological diseases diagnostics has special significance. It allows determining any kind of tumors at early stages. The development of optical and laser technologies provided increase of a number of methods allowing making diagnostic studies of oncological diseases. A promising area of biomedical diagnostics is the development of automated nondestructive testing systems for the study of the skin polarizing properties based on backscattered radiation detection. Specification of the examined tissue polarizing properties allows studying of structural properties change influenced by various pathologies. Consequently, measurement and analysis of the polarizing properties of the scattered optical radiation for the development of methods for diagnosis and imaging of skin in vivo appear relevant. The purpose of this research is to design the algorithm of photons migration in the multilayer skin structure. In this research, the algorithm of photons migration in the multilayer skin structure was designed. It is based on the use of the Monte Carlo method. Implemented Monte Carlo method appears as a tracking the paths of photons experiencing random discrete direction changes before they are released from the analyzed area or decrease their intensity to negligible levels. Modeling algorithm consists of the medium and the source characteristics generation, a photon generating considering spatial coordinates of the polar and azimuthal angles, the photon weight reduction calculating due to specular and diffuse reflection, the photon mean free path definition, the photon motion direction angle definition as a result of random scattering with a Henyey-Greenstein phase function, the medium's absorption calculation. Biological tissue is modeled as a homogeneous scattering sheet characterized by absorption, a scattering and anisotropy coefficients.
Analysis of Brown camera distortion model
NASA Astrophysics Data System (ADS)
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
A real-time spectral mapper as an emerging diagnostic technology in biomedical sciences.
Epitropou, George; Kavvadias, Vassilis; Iliou, Dimitris; Stathopoulos, Efstathios; Balas, Costas
2013-01-01
Real time spectral imaging and mapping at video rates can have tremendous impact not only on diagnostic sciences but also on fundamental physiological problems. We report the first real-time spectral mapper based on the combination of snap-shot spectral imaging and spectral estimation algorithms. Performance evaluation revealed that six band imaging combined with the Wiener algorithm provided high estimation accuracy, with error levels lying within the experimental noise. High accuracy is accompanied with much faster, by 3 orders of magnitude, spectral mapping, as compared with scanning spectral systems. This new technology is intended to enable spectral mapping at nearly video rates in all kinds of dynamic bio-optical effects as well as in applications where the target-probe relative position is randomly and fast changing.
Neuromyelitis optica: Application of computer diagnostics to historical case reports.
Garcia Reitboeck, Pablo; Garrard, Peter; Peters, Timothy
2017-01-01
The retrospective diagnosis of illnesses by medical historians can often be difficult and prone to bias, although knowledge of the medical disorders of historical figures is key to the understanding of their behavior and reactions. The recent application of computer diagnostics to historical figures allows an objective differential diagnosis to be accomplished. Taking an example from clinical neurology, we analyzed the earliest reported cases of Devic's disease (neuromyelitis optica) that commonly affects the optic nerve and spinal cord and was previously often confused with multiple sclerosis. We conclude that in most identified cases the software concurred with the contemporary physicians' interpretation, but some claimed cases either had insufficient data to provide a diagnosis or other possible diagnoses were suggested that had not been considered. Computational methods may, therefore, help historians to diagnose the ailments of historical figures with greater objectivity.
NASA Technical Reports Server (NTRS)
Summons, R. E.; Jahnke, L. L.; Simoneit, B. R.
1996-01-01
This paper forms part of our long-term goal of using molecular structure and carbon isotopic signals preserved as hydrocarbons in ancient sediments to improve understanding of the early evolution of Earth's surface environment. We are particularly concerned with biomarkers which are informative about aerobiosis. Here, we combine bacterial biochemistry with the organic geochemistry of contemporary and ancient hydrothermal ecosystems to construct models for the nature, behaviour and preservation potential of primitive microbial communities. We use a combined molecular and isotopic approach to characterize lipids produced by cultured bacteria and test a variety of culture conditions which affect their biosynthesis. This information is then compared with lipid mixtures isolated from contemporary hot springs and evaluated for the kinds of chemical change that would accompany burial and incorporation into the sedimentary record. In this study we have shown that growth temperature does not appear to alter isotopic fractionation within the lipid classes produced by a methanotropic bacterium. We also found that cultured cyanobacteria biosynthesize diagnostic methylalkanes and dimethylalkanes with the latter only made when growing under low pCO2. In an examination of a microbial mat sample from Octopus Spring, Yellowstone National Park (USA), we could readily identify chemical structures with 13C contents which were diagnostic for the phototrophic organisms such as cyanobacteria and Chloroflexus. We could not, however, find molecular evidence for operation of a methane cycle in the particular mat samples we studied.
Machiulskiene, Vita; Carvalho, Joana Christina
2018-03-05
Classifications employed to measure dental caries should first of all reflect the dynamics of the disease, in order to provide a solid basis for subsequent treatment decisions and for further monitoring of dental health of individual patients and populations. The contemporary philosophy of dental caries management implies that nonoperative treatment of caries lesions should be implemented whenever possible, limiting operative interventions to the severe and irreversible cases. The ORCA Saturday Afternoon Symposium 2016, held back-to-back to the 63rd ORCA Congress in Athens, Greece, was intended to provide an update on general requirements for clinical caries diagnosis and to overview caries diagnostic classifications including their rationale, validation, advantages, and limitations. Clinical caries diagnostic criteria and caries management outcomes are interrelated, and any diagnostic classification disregarding this concept is outdated, according to the current understanding of oral health care. Choosing clinical caries diagnostic classifications that assess the activity status of detected lesions should be a priority for dental professionals since these classifications favor the best clinical practice directed towards nonoperative interventions. The choice of clinical caries diagnostic classifications in research, in clinical practice, and in public health services should be guided by the best available scientific evidence. The clinical caries diagnostic classifications should be universally applicable in all these fields. Policy making in oral health care and the underlying policy analyses should follow the same standards. Any clinical caries diagnostic classification disregarding the universality of its use is of limited or no interest in the context of the clinical caries diagnosis of today. © 2018 S. Karger AG, Basel.
Asher, Elad; Reuveni, Haim; Shlomo, Nir; Gerber, Yariv; Beigel, Roy; Narodetski, Michael; Eldar, Michael; Or, Jacob; Hod, Hanoch; Shamiss, Arie; Matetzky, Shlomi
2015-01-01
The aim of this study was to compare in patients presenting with acute chest pain the clinical outcomes and cost-effectiveness of an accelerated diagnostic protocol utilizing contemporary technology in a chest pain unit versus routine care in an internal medicine department. Hospital and 90-day course were prospectively studied in 585 consecutive low-moderate risk acute chest pain patients, of whom 304 were investigated in a designated chest pain center using a pre-specified accelerated diagnostic protocol, while 281 underwent routine care in an internal medicine ward. Hospitalization was longer in the routine care compared with the accelerated diagnostic protocol group (p<0.001). During hospitalization, 298 accelerated diagnostic protocol patients (98%) vs. 57 (20%) routine care patients underwent non-invasive testing, (p<0.001). Throughout the 90-day follow-up, diagnostic imaging testing was performed in 125 (44%) and 26 (9%) patients in the routine care and accelerated diagnostic protocol patients, respectively (p<0.001). Ultimately, most patients in both groups had non-invasive imaging testing. Accelerated diagnostic protocol patients compared with those receiving routine care was associated with a lower incidence of readmissions for chest pain [8 (3%) vs. 24 (9%), p<0.01], and acute coronary syndromes [1 (0.3%) vs. 9 (3.2%), p<0.01], during the follow-up period. The accelerated diagnostic protocol remained a predictor of lower acute coronary syndromes and readmissions after propensity score analysis [OR = 0.28 (CI 95% 0.14-0.59)]. Cost per patient was similar in both groups [($2510 vs. $2703 for the accelerated diagnostic protocol and routine care group, respectively, (p = 0.9)]. An accelerated diagnostic protocol is clinically superior and as cost effective as routine in acute chest pain patients, and may save time and resources.
Asher, Elad; Reuveni, Haim; Shlomo, Nir; Gerber, Yariv; Beigel, Roy; Narodetski, Michael; Eldar, Michael; Or, Jacob; Hod, Hanoch; Shamiss, Arie; Matetzky, Shlomi
2015-01-01
Aims The aim of this study was to compare in patients presenting with acute chest pain the clinical outcomes and cost-effectiveness of an accelerated diagnostic protocol utilizing contemporary technology in a chest pain unit versus routine care in an internal medicine department. Methods and Results Hospital and 90-day course were prospectively studied in 585 consecutive low-moderate risk acute chest pain patients, of whom 304 were investigated in a designated chest pain center using a pre-specified accelerated diagnostic protocol, while 281 underwent routine care in an internal medicine ward. Hospitalization was longer in the routine care compared with the accelerated diagnostic protocol group (p<0.001). During hospitalization, 298 accelerated diagnostic protocol patients (98%) vs. 57 (20%) routine care patients underwent non-invasive testing, (p<0.001). Throughout the 90-day follow-up, diagnostic imaging testing was performed in 125 (44%) and 26 (9%) patients in the routine care and accelerated diagnostic protocol patients, respectively (p<0.001). Ultimately, most patients in both groups had non-invasive imaging testing. Accelerated diagnostic protocol patients compared with those receiving routine care was associated with a lower incidence of readmissions for chest pain [8 (3%) vs. 24 (9%), p<0.01], and acute coronary syndromes [1 (0.3%) vs. 9 (3.2%), p<0.01], during the follow-up period. The accelerated diagnostic protocol remained a predictor of lower acute coronary syndromes and readmissions after propensity score analysis [OR = 0.28 (CI 95% 0.14–0.59)]. Cost per patient was similar in both groups [($2510 vs. $2703 for the accelerated diagnostic protocol and routine care group, respectively, (p = 0.9)]. Conclusion An accelerated diagnostic protocol is clinically superior and as cost effective as routine in acute chest pain patients, and may save time and resources. PMID:25622029
Diagnostic Approach to a Patient With Paraneoplastic Neurological Syndrome.
Mahta, Ali; Vijayvergia, Namrata; Bhavsar, Tapan M; Ward, Lawrence D
2012-10-01
Herein, we discussed a case of an otherwise healthy man who presented with progressive gait imbalance and ataxia, found to have small cell lung cancer. Based upon our clinical findings and laboratory data, a diagnosis of paraneoplastic cerebellar degeneration was made. Paraneoplastic neurological syndromes (PNS) are relatively rare but diverse and always should be considered in differentials. A diagnostic algorithm along with appropriate work up is discussed here.
Diagnostic Approach to a Patient With Paraneoplastic Neurological Syndrome
Mahta, Ali; Vijayvergia, Namrata; Bhavsar, Tapan M.; Ward, Lawrence D.
2012-01-01
Herein, we discussed a case of an otherwise healthy man who presented with progressive gait imbalance and ataxia, found to have small cell lung cancer. Based upon our clinical findings and laboratory data, a diagnosis of paraneoplastic cerebellar degeneration was made. Paraneoplastic neurological syndromes (PNS) are relatively rare but diverse and always should be considered in differentials. A diagnostic algorithm along with appropriate work up is discussed here. PMID:29147315
Diagnostic potential of Raman spectroscopy in Barrett's esophagus
NASA Astrophysics Data System (ADS)
Wong Kee Song, Louis-Michel; Molckovsky, Andrea; Wang, Kenneth K.; Burgart, Lawrence J.; Dolenko, Brion; Somorjai, Rajmund L.; Wilson, Brian C.
2005-04-01
Patients with Barrett's esophagus (BE) undergo periodic endoscopic surveillance with random biopsies in an effort to detect dysplastic or early cancerous lesions. Surveillance may be enhanced by near-infrared Raman spectroscopy (NIRS), which has the potential to identify endoscopically-occult dysplastic lesions within the Barrett's segment and allow for targeted biopsies. The aim of this study was to assess the diagnostic performance of NIRS for identifying dysplastic lesions in BE in vivo. Raman spectra (Pexc=70 mW; t=5 s) were collected from Barrett's mucosa at endoscopy using a custom-built NIRS system (λexc=785 nm) equipped with a filtered fiber-optic probe. Each probed site was biopsied for matching histological diagnosis as assessed by an expert pathologist. Diagnostic algorithms were developed using genetic algorithm-based feature selection and linear discriminant analysis, and classification was performed on all spectra with a bootstrap-based cross-validation scheme. The analysis comprised 192 samples (112 non-dysplastic, 54 low-grade dysplasia and 26 high-grade dysplasia/early adenocarcinoma) from 65 patients. Compared with histology, NIRS differentiated dysplastic from non-dysplastic Barrett's samples with 86% sensitivity, 88% specificity and 87% accuracy. NIRS identified 'high-risk' lesions (high-grade dysplasia/early adenocarcinoma) with 88% sensitivity, 89% specificity and 89% accuracy. In the present study, NIRS classified Barrett's epithelia with high and clinically-useful diagnostic accuracy.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou
2006-03-01
Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.
Shah, Maunank; Dowdy, David; Joloba, Moses; Ssengooba, Willy; Manabe, Yukari C; Ellner, Jerrold; Dorman, Susan E
2013-11-28
Xpert MTB/RIF ('Xpert') and urinary lateral-flow lipoarabinomannan (LF-LAM) assays offer rapid tuberculosis (TB) diagnosis. This study evaluated the cost-effectiveness of novel diagnostic algorithms utilizing combinations of Xpert and LF-LAM for the detection of active TB among people living with HIV. Cost-effectiveness analysis using data from a comparative study of LF-LAM and Xpert, with a target population of HIV-infected individuals with signs/symptoms of TB in Uganda. A decision-analysis model compared multiple strategies for rapid TB diagnosis:sputum smear-microscopy; sputum Xpert; smear-microscopy combined with LF-LAM; and Xpert combined with LF-LAM. Primary outcomes were the costs and DALY's averted for each algorithm. Cost-effectiveness was represented using incremental cost-effectiveness ratios (ICER). Compared with an algorithm of Xpert testing alone, the combination of Xpert with LF-LAM was considered highly cost-effective (ICER $57/DALY-averted) at a willingness to pay threshold of Ugandan GDP per capita. Addition of urine LF-LAM testing to smear-microscopy was a less effective strategy than Xpert replacement of smear-microscopy, but was less costly and also considered highly cost-effective (ICER $33 per DALY-averted) compared with continued usage of smear-microscopy alone. Cost-effectiveness of the Xpert plus LF-LAM algorithm was most influenced by HIV/ART costs and life-expectancy of patients after TB treatment. The addition of urinary LF-LAM to TB diagnostic algorithms for HIV-infected individuals is highly cost-effective compared with usage of either sputum smear-microscopy or Xpert alone.
A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging
NASA Astrophysics Data System (ADS)
Jiang, J.; Hall, T. J.
2007-07-01
Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows® system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s-1) that exceed our previous methods.
Living into the imagined body: how the diagnostic image confronts the lived body.
Stahl, Devan
2013-06-01
In this paper I will show how the medical image, presented to the patient by the physician, participates in medicine's cold culture of abstraction, objectification and mandated normativity. I begin by giving a brief account of the use of anatomical imaging since the Renaissance to show how images have historically functioned in contrast to how they are currently used in medical practice. Next, I examine how contemporary medical imaging techniques participate in a kind of knowledge production that objectifies the human body. Finally, I elucidate how physicians ought to place the medical image within the context of the lived body so as to create a healing relationship with the patient. In all this I hope to show that the medical image, far from a piece of objective data, testifies to the interplay of particular beliefs, practices and doctrines contemporary medicine holds dear. To best treat her patient, the physician must appreciate the influence of these images and appropriately place them within the context of the patient's lived experience.
The production of the psychiatric subject: power, knowledge and Michel Foucault.
Roberts, Marc
2005-01-01
The issue of power has become increasingly important within psychiatry, psychotherapy and mental health nursing generally. This paper will suggest that the work of Michel Foucault, the French philosopher and historian, has much to contribute to the discussion about the nature, existence and exercise of power within contemporary mental health care. As well as examining his original and challenging account of power, Foucault's emphasis on the intimate relationship between power and knowledge will be explored within the context of psychiatry and mental health nursing. This is to say that the paper will investigate Foucault's account of how power and knowledge are central to the process by which human beings are 'made subjects' and therefore how 'psychiatric identities' are produced. In doing so, it will be suggested that Foucault's work can not only make a valuable contribution to contemporary discussions about power and knowledge, but can also provide a significant critique and reconceptualization of the theoretical foundations and associated diagnostic and therapeutic practices of psychiatry and mental health nursing.
Understanding the role of psychopathology in bariatric surgery outcomes.
Marek, R J; Ben-Porath, Y S; Heinberg, L J
2016-02-01
Bariatric surgery is the most effective treatment for morbid obesity; however, a subset of patients who undergo this procedure regain weight or achieve suboptimal weight loss results. A large number of studies have examined whether psychological variables play a role in weight loss surgery outcome. Although presurgical psychopathology has been found to be associated with suboptimal results in some studies, this literature is equivocal. These inconsistent findings are reviewed and considered in the context of contemporary models of psychopathology. More specifically, the review focuses on the limitations of atheoretical, descriptive diagnostic systems and examines whether comorbidity within the mood/anxiety disorders, impulse control/substance use disorders and thought disorders can account for the inconsistent findings reported to date. Contemporary models of psychopathology are highlighted and linked to the Research Domain Criteria, which have been advanced by the National Institute of Health. Means for assessing psychological constructs congruent with these models are reviewed. Recommendations are made for standardizing approaches to investigating how psychopathology contributes to suboptimal bariatric surgery outcomes. © 2015 World Obesity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C; Adcock, A; Azevedo, S
2010-12-28
Some diagnostics at the National Ignition Facility (NIF), including the Gamma Reaction History (GRH) diagnostic, require multiple channels of data to achieve the required dynamic range. These channels need to be stitched together into a single time series, and they may have non-uniform and redundant time samples. We chose to apply the popular cubic smoothing spline technique to our stitching problem because we needed a general non-parametric method. We adapted one of the algorithms in the literature, by Hutchinson and deHoog, to our needs. The modified algorithm and the resulting code perform a cubic smoothing spline fit to multiple datamore » channels with redundant time samples and missing data points. The data channels can have different, time-varying, zero-mean white noise characteristics. The method we employ automatically determines an optimal smoothing level by minimizing the Generalized Cross Validation (GCV) score. In order to automatically validate the smoothing level selection, the Weighted Sum-Squared Residual (WSSR) and zero-mean tests are performed on the residuals. Further, confidence intervals, both analytical and Monte Carlo, are also calculated. In this paper, we describe the derivation of our cubic smoothing spline algorithm. We outline the algorithm and test it with simulated and experimental data.« less
Skopp, Nancy A; Smolenski, Derek J; Schwesinger, Daniel A; Johnson, Christopher J; Metzger-Abamukong, Melinda J; Reger, Mark A
2017-06-01
Accurate knowledge of the vital status of individuals is critical to the validity of mortality research. National Death Index (NDI) and NDI-Plus are comprehensive epidemiological resources for mortality ascertainment and cause of death data that require additional user validation. Currently, there is a gap in methods to guide validation of NDI search results rendered for active duty service members. The purpose of this research was to adapt and evaluate the CDC National Program of Cancer Registries (NPCR) algorithm for mortality ascertainment in a large military cohort. We adapted and applied the NPCR algorithm to a cohort of 7088 service members on active duty at the time of death at some point between 2001 and 2009. We evaluated NDI validity and NDI-Plus diagnostic agreement against the Department of Defense's Armed Forces Medical Examiner System (AFMES). The overall sensitivity of the NDI to AFMES records after the application of the NPCR algorithm was 97.1%. Diagnostic estimates of measurement agreement between the NDI-Plus and the AFMES cause of death groups were high. The NDI and NDI-Plus can be successfully used with the NPCR algorithm to identify mortality and cause of death among active duty military cohort members who die in the United States. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.
2015-02-01
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high energy density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.
NASA Astrophysics Data System (ADS)
Bostock, J.; Weller, P.; Cooklin, M.
2010-07-01
Automated diagnostic algorithms are used in implantable cardioverter-defibrillators (ICD's) to detect abnormal heart rhythms. Algorithms misdiagnose and improved specificity is needed to prevent inappropriate therapy. Knowledge engineering (KE) and artificial intelligence (AI) could improve this. A pilot study of KE was performed with artificial neural network (ANN) as AI system. A case note review analysed arrhythmic events stored in patients ICD memory. 13.2% patients received inappropriate therapy. The best ICD algorithm had sensitivity 1.00, specificity 0.69 (p<0.001 different to gold standard). A subset of data was used to train and test an ANN. A feed-forward, back-propagation network with 7 inputs, a 4 node hidden layer and 1 output had sensitivity 1.00, specificity 0.71 (p<0.001). A prospective study was performed using KE to list arrhythmias, factors and indicators for which measurable parameters were evaluated and results reviewed by a domain expert. Waveforms from electrodes in the heart and thoracic bio-impedance; temperature and motion data were collected from 65 patients during cardiac electrophysiological studies. 5 incomplete datasets were due to technical failures. We concluded that KE successfully guided selection of parameters and ANN produced a usable system and that complex data collection carries greater risk of technical failure, leading to data loss.
TH-A-BRF-11: Image Intensity Non-Uniformities Between MRI Simulation and Diagnostic MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, E
2014-06-15
Purpose: MRI simulation for MRI-based radiotherapy demands that patients be setup in treatment position, which frequently involves use of alternative radiofrequency (RF) coil configurations to accommodate immobilized patients. However, alternative RF coil geometries may exacerbate image intensity non-uniformities (IINU) beyond those observed in diagnostic MRI, which may challenge image segmentation and registration accuracy as well as confound studies assessing radiotherapy response when MR simulation images are used as baselines for evaluation. The goal of this work was to determine whether differences in IINU exist between MR simulation and diagnostic MR images. Methods: ACR-MRI phantom images were acquired at 3T usingmore » a spin-echo sequence (TE/TR:20/500ms, rBW:62.5kHz, TH/skip:5/5mm). MR simulation images were obtained by wrapping two flexible phased-array RF coils around the phantom. Diagnostic MR images were obtained by placing the phantom into a commercial phased-array head coil. Pre-scan normalization was enabled in both cases. Images were transferred offline and corrected for IINU using the MNI N3 algorithm. Coefficients of variation (CV=σ/μ) were calculated for each slice. Wilcoxon matched-pairs and Mann-Whitney tests compared CV values between original and N3 images and between MR simulation and diagnostic MR images. Results: Significant differences in CV were detected between original and N3 images in both MRI simulation and diagnostic MRI groups (p=0.010, p=0.010). In addition, significant differences in CV were detected between original MR simulation and original and N3 diagnostic MR images (p=0.0256, p=0.0016). However, no significant differences in CV were detected between N3 MR simulation images and original or N3 diagnostic MR images, demonstrating the importance of correcting MR simulation images beyond pre-scan normalization prior to use in radiotherapy. Conclusions: Alternative RF coil configurations used in MRI simulation can Result in significant IINU differences compared to diagnostic MR images. The MNI N3 algorithm reduced MR simulation IINU to levels observed in diagnostic MR images. Funding provided by Advancing a Healthier Wisconsin.« less
Intelligent approach to prognostic enhancements of diagnostic systems
NASA Astrophysics Data System (ADS)
Vachtsevanos, George; Wang, Peng; Khiripet, Noppadon; Thakker, Ash; Galie, Thomas R.
2001-07-01
This paper introduces a novel methodology to prognostics based on a dynamic wavelet neural network construct and notions from the virtual sensor area. This research has been motivated and supported by the U.S. Navy's active interest in integrating advanced diagnostic and prognostic algorithms in existing Naval digital control and monitoring systems. A rudimentary diagnostic platform is assumed to be available providing timely information about incipient or impending failure conditions. We focus on the development of a prognostic algorithm capable of predicting accurately and reliably the remaining useful lifetime of a failing machine or component. The prognostic module consists of a virtual sensor and a dynamic wavelet neural network as the predictor. The virtual sensor employs process data to map real measurements into difficult to monitor fault quantities. The prognosticator uses a dynamic wavelet neural network as a nonlinear predictor. Means to manage uncertainty and performance metrics are suggested for comparison purposes. An interface to an available shipboard Integrated Condition Assessment System is described and applications to shipboard equipment are discussed. Typical results from pump failures are presented to illustrate the effectiveness of the methodology.
Wilke, Russell A; Berg, Richard L; Peissig, Peggy; Kitchner, Terrie; Sijercic, Bozana; McCarty, Catherine A; McCarty, Daniel J
2007-03-01
Diabetes mellitus is a rapidly increasing and costly public health problem. Large studies are needed to understand the complex gene-environment interactions that lead to diabetes and its complications. The Marshfield Clinic Personalized Medicine Research Project (PMRP) represents one of the largest population-based DNA biobanks in the United States. As part of an effort to begin phenotyping common diseases within the PMRP, we now report on the construction of a diabetes case-finding algorithm using electronic medical record data from adult subjects aged > or =50 years living in one of the target PMRP ZIP codes. Based upon diabetic diagnostic codes alone, we observed a false positive case rate ranging from 3.0% (in subjects with the highest glycosylated hemoglobin values) to 44.4% (in subjects with the lowest glycosylated hemoglobin values). We therefore developed an improved case finding algorithm that utilizes diabetic diagnostic codes in combination with clinical laboratory data and medication history. This algorithm yielded an estimated prevalence of 24.2% for diabetes mellitus in adult subjects aged > or =50 years.
Pilszyk, Anna; Silczuk, Andrzej; Habrat, Bogusław; Heitzman, Janusz
2018-02-28
Contemporary literature does not take a clear position on the issue of determining civil and criminal liability of persons diagnosed with pathological gambling, and all the more so in case of possible comorbidity of or interference with other mental disorders. Diagnostic difficulties are demonstrated by a clinical picture of a patient with problem gambling who underwent forensic and psychiatric assessments to evaluate the process of making informed (and independent) decisions in view of numerous concluded civil law (mainly financial) agreements. The patient had been examined 5 times by expert psychiatrists who, in 4 opinions, diagnosed her with bipolar affective disorder, including 1 diagnosis of rapid cycling of episodes. Based on the current state of scientific knowledge about the relationship between problem gambling and mood disorders, bipolar affective disorder was not confirmed. Diagnostic difficulties, resulting both from diagnostic haziness and unreliable information obtained during patient interview, that emerged in the course of case study point to the need for multi-dimensional clinical diagnosis of persons with suspected mood disorders and behavioral addictions.
Fischer, Bernard A
2012-12-01
The history of the Diagnostic and Statistical Manual of Mental Disorders (DSM) reflects the larger history of American psychiatry. As the field anticipates DSM-5, it is useful to take stock of this history and consider not only how diagnosis impacts our understanding of mental illness but also how contemporary thought influences diagnosis. Before the DSM, the field was disjointed. The publication of the first American diagnostic manual, the precursor of the DSM, mirrored society's interest in organized record keeping and prevention rather than treatment of mental illness. The first and second editions of DSM brought a common language to diagnosis and were largely the work of outpatient and academic psychiatrists rather than those based in large state hospitals. The third edition of the DSM saw the shift in American psychiatry's leadership from the eminent clinician to the researcher, whereas the fourth edition reflected the rise of "evidence-based medicine." DSM-5 will likewise represent the current status of the field-not only with regard to science but also reflecting the place of American psychiatry in medicine today.
McDonald, William C; Banerji, Nilanjana; McDonald, Kelsey N; Ho, Bridget; Macias, Virgilia; Kajdacsy-Balla, Andre
2017-01-01
-Pituitary adenoma classification is complex, and diagnostic strategies vary greatly from laboratory to laboratory. No optimal diagnostic algorithm has been defined. -To develop a panel of immunohistochemical (IHC) stains that provides the optimal combination of cost, accuracy, and ease of use. -We examined 136 pituitary adenomas with stains of steroidogenic factor 1 (SF-1), Pit-1, anterior pituitary hormones, cytokeratin CAM5.2, and α subunit of human chorionic gonadotropin. Immunohistochemical staining was scored using the Allred system. Adenomas were assigned to a gold standard class based on IHC results and available clinical and serologic information. Correlation and cluster analyses were used to develop an algorithm for parsimoniously classifying adenomas. -The algorithm entailed a 1- or 2-step process: (1) a screening step consisting of IHC stains for SF-1, Pit-1, and adrenocorticotropic hormone; and (2) when screening IHC pattern and clinical history were not clearly gonadotrophic (SF-1 positive only), corticotrophic (adrenocorticotropic hormone positive only), or IHC null cell (negative-screening IHC), we subsequently used IHC for prolactin, growth hormone, thyroid-stimulating hormone, and cytokeratin CAM5.2. -Comparison between diagnoses generated by our algorithm and the gold standard diagnoses showed excellent agreement. When compared with a commonly used panel using 6 IHC for anterior pituitary hormones plus IHC for a low-molecular-weight cytokeratin in certain tumors, our algorithm uses approximately one-third fewer IHC stains and detects gonadotroph adenomas with greater sensitivity.
NASA Astrophysics Data System (ADS)
Srinivasan, Yeshwanth; Hernes, Dana; Tulpule, Bhakti; Yang, Shuyu; Guo, Jiangling; Mitra, Sunanda; Yagneswaran, Sriraja; Nutter, Brian; Jeronimo, Jose; Phillips, Benny; Long, Rodney; Ferris, Daron
2005-04-01
Automated segmentation and classification of diagnostic markers in medical imagery are challenging tasks. Numerous algorithms for segmentation and classification based on statistical approaches of varying complexity are found in the literature. However, the design of an efficient and automated algorithm for precise classification of desired diagnostic markers is extremely image-specific. The National Library of Medicine (NLM), in collaboration with the National Cancer Institute (NCI), is creating an archive of 60,000 digitized color images of the uterine cervix. NLM is developing tools for the analysis and dissemination of these images over the Web for the study of visual features correlated with precancerous neoplasia and cancer. To enable indexing of images of the cervix, it is essential to develop algorithms for the segmentation of regions of interest, such as acetowhitened regions, and automatic identification and classification of regions exhibiting mosaicism and punctation. Success of such algorithms depends, primarily, on the selection of relevant features representing the region of interest. We present color and geometric features based statistical classification and segmentation algorithms yielding excellent identification of the regions of interest. The distinct classification of the mosaic regions from the non-mosaic ones has been obtained by clustering multiple geometric and color features of the segmented sections using various morphological and statistical approaches. Such automated classification methodologies will facilitate content-based image retrieval from the digital archive of uterine cervix and have the potential of developing an image based screening tool for cervical cancer.
Automated System for Early Breast Cancer Detection in Mammograms
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.
1993-01-01
The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.
Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K
2015-01-01
Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (p<0.001) and integrated discrimination improvement (p=0.04). The HALT-C model had a c-statistic of 0.60 (95%CI 0.50-0.70) in the validation cohort and was outperformed by the machine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273
[The factors affecting the results of mechanical jaundice management].
Malkov, I S; Shaimardanov, R Sh; Korobkov, V N; Filippov, V A; Khisamiev, I G
To improve the results of obstructive jaundice management by rational diagnostic and treatment strategies. Outcomes of 820 patients with obstructive jaundice syndrome were analyzed. Diagnostic and tactical mistakes were made at pre-hospital stage in 143 (17.4%) patients and in 105 (12.8%) at hospital stage. Herewith, in 53 (6.5%) cases the errors were observed at all stages. Retrospective analysis of severe postoperative complications and lethal outcomes in patients with obstructive jaundice showed that in 23.8% of cases they were explained by diagnostic and tactical mistakes at various stages of examination and treatment. We developed an algorithm for obstructive jaundice management to reduce the number of diagnostic and tactical errors, a reduction in the frequency of diagnostic and tactical errors. It reduced the number of postoperative complications up to 16.5% and mortality rate to 3.0%.
The thinking doctor: clinical decision making in contemporary medicine.
Trimble, Michael; Hamilton, Paul
2016-08-01
Diagnostic errors are responsible for a significant number of adverse events. Logical reasoning and good decision-making skills are key factors in reducing such errors, but little emphasis has traditionally been placed on how these thought processes occur, and how errors could be minimised. In this article, we explore key cognitive ideas that underpin clinical decision making and suggest that by employing some simple strategies, physicians might be better able to understand how they make decisions and how the process might be optimised. © 2016 Royal College of Physicians.
Trauma-induced dissociative amnesia in World War I combat soldiers.
van der Hart, O; Brown, P; Graafland, M
1999-02-01
This study relates trauma-induced dissociative amnesia reported in World War I (WW I) studies of war trauma to contemporary findings of dissociative amnesia in victims of childhood sexual abuse. Key diagnostic studies of post-traumatic amnesia in WW I combatants are surveyed. These cover phenomenology and the psychological dynamics of dissociation vis-à-vis repression. Descriptive evidence is cited for war trauma-induced dissociative amnesia. Posttraumatic amnesia extends beyond the experience of sexual and combat trauma and is a protean symptom, which reflects responses to the gamut of traumatic events.
Bigler, E D
1999-08-01
Contemporary neuorimaging techniques in child traumatic brain injury are reviewed, with an emphasis on computerized tomography (CT) and magnetic resonance (MR) imaging. A brief overview of MR spectroscopy (MRS), functional MR imaging (fMRI), single-photon emission computed tomography (SPECT), and magnetoencephalography (MEG) is also provided because these techniques will likely constitute important neuroimaging techniques of the future. Numerous figures are provided to illustrate the multifaceted manner in which traumatic deficits can be imaged and the role of neuroimaging information as it relates to TBI outcome.
Ideal Positions: 3D Sonography, Medical Visuality, Popular Culture.
Seiber, Tim
2016-03-01
As digital technologies are integrated into medical environments, they continue to transform the experience of contemporary health care. Importantly, medicine is increasingly visual. In the history of sonography, visibility has played an important role in accessing fetal bodies for diagnostic and entertainment purposes. With the advent of three-dimensional (3D) rendering, sonography presents the fetus visually as already a child. The aesthetics of this process and the resulting imagery, made possible in digital networks, discloses important changes in the relationship between technology and biology, reproductive health and political debates, and biotechnology and culture.
A Stochastic-Variational Model for Soft Mumford-Shah Segmentation
2006-01-01
In contemporary image and vision analysis, stochastic approaches demonstrate great flexibility in representing and modeling complex phenomena, while variational-PDE methods gain enormous computational advantages over Monte Carlo or other stochastic algorithms. In combination, the two can lead to much more powerful novel models and efficient algorithms. In the current work, we propose a stochastic-variational model for soft (or fuzzy) Mumford-Shah segmentation of mixture image patterns. Unlike the classical hard Mumford-Shah segmentation, the new model allows each pixel to belong to each image pattern with some probability. Soft segmentation could lead to hard segmentation, and hence is more general. The modeling procedure, mathematical analysis on the existence of optimal solutions, and computational implementation of the new model are explored in detail, and numerical examples of both synthetic and natural images are presented. PMID:23165059
Flight experience with flight control redundancy management
NASA Technical Reports Server (NTRS)
Szalai, K. J.; Larson, R. R.; Glover, R. D.
1980-01-01
Flight experience with both current and advanced redundancy management schemes was gained in recent flight research programs using the F-8 digital fly by wire aircraft. The flight performance of fault detection, isolation, and reconfiguration (FDIR) methods for sensors, computers, and actuators is reviewed. Results of induced failures as well as of actual random failures are discussed. Deficiencies in modeling and implementation techniques are also discussed. The paper also presents comparison off multisensor tracking in smooth air, in turbulence, during large maneuvers, and during maneuvers typical of those of large commercial transport aircraft. The results of flight tests of an advanced analytic redundancy management algorithm are compared with the performance of a contemporary algorithm in terms of time to detection, false alarms, and missed alarms. The performance of computer redundancy management in both iron bird and flight tests is also presented.
Spectral analysis of major heart tones
NASA Astrophysics Data System (ADS)
Lejkowski, W.; Dobrowolski, A. P.; Majka, K.; Olszewski, R.
2018-04-01
The World Health Organization (WHO) figures clearly indicate that cardiovascular disease is the most common cause of death and disability in the world. Early detection of cardiovascular pathologies may contribute to reducing such a high mortality rate. Auscultatory examination is one of the first and most important step in cardiologic diagnostics. Unfortunately, proper diagnosis is closely related to long-term practice and medical experience. The article presents the author's system of recording phonocardiograms and the way of saving data, as well as the outline of the analysis algorithm, which will allow to assign a case to a patient with heart failure or healthy voluntaries' with a certain high probability. The results of a pilot study of phonocardiographic signals were also presented as an introduction to further research aimed at the development of an efficient diagnostic algorithm based on spectral analysis of the heart tone.
Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D
2008-02-15
Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.
Mexican consensus on lysosomal acid lipase deficiency diagnosis.
Vázquez-Frias, R; García-Ortiz, J E; Valencia-Mayoral, P F; Castro-Narro, G E; Medina-Bravo, P G; Santillán-Hernández, Y; Flores-Calderón, J; Mehta, R; Arellano-Valdés, C A; Carbajal-Rodríguez, L; Navarrete-Martínez, J I; Urbán-Reyes, M L; Valadez-Reyes, M T; Zárate-Mondragón, F; Consuelo-Sánchez, A
Lysosomal acid lipase deficiency (LAL-D) causes progressive cholesteryl ester and triglyceride accumulation in the lysosomes of hepatocytes and monocyte-macrophage system cells, resulting in a systemic disease with various manifestations that may go unnoticed. It is indispensable to recognize the deficiency, which can present in patients at any age, so that specific treatment can be given. The aim of the present review was to offer a guide for physicians in understanding the fundamental diagnostic aspects of LAL-D, to successfully aid in its identification. The review was designed by a group of Mexican experts and is presented as an orienting algorithm for the pediatrician, internist, gastroenterologist, endocrinologist, geneticist, pathologist, radiologist, and other specialists that could come across this disease in their patients. An up-to-date review of the literature in relation to the clinical manifestations of LAL-D and its diagnosis was performed. The statements were formulated based on said review and were then voted upon. The structured quantitative method employed for reaching consensus was the nominal group technique. A practical algorithm of the diagnostic process in LAL-D patients was proposed, based on clinical and laboratory data indicative of the disease and in accordance with the consensus established for each recommendation. The algorithm provides a sequence of clinical actions from different studies for optimizing the diagnostic process of patients suspected of having LAL-D. Copyright © 2017 Asociación Mexicana de Gastroenterología. Publicado por Masson Doyma México S.A. All rights reserved.
Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview
Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee
2013-01-01
Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037
Gifford, Grace K; Gill, Anthony J; Stevenson, William S
2016-01-01
Molecular classification of diffuse large B-cell lymphoma (DLBCL) is critical. Numerous methodologies have demonstrated that DLBCL is biologically heterogeneous despite morphological similarities. This underlies the disparate outcomes of treatment response or failure in this common non-Hodgkin lymphoma. This review will summarise historical approaches to lymphoma classifications, current diagnosis of DLBCL, molecular techniques that have primarily been used in the research setting to distinguish and subclassify DLBCL, evaluate contemporary diagnostic methodologies that seek to translate lymphoma biology into clinical practice, and introduce novel diagnostic platforms that may overcome current issues. The review concludes with an overview of key molecular lesions currently identified in DLBCL, all of which are potential targets for drug treatments that may improve survival and cure. Copyright © 2015 The Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.
Proposed Morphologic Classification of Prostate Cancer With Neuroendocrine Differentiation
Epstein, Jonathan I.; Amin, Mahul B.; Beltran, Himisha; Lotan, Tamara L.; Mosquera, Juan-Miguel; Reuter, Victor E.; Robinson, Brian D.; Troncoso, Patricia; Rubin, Mark A.
2014-01-01
On July 31, 2013, the Prostate Cancer Foundation assembled a working committee on the molecular biology and pathologic classification of neuroendocrine differentiation in prostate cancer. The committee consisted of genitourinary oncologists, urologists, urological surgical pathologists, basic scientists, and translational researchers, with expertise in this field. It was concluded that the proceedings of the meeting should be reported in 2 manuscripts appealing to different target audiences, one to focus on surgical pathology and the other to review the molecular aspects of this disease. New clinical and molecular data emerging from prostate cancers treated by contemporary androgen deprivation therapies, as well as primary lesions, have highlighted the need for refinement of diagnostic terminology to encompass the full spectrum of neuroendocrine differentiation. It is envisioned that specific criteria associated with the refined diagnostic terminology will lead to clinically relevant pathologic diagnoses that will stimulate further clinical and molecular investigation and identification of appropriate targeted therapies. PMID:24705311
Trauma-induced dissociative amnesia in World War I combat soldiers. II. Treatment dimensions.
Brown, P; van der Hart, O; Graafland, M
1999-06-01
This is the second part of a study of posttraumatic amnesia in World War I (WW I) soldiers. It moves beyond diagnostic validation of posttraumatic amnesia (PTA), to examine treatment findings, and relates these to contemporary treatment of dissociative amnesia, including treatment of victims of civilian trauma (e.g. childhood sexual abuse). Key WW I studies are surveyed which focus on the treatment of PTA and traumatic memories. The dissociation-integration and repression-abreaction models are contrasted. Descriptive evidence is cited in support of preferring Myers' and McDougalls' dissociation-integration treatment approach over Brown's repression-abreaction model. Therapeutic findings in this paper complement diagnostic data from the first report. Although effective treatment includes elements of both the dissociative-integrative and abreactive treatment approaches, cognitive integration of dissociated traumatic memories and personality functions is primary, while emotional release is secondary.
Contemporary Management of Benign and Malignant Parotid Tumors.
Thielker, Jovanna; Grosheva, Maria; Ihrler, Stephan; Wittig, Andrea; Guntinas-Lichius, Orlando
2018-01-01
To report the standard of care, interesting new findings and controversies about the treatment of parotid tumors. Relevant and actual studies were searched in PubMed and reviewed for diagnostics, treatment and outcome of both benign and malignant tumors. Prospective trials are lacking due to rarity of the disease and high variety of tumor subtypes. The establishment of reliable non-invasive diagnostics tools for the differentiation between benign and malignant tumors is desirable. Prospective studies clarifying the association between different surgical techniques for benign parotid tumors and morbidity are needed. The role of adjuvant or definitive radiotherapy in securing loco-regional control and improving survival in malignant disease is established. Prospective clinical trials addressing the role of chemotherapy/molecular targeted therapy for parotid cancer are needed. An international consensus on the classification of parotid surgery techniques would facilitate the comparison of different trials. Such efforts should lead into a clinical guideline.
Ramani, Sathish; Liu, Zhihao; Rosen, Jeffrey; Nielsen, Jon-Fredrik; Fessler, Jeffrey A.
2012-01-01
Regularized iterative reconstruction algorithms for imaging inverse problems require selection of appropriate regularization parameter values. We focus on the challenging problem of tuning regularization parameters for nonlinear algorithms for the case of additive (possibly complex) Gaussian noise. Generalized cross-validation (GCV) and (weighted) mean-squared error (MSE) approaches (based on Stein's Unbiased Risk Estimate— SURE) need the Jacobian matrix of the nonlinear reconstruction operator (representative of the iterative algorithm) with respect to the data. We derive the desired Jacobian matrix for two types of nonlinear iterative algorithms: a fast variant of the standard iterative reweighted least-squares method and the contemporary split-Bregman algorithm, both of which can accommodate a wide variety of analysis- and synthesis-type regularizers. The proposed approach iteratively computes two weighted SURE-type measures: Predicted-SURE and Projected-SURE (that require knowledge of noise variance σ2), and GCV (that does not need σ2) for these algorithms. We apply the methods to image restoration and to magnetic resonance image (MRI) reconstruction using total variation (TV) and an analysis-type ℓ1-regularization. We demonstrate through simulations and experiments with real data that minimizing Predicted-SURE and Projected-SURE consistently lead to near-MSE-optimal reconstructions. We also observed that minimizing GCV yields reconstruction results that are near-MSE-optimal for image restoration and slightly sub-optimal for MRI. Theoretical derivations in this work related to Jacobian matrix evaluations can be extended, in principle, to other types of regularizers and reconstruction algorithms. PMID:22531764
Integrated Building Management System (IBMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anita Lewis
This project provides a combination of software and services that more easily and cost-effectively help to achieve optimized building performance and energy efficiency. Featuring an open-platform, cloud- hosted application suite and an intuitive user experience, this solution simplifies a traditionally very complex process by collecting data from disparate building systems and creating a single, integrated view of building and system performance. The Fault Detection and Diagnostics algorithms developed within the IBMS have been designed and tested as an integrated component of the control algorithms running the equipment being monitored. The algorithms identify the normal control behaviors of the equipment withoutmore » interfering with the equipment control sequences. The algorithms also work without interfering with any cooperative control sequences operating between different pieces of equipment or building systems. In this manner the FDD algorithms create an integrated building management system.« less
Data Processing Algorithm for Diagnostics of Combustion Using Diode Laser Absorption Spectrometry.
Mironenko, Vladimir R; Kuritsyn, Yuril A; Liger, Vladimir V; Bolshov, Mikhail A
2018-02-01
A new algorithm for the evaluation of the integral line intensity for inferring the correct value for the temperature of a hot zone in the diagnostic of combustion by absorption spectroscopy with diode lasers is proposed. The algorithm is based not on the fitting of the baseline (BL) but on the expansion of the experimental and simulated spectra in a series of orthogonal polynomials, subtracting of the first three components of the expansion from both the experimental and simulated spectra, and fitting the spectra thus modified. The algorithm is tested in the numerical experiment by the simulation of the absorption spectra using a spectroscopic database, the addition of white noise, and the parabolic BL. Such constructed absorption spectra are treated as experimental in further calculations. The theoretical absorption spectra were simulated with the parameters (temperature, total pressure, concentration of water vapor) close to the parameters used for simulation of the experimental data. Then, spectra were expanded in the series of orthogonal polynomials and first components were subtracted from both spectra. The value of the correct integral line intensities and hence the correct temperature evaluation were obtained by fitting of the thus modified experimental and simulated spectra. The dependence of the mean and standard deviation of the evaluation of the integral line intensity on the linewidth and the number of subtracted components (first two or three) were examined. The proposed algorithm provides a correct estimation of temperature with standard deviation better than 60 K (for T = 1000 K) for the line half-width up to 0.6 cm -1 . The proposed algorithm allows for obtaining the parameters of a hot zone without the fitting of usually unknown BL.
Performance of a Novel Algorithm Using Automated Digital Microscopy for Diagnosing Tuberculosis.
Ismail, Nazir A; Omar, Shaheed V; Lewis, James J; Dowdy, David W; Dreyer, Andries W; van der Meulen, Hermina; Nconjana, George; Clark, David A; Churchyard, Gavin J
2015-06-15
TBDx automated microscopy is a novel technology that processes digital microscopic images to identify acid-fast bacilli (AFB). Use of TBDx as part of a diagnostic algorithm could improve the diagnosis of tuberculosis (TB), but its performance characteristics have not yet been formally tested. To evaluate the performance of the TBDx automated microscopy system in algorithms for diagnosis of TB. Prospective samples from patients with presumed TB were processed in parallel with conventional smear microscopy, TBDx microscopy, and liquid culture. All TBDx-positive specimens were also tested with the Xpert MTB/RIF (GXP) assay. We evaluated the sensitivity and specificity of two algorithms-(1) TBDx-GXP (TBDx with positive specimens tested by Xpert MTB/RIF) and (2) TBDx alone-against the gold standard liquid media culture. Of 1,210 samples, 1,009 were eligible for evaluation, of which 109 were culture positive for Mycobacterium tuberculosis. The TBDx system identified 70 specimens (68 culture positive) as having 10 or more putative AFB (high positive) and 207 (19 culture positive) as having 1-9 putative AFB (low positive). An algorithm in which "low-positive" results on TBDx were confirmed by GXP had 78% sensitivity (85 of 109) and 99.8% specificity (889 of 900), requiring 21% (207 of 1,009) specimens to be processed by GXP. As a stand-alone test, a "high-positive" result on TBDx had 62% sensitivity and 99.7% specificity. TBDx used in diagnostic algorithms with GXP provided reasonable sensitivity and high specificity for active TB while dramatically reducing the number GXP tests performed. As a stand-alone microscopy system, its performance was equivalent to that of a highly experienced TB microscopist.
Iyatomi, Hitoshi; Oka, Hiroshi; Saito, Masataka; Miyake, Ayako; Kimoto, Masayuki; Yamagami, Jun; Kobayashi, Seiichiro; Tanikawa, Akiko; Hagiwara, Masafumi; Ogawa, Koichi; Argenziano, Giuseppe; Soyer, H Peter; Tanaka, Masaru
2006-04-01
The aims of this study were to provide a quantitative assessment of the tumour area extracted by dermatologists and to evaluate computer-based methods from dermoscopy images for refining a computer-based melanoma diagnostic system. Dermoscopic images of 188 Clark naevi, 56 Reed naevi and 75 melanomas were examined. Five dermatologists manually drew the border of each lesion with a tablet computer. The inter-observer variability was evaluated and the standard tumour area (STA) for each dermoscopy image was defined. Manual extractions by 10 non-medical individuals and by two computer-based methods were evaluated with STA-based assessment criteria: precision and recall. Our new computer-based method introduced the region-growing approach in order to yield results close to those obtained by dermatologists. The effectiveness of our extraction method with regard to diagnostic accuracy was evaluated. Two linear classifiers were built using the results of conventional and new computer-based tumour area extraction methods. The final diagnostic accuracy was evaluated by drawing the receiver operating curve (ROC) of each classifier, and the area under each ROC was evaluated. The standard deviations of the tumour area extracted by five dermatologists and 10 non-medical individuals were 8.9% and 10.7%, respectively. After assessment of the extraction results by dermatologists, the STA was defined as the area that was selected by more than two dermatologists. Dermatologists selected the melanoma area with statistically smaller divergence than that of Clark naevus or Reed naevus (P = 0.05). By contrast, non-medical individuals did not show this difference. Our new computer-based extraction algorithm showed superior performance (precision, 94.1%; recall, 95.3%) to the conventional thresholding method (precision, 99.5%; recall, 87.6%). These results indicate that our new algorithm extracted a tumour area close to that obtained by dermatologists and, in particular, the border part of the tumour was adequately extracted. With this refinement, the area under the ROC increased from 0.795 to 0.875 and the diagnostic accuracy showed an increase of approximately 20% in specificity when the sensitivity was 80%. It can be concluded that our computer-based tumour extraction algorithm extracted almost the same area as that obtained by dermatologists and provided improved computer-based diagnostic accuracy.
Schellhaas, Barbara; Görtz, Ruediger S; Pfeifer, Lukas; Kielisch, Christian; Neurath, Markus F; Strobel, Deike
2017-09-01
A comparison is made of two contrast-enhanced ultrasound (CEUS) algorithms for the diagnosis of hepatocellular carcinoma (HCC) in high-risk patients: Erlanger Synopsis of Contrast-enhanced Ultrasound for Liver lesion Assessment in Patients at Risk (ESCULAP) and American College of Radiology Contrast-Enhanced Ultrasound-Liver Imaging Reporting and Data System (ACR-CEUS-LI-RADSv.2016). Focal liver lesions in 100 high-risk patients were assessed using both CEUS algorithms (ESCULAP and CEUS-LI-RADSv.2016) for a direct comparison. Lesions were categorized according to size and contrast enhancement in the arterial, portal venous and late phases.For the definite diagnosis of HCC, categories ESCULAP-4, ESCULAP-Tr and ESCULAP-V and CEUS-LI-RADS-LR-5, LR-Tr and LR-5-V were compared. In addition, CEUS-LI-RADS-category LR-M (definitely/probably malignant, but not specific for HCC) and ESCULAP-category C [intrahepatic cholangiocellular carcinoma (ICC)] were compared.Histology, CE-computed tomography and CE-MRI served as reference standards. The reference standard among 100 lesions included 87 HCCs, six ICCs and seven non-HCC-non-ICC-lesions. For the diagnosis of HCC, the diagnostic accuracy of CEUS was significantly higher with ESCULAP versus CEUS-LI-RADS (94.3%/72.4%; p<0.01). Sensitivity, specificity and positive predictive value (PPV) and negative predictive value for ESCULAP/CEUS-LI-RADS were 94.3%/72.4%; 61.5%/69.2%; 94.3%/94%; and 61.5%/27.3%, respectively.The diagnostic accuracy for ICC (LR-M/ESCULAP-C) was identical with both algorithms (50%), with higher PPV for ESCULAP-C versus LR-M (75 vs. 50%). CEUS-based algorithms contribute toward standardized assessment and reporting of HCC-suspect lesions in high-risk patients. ESCULAP shows significantly higher diagnostic accuracy, sensitivity and negative predictive value with no loss of specificity compared with CEUS-LI-RADS. Both algorithms have an excellent PPV. Arterial hyperenhancement is the key feature for the diagnosis of HCC with CEUS. Washout should not be a necessary prerequisite for the diagnosis of definite HCC. CEUS-LI-RADS in its current version is inferior to ESCULAP for the noninvasive diagnosis of HCC. There are two ways to improve CEUS-LI-RADS: firstly, combination of the categories LR-4 and LR-5 for the diagnosis of definite HCC, and secondly, use of subtotal infiltration of a liver lobe as an additional feature.
Lützen, Ulf; Naumann, Carsten Maik; Marx, Marlies; Zhao, Yi; Jüptner, Michael; Baumann, René; Papp, László; Zsótér, Norbert; Aksenov, Alexey; Jünemann, Klaus-Peter; Zuhayra, Maaz
2016-09-07
Because of the increasing importance of computer-assisted post processing of image data in modern medical diagnostic we studied the value of an algorithm for assessment of single photon emission computed tomography/computed tomography (SPECT/CT)-data, which has been used for the first time for lymph node staging in penile cancer with non-palpable inguinal lymph nodes. In the guidelines of the relevant international expert societies, sentinel lymph node-biopsy (SLNB) is recommended as a diagnostic method of choice. The aim of this study is to evaluate the value of the afore-mentioned algorithm and in the clinical context the reliability and the associated morbidity of this procedure. Between 2008 and 2015, 25 patients with invasive penile cancer and inconspicuous inguinal lymph node status underwent SLNB after application of the radiotracer Tc-99m labelled nanocolloid. We recorded in a prospective approach the reliability and the complication rate of the procedure. In addition, we evaluated the results of an algorithm for SPECT/CT-data assessment of these patients. SLNB was carried out in 44 groins of 25 patients. In three patients, inguinal lymph node metastases were detected via SLNB. In one patient, bilateral lymph node recurrence of the groins occurred after negative SLNB. There was a false-negative rate of 4 % in relation to the number of patients (1/25), resp. 4.5 % in relation to the number of groins (2/44). Morbidity was 4 % in relation to the number of patients (1/25), resp. 2.3 % in relation to the number of groins (1/44). The results of computer-assisted assessment of SPECT/CT data for sentinel lymph node (SLN)-diagnostics demonstrated high sensitivity of 88.8 % and specificity of 86.7 %. SLNB is a very reliable method, associated with low morbidity. Computer-assisted assessment of SPECT/CT data of the SLN-diagnostics shows high sensitivity and specificity. While it cannot replace the assessment by medical experts, it can still provide substantial supplement and assistance.
McInnes, Matthew D F; Moher, David; Thombs, Brett D; McGrath, Trevor A; Bossuyt, Patrick M; Clifford, Tammy; Cohen, Jérémie F; Deeks, Jonathan J; Gatsonis, Constantine; Hooft, Lotty; Hunt, Harriet A; Hyde, Christopher J; Korevaar, Daniël A; Leeflang, Mariska M G; Macaskill, Petra; Reitsma, Johannes B; Rodin, Rachel; Rutjes, Anne W S; Salameh, Jean-Paul; Stevens, Adrienne; Takwoingi, Yemisi; Tonelli, Marcello; Weeks, Laura; Whiting, Penny; Willis, Brian H
2018-01-23
Systematic reviews of diagnostic test accuracy synthesize data from primary diagnostic studies that have evaluated the accuracy of 1 or more index tests against a reference standard, provide estimates of test performance, allow comparisons of the accuracy of different tests, and facilitate the identification of sources of variability in test accuracy. To develop the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) diagnostic test accuracy guideline as a stand-alone extension of the PRISMA statement. Modifications to the PRISMA statement reflect the specific requirements for reporting of systematic reviews and meta-analyses of diagnostic test accuracy studies and the abstracts for these reviews. Established standards from the Enhancing the Quality and Transparency of Health Research (EQUATOR) Network were followed for the development of the guideline. The original PRISMA statement was used as a framework on which to modify and add items. A group of 24 multidisciplinary experts used a systematic review of articles on existing reporting guidelines and methods, a 3-round Delphi process, a consensus meeting, pilot testing, and iterative refinement to develop the PRISMA diagnostic test accuracy guideline. The final version of the PRISMA diagnostic test accuracy guideline checklist was approved by the group. The systematic review (produced 64 items) and the Delphi process (provided feedback on 7 proposed items; 1 item was later split into 2 items) identified 71 potentially relevant items for consideration. The Delphi process reduced these to 60 items that were discussed at the consensus meeting. Following the meeting, pilot testing and iterative feedback were used to generate the 27-item PRISMA diagnostic test accuracy checklist. To reflect specific or optimal contemporary systematic review methods for diagnostic test accuracy, 8 of the 27 original PRISMA items were left unchanged, 17 were modified, 2 were added, and 2 were omitted. The 27-item PRISMA diagnostic test accuracy checklist provides specific guidance for reporting of systematic reviews. The PRISMA diagnostic test accuracy guideline can facilitate the transparent reporting of reviews, and may assist in the evaluation of validity and applicability, enhance replicability of reviews, and make the results from systematic reviews of diagnostic test accuracy studies more useful.
NASA Astrophysics Data System (ADS)
Wang, Jianfeng; Lin, Kan; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Huang, Zhiwei
2015-08-01
This work aims to evaluate clinical value of a fiber-optic Raman spectroscopy technique developed for in vivo diagnosis of esophageal squamous cell carcinoma (ESCC) during clinical endoscopy. We have developed a rapid fiber-optic Raman endoscopic system capable of simultaneously acquiring both fingerprint (FP)(800-1800 cm-1) and high-wavenumber (HW)(2800-3600 cm-1) Raman spectra from esophageal tissue in vivo. A total of 1172 in vivo FP/HW Raman spectra were acquired from 48 esophageal patients undergoing endoscopic examination. The total Raman dataset was split into two parts: 80% for training; while 20% for testing. Partial least squares-discriminant analysis (PLS-DA) and leave-one patient-out, cross validation (LOPCV) were implemented on training dataset to develop diagnostic algorithms for tissue classification. PLS-DA-LOPCV shows that simultaneous FP/HW Raman spectroscopy on training dataset provides a diagnostic sensitivity of 97.0% and specificity of 97.4% for ESCC classification. Further, the diagnostic algorithm applied to the independent testing dataset based on simultaneous FP/HW Raman technique gives a predictive diagnostic sensitivity of 92.7% and specificity of 93.6% for ESCC identification, which is superior to either FP or HW Raman technique alone. This work demonstrates that the simultaneous FP/HW fiber-optic Raman spectroscopy technique improves real-time in vivo diagnosis of esophageal neoplasia at endoscopy.
[A new information technology for system diagnosis of functional activity of human organs].
Avshalumov, A Sh; Sudakov, K V; Filaretov, G F
2006-01-01
The goal of this work was to consider a new diagnostic technology based on analysis of objective information parameters of functional activity and interaction of normal and pathologically changed human organs. The technology is based on the use of very low power millimeter (EHF) radiation emitted by human body and other biological objects in the process of vital activity. The importance of consideration of the information aspect of vital activity from the standpoint of the theory of functional systems suggested by P. K. Anokhin is emphasized. The suggested information technology is theoretically substantiated. The capabilities of the suggested technology for diagnosis, as well as the difficulties of its practical implementation caused by very low power of electromagnetic fields generated by human body, are discussed. It is noted that only use of modern radiophysical equipment together with new software based on specially developed algorithms made it possible to construct a medical EHF diagnostic system for effective implementation of the suggested technology. The system structure, functions of its components, the examination procedure, and the form of representation of diagnostic information are described together with the specific features of applied software based on the principle of maximal objectivity of analysis and interpretation of the results of diagnosis on the basis of artificial intelligence algorithms. The diagnostic capabilities of the system are illustrated by several examples.
Motta, Irene; Filocamo, Mirella; Poggiali, Erika; Stroppiano, Marina; Dragani, Alfredo; Consonni, Dario; Barcellini, Wilma; Gaidano, Gianluca; Facchini, Luca; Specchia, Giorgina; Cappellini, Maria Domenica
2016-04-01
Gaucher disease (GD) is the most common lysosomal disorder resulting from deficient activity of the β-glucosidase enzyme that causes accumulation of glucosylceramide in the macrophage-monocyte system. Notably, because of non-specific symptoms and a lack of awareness, patients with GD experience long diagnostic delays. The aim of this study was to apply a diagnostic algorithm to identify GD type 1 among adults subjects referred to Italian haematology outpatient units because of splenomegaly and/or thrombocytopenia and, eventually, to estimate the prevalence of GD in this selected population. One hundred and ninety-six subjects (61 females, 135 males; mean age 47.8 ± 18.2 years) have been enrolled in the study and tested for β-glucosidase enzyme activity on dried blood spot (DBS). Seven of 196 patients have been diagnosed with GD, (5 females and 2 males) with mean age 31.8 ± 8.2 years, with a prevalence of 3.6% (with a prevalence of 3.6% (I95% CI 1.4-7.2; 1/28 patients) in this population. These results show that the use of an appropriate diagnostic algorithm and a simple diagnostic method, such as DBS, are important tools to facilitate the diagnosis of a rare disease even for not disease-expert physicians. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Lee, Jeong Min; Park, Joong-Won; Choi, Byung Ihn
2014-01-01
Hepatocellular carcinoma (HCC) is the fifth most commonly occurring cancer in Korea and typically has a poor prognosis with a 5-year survival rate of only 28.6%. Therefore, it is of paramount importance to achieve the earliest possible diagnosis of HCC and to recommend the most up-to-date optimal treatment strategy in order to increase the survival rate of patients who develop this disease. After the establishment of the Korean Liver Cancer Study Group (KLCSG) and the National Cancer Center (NCC), Korea jointly produced for the first time the Clinical Practice Guidelines for HCC in 2003, revised them in 2009, and published the newest revision of the guidelines in 2014, including changes in the diagnostic criteria of HCC and incorporating the most recent medical advances over the past 5 years. In this review, we will address the noninvasive diagnostic criteria and diagnostic algorithm of HCC included in the newly established KLCSG-NCC guidelines in 2014, and review the differences in the criteria for a diagnosis of HCC between the KLCSG-NCC guidelines and the most recent imaging guidelines endorsed by the European Organisation for Research and Treatment of Cancer (EORTC), the Liver Imaging Reporting and Data System (LI-RADS), the Organ Procurement and Transplantation Network (OPTN) system, the Asian Pacific Association for the Study of the Liver (APASL) and the Japan Society of Hepatology (JSH).
Villa-Manríquez, J F; Castro-Ramos, J; Gutiérrez-Delgado, F; Lopéz-Pacheco, M A; Villanueva-Luna, A E
2017-08-01
In this study we identify and classify high and low levels of glycated hemoglobin (HbA1c) in healthy volunteers (HV) and diabetic patients (DP). Overall, 86 subjects were evaluated. The Raman spectrum was measured in three anatomical regions of the body: index fingertip, right ear lobe, and forehead. The measurements were performed to compare the difference between the HV and DP (22 well controlled diabetic patients (WCDP) (HbA1c <6.5%), and 49 not controlled diabetic patients (NCDP) (HbA1c ≥6.5%)). Multivariable methods such as principal components analysis (PCA) combined with support vector machine (SVM) were used to develop effective diagnostic algorithms for classification among these groups. The forehead of HV versus WCDP showed the highest sensitivity (100%) and specificity (100%). Sensitivity (100%) and specificity (60%), were highest in the forehead of WCDP, versus NCDP. In HV versus NCDP, the fingertip had the highest sensitivity (100%) and specificity (80%). The efficacy of the diagnostic algorithm by receiver operating characteristic (ROC) curve was confirmed. Overall, our study demonstrated that the combination of Raman spectroscopy and PCA-SVM are feasible non-invasive diagnostic tool in diabetes to classify qualitatively high and low levels of HbA1c in vivo. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Suh, Young Joo; Kim, Young Jin; Kim, Jin Young; Chang, Suyon; Im, Dong Jin; Hong, Yoo Jin; Choi, Byoung Wook
2017-11-01
We aimed to determine the effect of a whole-heart motion-correction algorithm (new-generation snapshot freeze, NG SSF) on the image quality of cardiac computed tomography (CT) images in patients with mechanical valve prostheses compared to standard images without motion correction and to compare the diagnostic accuracy of NG SSF and standard CT image sets for the detection of prosthetic valve abnormalities. A total of 20 patients with 32 mechanical valves who underwent wide-coverage detector cardiac CT with single-heartbeat acquisition were included. The CT image quality for subvalvular (below the prosthesis) and valvular regions (valve leaflets) of mechanical valves was assessed by two observers on a four-point scale (1 = poor, 2 = fair, 3 = good, and 4 = excellent). Paired t-tests or Wilcoxon signed rank tests were used to compare image quality scores and the number of diagnostic phases (image quality score≥3) between the standard image sets and NG SSF image sets. Diagnostic performance for detection of prosthetic valve abnormalities was compared between two image sets with the final diagnosis set by re-operation or clinical findings as the standard reference. NG SSF image sets had better image quality scores than standard image sets for both valvular and subvalvular regions (P < 0.05 for both). The number of phases that were of diagnostic image quality per patient was significantly greater in the NG SSF image set than standard image set for both valvular and subvalvular regions (P < 0.0001). Diagnostic performance of NG SSF image sets for the detection of prosthetic abnormalities (20 pannus and two paravalvular leaks) was greater than that of standard image sets (P < 0.05). Application of NG SSF can improve CT image quality and diagnostic accuracy in patients with mechanical valves compared to standard images. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
The Evolving Role of Companion Diagnostics for Breast Cancer in an Era of Next-Generation Omics.
Rosenbaum, Jason N; Weisman, Paul
2017-10-01
A companion diagnostic is a test for a specific biomarker-approved by the United States Food and Drug Administration-qualifying a patient to receive a specific, associated therapy. As interest has grown in precision medicine over the past decade, the principle of companion diagnostics has gained increasing purchase among laboratory professionals, clinicians, regulators, and even patients. The evolution of the biomarkers used to stratify and treat breast cancer illustrates the history of companion diagnostics and provides a lens through which to examine potential challenges. As new targeted therapies and corresponding biomarkers accumulate, algorithms for diagnosis and treatment necessarily become lengthier and more complex. To accommodate future needs of breast cancer patients, the companion diagnostic model will continue to adapt and evolve. Copyright © 2017 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Application of content-based image compression to telepathology
NASA Astrophysics Data System (ADS)
Varga, Margaret J.; Ducksbury, Paul G.; Callagy, Grace
2002-05-01
Telepathology is a means of practicing pathology at a distance, viewing images on a computer display rather than directly through a microscope. Without compression, images take too long to transmit to a remote location and are very expensive to store for future examination. However, to date the use of compressed images in pathology remains controversial. This is because commercial image compression algorithms such as JPEG achieve data compression without knowledge of the diagnostic content. Often images are lossily compressed at the expense of corrupting informative content. None of the currently available lossy compression techniques are concerned with what information has been preserved and what data has been discarded. Their sole objective is to compress and transmit the images as fast as possible. By contrast, this paper presents a novel image compression technique, which exploits knowledge of the slide diagnostic content. This 'content based' approach combines visually lossless and lossy compression techniques, judiciously applying each in the appropriate context across an image so as to maintain 'diagnostic' information while still maximising the possible compression. Standard compression algorithms, e.g. wavelets, can still be used, but their use in a context sensitive manner can offer high compression ratios and preservation of diagnostically important information. When compared with lossless compression the novel content-based approach can potentially provide the same degree of information with a smaller amount of data. When compared with lossy compression it can provide more information for a given amount of compression. The precise gain in the compression performance depends on the application (e.g. database archive or second opinion consultation) and the diagnostic content of the images.
[Hypertensive crisis: pathogenesis, clinic, treatment].
Vertkin, A L; Topolianskiĭ, A V; Abdullaeva, A U; Alekseev, M A; Shakhmanaev, Kh A
2013-01-01
Contemporary data on mechanisms of development, types, and clinical picture of hypertensive crisis (HC) are presented. Algorithms of rational therapy of uncomplicated and complicated HC are considered. Appropriateness of the use in HC of antihypertensive drugs with multifactorial action is stressed. These drugs include urapidil - an antihypertensive agent with complex mechanism of action. Blocking mainly the postsynaptic 1-adrenoreceptors urapidil attenuates vasoconstrictor effect of catecholamines and decreases total peripheral resistance. Stimulation of 5HT1-receptors of medullary vasculomotor center promotes lowering of elevated vascular tone and prevents development of reflex tachycardia.
Fiuzy, Mohammad; Haddadnia, Javad; Mollania, Nasrin; Hashemian, Maryam; Hassanpour, Kazem
2012-01-01
Accurate Diagnosis of Breast Cancer is of prime importance. Fine Needle Aspiration test or "FNA", which has been used for several years in Europe, is a simple, inexpensive, noninvasive and accurate technique for detecting breast cancer. Expending the suitable features of the Fine Needle Aspiration results is the most important diagnostic problem in early stages of breast cancer. In this study, we introduced a new algorithm that can detect breast cancer based on combining artificial intelligent system and Fine Needle Aspiration (FNA). We studied the Features of Wisconsin Data Base Cancer which contained about 569 FNA test samples (212 patient samples (malignant) and 357 healthy samples (benign)). In this research, we combined Artificial Intelligence Approaches, such as Evolutionary Algorithm (EA) with Genetic Algorithm (GA), and also used Exact Classifier Systems (here by Fuzzy C-Means (FCM)) to separate malignant from benign samples. Furthermore, we examined artificial Neural Networks (NN) to identify the model and structure. This research proposed a new algorithm for an accurate diagnosis of breast cancer. According to Wisconsin Data Base Cancer (WDBC) data base, 62.75% of samples were benign, and 37.25% were malignant. After applying the proposed algorithm, we achieved high detection accuracy of about "96.579%" on 205 patients who were diagnosed as having breast cancer. It was found that the method had 93% sensitivity, 73% specialty, 65% positive predictive value, and 95% negative predictive value, respectively. If done by experts, Fine Needle Aspiration (FNA) can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. FNA can be the first line of diagnosis in women with breast masses, at least in deprived regions, and may increase health standards and clinical supervision of patients. Such a smart, economical, non-invasive, rapid and accurate system can be introduced as a useful diagnostic system for comprehensive treatment of breast cancer. Another advantage of this method is the possibility of diagnosing breast abnormalities. If done by experts, FNA can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples.
Marchetti, Antonio; Pace, Maria Vittoria; Di Lorito, Alessia; Canarecci, Sara; Felicioni, Lara; D'Antuono, Tommaso; Liberatore, Marcella; Filice, Giampaolo; Guetti, Luigi; Mucilli, Felice; Buttitta, Fiamma
2016-09-01
Anaplastic Lymphoma Kinase (ALK) gene rearrangements have been described in 3-5% of lung adenocarcinomas (ADC) and their identification is essential to select patients for treatment with ALK tyrosine kinase inhibitors. For several years, fluorescent in situ hybridization (FISH) has been considered as the only validated diagnostic assay. Currently, alternative methods are commercially available as diagnostic tests. A series of 217 ADC comprising 196 consecutive resected tumors and 21 ALK FISH-positive cases from an independent series of 702 ADC were investigated. All specimens were screened by IHC (ALK-D5F3-CDx-Ventana), FISH (Vysis ALK Break-Apart-Abbott) and RT-PCR (ALK RGQ RT-PCR-Qiagen). Results were compared and discordant cases subjected to Next Generation Sequencing. Thirty-nine of 217 samples were positive by the ALK RGQ RT-PCR assay, using a threshold cycle (Ct) cut-off ≤35.9, as recommended. Of these positive samples, 14 were negative by IHC and 12 by FISH. ALK RGQ RT-PCR/FISH discordant cases were analyzed by the NGS assay with results concordant with FISH data. In order to obtain the maximum level of agreement between FISH and ALK RGQ RT-PCR data, we introduced a new scoring algorithm based on the ΔCt value. A ΔCt cut-off level ≤3.5 was used in a pilot series. Then the algorithm was tested on a completely independent validation series. By using the new scoring algorithm and FISH as reference standard, the sensitivity and the specificity of the ALK RGQ RT-PCR(ΔCt) assay were 100% and 100%, respectively. Our results suggest that the ALK RGQ RT-PCR test could be useful in clinical practice as a complementary assay in multi-test diagnostic algorithms or even, if our data will be confirmed in independent studies, as a standalone or screening test for the selection of patients to be treated with ALK inhibitors. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Cairns, Andrew W; Bond, Raymond R; Finlay, Dewar D; Guldenring, Daniel; Badilini, Fabio; Libretti, Guido; Peace, Aaron J; Leslie, Stephen J
The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. To improve interpretation accuracy and reduce missed co-abnormalities. The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Chirag; Vicini, Frank A., E-mail: fvicini@beaumont.edu
As more women survive breast cancer, long-term toxicities affecting their quality of life, such as lymphedema (LE) of the arm, gain importance. Although numerous studies have attempted to determine incidence rates, identify optimal diagnostic tests, enumerate efficacious treatment strategies and outline risk reduction guidelines for breast cancer-related lymphedema (BCRL), few groups have consistently agreed on any of these issues. As a result, standardized recommendations are still lacking. This review will summarize the latest data addressing all of these concerns in order to provide patients and health care providers with optimal, contemporary recommendations. Published incidence rates for BCRL vary substantially withmore » a range of 2-65% based on surgical technique, axillary sampling method, radiation therapy fields treated, and the use of chemotherapy. Newer clinical assessment tools can potentially identify BCRL in patients with subclinical disease with prospective data suggesting that early diagnosis and management with noninvasive therapy can lead to excellent outcomes. Multiple therapies exist with treatments defined by the severity of BCRL present. Currently, the standard of care for BCRL in patients with significant LE is complex decongestive physiotherapy (CDP). Contemporary data also suggest that a multidisciplinary approach to the management of BCRL should begin prior to definitive treatment for breast cancer employing patient-specific surgical, radiation therapy, and chemotherapy paradigms that limit risks. Further, prospective clinical assessments before and after treatment should be employed to diagnose subclinical disease. In those patients who require aggressive locoregional management, prophylactic therapies and the use of CDP can help reduce the long-term sequelae of BCRL.« less
Modification of the Integrated Sasang Constitutional Diagnostic Model
Nam, Jiho
2017-01-01
In 2012, the Korea Institute of Oriental Medicine proposed an objective and comprehensive physical diagnostic model to address quantification problems in the existing Sasang constitutional diagnostic method. However, certain issues have been raised regarding a revision of the proposed diagnostic model. In this paper, we propose various methodological approaches to address the problems of the previous diagnostic model. Firstly, more useful variables are selected in each component. Secondly, the least absolute shrinkage and selection operator is used to reduce multicollinearity without the modification of explanatory variables. Thirdly, proportions of SC types and age are considered to construct individual diagnostic models and classify the training set and the test set for reflecting the characteristics of the entire dataset. Finally, an integrated model is constructed with explanatory variables of individual diagnosis models. The proposed integrated diagnostic model significantly improves the sensitivities for both the male SY type (36.4% → 62.0%) and the female SE type (43.7% → 64.5%), which were areas of limitation of the previous integrated diagnostic model. The ideas of these new algorithms are expected to contribute not only to the scientific development of Sasang constitutional medicine in Korea but also to that of other diagnostic methods for traditional medicine. PMID:29317897
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.
2015-01-12
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high-energy-density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However,more » the camera nonlinearities drift over time, affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.« less
Rodriguez-Diaz, Eladio; Castanon, David A; Singh, Satish K; Bigio, Irving J
2011-06-01
Optical spectroscopy has shown potential as a real-time, in vivo, diagnostic tool for identifying neoplasia during endoscopy. We present the development of a diagnostic algorithm to classify elastic-scattering spectroscopy (ESS) spectra as either neoplastic or non-neoplastic. The algorithm is based on pattern recognition methods, including ensemble classifiers, in which members of the ensemble are trained on different regions of the ESS spectrum, and misclassification-rejection, where the algorithm identifies and refrains from classifying samples that are at higher risk of being misclassified. These "rejected" samples can be reexamined by simply repositioning the probe to obtain additional optical readings or ultimately by sending the polyp for histopathological assessment, as per standard practice. Prospective validation using separate training and testing sets result in a baseline performance of sensitivity = .83, specificity = .79, using the standard framework of feature extraction (principal component analysis) followed by classification (with linear support vector machines). With the developed algorithm, performance improves to Se ∼ 0.90, Sp ∼ 0.90, at a cost of rejecting 20-33% of the samples. These results are on par with a panel of expert pathologists. For colonoscopic prevention of colorectal cancer, our system could reduce biopsy risk and cost, obviate retrieval of non-neoplastic polyps, decrease procedure time, and improve assessment of cancer risk.
Rodriguez-Diaz, Eladio; Castanon, David A.; Singh, Satish K.; Bigio, Irving J.
2011-01-01
Optical spectroscopy has shown potential as a real-time, in vivo, diagnostic tool for identifying neoplasia during endoscopy. We present the development of a diagnostic algorithm to classify elastic-scattering spectroscopy (ESS) spectra as either neoplastic or non-neoplastic. The algorithm is based on pattern recognition methods, including ensemble classifiers, in which members of the ensemble are trained on different regions of the ESS spectrum, and misclassification-rejection, where the algorithm identifies and refrains from classifying samples that are at higher risk of being misclassified. These “rejected” samples can be reexamined by simply repositioning the probe to obtain additional optical readings or ultimately by sending the polyp for histopathological assessment, as per standard practice. Prospective validation using separate training and testing sets result in a baseline performance of sensitivity = .83, specificity = .79, using the standard framework of feature extraction (principal component analysis) followed by classification (with linear support vector machines). With the developed algorithm, performance improves to Se ∼ 0.90, Sp ∼ 0.90, at a cost of rejecting 20–33% of the samples. These results are on par with a panel of expert pathologists. For colonoscopic prevention of colorectal cancer, our system could reduce biopsy risk and cost, obviate retrieval of non-neoplastic polyps, decrease procedure time, and improve assessment of cancer risk. PMID:21721830
Ehteshami Bejnordi, Babak; Mullooly, Maeve; Pfeiffer, Ruth M; Fan, Shaoqi; Vacek, Pamela M; Weaver, Donald L; Herschorn, Sally; Brinton, Louise A; van Ginneken, Bram; Karssemeijer, Nico; Beck, Andrew H; Gierach, Gretchen L; van der Laak, Jeroen A W M; Sherman, Mark E
2018-06-13
The breast stromal microenvironment is a pivotal factor in breast cancer development, growth and metastases. Although pathologists often detect morphologic changes in stroma by light microscopy, visual classification of such changes is subjective and non-quantitative, limiting its diagnostic utility. To gain insights into stromal changes associated with breast cancer, we applied automated machine learning techniques to digital images of 2387 hematoxylin and eosin stained tissue sections of benign and malignant image-guided breast biopsies performed to investigate mammographic abnormalities among 882 patients, ages 40-65 years, that were enrolled in the Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project. Using deep convolutional neural networks, we trained an algorithm to discriminate between stroma surrounding invasive cancer and stroma from benign biopsies. In test sets (928 whole-slide images from 330 patients), this algorithm could distinguish biopsies diagnosed as invasive cancer from benign biopsies solely based on the stromal characteristics (area under the receiver operator characteristics curve = 0.962). Furthermore, without being trained specifically using ductal carcinoma in situ as an outcome, the algorithm detected tumor-associated stroma in greater amounts and at larger distances from grade 3 versus grade 1 ductal carcinoma in situ. Collectively, these results suggest that algorithms based on deep convolutional neural networks that evaluate only stroma may prove useful to classify breast biopsies and aid in understanding and evaluating the biology of breast lesions.
Xiao, Li-Hong; Chen, Pei-Ran; Gou, Zhong-Ping; Li, Yong-Zhong; Li, Mei; Xiang, Liang-Cheng; Feng, Ping
2017-01-01
The aim of this study is to evaluate the ability of the random forest algorithm that combines data on transrectal ultrasound findings, age, and serum levels of prostate-specific antigen to predict prostate carcinoma. Clinico-demographic data were analyzed for 941 patients with prostate diseases treated at our hospital, including age, serum prostate-specific antigen levels, transrectal ultrasound findings, and pathology diagnosis based on ultrasound-guided needle biopsy of the prostate. These data were compared between patients with and without prostate cancer using the Chi-square test, and then entered into the random forest model to predict diagnosis. Patients with and without prostate cancer differed significantly in age and serum prostate-specific antigen levels (P < 0.001), as well as in all transrectal ultrasound characteristics (P < 0.05) except uneven echo (P = 0.609). The random forest model based on age, prostate-specific antigen and ultrasound predicted prostate cancer with an accuracy of 83.10%, sensitivity of 65.64%, and specificity of 93.83%. Positive predictive value was 86.72%, and negative predictive value was 81.64%. By integrating age, prostate-specific antigen levels and transrectal ultrasound findings, the random forest algorithm shows better diagnostic performance for prostate cancer than either diagnostic indicator on its own. This algorithm may help improve diagnosis of the disease by identifying patients at high risk for biopsy.
Hybrid Kalman Filter: A New Approach for Aircraft Engine In-Flight Diagnostics
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2006-01-01
In this paper, a uniquely structured Kalman filter is developed for its application to in-flight diagnostics of aircraft gas turbine engines. The Kalman filter is a hybrid of a nonlinear on-board engine model (OBEM) and piecewise linear models. The utilization of the nonlinear OBEM allows the reference health baseline of the in-flight diagnostic system to be updated to the degraded health condition of the engines through a relatively simple process. Through this health baseline update, the effectiveness of the in-flight diagnostic algorithm can be maintained as the health of the engine degrades over time. Another significant aspect of the hybrid Kalman filter methodology is its capability to take advantage of conventional linear and nonlinear Kalman filter approaches. Based on the hybrid Kalman filter, an in-flight fault detection system is developed, and its diagnostic capability is evaluated in a simulation environment. Through the evaluation, the suitability of the hybrid Kalman filter technique for aircraft engine in-flight diagnostics is demonstrated.
Computer-aided diagnostic strategy selection.
Greenes, R A
1986-03-01
Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.
Langabeer, Stephen E
2016-01-01
The majority of patients with classical myeloproliferative neoplasms (MPN) of polycythemia vera, essential thrombocythemia, and primary myelofibrosis harbor distinct disease-driving mutations within the JAK2 , CALR , or MPL genes. The term triple-negative has been recently applied to those MPN without evidence of these consistent mutations, prompting whole or targeted exome sequencing approaches to determine the driver mutational status of this subgroup. These strategies have identified numerous novel mutations that occur in alternative exons of both JAK2 and MPL , the majority of which result in functional activation. Current molecular diagnostic approaches may possess insufficient coverage to detect these alternative mutations, prompting further consideration of targeted exon sequencing into routine diagnostic practice. How to incorporate these illuminating findings into the expanding molecular diagnostic algorithm for MPN requires continual attention.
Development of a South African integrated syndromic respiratory disease guideline for primary care.
English, René G; Bateman, Eric D; Zwarenstein, Merrick F; Fairall, Lara R; Bheekie, Angeni; Bachmann, Max O; Majara, Bosielo; Ottmani, Salah-Eddine; Scherpbier, Robert W
2008-09-01
The Practical Approach to Lung Health in South Africa (PALSA) initiative aimed to develop an integrated symptom- and sign-based (syndromic) respiratory disease guideline for nurse care practitioners working in primary care in a developing country. A multidisciplinary team developed the guideline after reviewing local barriers to respiratory health care provision, relevant health care policies, existing respiratory guidelines, and literature. Guideline drafts were evaluated by means of focus group discussions. Existing evidence-based guideline development methodologies were tailored for development of the guideline. A locally-applicable guideline based on syndromic diagnostic algorithms was developed for the management of patients 15 years and older who presented to primary care facilities with cough or difficulty breathing. PALSA has developed a guideline that integrates and presents diagnostic and management recommendations for priority respiratory diseases in adults using a symptom- and sign-based algorithmic guideline for nurses in developing countries.
Diagnostic emulation: Implementation and user's guide
NASA Technical Reports Server (NTRS)
Becher, Bernice
1987-01-01
The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.
An inference engine for embedded diagnostic systems
NASA Technical Reports Server (NTRS)
Fox, Barry R.; Brewster, Larry T.
1987-01-01
The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.
OSA severity assessment based on sleep breathing analysis using ambient microphone.
Dafna, E; Tarasiuk, A; Zigel, Y
2013-01-01
In this paper, an audio-based system for severity estimation of obstructive sleep apnea (OSA) is proposed. The system estimates the apnea-hypopnea index (AHI), which is the average number of apneic events per hour of sleep. This system is based on a Gaussian mixture regression algorithm that was trained and validated on full-night audio recordings. Feature selection process using a genetic algorithm was applied to select the best features extracted from time and spectra domains. A total of 155 subjects, referred to in-laboratory polysomnography (PSG) study, were recruited. Using the PSG's AHI score as a gold-standard, the performances of the proposed system were evaluated using a Pearson correlation, AHI error, and diagnostic agreement methods. Correlation of R=0.89, AHI error of 7.35 events/hr, and diagnostic agreement of 77.3% were achieved, showing encouraging performances and a reliable non-contact alternative method for OSA severity estimation.
Prince, Martin J; de Rodriguez, Juan Llibre; Noriega, L; Lopez, A; Acosta, Daisy; Albanese, Emiliano; Arizaga, Raul; Copeland, John RM; Dewey, Michael; Ferri, Cleusa P; Guerra, Mariella; Huang, Yueqin; Jacob, KS; Krishnamoorthy, ES; McKeigue, Paul; Sousa, Renata; Stewart, Robert J; Salas, Aquiles; Sosa, Ana Luisa; Uwakwa, Richard
2008-01-01
Background The criterion for dementia implicit in DSM-IV is widely used in research but not fully operationalised. The 10/66 Dementia Research Group sought to do this using assessments from their one phase dementia diagnostic research interview, and to validate the resulting algorithm in a population-based study in Cuba. Methods The criterion was operationalised as a computerised algorithm, applying clinical principles, based upon the 10/66 cognitive tests, clinical interview and informant reports; the Community Screening Instrument for Dementia, the CERAD 10 word list learning and animal naming tests, the Geriatric Mental State, and the History and Aetiology Schedule – Dementia Diagnosis and Subtype. This was validated in Cuba against a local clinician DSM-IV diagnosis and the 10/66 dementia diagnosis (originally calibrated probabilistically against clinician DSM-IV diagnoses in the 10/66 pilot study). Results The DSM-IV sub-criteria were plausibly distributed among clinically diagnosed dementia cases and controls. The clinician diagnoses agreed better with 10/66 dementia diagnosis than with the more conservative computerized DSM-IV algorithm. The DSM-IV algorithm was particularly likely to miss less severe dementia cases. Those with a 10/66 dementia diagnosis who did not meet the DSM-IV criterion were less cognitively and functionally impaired compared with the DSMIV confirmed cases, but still grossly impaired compared with those free of dementia. Conclusion The DSM-IV criterion, strictly applied, defines a narrow category of unambiguous dementia characterized by marked impairment. It may be specific but incompletely sensitive to clinically relevant cases. The 10/66 dementia diagnosis defines a broader category that may be more sensitive, identifying genuine cases beyond those defined by our DSM-IV algorithm, with relevance to the estimation of the population burden of this disorder. PMID:18577205
NASA Astrophysics Data System (ADS)
Shen, Fei; Chen, Chao; Yan, Ruqiang
2017-05-01
Classical bearing fault diagnosis methods, being designed according to one specific task, always pay attention to the effectiveness of extracted features and the final diagnostic performance. However, most of these approaches suffer from inefficiency when multiple tasks exist, especially in a real-time diagnostic scenario. A fault diagnosis method based on Non-negative Matrix Factorization (NMF) and Co-clustering strategy is proposed to overcome this limitation. Firstly, some high-dimensional matrixes are constructed using the Short-Time Fourier Transform (STFT) features, where the dimension of each matrix equals to the number of target tasks. Then, the NMF algorithm is carried out to obtain different components in each dimension direction through optimized matching, such as Euclidean distance and divergence distance. Finally, a Co-clustering technique based on information entropy is utilized to realize classification of each component. To verity the effectiveness of the proposed approach, a series of bearing data sets were analysed in this research. The tests indicated that although the diagnostic performance of single task is comparable to traditional clustering methods such as K-mean algorithm and Guassian Mixture Model, the accuracy and computational efficiency in multi-tasks fault diagnosis are improved.
NASA Astrophysics Data System (ADS)
Li, S. X.; Zhang, Y. J.; Zeng, Q. Y.; Li, L. F.; Guo, Z. Y.; Liu, Z. M.; Xiong, H. L.; Liu, S. H.
2014-06-01
Cancer is the most common disease to threaten human health. The ability to screen individuals with malignant tumours with only a blood sample would be greatly advantageous to early diagnosis and intervention. This study explores the possibility of discriminating between cancer patients and normal subjects with serum surface-enhanced Raman spectroscopy (SERS) and a support vector machine (SVM) through a peripheral blood sample. A total of 130 blood samples were obtained from patients with liver cancer, colonic cancer, esophageal cancer, nasopharyngeal cancer, gastric cancer, as well as 113 blood samples from normal volunteers. Several diagnostic models were built with the serum SERS spectra using SVM and principal component analysis (PCA) techniques. The results show that a diagnostic accuracy of 85.5% is acquired with a PCA algorithm, while a diagnostic accuracy of 95.8% is obtained using radial basis function (RBF), PCA-SVM methods. The results prove that a RBF kernel PCA-SVM technique is superior to PCA and conventional SVM (C-SVM) algorithms in classification serum SERS spectra. The study demonstrates that serum SERS, in combination with SVM techniques, has great potential for screening cancerous patients with any solid malignant tumour through a peripheral blood sample.
Use of hyperspectral imaging technology to develop a diagnostic support system for gastric cancer
NASA Astrophysics Data System (ADS)
Goto, Atsushi; Nishikawa, Jun; Kiyotoki, Shu; Nakamura, Munetaka; Nishimura, Junichi; Okamoto, Takeshi; Ogihara, Hiroyuki; Fujita, Yusuke; Hamamoto, Yoshihiko; Sakaida, Isao
2015-01-01
Hyperspectral imaging (HSI) is a new technology that obtains spectroscopic information and renders it in image form. This study examined the difference in the spectral reflectance (SR) of gastric tumors and normal mucosa recorded with a hyperspectral camera equipped with HSI technology and attempted to determine the specific wavelength that is useful for the diagnosis of gastric cancer. A total of 104 gastric tumors removed by endoscopic submucosal dissection from 96 patients at Yamaguchi University Hospital were recorded using a hyperspectral camera. We determined the optimal wavelength and the cut-off value for differentiating tumors from normal mucosa to establish a diagnostic algorithm. We also attempted to highlight tumors by image processing using the hyperspectral camera's analysis software. A wavelength of 770 nm and a cut-off value of 1/4 the corrected SR were selected as the respective optimal wavelength and cut-off values. The rates of sensitivity, specificity, and accuracy of the algorithm's diagnostic capability were 71%, 98%, and 85%, respectively. It was possible to enhance tumors by image processing at the 770-nm wavelength. HSI can be used to measure the SR in gastric tumors and to differentiate between tumorous and normal mucosa.
Clark, Alex M; Williams, Antony J; Ekins, Sean
2015-01-01
The current rise in the use of open lab notebook techniques means that there are an increasing number of scientists who make chemical information freely and openly available to the entire community as a series of micropublications that are released shortly after the conclusion of each experiment. We propose that this trend be accompanied by a thorough examination of data sharing priorities. We argue that the most significant immediate benefactor of open data is in fact chemical algorithms, which are capable of absorbing vast quantities of data, and using it to present concise insights to working chemists, on a scale that could not be achieved by traditional publication methods. Making this goal practically achievable will require a paradigm shift in the way individual scientists translate their data into digital form, since most contemporary methods of data entry are designed for presentation to humans rather than consumption by machine learning algorithms. We discuss some of the complex issues involved in fixing current methods, as well as some of the immediate benefits that can be gained when open data is published correctly using unambiguous machine readable formats. Graphical AbstractLab notebook entries must target both visualisation by scientists and use by machine learning algorithms.
Planetary Transmission Diagnostics
NASA Technical Reports Server (NTRS)
Lewicki, David G. (Technical Monitor); Samuel, Paul D.; Conroy, Joseph K.; Pines, Darryll J.
2004-01-01
This report presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting algorithm. The lifting scheme, developed by Wim Sweldens of Bell Labs, is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Classic lifting analyzes a given signal using wavelets derived from a single fundamental basis function. A number of researchers have proposed techniques for adding adaptivity to the lifting scheme, allowing the transform to choose from a set of fundamental bases the basis that best fits the signal. This characteristic is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation (synchronous signal-averaging) algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local wave-form changes associated with certain types of gear damage are poorly adapted, causing a significant change in the prediction error. The constrained adaptive lifting diagnostic algorithm is validated using data collected from the University of Maryland Transmission Test Rig and the results are discussed.
Tanis, Wilco; Habets, Jesse; van den Brink, Renee B A; Symersky, Petr; Budde, Ricardo P J; Chamuleau, Steven A J
2014-02-01
For acquired mechanical prosthetic heart valve (PHV) obstruction and suspicion on thrombosis, recently updated European Society of Cardiology guidelines advocate the confirmation of thrombus by transthoracic echocardiography, transesophageal echocardiography (TEE), and fluoroscopy. However, no evidence-based diagnostic algorithm is available for correct thrombus detection, although this is clinically important as fibrinolysis is contraindicated in non-thrombotic obstruction (isolated pannus). Here, we performed a review of the literature in order to propose a diagnostic algorithm. We performed a systematic search in Pubmed and Embase. Included publications were assessed on methodological quality based on the validated Quality Assessment of Diagnostic Accuracy Studies (QUADAS) II checklist. Studies were scarce (n = 15) and the majority were of moderate methodological quality. In total, 238 mechanical PHV's with acquired obstruction and a reliable reference standard were included for the evaluation of the role of fluoroscopy, echocardiography, or multidetector-row computed tomography (MDCT). In acquired PHV obstruction caused by thrombosis, mass detection by TEE and leaflet restriction detected by fluoroscopy were observed in the majority of cases (96 and 100%, respectively). In contrast, in acquired PHV obstruction free of thrombosis (pannus), leaflet restriction detected by fluoroscopy was absent in some cases (17%) and mass detection by TEE was absent in the majority of cases (66%). In case of mass detection by TEE, predictors for obstructive thrombus masses (compared with pannus masses) were leaflet restriction, soft echo density, and increased mass length. In situations of inconclusive echocardiography, MDCT may correctly detect pannus/thrombus based on the morphological aspects and localization. In acquired mechanical PHV obstruction without leaflet restriction and absent mass on TEE, obstructive PHV thrombosis cannot be confirmed and consequently, fibrinolysis is not advised. Based on the literature search and our opinion, a diagnostic algorithm is provided to correctly identify non-thrombotic PHV obstruction, which is highly relevant in daily clinical practice.
Chang, Ming; Wong, Audrey J S; Raugi, Dana N; Smith, Robert A; Seilie, Annette M; Ortega, Jose P; Bogusz, Kyle M; Sall, Fatima; Ba, Selly; Seydi, Moussa; Gottlieb, Geoffrey S; Coombs, Robert W
2017-01-01
The 2014 CDC 4th generation HIV screening algorithm includes an orthogonal immunoassay to confirm and discriminate HIV-1 and HIV-2 antibodies. Additional nucleic acid testing (NAT) is recommended to resolve indeterminate or undifferentiated HIV seroreactivity. HIV-2 NAT requires a second-line assay to detect HIV-2 total nucleic acid (TNA) in patients' blood cells, as a third of untreated patients have undetectable plasma HIV-2 RNA. To validate a qualitative HIV-2 TNA assay using peripheral blood mononuclear cells (PBMC) from HIV-2-infected Senegalese study participants. We evaluated the assay precision, sensitivity, specificity, and diagnostic performance of an HIV-2 TNA assay. Matched plasma and PBMC samples were collected from 25 HIV-1, 30 HIV-2, 8 HIV-1/-2 dual-seropositive and 25 HIV seronegative individuals. Diagnostic performance was evaluated by comparing the outcome of the TNA assay to the results obtained by the 4th generation HIV screening and confirmatory immunoassays. All PBMC from 30 HIV-2 seropositive participants tested positive for HIV-2 TNA including 23 patients with undetectable plasma RNA. Of the 30 matched plasma specimens, one was HIV non-reactive. Samples from 50 non-HIV-2 infected individuals were confirmed as non-reactive for HIV-2 Ab and negative for HIV-2 TNA. The agreement between HIV-2 TNA and the combined immunoassay results was 98.8% (79/80). Furthermore, HIV-2 TNA was detected in 7 of 8 PBMC specimens from HIV-1/HIV-2 dual-seropositive participants. Our TNA assay detected HIV-2 DNA/RNA in PBMC from serologically HIV-2 reactive, HIV indeterminate or HIV undifferentiated individuals with undetectable plasma RNA, and is suitable for confirming HIV-2 infection in the HIV testing algorithm. Copyright © 2016 Elsevier B.V. All rights reserved.
Willemsen, Robert T A; Buntinx, Frank; Winkens, Bjorn; Glatz, Jan F; Dinant, Geert Jan
2014-12-12
Chest complaints presented to a general practitioner (GP) are frequently caused by diseases which have advantageous outcomes. However, in some cases, acute coronary syndrome (ACS) is present (1.5-22% of cases). The patient's signs, symptoms and electrocardiography results are insufficient diagnostic tools to distinguish mild disease from ACS. Therefore, most patients presenting chest complaints are referred to secondary care facilities where ACS is then ruled out in a majority of patients (78%). Recently, a point of care test for heart-type fatty acid-binding protein (H-FABP) using a low cut-off value between positive and negative of 4 ng/ml has become available. We aim to study the role of this point of care device in triage of patients presenting chest complaints possibly due to ACS, in primary care. Our research protocol is presented in this article. Results are expected in 2015. Participating GPs will register signs and symptoms in all patients presenting chest complaints possibly due to ACS. Point of care H-FABP testing will also be performed. Our study will be a derivation study to identify signs and symptoms that, combined with point of care H-FABP testing, can be part of an algorithm to either confirm or rule out ACS. The diagnostic value for ACS of this algorithm in general practice will be determined. A safe diagnostic elimination of ACS by application of the algorithm can be of significant clinical relevance. Improved triage and thus reduction of the number of patients with chest complaints without underlying ACS, that are referred to secondary care facilities, could lead to a substantial cost reduction. ClinicalTrials.gov, NCT01826994, accepted April 8th 2013.
Electric machine differential for vehicle traction control and stability control
NASA Astrophysics Data System (ADS)
Kuruppu, Sandun Shivantha
Evolving requirements in energy efficiency and tightening regulations for reliable electric drivetrains drive the advancement of the hybrid electric (HEV) and full electric vehicle (EV) technology. Different configurations of EV and HEV architectures are evaluated for their performance. The future technology is trending towards utilizing distinctive properties in electric machines to not only to improve efficiency but also to realize advanced road adhesion controls and vehicle stability controls. Electric machine differential (EMD) is such a concept under current investigation for applications in the near future. Reliability of a power train is critical. Therefore, sophisticated fault detection schemes are essential in guaranteeing reliable operation of a complex system such as an EMD. The research presented here emphasize on implementation of a 4kW electric machine differential, a novel single open phase fault diagnostic scheme, an implementation of a real time slip optimization algorithm and an electric machine differential based yaw stability improvement study. The proposed d-q current signature based SPO fault diagnostic algorithm detects the fault within one electrical cycle. The EMD based extremum seeking slip optimization algorithm reduces stopping distance by 30% compared to hydraulic braking based ABS.
NASA Technical Reports Server (NTRS)
Hunter, H. E.
1972-01-01
The Avco Data Analysis and Prediction Techniques (ADAPT) were employed to determine laws capable of detecting failures in a heat plant up to three days in advance of the occurrence of the failure. The projected performance of algorithms yielded a detection probability of 90% with false alarm rates of the order of 1 per year for a sample rate of 1 per day with each detection, followed by 3 hourly samplings. This performance was verified on 173 independent test cases. The program also demonstrated diagnostic algorithms and the ability to predict the time of failure to approximately plus or minus 8 hours up to three days in advance of the failure. The ADAPT programs produce simple algorithms which have a unique possibility of a relatively low cost updating procedure. The algorithms were implemented on general purpose computers at Kennedy Space Flight Center and tested against current data.
Zhu, Bohui; Ding, Yongsheng; Hao, Kuangrong
2013-01-01
This paper presents a novel maximum margin clustering method with immune evolution (IEMMC) for automatic diagnosis of electrocardiogram (ECG) arrhythmias. This diagnostic system consists of signal processing, feature extraction, and the IEMMC algorithm for clustering of ECG arrhythmias. First, raw ECG signal is processed by an adaptive ECG filter based on wavelet transforms, and waveform of the ECG signal is detected; then, features are extracted from ECG signal to cluster different types of arrhythmias by the IEMMC algorithm. Three types of performance evaluation indicators are used to assess the effect of the IEMMC method for ECG arrhythmias, such as sensitivity, specificity, and accuracy. Compared with K-means and iterSVR algorithms, the IEMMC algorithm reflects better performance not only in clustering result but also in terms of global search ability and convergence ability, which proves its effectiveness for the detection of ECG arrhythmias. PMID:23690875
Dynamic path planning for mobile robot based on particle swarm optimization
NASA Astrophysics Data System (ADS)
Wang, Yong; Cai, Feng; Wang, Ying
2017-08-01
In the contemporary, robots are used in many fields, such as cleaning, medical treatment, space exploration, disaster relief and so on. The dynamic path planning of robot without collision is becoming more and more the focus of people's attention. A new method of path planning is proposed in this paper. Firstly, the motion space model of the robot is established by using the MAKLINK graph method. Then the A* algorithm is used to get the shortest path from the start point to the end point. Secondly, this paper proposes an effective method to detect and avoid obstacles. When an obstacle is detected on the shortest path, the robot will choose the nearest safety point to move. Moreover, calculate the next point which is nearest to the target. Finally, the particle swarm optimization algorithm is used to optimize the path. The experimental results can prove that the proposed method is more effective.
A Novel and Simple Spike Sorting Implementation.
Petrantonakis, Panagiotis C; Poirazi, Panayiota
2017-04-01
Monitoring the activity of multiple, individual neurons that fire spikes in the vicinity of an electrode, namely perform a Spike Sorting (SS) procedure, comprises one of the most important tools for contemporary neuroscience in order to reverse-engineer the brain. As recording electrodes' technology rabidly evolves by integrating thousands of electrodes in a confined spatial setting, the algorithms that are used to monitor individual neurons from recorded signals have to become even more reliable and computationally efficient. In this work, we propose a novel framework of the SS approach in which a single-step processing of the raw (unfiltered) extracellular signal is sufficient for both the detection and sorting of the activity of individual neurons. Despite its simplicity, the proposed approach exhibits comparable performance with state-of-the-art approaches, especially for spike detection in noisy signals, and paves the way for a new family of SS algorithms with the potential for multi-recording, fast, on-chip implementations.
[Diagnostic tactics in localized fibroadenomatosis of the breast].
LI, L A; Martyniuk, V V; Khudiakova, T G; Kondrat'ev, V B
2000-01-01
Results of the examination and treatment of 203 patients with localized fibroadenomatosis of the mammary gland are described. The authors give an estimation of diagnostic algorithms used when detecting the physical signs of the disease. It was shown that the examination of patients with the palpation signs of nodular mastopathy should necessarily include mammography which can eliminate unwarranted risk of hypodiagnosing carcinoma of the mammary gland and determine the group of patients in whom the emergency operation is not necessary.
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2013-01-01
Background Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. Objective To develop a clinical decision–support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. Methods A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. Results The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision–support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k = 0.68 (p < 0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician’s diagnostic impression and the gold standard k = 0. 64 (p < 0.0001). There was moderate agreement between the physician’s diagnostic impression and CDSS k = 0.46 (p = 0.0008). Conclusions The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. PMID:21917512
Measurement of fecal elastase improves performance of newborn screening for cystic fibrosis.
Barben, Juerg; Rueegg, Corina S; Jurca, Maja; Spalinger, Johannes; Kuehni, Claudia E
2016-05-01
The aim of newborn screening (NBS) for CF is to detect children with 'classic' CF where early treatment is possible and improves prognosis. Children with inconclusive CF diagnosis (CFSPID) should not be detected, as there is no evidence for improvement through early treatment. No algorithm in current NBS guidelines explains what to do when sweat test (ST) fails. This study compares the performance of three different algorithms for further diagnostic evaluations when first ST is unsuccessful, regarding the numbers of children detected with CF and CFSPID, and the time until a definite diagnosis. In Switzerland, CF-NBS was introduced in January 2011 using an IRT-DNA-IRT algorithm followed by a ST. In children, in whom ST was not possible (no or insufficient sweat), 3 different protocols were applied between 2011 and 2014: in 2011, ST was repeated until it was successful (protocol A), in 2012 we proceeded directly to diagnostic DNA testing (protocol B), and 2013-2014, fecal elastase (FE) was measured in the stool, in order to determine a pancreas insufficiency needing immediate treatment (protocol C). The ratio CF:CFSPID was 7:1 (27/4) with protocol A, 2:1 (22/10) with protocol B, and 14:1 (54/4) with protocol C. The mean time to definite diagnosis was significantly shorter with protocol C (33days) compared to protocol A or B (42 and 40days; p=0.014 compared to A, and p=0.036 compared to B). The algorithm for the diagnostic part of the newborn screening used in the CF centers is important and affects the performance of a CF-NBS program with regard to the ratio CF:CFSPID and the time until definite diagnosis. Our results suggest to include FE after initial sweat test failure in the CF-NBS guidelines to keep the proportion of CFSPID low and the time until definite diagnosis short. Copyright © 2016 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2011-11-01
Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. To develop a clinical decision-support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision-support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k=0.68 (p<0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician's diagnostic impression and the gold standard k=0. 64 (p<0.0001). There was moderate agreement between the physician's diagnostic impression and CDSS k=0.46 (p=0.0008). The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
On-the-fly detection of images with gastritis aspects in magnetically guided capsule endoscopy
NASA Astrophysics Data System (ADS)
Mewes, P. W.; Neumann, D.; Juloski, A. L.; Angelopoulou, E.; Hornegger, J.
2011-03-01
Capsule Endoscopy (CE) was introduced in 2000 and has since become an established diagnostic procedure for the small bowel, colon and esophagus. For the CE examination the patient swallows the capsule, which then travels through the gastrointestinal tract under the influence of the peristaltic movements. CE is not indicated for stomach examination, as the capsule movements can not be controlled from the outside and the entire surface of the stomach can not be reliably covered. Magnetically-guided capsule endoscopy (MGCE) was introduced in 2010. For the MGCE procedure the stomach is filled with water and the capsule is navigated from the outside using an external magnetic field. During the examination the operator can control the motion of the capsule in order to obtain a sufficient number of stomach-surface images with diagnostic value. The quality of the examination depends on the skill of the operator and his ability to detect aspects of interest in real time. We present a novel computer-assisted diagnostic-procedure (CADP) algorithm for indicating gastritis pathologies in the stomach during the examination. Our algorithm is based on pre-processing methods and feature vectors that are suitably chosen for the challenges of the MGCE imaging (suspended particles, bubbles, lighting). An image is classified using an ada-boost trained classifier. For the classifier training, a number of possible features were investigated. Statistical evaluation was conducted to identify relevant features with discriminative potential. The proposed algorithm was tested on 12 video sequences stemming from 6 volunteers. A mean detection rate of 91.17% was achieved during leave-one out cross-validation.
Minet, V; Baudar, J; Bailly, N; Douxfils, J; Laloy, J; Lessire, S; Gourdin, M; Devalet, B; Chatelain, B; Dogné, J M; Mullier, F
2014-06-01
Accurate diagnosis of heparin-induced thrombocytopenia (HIT) is essential but remains challenging. We have previously demonstrated, in a retrospective study, the usefulness of the combination of the 4Ts score, AcuStar HIT and heparin-induced multiple electrode aggregometry (HIMEA) with optimized thresholds. We aimed at exploring prospectively the performances of our optimized diagnostic algorithm on suspected HIT patients. The secondary objective is to evaluate performances of AcuStar HIT-Ab (PF4-H) in comparison with the clinical outcome. 116 inpatients with clinically suspected immune HIT were included. Our optimized diagnostic algorithm was applied to each patient. Sensitivity, specificity, negative predictive value (NPV), positive predictive value (PPV) of the overall diagnostic strategy as well as AcuStar HIT-Ab (at manufacturer's thresholds and at our thresholds) were calculated using clinical diagnosis as the reference. Among 116 patients, 2 patients had clinically-diagnosed HIT. These 2 patients were positive on AcuStar HIT-Ab, AcuStar HIT-IgG and HIMEA. Using our optimized algorithm, all patients were correctly diagnosed. AcuStar HIT-Ab at our cut-off (>9.41 U/mL) and at manufacturer's cut-off (>1.00 U/mL) showed both a sensitivity of 100.0% and a specificity of 99.1% and 90.4%, respectively. The combination of the 4Ts score, the HemosIL® AcuStar HIT and HIMEA with optimized thresholds may be useful for the rapid and accurate exclusion of the diagnosis of immune HIT. Copyright © 2014 Elsevier Ltd. All rights reserved.
The modern trends of the evolution laser information technology in oncology
NASA Astrophysics Data System (ADS)
Mikov, A. A.; Svirin, V. N.
2008-04-01
Laser-optical information technologies and devices develop since the 70- years at the end of 20 century and are broadly used for diagnostics and treatment of oncological diseases to date. Although such methods as photodynamic therapy (PDT), laser-induce thermotherapy (LITT), fluorescent diagnostics and spectrophotometry already more than 30 years are used for treatment and diagnostics of oncological diseases, nevertheless, they are enough new methods and, as a rule, are used in large scientific centers and medical institutions. This is bound, first of all, with lack of information on modern method of cancer treatment, the absence of widely available laser procedures and corresponding devices in the polyclinics and even in district hospitals, as well as insufficient understanding of application areas, where laser methods has an advantage by comparison, for instance, with beam or chemotherapy. At present day laser methods are fast upcoming direction of the treatment oncological diseases. This is explained by progress in development essentially laser, particularly diode, improvement electronic and computing components and broad introduction software-algorithmic methods of control the undertaking therapeutic and diagnostic procedures. In article are considered new laser methods of the undertaking diagnostic and therapeutic procedures and is shown that introduction multiwave laser radiation for probe and influences on tissue, the different methods of the determination of the functional state of tissues, realization of the on-line diagnostics when carrying out the therapeutic procedures, automatic control systems of the power laser radiation, which depends on state patient tissue, as well as software-algorithmic methods of management session therapeutic and diagnostic procedures greatly raises efficiency of the treatment oncological diseases. On an example of the multipurpose laser therapeutic devices("MLTA") developed and introduced in clinical practice and multipurpose laser diagnostic complexes ("MLDC"), the realizing offered methods, are shown the basic tendencies of development laser methods in oncology, concrete technical decisions and the experimental clinical material showing increase of efficiency of treatment of a cancer at their realization are resulted. It is shown, that realization of the offered methods and technical technologies opens new competitive advantages laser technologies in comparison with beam and chemical-therapy at treatment of oncological diseases.
van der Linden, Noreen; Wildi, Karin; Twerenbold, Raphael; Pickering, John W; Than, Martin; Cullen, Louise; Greenslade, Jaimi; Parsonage, William; Nestelberger, Thomas; Boeddinghaus, Jasper; Badertscher, Patrick; Rubini Giménez, Maria; Klinkenberg, Lieke J J; Bekers, Otto; Schöni, Aline; Keller, Dagmar I; Sabti, Zaid; Puelacher, Christian; Cupa, Janosch; Schumacher, Lukas; Kozhuharov, Nikola; Grimm, Karin; Shrestha, Samyut; Flores, Dayana; Freese, Michael; Stelzig, Claudia; Strebel, Ivo; Miró, Òscar; Rentsch, Katharina; Morawiec, Beata; Kawecki, Damian; Kloos, Wanda; Lohrmann, Jens; Richards, A Mark; Troughton, Richard; Pemberton, Christopher; Osswald, Stefan; van Dieijen-Visser, Marja P; Mingels, Alma M; Reichlin, Tobias; Meex, Steven J R; Mueller, Christian
2018-04-24
Background -Combining two signals of cardiomyocyte injury, cardiac troponin I (cTnI) and T (cTnT), might overcome some individual pathophysiological and analytical limitations and thereby increase diagnostic accuracy for acute myocardial infarction (AMI) with a single blood draw. We aimed to evaluate the diagnostic performance of combinations of high sensitivity (hs) cTnI and hs-cTnT for the early diagnosis of AMI. Methods -The diagnostic performance of combining hs-cTnI (Architect, Abbott) and hs-cTnT (Elecsys, Roche) concentrations (sum, product, ratio and a combination algorithm) obtained at the time of presentation was evaluated in a large multicenter diagnostic study of patients with suspected AMI. The optimal rule out and rule in thresholds were externally validated in a second large multicenter diagnostic study. The proportion of patients eligible for early rule out was compared with the ESC 0/1 and 0/3 hour algorithms. Results -Combining hs-cTnI and hs-cTnT concentrations did not consistently increase overall diagnostic accuracy as compared with the individual isoforms. However, the combination improved the proportion of patients meeting criteria for very early rule-out. With the ESC 2015 guideline recommended algorithms and cut-offs, the proportion meeting rule out criteria after the baseline blood sampling was limited (6-24%) and assay dependent. Application of optimized cut-off values using the sum (9 ng/L) and product (18 ng2/L2) of hs-cTnI and hs-cTnT concentrations led to an increase in the proportion ruled-out after a single blood draw to 34-41% in the original (sum: negative predictive value (NPV) 100% (95%CI: 99.5-100%); product: NPV 100% (95%CI: 99.5-100%) and in the validation cohort (sum: NPV 99.6% (95%CI: 99.0-99.9%); product: NPV 99.4% (95%CI: 98.8-99.8%). The use of a combination algorithm (hs-cTnI <4 ng/L and hs-cTnT <9 ng/L) showed comparable results for rule out (40-43% ruled out; NPV original cohort 99.9% (95%CI: 99.2-100%); NPV validation cohort 99.5% (95%CI: 98.9-99.8%)) and rule-in (PPV original cohort 74.4% (95%Cl 69.6-78.8%); PPV validation cohort 84.0% (95%Cl 79.7-87.6%)). Conclusions -New strategies combining hs-cTnI and hs-cTnT concentrations may significantly increase the number of patients eligible for very early and safe rule-out, but do not seem helpful for the rule-in of AMI. Clinical Trial Registration -APACE URL: www.clinicaltrial.gov, Unique Identifier: NCT00470587; ADAPT URL: www.anzctr.org.au, Unique Identifier: ACTRN12611001069943.
Model-Based Diagnosis in a Power Distribution Test-Bed
NASA Technical Reports Server (NTRS)
Scarl, E.; McCall, K.
1998-01-01
The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.
Listening to hypochondriasis and hearing health anxiety.
Braddock, Autumn E; Abramowitz, Jonathan S
2006-09-01
Although hypochondriasis is categorized as a somatoform disorder in the Diagnostic and Statistical Manual of Mental Disorders Fourth Edition--Text Revision (DSM-IV-TR) due to excessive focus on bodily symptoms for at least 6 months, a contemporary conceptualization suggests that hypochondriasis represents an intense form of health anxiety. This article discusses the clinical presentation of hypochondriasis, etiological underpinnings and multiple maintaining factors, including physiological, cognitive and behavioral components. A cognitive-behavioral model of hypochondriasis as health anxiety and the empirically supported treatment based on the model are articulated. Future directions and informational resources are provided for both clinicians and patients.
The impact of psychopharmacology on contemporary clinical psychiatry.
Vázquez, Gustavo H
2014-08-01
Clinical psychiatric evaluations of patients have changed dramatically in recent decades. Both initial assessments and follow-up visits have become brief and superficial, focused on searching for categorical diagnostic criteria from checklists, with limited inquiry into patient-reported symptomatic status and tolerability of treatments. The virtually exclusive therapeutic task has become selecting a plausible psychotropic, usually based on expert consensus guidelines. These guidelines and practice patterns rest mainly on published monotherapy trials that may or may not be applicable to particular patients but are having a profound impact, not only on modern psychiatric practice but also on psychiatric education, research, and theory.