Eliseev, Platon; Balantcev, Grigory; Nikishova, Elena; Gaida, Anastasia; Bogdanova, Elena; Enarson, Donald; Ornstein, Tara; Detjen, Anne; Dacombe, Russell; Gospodarevskaya, Elena; Phillips, Patrick P J; Mann, Gillian; Squire, Stephen Bertel; Mariandyshev, Andrei
2016-01-01
In the Arkhangelsk region of Northern Russia, multidrug-resistant (MDR) tuberculosis (TB) rates in new cases are amongst the highest in the world. In 2014, MDR-TB rates reached 31.7% among new cases and 56.9% among retreatment cases. The development of new diagnostic tools allows for faster detection of both TB and MDR-TB and should lead to reduced transmission by earlier initiation of anti-TB therapy. The PROVE-IT (Policy Relevant Outcomes from Validating Evidence on Impact) Russia study aimed to assess the impact of the implementation of line probe assay (LPA) as part of an LPA-based diagnostic algorithm for patients with presumptive MDR-TB focusing on time to treatment initiation with time from first-care seeking visit to the initiation of MDR-TB treatment rather than diagnostic accuracy as the primary outcome, and to assess treatment outcomes. We hypothesized that the implementation of LPA would result in faster time to treatment initiation and better treatment outcomes. A culture-based diagnostic algorithm used prior to LPA implementation was compared to an LPA-based algorithm that replaced BacTAlert and Löwenstein Jensen (LJ) for drug sensitivity testing. A total of 295 MDR-TB patients were included in the study, 163 diagnosed with the culture-based algorithm, 132 with the LPA-based algorithm. Among smear positive patients, the implementation of the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 50 and 66 days compared to the culture-based algorithm (BacTAlert and LJ respectively, p<0.001). In smear negative patients, the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 78 days when compared to the culture-based algorithm (LJ, p<0.001). However, several weeks were still needed for treatment initiation in LPA-based algorithm, 24 days in smear positive, and 62 days in smear negative patients. Overall treatment outcomes were better in LPA-based algorithm compared to culture-based algorithm (p = 0.003). Treatment success rates at 20 months of treatment were higher in patients diagnosed with the LPA-based algorithm (65.2%) as compared to those diagnosed with the culture-based algorithm (44.8%). Mortality was also lower in the LPA-based algorithm group (7.6%) compared to the culture-based algorithm group (15.9%). There was no statistically significant difference in smear and culture conversion rates between the two algorithms. The results of the study suggest that the introduction of LPA leads to faster time to MDR diagnosis and earlier treatment initiation as well as better treatment outcomes for patients with MDR-TB. These findings also highlight the need for further improvements within the health system to reduce both patient and diagnostic delays to truly optimize the impact of new, rapid diagnostics.
Dunbar, R; Naidoo, P; Beyers, N; Langley, I
2017-04-01
Cape Town, South Africa. To compare the diagnostic yield for smear/culture and Xpert® MTB/RIF algorithms and to investigate the mechanisms influencing tuberculosis (TB) yield. We developed and validated an operational model of the TB diagnostic process, first with the smear/culture algorithm and then with the Xpert algorithm. We modelled scenarios by varying TB prevalence, adherence to diagnostic algorithms and human immunodeficiency virus (HIV) status. This enabled direct comparisons of diagnostic yield in the two algorithms to be made. Routine data showed that diagnostic yield had decreased over the period of the Xpert algorithm roll-out compared to the yield when the smear/culture algorithm was in place. However, modelling yield under identical conditions indicated a 13.3% increase in diagnostic yield from the Xpert algorithm compared to smear/culture. The model demonstrated that the extensive use of culture in the smear/culture algorithm and the decline in TB prevalence are the main factors contributing to not finding an increase in diagnostic yield in the routine data. We demonstrate the benefits of an operational model to determine the effect of scale-up of a new diagnostic algorithm, and recommend that policy makers use operational modelling to make appropriate decisions before new diagnostic algorithms are scaled up.
Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie
2009-01-01
Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.
Diagnostic Utility of the ADI-R and DSM-5 in the Assessment of Latino Children and Adolescents.
Magaña, Sandy; Vanegas, Sandra B
2017-05-01
Latino children in the US are systematically underdiagnosed with Autism Spectrum Disorder (ASD); therefore, it is important that recent changes to the diagnostic process do not exacerbate this pattern of under-identification. Previous research has found that the Autism Diagnostic Interview-Revised (ADI-R) algorithm, based on the Diagnostic and Statistical Manual of Mental Disorder, Fourth Edition, Text Revision (DSM-IV-TR), has limitations with Latino children of Spanish speaking parents. We evaluated whether an ADI-R algorithm based on the new DSM-5 classification for ASD would be more sensitive in identifying Latino children of Spanish speaking parents who have a clinical diagnosis of ASD. Findings suggest that the DSM-5 algorithm shows better sensitivity than the DSM-IV-TR algorithm for Latino children.
Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil
2010-01-01
We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach
SUMIE, HIROAKI; SUMIE, SHUJI; NAKAHARA, KEITA; WATANABE, YASUTOMO; MATSUO, KEN; MUKASA, MICHITA; SAKAI, TAKESHI; YOSHIDA, HIKARU; TSURUTA, OSAMU; SATA, MICHIO
2014-01-01
The usefulness of magnifying endoscopy with narrow-band imaging (ME-NBI) for the diagnosis of early gastric cancer is well known, however, there are no evaluation criteria. The aim of this study was to devise and evaluate a novel diagnostic algorithm for ME-NBI in depressed early gastric cancer. Between August, 2007 and May, 2011, 90 patients with a total of 110 depressed gastric lesions were enrolled in the study. A diagnostic algorithm was devised based on ME-NBI microvascular findings: microvascular irregularity and abnormal microvascular patterns (fine network, corkscrew and unclassified patterns). The diagnostic efficiency of the algorithm for gastric cancer and histological grade was assessed by measuring its mean sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy. Furthermore, inter- and intra-observer variation were measured. In the differential diagnosis of gastric cancer from non-cancerous lesions, the mean sensitivity, specificity, PPV, NPV, and accuracy of the diagnostic algorithm were 86.7, 48.0, 94.4, 26.7, and 83.2%, respectively. Furthermore, in the differential diagnosis of undifferentiated adenocarcinoma from differentiated adenocarcinoma, the mean sensitivity, specificity, PPV, NPV, and accuracy of the diagnostic algorithm were 61.6, 86.3, 69.0, 84.8, and 79.1%, respectively. For the ME-NBI final diagnosis using this algorithm, the mean κ values for inter- and intra-observer agreement were 0.50 and 0.77, respectively. In conclusion, the diagnostic algorithm based on ME-NBI microvascular findings was convenient and had high diagnostic accuracy, reliability and reproducibility in the differential diagnosis of depressed gastric lesions. PMID:24649321
A Hybrid Neural Network-Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2001-01-01
In this paper, a model-based diagnostic method, which utilizes Neural Networks and Genetic Algorithms, is investigated. Neural networks are applied to estimate the engine internal health, and Genetic Algorithms are applied for sensor bias detection and estimation. This hybrid approach takes advantage of the nonlinear estimation capability provided by neural networks while improving the robustness to measurement uncertainty through the application of Genetic Algorithms. The hybrid diagnostic technique also has the ability to rank multiple potential solutions for a given set of anomalous sensor measurements in order to reduce false alarms and missed detections. The performance of the hybrid diagnostic technique is evaluated through some case studies derived from a turbofan engine simulation. The results show this approach is promising for reliable diagnostics of aircraft engines.
Computer-aided US diagnosis of breast lesions by using cell-based contour grouping.
Cheng, Jie-Zhi; Chou, Yi-Hong; Huang, Chiun-Sheng; Chang, Yeun-Chung; Tiu, Chui-Mei; Chen, Kuei-Wu; Chen, Chung-Ming
2010-06-01
To develop a computer-aided diagnostic algorithm with automatic boundary delineation for differential diagnosis of benign and malignant breast lesions at ultrasonography (US) and investigate the effect of boundary quality on the performance of a computer-aided diagnostic algorithm. This was an institutional review board-approved retrospective study with waiver of informed consent. A cell-based contour grouping (CBCG) segmentation algorithm was used to delineate the lesion boundaries automatically. Seven morphologic features were extracted. The classifier was a logistic regression function. Five hundred twenty breast US scans were obtained from 520 subjects (age range, 15-89 years), including 275 benign (mean size, 15 mm; range, 5-35 mm) and 245 malignant (mean size, 18 mm; range, 8-29 mm) lesions. The newly developed computer-aided diagnostic algorithm was evaluated on the basis of boundary quality and differentiation performance. The segmentation algorithms and features in two conventional computer-aided diagnostic algorithms were used for comparative study. The CBCG-generated boundaries were shown to be comparable with the manually delineated boundaries. The area under the receiver operating characteristic curve (AUC) and differentiation accuracy were 0.968 +/- 0.010 and 93.1% +/- 0.7, respectively, for all 520 breast lesions. At the 5% significance level, the newly developed algorithm was shown to be superior to the use of the boundaries and features of the two conventional computer-aided diagnostic algorithms in terms of AUC (0.974 +/- 0.007 versus 0.890 +/- 0.008 and 0.788 +/- 0.024, respectively). The newly developed computer-aided diagnostic algorithm that used a CBCG segmentation method to measure boundaries achieved a high differentiation performance. Copyright RSNA, 2010
Kros, Johan M; Huizer, Karin; Hernández-Laín, Aurelio; Marucci, Gianluca; Michotte, Alex; Pollo, Bianca; Rushing, Elisabeth J; Ribalta, Teresa; French, Pim; Jaminé, David; Bekka, Nawal; Lacombe, Denis; van den Bent, Martin J; Gorlia, Thierry
2015-06-10
With the rapid discovery of prognostic and predictive molecular parameters for glioma, the status of histopathology in the diagnostic process should be scrutinized. Our project aimed to construct a diagnostic algorithm for gliomas based on molecular and histologic parameters with independent prognostic values. The pathology slides of 636 patients with gliomas who had been included in EORTC 26951 and 26882 trials were reviewed using virtual microscopy by a panel of six neuropathologists who independently scored 18 histologic features and provided an overall diagnosis. The molecular data for IDH1, 1p/19q loss, EGFR amplification, loss of chromosome 10 and chromosome arm 10q, gain of chromosome 7, and hypermethylation of the promoter of MGMT were available for some of the cases. The slides were divided in discovery (n = 426) and validation sets (n = 210). The diagnostic algorithm resulting from analysis of the discovery set was validated in the latter. In 66% of cases, consensus of overall diagnosis was present. A diagnostic algorithm consisting of two molecular markers and one consensus histologic feature was created by conditional inference tree analysis. The order of prognostic significance was: 1p/19q loss, EGFR amplification, and astrocytic morphology, which resulted in the identification of four diagnostic nodes. Validation of the nodes in the validation set confirmed the prognostic value (P < .001). We succeeded in the creation of a timely diagnostic algorithm for anaplastic glioma based on multivariable analysis of consensus histopathology and molecular parameters. © 2015 by American Society of Clinical Oncology.
Prosthetic joint infection development of an evidence-based diagnostic algorithm.
Mühlhofer, Heinrich M L; Pohlig, Florian; Kanz, Karl-Georg; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; Kelch, Sarah; Harrasser, Norbert; von Eisenhart-Rothe, Rüdiger; Schauwecker, Johannes
2017-03-09
Increasing rates of prosthetic joint infection (PJI) have presented challenges for general practitioners, orthopedic surgeons and the health care system in the recent years. The diagnosis of PJI is complex; multiple diagnostic tools are used in the attempt to correctly diagnose PJI. Evidence-based algorithms can help to identify PJI using standardized diagnostic steps. We reviewed relevant publications between 1990 and 2015 using a systematic literature search in MEDLINE and PUBMED. The selected search results were then classified into levels of evidence. The keywords were prosthetic joint infection, biofilm, diagnosis, sonication, antibiotic treatment, implant-associated infection, Staph. aureus, rifampicin, implant retention, pcr, maldi-tof, serology, synovial fluid, c-reactive protein level, total hip arthroplasty (THA), total knee arthroplasty (TKA) and combinations of these terms. From an initial 768 publications, 156 publications were stringently reviewed. Publications with class I-III recommendations (EAST) were considered. We developed an algorithm for the diagnostic approach to display the complex diagnosis of PJI in a clear and logically structured process according to ISO 5807. The evidence-based standardized algorithm combines modern clinical requirements and evidence-based treatment principles. The algorithm provides a detailed transparent standard operating procedure (SOP) for diagnosing PJI. Thus, consistently high, examiner-independent process quality is assured to meet the demands of modern quality management in PJI diagnosis.
Diagnostic work-up and loss of tuberculosis suspects in Jogjakarta, Indonesia.
Ahmad, Riris Andono; Matthys, Francine; Dwihardiani, Bintari; Rintiswati, Ning; de Vlas, Sake J; Mahendradhata, Yodi; van der Stuyft, Patrick
2012-02-15
Early and accurate diagnosis of pulmonary tuberculosis (TB) is critical for successful TB control. To assist in the diagnosis of smear-negative pulmonary TB, the World Health Organisation (WHO) recommends the use of a diagnostic algorithm. Our study evaluated the implementation of the national tuberculosis programme's diagnostic algorithm in routine health care settings in Jogjakarta, Indonesia. The diagnostic algorithm is based on the WHO TB diagnostic algorithm, which had already been implemented in the health facilities. We prospectively documented the diagnostic work-up of all new tuberculosis suspects until a diagnosis was reached. We used clinical audit forms to record each step chronologically. Data on the patient's gender, age, symptoms, examinations (types, dates, and results), and final diagnosis were collected. Information was recorded for 754 TB suspects; 43.5% of whom were lost during the diagnostic work-up in health centres, 0% in lung clinics. Among the TB suspects who completed diagnostic work-ups, 51.1% and 100.0% were diagnosed without following the national TB diagnostic algorithm in health centres and lung clinics, respectively. However, the work-up in the health centres and lung clinics generally conformed to international standards for tuberculosis care (ISTC). Diagnostic delays were significantly longer in health centres compared to lung clinics. The high rate of patients lost in health centres needs to be addressed through the implementation of TB suspect tracing and better programme supervision. The national TB algorithm needs to be revised and differentiated according to the level of care.
Cho-Vega, Jeong Hee
2016-07-01
Atypical spitzoid tumors are a morphologically diverse group of rare melanocytic lesions most frequently seen in children and young adults. As atypical spitzoid tumors bear striking resemblance to Spitz nevus and spitzoid melanomas clinically and histopathologically, it is crucial to determine its malignant potential and predict its clinical behavior. To date, many researchers have attempted to differentiate atypical spitzoid tumors from unequivocal melanomas based on morphological, immonohistochemical, and molecular diagnostic differences. A diagnostic algorithm is proposed here to assess the malignant potential of atypical spitzoid tumors by using a combination of immunohistochemical and cytogenetic/molecular tests. Together with classical morphological evaluation, this algorithm includes a set of immunohistochemistry assays (p16(Ink4a), a dual-color Ki67/MART-1, and HMB45), fluorescence in situ hybridization (FISH) with five probes (6p25, 8q24, 11q13, CEN9, and 9p21), and an array-based comparative genomic hybridization. This review discusses details of the algorithm, the rationale of each test used in the algorithm, and utility of this algorithm in routine dermatopathology practice. This algorithmic approach will provide a comprehensive diagnostic tool that complements conventional histological criteria and will significantly contribute to improve the diagnosis and prediction of the clinical behavior of atypical spitzoid tumors.
Implementation of an Algorithm for Prosthetic Joint Infection: Deviations and Problems.
Mühlhofer, Heinrich M L; Kanz, Karl-Georg; Pohlig, Florian; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; von Eisenhart-Rothe, Ruediger; Schauwecker, Johannes
The outcome of revision surgery in arthroplasty is based on a precise diagnosis. In addition, the treatment varies based on whether the prosthetic failure is caused by aseptic or septic loosening. Algorithms can help to identify periprosthetic joint infections (PJI) and standardize diagnostic steps, however, algorithms tend to oversimplify the treatment of complex cases. We conducted a process analysis during the implementation of a PJI algorithm to determine problems and deviations associated with the implementation of this algorithm. Fifty patients who were treated after implementing a standardized algorithm were monitored retrospectively. Their treatment plans and diagnostic cascades were analyzed for deviations from the implemented algorithm. Each diagnostic procedure was recorded, compared with the algorithm, and evaluated statistically. We detected 52 deviations while treating 50 patients. In 25 cases, no discrepancy was observed. Synovial fluid aspiration was not performed in 31.8% of patients (95% confidence interval [CI], 18.1%-45.6%), while white blood cell counts (WBCs) and neutrophil differentiation were assessed in 54.5% of patients (95% CI, 39.8%-69.3%). We also observed that the prolonged incubation of cultures was not requested in 13.6% of patients (95% CI, 3.5%-23.8%). In seven of 13 cases (63.6%; 95% CI, 35.2%-92.1%), arthroscopic biopsy was performed; 6 arthroscopies were performed in discordance with the algorithm (12%; 95% CI, 3%-21%). Self-critical analysis of diagnostic processes and monitoring of deviations using algorithms are important and could increase the quality of treatment by revealing recurring faults.
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
The PHQ-8 as a measure of current depression in the general population.
Kroenke, Kurt; Strine, Tara W; Spitzer, Robert L; Williams, Janet B W; Berry, Joyce T; Mokdad, Ali H
2009-04-01
The eight-item Patient Health Questionnaire depression scale (PHQ-8) is established as a valid diagnostic and severity measure for depressive disorders in large clinical studies. Our objectives were to assess the PHQ-8 as a depression measure in a large, epidemiological population-based study, and to determine the comparability of depression as defined by the PHQ-8 diagnostic algorithm vs. a PHQ-8 cutpoint > or = 10. Random-digit-dialed telephone survey of 198,678 participants in the 2006 Behavioral Risk Factor Surveillance Survey (BRFSS), a population-based survey in the United States. Current depression as defined by either the DSM-IV based diagnostic algorithm (i.e., major depressive or other depressive disorder) of the PHQ-8 or a PHQ-8 score > or = 10; respondent sociodemographic characteristics; number of days of impairment in the past 30 days in multiple domains of health-related quality of life (HRQoL). The prevalence of current depression was similar whether defined by the diagnostic algorithm or a PHQ-8 score > or = 10 (9.1% vs. 8.6%). Depressed patients had substantially more days of impairment across multiple domains of HRQoL, and the impairment was nearly identical in depressed groups defined by either method. Of the 17,040 respondents with a PHQ-8 score > or = 10, major depressive disorder was present in 49.7%, other depressive disorder in 23.9%, depressed mood or anhedonia in another 22.8%, and no evidence of depressive disorder or depressive symptoms in only 3.5%. The PHQ-8 diagnostic algorithm rather than an independent structured psychiatric interview was used as the criterion standard. The PHQ-8 is a useful depression measure for population-based studies, and either its diagnostic algorithm or a cutpoint > or = 10 can be used for defining current depression.
Dey, Nilanjan; Bose, Soumyo; Das, Achintya; Chaudhuri, Sheli Sinha; Saba, Luca; Shafique, Shoaib; Nicolaides, Andrew; Suri, Jasjit S
2016-04-01
Embedding of diagnostic and health care information requires secure encryption and watermarking. This research paper presents a comprehensive study for the behavior of some well established watermarking algorithms in frequency domain for the preservation of stroke-based diagnostic parameters. Two different sets of watermarking algorithms namely: two correlation-based (binary logo hiding) and two singular value decomposition (SVD)-based (gray logo hiding) watermarking algorithms are used for embedding ownership logo. The diagnostic parameters in atherosclerotic plaque ultrasound video are namely: (a) bulb identification and recognition which consists of identifying the bulb edge points in far and near carotid walls; (b) carotid bulb diameter; and (c) carotid lumen thickness all along the carotid artery. The tested data set consists of carotid atherosclerotic movies taken under IRB protocol from University of Indiana Hospital, USA-AtheroPoint™ (Roseville, CA, USA) joint pilot study. ROC (receiver operating characteristic) analysis was performed on the bulb detection process that showed an accuracy and sensitivity of 100 % each, respectively. The diagnostic preservation (DPsystem) for SVD-based approach was above 99 % with PSNR (Peak signal-to-noise ratio) above 41, ensuring the retention of diagnostic parameter devalorization as an effect of watermarking. Thus, the fully automated proposed system proved to be an efficient method for watermarking the atherosclerotic ultrasound video for stroke application.
Diagnostic work-up and loss of tuberculosis suspects in Jogjakarta, Indonesia
2012-01-01
Background Early and accurate diagnosis of pulmonary tuberculosis (TB) is critical for successful TB control. To assist in the diagnosis of smear-negative pulmonary TB, the World Health Organisation (WHO) recommends the use of a diagnostic algorithm. Our study evaluated the implementation of the national tuberculosis programme's diagnostic algorithm in routine health care settings in Jogjakarta, Indonesia. The diagnostic algorithm is based on the WHO TB diagnostic algorithm, which had already been implemented in the health facilities. Methods We prospectively documented the diagnostic work-up of all new tuberculosis suspects until a diagnosis was reached. We used clinical audit forms to record each step chronologically. Data on the patient's gender, age, symptoms, examinations (types, dates, and results), and final diagnosis were collected. Results Information was recorded for 754 TB suspects; 43.5% of whom were lost during the diagnostic work-up in health centres, 0% in lung clinics. Among the TB suspects who completed diagnostic work-ups, 51.1% and 100.0% were diagnosed without following the national TB diagnostic algorithm in health centres and lung clinics, respectively. However, the work-up in the health centres and lung clinics generally conformed to international standards for tuberculosis care (ISTC). Diagnostic delays were significantly longer in health centres compared to lung clinics. Conclusions The high rate of patients lost in health centres needs to be addressed through the implementation of TB suspect tracing and better programme supervision. The national TB algorithm needs to be revised and differentiated according to the level of care. PMID:22333111
Cassani, Raymundo; Falk, Tiago H.; Fraga, Francisco J.; Kanda, Paulo A. M.; Anghinah, Renato
2014-01-01
Over the last decade, electroencephalography (EEG) has emerged as a reliable tool for the diagnosis of cortical disorders such as Alzheimer's disease (AD). EEG signals, however, are susceptible to several artifacts, such as ocular, muscular, movement, and environmental. To overcome this limitation, existing diagnostic systems commonly depend on experienced clinicians to manually select artifact-free epochs from the collected multi-channel EEG data. Manual selection, however, is a tedious and time-consuming process, rendering the diagnostic system “semi-automated.” Notwithstanding, a number of EEG artifact removal algorithms have been proposed in the literature. The (dis)advantages of using such algorithms in automated AD diagnostic systems, however, have not been documented; this paper aims to fill this gap. Here, we investigate the effects of three state-of-the-art automated artifact removal (AAR) algorithms (both alone and in combination with each other) on AD diagnostic systems based on four different classes of EEG features, namely, spectral, amplitude modulation rate of change, coherence, and phase. The three AAR algorithms tested are statistical artifact rejection (SAR), blind source separation based on second order blind identification and canonical correlation analysis (BSS-SOBI-CCA), and wavelet enhanced independent component analysis (wICA). Experimental results based on 20-channel resting-awake EEG data collected from 59 participants (20 patients with mild AD, 15 with moderate-to-severe AD, and 24 age-matched healthy controls) showed the wICA algorithm alone outperforming other enhancement algorithm combinations across three tasks: diagnosis (control vs. mild vs. moderate), early detection (control vs. mild), and disease progression (mild vs. moderate), thus opening the doors for fully-automated systems that can assist clinicians with early detection of AD, as well as disease severity progression assessment. PMID:24723886
Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.
Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab
2017-09-01
Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.
Yakhelef, N; Audibert, M; Varaine, F; Chakaya, J; Sitienei, J; Huerga, H; Bonnet, M
2014-05-01
In 2007, the World Health Organization recommended introducing rapid Mycobacterium tuberculosis culture into the diagnostic algorithm of smear-negative pulmonary tuberculosis (TB). To assess the cost-effectiveness of introducing a rapid non-commercial culture method (thin-layer agar), together with Löwenstein-Jensen culture to diagnose smear-negative TB at a district hospital in Kenya. Outcomes (number of true TB cases treated) were obtained from a prospective study evaluating the effectiveness of a clinical and radiological algorithm (conventional) against the alternative algorithm (conventional plus M. tuberculosis culture) in 380 smear-negative TB suspects. The costs of implementing each algorithm were calculated using a 'micro-costing' or 'ingredient-based' method. We then compared the cost and effectiveness of conventional vs. culture-based algorithms and estimated the incremental cost-effectiveness ratio. The costs of conventional and culture-based algorithms per smear-negative TB suspect were respectively €39.5 and €144. The costs per confirmed and treated TB case were respectively €452 and €913. The culture-based algorithm led to diagnosis and treatment of 27 more cases for an additional cost of €1477 per case. Despite the increase in patients started on treatment thanks to culture, the relatively high cost of a culture-based algorithm will make it difficult for resource-limited countries to afford.
FPGA based charge acquisition algorithm for soft x-ray diagnostics system
NASA Astrophysics Data System (ADS)
Wojenski, A.; Kasprowicz, G.; Pozniak, K. T.; Zabolotny, W.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Krawczyk, R. D.; Zienkiewicz, P.; Chernyshova, M.; Czarski, T.
2015-09-01
Soft X-ray (SXR) measurement systems working in tokamaks or with laser generated plasma can expect high photon fluxes. Therefore it is necessary to focus on data processing algorithms to have the best possible efficiency in term of processed photon events per second. This paper refers to recently designed algorithm and data-flow for implementation of charge data acquisition in FPGA. The algorithms are currently on implementation stage for the soft X-ray diagnostics system. In this paper despite of the charge processing algorithm is also described general firmware overview, data storage methods and other key components of the measurement system. The simulation section presents algorithm performance and expected maximum photon rate.
Dionne, Audrey; Meloche-Dumas, Léamarie; Desjardins, Laurent; Turgeon, Jean; Saint-Cyr, Claire; Autmizguine, Julie; Spigelblatt, Linda; Fournier, Anne; Dahdah, Nagib
2017-03-01
Diagnosis of Kawasaki disease (KD) can be challenging in the absence of a confirmatory test or pathognomonic finding, especially when clinical criteria are incomplete. We recently proposed serum N-terminal pro-B-type natriuretic peptide (NT-proBNP) as an adjunctive diagnostic test. We retrospectively tested a new algorithm to help KD diagnosis based on NT-proBNP, coronary artery dilation (CAD) at onset, and abnormal serum albumin or C-reactive protein (CRP). The goal was to assess the performance of the algorithm and compare its performance with that of the 2004 American Heart Association (AHA)/American Academy of Pediatrics (AAP) algorithm. The algorithm was tested on 124 KD patients with NT-proBNP measured on admission at the present institutions between 2007 and 2013. Age at diagnosis was 3.4 ± 3.0 years, with a median of five diagnostic criteria; and 55 of the 124 patients (44%) had incomplete KD. CA complications occurred in 64 (52%), with aneurysm in 14 (11%). Using this algorithm, 120/124 (97%) were to be treated, based on high NT-proBNP alone for 79 (64%); on onset CAD for 14 (11%); and on high CRP or low albumin for 27 (22%). Using the AHA/AAP algorithm, 22/47 (47%) of the eligible patients with incomplete KD would not have been referred for treatment, compared with 3/55 (5%) with the NT-proBNP algorithm (P < 0.001). This NT-proBNP-based algorithm is efficient to identify and treat patients with KD, including those with incomplete KD. This study paves the way for a prospective validation trial of the algorithm. © 2016 Japan Pediatric Society.
Unlu, Ezgi; Akay, Bengu N; Erdem, Cengizhan
2014-07-01
Dermatoscopic analysis of melanocytic lesions using the CASH algorithm has rarely been described in the literature. The purpose of this study was to compare the sensitivity, specificity, and diagnostic accuracy rates of the ABCD rule of dermatoscopy, the seven-point checklist, the three-point checklist, and the CASH algorithm in the diagnosis and dermatoscopic evaluation of melanocytic lesions on the hairy skin. One hundred and fifteen melanocytic lesions of 115 patients were examined retrospectively using dermatoscopic images and compared with the histopathologic diagnosis. Four dermatoscopic algorithms were carried out for all lesions. The ABCD rule of dermatoscopy showed sensitivity of 91.6%, specificity of 60.4%, and diagnostic accuracy of 66.9%. The seven-point checklist showed sensitivity, specificity, and diagnostic accuracy of 87.5, 65.9, and 70.4%, respectively; the three-point checklist 79.1, 62.6, 66%; and the CASH algorithm 91.6, 64.8, and 70.4%, respectively. To our knowledge, this is the first study that compares the sensitivity, specificity and diagnostic accuracy of the ABCD rule of dermatoscopy, the three-point checklist, the seven-point checklist, and the CASH algorithm for the diagnosis of melanocytic lesions on the hairy skin. In our study, the ABCD rule of dermatoscopy and the CASH algorithm showed the highest sensitivity for the diagnosis of melanoma. © 2014 Japanese Dermatological Association.
NASA Technical Reports Server (NTRS)
Ricks, Brian W.; Mengshoel, Ole J.
2009-01-01
Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.
Conwell, Darwin L; Lee, Linda S; Yadav, Dhiraj; Longnecker, Daniel S; Miller, Frank H; Mortele, Koenraad J; Levy, Michael J; Kwon, Richard; Lieb, John G; Stevens, Tyler; Toskes, Phillip P; Gardner, Timothy B; Gelrud, Andres; Wu, Bechien U; Forsmark, Christopher E; Vege, Santhi S
2014-11-01
The diagnosis of chronic pancreatitis remains challenging in early stages of the disease. This report defines the diagnostic criteria useful in the assessment of patients with suspected and established chronic pancreatitis. All current diagnostic procedures are reviewed, and evidence-based statements are provided about their utility and limitations. Diagnostic criteria for chronic pancreatitis are classified as definitive, probable, or insufficient evidence. A diagnostic (STEP-wise; survey, tomography, endoscopy, and pancreas function testing) algorithm is proposed that proceeds from a noninvasive to a more invasive approach. This algorithm maximizes specificity (low false-positive rate) in subjects with chronic abdominal pain and equivocal imaging changes. Furthermore, a nomenclature is suggested to further characterize patients with established chronic pancreatitis based on TIGAR-O (toxic, idiopathic, genetic, autoimmune, recurrent, and obstructive) etiology, gland morphology (Cambridge criteria), and physiologic state (exocrine, endocrine function) for uniformity across future multicenter research collaborations. This guideline will serve as a baseline manuscript that will be modified as new evidence becomes available and our knowledge of chronic pancreatitis improves.
Portable Health Algorithms Test System
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Wong, Edmond; Fulton, Christopher E.; Sowers, Thomas S.; Maul, William A.
2010-01-01
A document discusses the Portable Health Algorithms Test (PHALT) System, which has been designed as a means for evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT system allows systems health management algorithms to be developed in a graphical programming environment, to be tested and refined using system simulation or test data playback, and to be evaluated in a real-time hardware-in-the-loop mode with a live test article. The integrated hardware and software development environment provides a seamless transition from algorithm development to real-time implementation. The portability of the hardware makes it quick and easy to transport between test facilities. This hard ware/software architecture is flexible enough to support a variety of diagnostic applications and test hardware, and the GUI-based rapid prototyping capability is sufficient to support development execution, and testing of custom diagnostic algorithms. The PHALT operating system supports execution of diagnostic algorithms under real-time constraints. PHALT can perform real-time capture and playback of test rig data with the ability to augment/ modify the data stream (e.g. inject simulated faults). It performs algorithm testing using a variety of data input sources, including real-time data acquisition, test data playback, and system simulations, and also provides system feedback to evaluate closed-loop diagnostic response and mitigation control.
Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam
2016-01-01
The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.
Anatomy-Based Algorithms for Detecting Oral Cancer Using Reflectance and Fluorescence Spectroscopy
McGee, Sasha; Mardirossian, Vartan; Elackattu, Alphi; Mirkovic, Jelena; Pistey, Robert; Gallagher, George; Kabani, Sadru; Yu, Chung-Chieh; Wang, Zimmern; Badizadegan, Kamran; Grillone, Gregory; Feld, Michael S.
2010-01-01
Objectives We used reflectance and fluorescence spectroscopy to noninvasively and quantitatively distinguish benign from dysplastic/malignant oral lesions. We designed diagnostic algorithms to account for differences in the spectral properties among anatomic sites (gingiva, buccal mucosa, etc). Methods In vivo reflectance and fluorescence spectra were collected from 71 patients with oral lesions. The tissue was then biopsied and the specimen evaluated by histopathology. Quantitative parameters related to tissue morphology and biochemistry were extracted from the spectra. Diagnostic algorithms specific for combinations of sites with similar spectral properties were developed. Results Discrimination of benign from dysplastic/malignant lesions was most successful when algorithms were designed for individual sites (area under the receiver operator characteristic curve [ROC-AUC], 0.75 for the lateral surface of the tongue) and was least accurate when all sites were combined (ROC-AUC, 0.60). The combination of sites with similar spectral properties (floor of mouth and lateral surface of the tongue) yielded an ROC-AUC of 0.71. Conclusions Accurate spectroscopic detection of oral disease must account for spectral variations among anatomic sites. Anatomy-based algorithms for single sites or combinations of sites demonstrated good diagnostic performance in distinguishing benign lesions from dysplastic/malignant lesions and consistently performed better than algorithms developed for all sites combined. PMID:19999369
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
NASA Astrophysics Data System (ADS)
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
A Framework to Debug Diagnostic Matrices
NASA Technical Reports Server (NTRS)
Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann
2013-01-01
Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.
NASA Astrophysics Data System (ADS)
Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas
2016-02-01
In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.
Code-based Diagnostic Algorithms for Idiopathic Pulmonary Fibrosis. Case Validation and Improvement.
Ley, Brett; Urbania, Thomas; Husson, Gail; Vittinghoff, Eric; Brush, David R; Eisner, Mark D; Iribarren, Carlos; Collard, Harold R
2017-06-01
Population-based studies of idiopathic pulmonary fibrosis (IPF) in the United States have been limited by reliance on diagnostic code-based algorithms that lack clinical validation. To validate a well-accepted International Classification of Diseases, Ninth Revision, code-based algorithm for IPF using patient-level information and to develop a modified algorithm for IPF with enhanced predictive value. The traditional IPF algorithm was used to identify potential cases of IPF in the Kaiser Permanente Northern California adult population from 2000 to 2014. Incidence and prevalence were determined overall and by age, sex, and race/ethnicity. A validation subset of cases (n = 150) underwent expert medical record and chest computed tomography review. A modified IPF algorithm was then derived and validated to optimize positive predictive value. From 2000 to 2014, the traditional IPF algorithm identified 2,608 cases among 5,389,627 at-risk adults in the Kaiser Permanente Northern California population. Annual incidence was 6.8/100,000 person-years (95% confidence interval [CI], 6.1-7.7) and was higher in patients with older age, male sex, and white race. The positive predictive value of the IPF algorithm was only 42.2% (95% CI, 30.6 to 54.6%); sensitivity was 55.6% (95% CI, 21.2 to 86.3%). The corrected incidence was estimated at 5.6/100,000 person-years (95% CI, 2.6-10.3). A modified IPF algorithm had improved positive predictive value but reduced sensitivity compared with the traditional algorithm. A well-accepted International Classification of Diseases, Ninth Revision, code-based IPF algorithm performs poorly, falsely classifying many non-IPF cases as IPF and missing a substantial proportion of IPF cases. A modification of the IPF algorithm may be useful for future population-based studies of IPF.
Diagnostic Utility of the ADI-R and DSM-5 in the Assessment of Latino Children and Adolescents
ERIC Educational Resources Information Center
Magaña, Sandy; Vanegas, Sandra B.
2017-01-01
Latino children in the US are systematically underdiagnosed with Autism Spectrum Disorder (ASD); therefore, it is important that recent changes to the diagnostic process do not exacerbate this pattern of under-identification. Previous research has found that the Autism Diagnostic Interview-Revised (ADI-R) algorithm, based on the Diagnostic and…
Colonoscopy video quality assessment using hidden Markov random fields
NASA Astrophysics Data System (ADS)
Park, Sun Young; Sargent, Dusty; Spofford, Inbar; Vosburgh, Kirby
2011-03-01
With colonoscopy becoming a common procedure for individuals aged 50 or more who are at risk of developing colorectal cancer (CRC), colon video data is being accumulated at an ever increasing rate. However, the clinically valuable information contained in these videos is not being maximally exploited to improve patient care and accelerate the development of new screening methods. One of the well-known difficulties in colonoscopy video analysis is the abundance of frames with no diagnostic information. Approximately 40% - 50% of the frames in a colonoscopy video are contaminated by noise, acquisition errors, glare, blur, and uneven illumination. Therefore, filtering out low quality frames containing no diagnostic information can significantly improve the efficiency of colonoscopy video analysis. To address this challenge, we present a quality assessment algorithm to detect and remove low quality, uninformative frames. The goal of our algorithm is to discard low quality frames while retaining all diagnostically relevant information. Our algorithm is based on a hidden Markov model (HMM) in combination with two measures of data quality to filter out uninformative frames. Furthermore, we present a two-level framework based on an embedded hidden Markov model (EHHM) to incorporate the proposed quality assessment algorithm into a complete, automated diagnostic image analysis system for colonoscopy video.
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2002-01-01
As part of the NASA Aviation Safety Program, a unique model-based diagnostics method that employs neural networks and genetic algorithms for aircraft engine performance diagnostics has been developed and demonstrated at the NASA Glenn Research Center against a nonlinear gas turbine engine model. Neural networks are applied to estimate the internal health condition of the engine, and genetic algorithms are used for sensor fault detection, isolation, and quantification. This hybrid architecture combines the excellent nonlinear estimation capabilities of neural networks with the capability to rank the likelihood of various faults given a specific sensor suite signature. The method requires a significantly smaller data training set than a neural network approach alone does, and it performs the combined engine health monitoring objectives of performance diagnostics and sensor fault detection and isolation in the presence of nominal and degraded engine health conditions.
Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan
2013-01-01
In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493
Vertigo in childhood: proposal for a diagnostic algorithm based upon clinical experience.
Casani, A P; Dallan, I; Navari, E; Sellari Franceschini, S; Cerchiai, N
2015-06-01
The aim of this paper is to analyse, after clinical experience with a series of patients with established diagnoses and review of the literature, all relevant anamnestic features in order to build a simple diagnostic algorithm for vertigo in childhood. This study is a retrospective chart review. A series of 37 children underwent complete clinical and instrumental vestibular examination. Only neurological disorders or genetic diseases represented exclusion criteria. All diagnoses were reviewed after applying the most recent diagnostic guidelines. In our experience, the most common aetiology for dizziness is vestibular migraine (38%), followed by acute labyrinthitis/neuritis (16%) and somatoform vertigo (16%). Benign paroxysmal vertigo was diagnosed in 4 patients (11%) and paroxysmal torticollis was diagnosed in a 1-year-old child. In 8% (3 patients) of cases, the dizziness had a post-traumatic origin: 1 canalolithiasis of the posterior semicircular canal and 2 labyrinthine concussions, respectively. Menière's disease was diagnosed in 2 cases. A bilateral vestibular failure of unknown origin caused chronic dizziness in 1 patient. In conclusion, this algorithm could represent a good tool for guiding clinical suspicion to correct diagnostic assessment in dizzy children where no neurological findings are detectable. The algorithm has just a few simple steps, based mainly on two aspects to be investigated early: temporal features of vertigo and presence of hearing impairment. A different algorithm has been proposed for cases in which a traumatic origin is suspected.
Conwell, Darwin L.; Lee, Linda S.; Yadav, Dhiraj; Longnecker, Daniel S.; Miller, Frank H.; Mortele, Koenraad J.; Levy, Michael J.; Kwon, Richard; Lieb, John G.; Stevens, Tyler; Toskes, Philip P.; Gardner, Timothy B.; Gelrud, Andres; Wu, Bechien U.; Forsmark, Christopher E.; Vege, Santhi S.
2016-01-01
The diagnosis of chronic pancreatitis remains challenging in early stages of the disease. This report defines the diagnostic criteria useful in the assessment of patients with suspected and established chronic pancreatitis. All current diagnostic procedures are reviewed and evidence based statements are provided about their utility and limitations. Diagnostic criteria for chronic pancreatitis are classified as definitive, probable or insufficient evidence. A diagnostic (STEP-wise; S-survey, T-tomography, E-endoscopy and P-pancreas function testing) algorithm is proposed that proceeds from a non-invasive to a more invasive approach. This algorithm maximizes specificity (low false positive rate) in subjects with chronic abdominal pain and equivocal imaging changes. Futhermore, a nomenclature is suggested to further characterize patients with established chronic pancreatitis based on TIGAR-O (T-toxic, I-idiopathic, G-genetic, A- autoimmune, R-recurrent and O-obstructive) etiology, gland morphology (Cambridge criteria) and physiologic state (exocrine, endocrine function) for uniformity across future multi-center research collaborations. This guideline will serve as a baseline manuscript that will be modified as new evidence becomes available and our knowledge of chronic pancreatitis improves. PMID:25333398
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2008-03-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
A cDNA microarray gene expression data classifier for clinical diagnostics based on graph theory.
Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco
2011-01-01
Despite great advances in discovering cancer molecular profiles, the proper application of microarray technology to routine clinical diagnostics is still a challenge. Current practices in the classification of microarrays' data show two main limitations: the reliability of the training data sets used to build the classifiers, and the classifiers' performances, especially when the sample to be classified does not belong to any of the available classes. In this case, state-of-the-art algorithms usually produce a high rate of false positives that, in real diagnostic applications, are unacceptable. To address this problem, this paper presents a new cDNA microarray data classification algorithm based on graph theory and is able to overcome most of the limitations of known classification methodologies. The classifier works by analyzing gene expression data organized in an innovative data structure based on graphs, where vertices correspond to genes and edges to gene expression relationships. To demonstrate the novelty of the proposed approach, the authors present an experimental performance comparison between the proposed classifier and several state-of-the-art classification algorithms.
Mohamed, Abdallah S. R.; Ruangskul, Manee-Naad; Awan, Musaddiq J.; Baron, Charles A.; Kalpathy-Cramer, Jayashree; Castillo, Richard; Castillo, Edward; Guerrero, Thomas M.; Kocak-Uzel, Esengul; Yang, Jinzhong; Court, Laurence E.; Kantor, Michael E.; Gunn, G. Brandon; Colen, Rivka R.; Frank, Steven J.; Garden, Adam S.; Rosenthal, David I.
2015-01-01
Purpose To develop a quality assurance (QA) workflow by using a robust, curated, manually segmented anatomic region-of-interest (ROI) library as a benchmark for quantitative assessment of different image registration techniques used for head and neck radiation therapy–simulation computed tomography (CT) with diagnostic CT coregistration. Materials and Methods Radiation therapy–simulation CT images and diagnostic CT images in 20 patients with head and neck squamous cell carcinoma treated with curative-intent intensity-modulated radiation therapy between August 2011 and May 2012 were retrospectively retrieved with institutional review board approval. Sixty-eight reference anatomic ROIs with gross tumor and nodal targets were then manually contoured on images from each examination. Diagnostic CT images were registered with simulation CT images rigidly and by using four deformable image registration (DIR) algorithms: atlas based, B-spline, demons, and optical flow. The resultant deformed ROIs were compared with manually contoured reference ROIs by using similarity coefficient metrics (ie, Dice similarity coefficient) and surface distance metrics (ie, 95% maximum Hausdorff distance). The nonparametric Steel test with control was used to compare different DIR algorithms with rigid image registration (RIR) by using the post hoc Wilcoxon signed-rank test for stratified metric comparison. Results A total of 2720 anatomic and 50 tumor and nodal ROIs were delineated. All DIR algorithms showed improved performance over RIR for anatomic and target ROI conformance, as shown for most comparison metrics (Steel test, P < .008 after Bonferroni correction). The performance of different algorithms varied substantially with stratification by specific anatomic structures or category and simulation CT section thickness. Conclusion Development of a formal ROI-based QA workflow for registration assessment demonstrated improved performance with DIR techniques over RIR. After QA, DIR implementation should be the standard for head and neck diagnostic CT and simulation CT allineation, especially for target delineation. © RSNA, 2014 Online supplemental material is available for this article. PMID:25380454
NASA Astrophysics Data System (ADS)
Huang, Shaohua; Wang, Lan; Chen, Weiwei; Lin, Duo; Huang, Lingling; Wu, Shanshan; Feng, Shangyuan; Chen, Rong
2014-09-01
A surface-enhanced Raman spectroscopy (SERS) approach was utilized for urine biochemical analysis with the aim to develop a label-free and non-invasive optical diagnostic method for esophagus cancer detection. SERS spectrums were acquired from 31 normal urine samples and 47 malignant esophagus cancer (EC) urine samples. Tentative assignments of urine SERS bands demonstrated esophagus cancer specific changes, including an increase in the relative amounts of urea and a decrease in the percentage of uric acid in the urine of normal compared with EC. The empirical algorithm integrated with linear discriminant analysis (LDA) were employed to identify some important urine SERS bands for differentiation between healthy subjects and EC urine. The empirical diagnostic approach based on the ratio of the SERS peak intensity at 527 to 1002 cm-1 and 725 to 1002 cm-1 coupled with LDA yielded a diagnostic sensitivity of 72.3% and specificity of 96.8%, respectively. The area under the receive operating characteristic (ROC) curve was 0.954, which further evaluate the performance of the diagnostic algorithm based on the ratio of the SERS peak intensity combined with LDA analysis. This work demonstrated that the urine SERS spectra associated with empirical algorithm has potential for noninvasive diagnosis of esophagus cancer.
Mental Health Risk Adjustment with Clinical Categories and Machine Learning.
Shrestha, Akritee; Bergquist, Savannah; Montz, Ellen; Rose, Sherri
2017-12-15
To propose nonparametric ensemble machine learning for mental health and substance use disorders (MHSUD) spending risk adjustment formulas, including considering Clinical Classification Software (CCS) categories as diagnostic covariates over the commonly used Hierarchical Condition Category (HCC) system. 2012-2013 Truven MarketScan database. We implement 21 algorithms to predict MHSUD spending, as well as a weighted combination of these algorithms called super learning. The algorithm collection included seven unique algorithms that were supplied with three differing sets of MHSUD-related predictors alongside demographic covariates: HCC, CCS, and HCC + CCS diagnostic variables. Performance was evaluated based on cross-validated R 2 and predictive ratios. Results show that super learning had the best performance based on both metrics. The top single algorithm was random forests, which improved on ordinary least squares regression by 10 percent with respect to relative efficiency. CCS categories-based formulas were generally more predictive of MHSUD spending compared to HCC-based formulas. Literature supports the potential benefit of implementing a separate MHSUD spending risk adjustment formula. Our results suggest there is an incentive to explore machine learning for MHSUD-specific risk adjustment, as well as considering CCS categories over HCCs. © Health Research and Educational Trust.
Guglielmi, Valeria; Bellia, Alfonso; Pecchioli, Serena; Medea, Gerardo; Parretti, Damiano; Lauro, Davide; Sbraccia, Paolo; Federici, Massimo; Cricelli, Iacopo; Cricelli, Claudio; Lapi, Francesco
2016-11-15
There are some inconsistencies on prevalence estimates of familial hypercholesterolemia (FH) in general population across Europe due to variable application of its diagnostic criteria. We aimed to investigate the FH epidemiology in Italy applying the Dutch Lipid Clinical Network (DLCN) score, and two alternative diagnostic algorithms to a primary care database. We performed a retrospective population-based study using the Health Search IMS Health Longitudinal Patient Database (HSD) and including active (alive and currently registered with their general practitioners (GPs)) patients on December 31, 2014. Cases of FH were identified by applying DLCN score. Two further algorithms, based on either ICD9CM coding for FH or some clinical items adopted by the DLCN, were tested towards DLCN itself as gold standard. We estimated a prevalence of 0.01% for "definite" and 0.18% for "definite" plus "probable" cases as per the DLCN. Algorithms 1 and 2 reported a FH prevalence of 0.9 and 0.13%, respectively. Both algorithms resulted in consistent specificity (1: 99.10%; 2: 99.9%) towards DLCN, but Algorithm 2 considerably better identified true positive (sensitivity=85.90%) than Algorithm 1 (sensitivity=10.10%). The application of DLCN or valid diagnostic alternatives in the Italian primary care setting provides estimates of FH prevalence consistent with those reported in other screening studies in Caucasian population. These diagnostic criteria should be therefore fostered among GPs. In the perspective of FH new therapeutic options, the epidemiological picture of FH is even more relevant to foresee the costs and to plan affordable reimbursement programs in Italy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Karayiannis, N B
2000-01-01
This paper presents the development and investigates the properties of ordered weighted learning vector quantization (LVQ) and clustering algorithms. These algorithms are developed by using gradient descent to minimize reformulation functions based on aggregation operators. An axiomatic approach provides conditions for selecting aggregation operators that lead to admissible reformulation functions. Minimization of admissible reformulation functions based on ordered weighted aggregation operators produces a family of soft LVQ and clustering algorithms, which includes fuzzy LVQ and clustering algorithms as special cases. The proposed LVQ and clustering algorithms are used to perform segmentation of magnetic resonance (MR) images of the brain. The diagnostic value of the segmented MR images provides the basis for evaluating a variety of ordered weighted LVQ and clustering algorithms.
Overcoming limitations of model-based diagnostic reasoning systems
NASA Technical Reports Server (NTRS)
Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.
1989-01-01
The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.
Rajpara, S M; Botello, A P; Townend, J; Ormerod, A D
2009-09-01
Dermoscopy improves diagnostic accuracy of the unaided eye for melanoma, and digital dermoscopy with artificial intelligence or computer diagnosis has also been shown useful for the diagnosis of melanoma. At present there is no clear evidence regarding the diagnostic accuracy of dermoscopy compared with artificial intelligence. To evaluate the diagnostic accuracy of dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis and to compare the diagnostic accuracy of the different dermoscopic algorithms with each other and with digital dermoscopy/artificial intelligence for the detection of melanoma. A literature search on dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis was performed using several databases. Titles and abstracts of the retrieved articles were screened using a literature evaluation form. A quality assessment form was developed to assess the quality of the included studies. Heterogeneity among the studies was assessed. Pooled data were analysed using meta-analytical methods and comparisons between different algorithms were performed. Of 765 articles retrieved, 30 studies were eligible for meta-analysis. Pooled sensitivity for artificial intelligence was slightly higher than for dermoscopy (91% vs. 88%; P = 0.076). Pooled specificity for dermoscopy was significantly better than artificial intelligence (86% vs. 79%; P < 0.001). Pooled diagnostic odds ratio was 51.5 for dermoscopy and 57.8 for artificial intelligence, which were not significantly different (P = 0.783). There were no significance differences in diagnostic odds ratio among the different dermoscopic diagnostic algorithms. Dermoscopy and artificial intelligence performed equally well for diagnosis of melanocytic skin lesions. There was no significant difference in the diagnostic performance of various dermoscopy algorithms. The three-point checklist, the seven-point checklist and Menzies score had better diagnostic odds ratios than the others; however, these results need to be confirmed by a large-scale high-quality population-based study.
A utility/cost analysis of breast cancer risk prediction algorithms
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.
2016-03-01
Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.
Xu, Jin; Xu, Zhao-Xia; Lu, Ping; Guo, Rui; Yan, Hai-Xia; Xu, Wen-Jie; Wang, Yi-Qin; Xia, Chun-Ming
2016-11-01
To develop an effective Chinese Medicine (CM) diagnostic model of coronary heart disease (CHD) and to confifirm the scientifific validity of CM theoretical basis from an algorithmic viewpoint. Four types of objective diagnostic data were collected from 835 CHD patients by using a self-developed CM inquiry scale for the diagnosis of heart problems, a tongue diagnosis instrument, a ZBOX-I pulse digital collection instrument, and the sound of an attending acquisition system. These diagnostic data was analyzed and a CM diagnostic model was established using a multi-label learning algorithm (REAL). REAL was employed to establish a Xin (Heart) qi defificiency, Xin yang defificiency, Xin yin defificiency, blood stasis, and phlegm fifive-card CM diagnostic model, which had recognition rates of 80.32%, 89.77%, 84.93%, 85.37%, and 69.90%, respectively. The multi-label learning method established using four diagnostic models based on mutual information feature selection yielded good recognition results. The characteristic model parameters were selected by maximizing the mutual information for each card type. The four diagnostic methods used to obtain information in CM, i.e., observation, auscultation and olfaction, inquiry, and pulse diagnosis, can be characterized by these parameters, which is consistent with CM theory.
Shrestha, Swastina; Dave, Amish J; Losina, Elena; Katz, Jeffrey N
2016-07-07
Administrative health care data are frequently used to study disease burden and treatment outcomes in many conditions including osteoarthritis (OA). OA is a chronic condition with significant disease burden affecting over 27 million adults in the US. There are few studies examining the performance of administrative data algorithms to diagnose OA. The purpose of this study is to perform a systematic review of administrative data algorithms for OA diagnosis; and, to evaluate the diagnostic characteristics of algorithms based on restrictiveness and reference standards. Two reviewers independently screened English-language articles published in Medline, Embase, PubMed, and Cochrane databases that used administrative data to identify OA cases. Each algorithm was classified as restrictive or less restrictive based on number and type of administrative codes required to satisfy the case definition. We recorded sensitivity and specificity of algorithms and calculated positive likelihood ratio (LR+) and positive predictive value (PPV) based on assumed OA prevalence of 0.1, 0.25, and 0.50. The search identified 7 studies that used 13 algorithms. Of these 13 algorithms, 5 were classified as restrictive and 8 as less restrictive. Restrictive algorithms had lower median sensitivity and higher median specificity compared to less restrictive algorithms when reference standards were self-report and American college of Rheumatology (ACR) criteria. The algorithms compared to reference standard of physician diagnosis had higher sensitivity and specificity than those compared to self-reported diagnosis or ACR criteria. Restrictive algorithms are more specific for OA diagnosis and can be used to identify cases when false positives have higher costs e.g. interventional studies. Less restrictive algorithms are more sensitive and suited for studies that attempt to identify all cases e.g. screening programs.
Janjua, Naveed Zafar; Islam, Nazrul; Kuo, Margot; Yu, Amanda; Wong, Stanley; Butt, Zahid A; Gilbert, Mark; Buxton, Jane; Chapinal, Nuria; Samji, Hasina; Chong, Mei; Alvarez, Maria; Wong, Jason; Tyndall, Mark W; Krajden, Mel
2018-05-01
Large linked healthcare administrative datasets could be used to monitor programs providing prevention and treatment services to people who inject drugs (PWID). However, diagnostic codes in administrative datasets do not differentiate non-injection from injection drug use (IDU). We validated algorithms based on diagnostic codes and prescription records representing IDU in administrative datasets against interview-based IDU data. The British Columbia Hepatitis Testers Cohort (BC-HTC) includes ∼1.7 million individuals tested for HCV/HIV or reported HBV/HCV/HIV/tuberculosis cases in BC from 1990 to 2015, linked to administrative datasets including physician visit, hospitalization and prescription drug records. IDU, assessed through interviews as part of enhanced surveillance at the time of HIV or HCV/HBV diagnosis from a subset of cases included in the BC-HTC (n = 6559), was used as the gold standard. ICD-9/ICD-10 codes for IDU and injecting-related infections (IRI) were grouped with records of opioid substitution therapy (OST) into multiple IDU algorithms in administrative datasets. We assessed the performance of IDU algorithms through calculation of sensitivity, specificity, positive predictive, and negative predictive values. Sensitivity was highest (90-94%), and specificity was lowest (42-73%) for algorithms based either on IDU or IRI and drug misuse codes. Algorithms requiring both drug misuse and IRI had lower sensitivity (57-60%) and higher specificity (90-92%). An optimal sensitivity and specificity combination was found with two medical visits or a single hospitalization for injectable drugs with (83%/82%) and without OST (78%/83%), respectively. Based on algorithms that included two medical visits, a single hospitalization or OST records, there were 41,358 (1.2% of 11-65 years individuals in BC) recent PWID in BC based on health encounters during 3- year period (2013-2015). Algorithms for identifying PWID using diagnostic codes in linked administrative data could be used for tracking the progress of programing aimed at PWID. With population-based datasets, this tool can be used to inform much needed estimates of PWID population size. Copyright © 2018 Elsevier B.V. All rights reserved.
Bone, Daniel; Bishop, Somer; Black, Matthew P.; Goodwin, Matthew S.; Lord, Catherine; Narayanan, Shrikanth S.
2016-01-01
Background Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely-used ASD screening and diagnostic tools. Methods The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders (DD), split at age 10. Algorithms were created via a robust ML classifier, support vector machine (SVM), while targeting best-estimate clinical diagnosis of ASD vs. non-ASD. Parameter settings were tuned in multiple levels of cross-validation. Results The created algorithms were more effective (higher performing) than current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. Conclusions ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. PMID:27090613
Bone, Daniel; Bishop, Somer L; Black, Matthew P; Goodwin, Matthew S; Lord, Catherine; Narayanan, Shrikanth S
2016-08-01
Machine learning (ML) provides novel opportunities for human behavior research and clinical translation, yet its application can have noted pitfalls (Bone et al., 2015). In this work, we fastidiously utilize ML to derive autism spectrum disorder (ASD) instrument algorithms in an attempt to improve upon widely used ASD screening and diagnostic tools. The data consisted of Autism Diagnostic Interview-Revised (ADI-R) and Social Responsiveness Scale (SRS) scores for 1,264 verbal individuals with ASD and 462 verbal individuals with non-ASD developmental or psychiatric disorders, split at age 10. Algorithms were created via a robust ML classifier, support vector machine, while targeting best-estimate clinical diagnosis of ASD versus non-ASD. Parameter settings were tuned in multiple levels of cross-validation. The created algorithms were more effective (higher performing) than the current algorithms, were tunable (sensitivity and specificity can be differentially weighted), and were more efficient (achieving near-peak performance with five or fewer codes). Results from ML-based fusion of ADI-R and SRS are reported. We present a screener algorithm for below (above) age 10 that reached 89.2% (86.7%) sensitivity and 59.0% (53.4%) specificity with only five behavioral codes. ML is useful for creating robust, customizable instrument algorithms. In a unique dataset comprised of controls with other difficulties, our findings highlight the limitations of current caregiver-report instruments and indicate possible avenues for improving ASD screening and diagnostic tools. © 2016 Association for Child and Adolescent Mental Health.
Current challenges in diagnostic imaging of venous thromboembolism.
Huisman, Menno V; Klok, Frederikus A
2015-01-01
Because the clinical diagnosis of deep-vein thrombosis and pulmonary embolism is nonspecific, integrated diagnostic approaches for patients with suspected venous thromboembolism have been developed over the years, involving both non-invasive bedside tools (clinical decision rules and D-dimer blood tests) for patients with low pretest probability and diagnostic techniques (compression ultrasound for deep-vein thrombosis and computed tomography pulmonary angiography for pulmonary embolism) for those with a high pretest probability. This combination has led to standardized diagnostic algorithms with proven safety for excluding venous thrombotic disease. At the same time, it has become apparent that, as a result of the natural history of venous thrombosis, there are special patient populations in which the current standard diagnostic algorithms are not sufficient. In this review, we present 3 evidence-based patient cases to underline recent developments in the imaging diagnosis of venous thromboembolism. © 2015 by The American Society of Hematology. All rights reserved.
[Chronic diarrhoea: Definition, classification and diagnosis].
Fernández-Bañares, Fernando; Accarino, Anna; Balboa, Agustín; Domènech, Eugeni; Esteve, Maria; Garcia-Planella, Esther; Guardiola, Jordi; Molero, Xavier; Rodríguez-Luna, Alba; Ruiz-Cerulla, Alexandra; Santos, Javier; Vaquero, Eva
2016-10-01
Chronic diarrhoea is a common presenting symptom in both primary care medicine and in specialized gastroenterology clinics. It is estimated that >5% of the population has chronic diarrhoea and nearly 40% of these patients are older than 60 years. Clinicians often need to select the best diagnostic approach to these patients and choose between the multiple diagnostic tests available. In 2014 the Catalan Society of Gastroenterology formed a working group with the main objective of creating diagnostic algorithms based on clinical practice and to evaluate diagnostic tests and the scientific evidence available for their use. The GRADE system was used to classify scientific evidence and strength of recommendations. The consensus document contains 28 recommendations and 6 diagnostic algorithms. The document also describes criteria for referral from primary to specialized care. Copyright © 2015 Elsevier España, S.L.U. y AEEH y AEG. All rights reserved.
A Diagnostic Approach for Electro-Mechanical Actuators in Aerospace Systems
NASA Technical Reports Server (NTRS)
Balaban, Edward; Saxena, Abhinav; Bansal, Prasun; Goebel, Kai Frank; Stoelting, Paul; Curran, Simon
2009-01-01
Electro-mechanical actuators (EMA) are finding increasing use in aerospace applications, especially with the trend towards all all-electric aircraft and spacecraft designs. However, electro-mechanical actuators still lack the knowledge base accumulated for other fielded actuator types, particularly with regard to fault detection and characterization. This paper presents a thorough analysis of some of the critical failure modes documented for EMAs and describes experiments conducted on detecting and isolating a subset of them. The list of failures has been prepared through an extensive Failure Modes and Criticality Analysis (FMECA) reference, literature review, and accessible industry experience. Methods for data acquisition and validation of algorithms on EMA test stands are described. A variety of condition indicators were developed that enabled detection, identification, and isolation among the various fault modes. A diagnostic algorithm based on an artificial neural network is shown to operate successfully using these condition indicators and furthermore, robustness of these diagnostic routines to sensor faults is demonstrated by showing their ability to distinguish between them and component failures. The paper concludes with a roadmap leading from this effort towards developing successful prognostic algorithms for electromechanical actuators.
Real-time plasma control based on the ISTTOK tomography diagnostica)
NASA Astrophysics Data System (ADS)
Carvalho, P. J.; Carvalho, B. B.; Neto, A.; Coelho, R.; Fernandes, H.; Sousa, J.; Varandas, C.; Chávez-Alarcón, E.; Herrera-Velázquez, J. J. E.
2008-10-01
The presently available processing power in generic processing units (GPUs) combined with state-of-the-art programmable logic devices benefits the implementation of complex, real-time driven, data processing algorithms for plasma diagnostics. A tomographic reconstruction diagnostic has been developed for the ISTTOK tokamak, based on three linear pinhole cameras each with ten lines of sight. The plasma emissivity in a poloidal cross section is computed locally on a submillisecond time scale, using a Fourier-Bessel algorithm, allowing the use of the output signals for active plasma position control. The data acquisition and reconstruction (DAR) system is based on ATCA technology and consists of one acquisition board with integrated field programmable gate array (FPGA) capabilities and a dual-core Pentium module running real-time application interface (RTAI) Linux. In this paper, the DAR real-time firmware/software implementation is presented, based on (i) front-end digital processing in the FPGA; (ii) a device driver specially developed for the board which enables streaming data acquisition to the host GPU; and (iii) a fast reconstruction algorithm running in Linux RTAI. This system behaves as a module of the central ISTTOK control and data acquisition system (FIRESIGNAL). Preliminary results of the above experimental setup are presented and a performance benchmarking against the magnetic coil diagnostic is shown.
Seizures in the elderly: development and validation of a diagnostic algorithm.
Dupont, Sophie; Verny, Marc; Harston, Sandrine; Cartz-Piver, Leslie; Schück, Stéphane; Martin, Jennifer; Puisieux, François; Alecu, Cosmin; Vespignani, Hervé; Marchal, Cécile; Derambure, Philippe
2010-05-01
Seizures are frequent in the elderly, but their diagnosis can be challenging. The objective of this work was to develop and validate an expert-based algorithm for the diagnosis of seizures in elderly people. A multidisciplinary group of neurologists and geriatricians developed a diagnostic algorithm using a combination of selected clinical, electroencephalographical and radiological criteria. The algorithm was validated by multicentre retrospective analysis of data of patients referred for specific symptoms and classified by the experts as epileptic patients or not. The algorithm was applied to all the patients, and the diagnosis provided by the algorithm was compared to the clinical diagnosis of the experts. Twenty-nine clinical, electroencephalographical and radiological criteria were selected for the algorithm. According to criteria combination, seizures were classified in four levels of diagnosis: certain, highly probable, possible or improbable. To validate the algorithm, the medical records of 269 elderly patients were analyzed (138 with epileptic seizures, 131 with non-epileptic manifestations). Patients were mainly referred for a transient focal deficit (40%), confusion (38%), unconsciousness (27%). The algorithm best classified certain and probable seizures versus possible and improbable seizures, with 86.2% sensitivity and 67.2% specificity. Using logistical regression, 2 simplified models were developed, the first with 13 criteria (Se 85.5%, Sp 90.1%), and the second with 7 criteria only (Se 84.8%, Sp 88.6%). In conclusion, the present study validated the use of a revised diagnostic algorithm to help diagnosis epileptic seizures in the elderly. A prospective study is planned to further validate this algorithm. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki
2009-02-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Combined algorithmic and GPU acceleration for ultra-fast circular conebeam backprojection
NASA Astrophysics Data System (ADS)
Brokish, Jeffrey; Sack, Paul; Bresler, Yoram
2010-04-01
In this paper, we describe the first implementation and performance of a fast O(N3logN) hierarchical backprojection algorithm for cone beam CT with a circular trajectory1,developed on a modern Graphics Processing Unit (GPU). The resulting tomographic backprojection system for 3D cone beam geometry combines speedup through algorithmic improvements provided by the hierarchical backprojection algorithm with speedup from a massively parallel hardware accelerator. For data parameters typical in diagnostic CT and using a mid-range GPU card, we report reconstruction speeds of up to 360 frames per second, and relative speedup of almost 6x compared to conventional backprojection on the same hardware. The significance of these results is twofold. First, they demonstrate that the reduction in operation counts demonstrated previously for the FHBP algorithm can be translated to a comparable run-time improvement in a massively parallel hardware implementation, while preserving stringent diagnostic image quality. Second, the dramatic speedup and throughput numbers achieved indicate the feasibility of systems based on this technology, which achieve real-time 3D reconstruction for state-of-the art diagnostic CT scanners with small footprint, high-reliability, and affordable cost.
Deep learning based syndrome diagnosis of chronic gastritis.
Liu, Guo-Ping; Yan, Jian-Jun; Wang, Yi-Qin; Zheng, Wu; Zhong, Tao; Lu, Xiong; Qian, Peng
2014-01-01
In Traditional Chinese Medicine (TCM), most of the algorithms used to solve problems of syndrome diagnosis are superficial structure algorithms and not considering the cognitive perspective from the brain. However, in clinical practice, there is complex and nonlinear relationship between symptoms (signs) and syndrome. So we employed deep leaning and multilabel learning to construct the syndrome diagnostic model for chronic gastritis (CG) in TCM. The results showed that deep learning could improve the accuracy of syndrome recognition. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice.
Deep Learning Based Syndrome Diagnosis of Chronic Gastritis
Liu, Guo-Ping; Wang, Yi-Qin; Zheng, Wu; Zhong, Tao; Lu, Xiong; Qian, Peng
2014-01-01
In Traditional Chinese Medicine (TCM), most of the algorithms used to solve problems of syndrome diagnosis are superficial structure algorithms and not considering the cognitive perspective from the brain. However, in clinical practice, there is complex and nonlinear relationship between symptoms (signs) and syndrome. So we employed deep leaning and multilabel learning to construct the syndrome diagnostic model for chronic gastritis (CG) in TCM. The results showed that deep learning could improve the accuracy of syndrome recognition. Moreover, the studies will provide a reference for constructing syndrome diagnostic models and guide clinical practice. PMID:24734118
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mlynar, J.; Weinzettl, V.; Imrisek, M.
2012-10-15
The contribution focuses on plasma tomography via the minimum Fisher regularisation (MFR) algorithm applied on data from the recently commissioned tomographic diagnostics on the COMPASS tokamak. The MFR expertise is based on previous applications at Joint European Torus (JET), as exemplified in a new case study of the plasma position analyses based on JET soft x-ray (SXR) tomographic reconstruction. Subsequent application of the MFR algorithm on COMPASS data from cameras with absolute extreme ultraviolet (AXUV) photodiodes disclosed a peaked radiating region near the limiter. Moreover, its time evolution indicates transient plasma edge cooling following a radial plasma shift. In themore » SXR data, MFR demonstrated that a high resolution plasma positioning independent of the magnetic diagnostics would be possible provided that a proper calibration of the cameras on an x-ray source is undertaken.« less
[Coagulation Monitoring and Bleeding Management in Cardiac Surgery].
Bein, Berthold; Schiewe, Robert
2018-05-01
The transfusion of allogeneic blood products is associated with increased morbidity and mortality. An impaired hemostasis is frequently found in patients undergoing cardiac surgery and may in turn cause bleeding and transfusions. A goal directed coagulation management addressing the often complex coagulation disorders needs sophisticated diagnostics. This may improve both patients' outcome and costs. Recent data suggest that coagulation management based on a rational algorithm is more effective than traditional therapy based on conventional laboratory variables such as PT and INR. Platelet inhibitors, cumarins, direct oral anticoagulants and heparin need different diagnostic and therapeutic approaches. An algorithm specifically developed for use during cardiac surgery is presented. Georg Thieme Verlag KG Stuttgart · New York.
Walusimbi, Simon; Kwesiga, Brendan; Rodrigues, Rashmi; Haile, Melles; de Costa, Ayesha; Bogg, Lennart; Katamba, Achilles
2016-10-10
Microscopic Observation Drug Susceptibility (MODS) and Xpert MTB/Rif (Xpert) are highly sensitive tests for diagnosis of pulmonary tuberculosis (PTB). This study evaluated the cost effectiveness of utilizing MODS versus Xpert for diagnosis of active pulmonary TB in HIV infected patients in Uganda. A decision analysis model comparing MODS versus Xpert for TB diagnosis was used. Costs were estimated by measuring and valuing relevant resources required to perform the MODS and Xpert tests. Diagnostic accuracy data of the tests were obtained from systematic reviews involving HIV infected patients. We calculated base values for unit costs and varied several assumptions to obtain the range estimates. Cost effectiveness was expressed as costs per TB patient diagnosed for each of the two diagnostic strategies. Base case analysis was performed using the base estimates for unit cost and diagnostic accuracy of the tests. Sensitivity analysis was performed using a range of value estimates for resources, prevalence, number of tests and diagnostic accuracy. The unit cost of MODS was US$ 6.53 versus US$ 12.41 of Xpert. Consumables accounted for 59 % (US$ 3.84 of 6.53) of the unit cost for MODS and 84 % (US$10.37 of 12.41) of the unit cost for Xpert. The cost effectiveness ratio of the algorithm using MODS was US$ 34 per TB patient diagnosed compared to US$ 71 of the algorithm using Xpert. The algorithm using MODS was more cost-effective compared to the algorithm using Xpert for a wide range of different values of accuracy, cost and TB prevalence. The cost (threshold value), where the algorithm using Xpert was optimal over the algorithm using MODS was US$ 5.92. MODS versus Xpert was more cost-effective for the diagnosis of PTB among HIV patients in our setting. Efforts to scale-up MODS therefore need to be explored. However, since other non-economic factors may still favour the use of Xpert, the current cost of the Xpert cartridge still needs to be reduced further by more than half, in order to make it economically competitive with MODS.
NASA Astrophysics Data System (ADS)
Chandra, Malavika; Scheiman, James; Simeone, Diane; McKenna, Barbara; Purdy, Julianne; Mycek, Mary-Ann
2010-01-01
Pancreatic adenocarcinoma is one of the leading causes of cancer death, in part because of the inability of current diagnostic methods to reliably detect early-stage disease. We present the first assessment of the diagnostic accuracy of algorithms developed for pancreatic tissue classification using data from fiber optic probe-based bimodal optical spectroscopy, a real-time approach that would be compatible with minimally invasive diagnostic procedures for early cancer detection in the pancreas. A total of 96 fluorescence and 96 reflectance spectra are considered from 50 freshly excised tissue sites-including human pancreatic adenocarcinoma, chronic pancreatitis (inflammation), and normal tissues-on nine patients. Classification algorithms using linear discriminant analysis are developed to distinguish among tissues, and leave-one-out cross-validation is employed to assess the classifiers' performance. The spectral areas and ratios classifier (SpARC) algorithm employs a combination of reflectance and fluorescence data and has the best performance, with sensitivity, specificity, negative predictive value, and positive predictive value for correctly identifying adenocarcinoma being 85, 89, 92, and 80%, respectively.
Integration of On-Line and Off-Line Diagnostic Algorithms for Aircraft Engine Health Management
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2007-01-01
This paper investigates the integration of on-line and off-line diagnostic algorithms for aircraft gas turbine engines. The on-line diagnostic algorithm is designed for in-flight fault detection. It continuously monitors engine outputs for anomalous signatures induced by faults. The off-line diagnostic algorithm is designed to track engine health degradation over the lifetime of an engine. It estimates engine health degradation periodically over the course of the engine s life. The estimate generated by the off-line algorithm is used to update the on-line algorithm. Through this integration, the on-line algorithm becomes aware of engine health degradation, and its effectiveness to detect faults can be maintained while the engine continues to degrade. The benefit of this integration is investigated in a simulation environment using a nonlinear engine model.
Iyatomi, Hitoshi; Oka, Hiroshi; Saito, Masataka; Miyake, Ayako; Kimoto, Masayuki; Yamagami, Jun; Kobayashi, Seiichiro; Tanikawa, Akiko; Hagiwara, Masafumi; Ogawa, Koichi; Argenziano, Giuseppe; Soyer, H Peter; Tanaka, Masaru
2006-04-01
The aims of this study were to provide a quantitative assessment of the tumour area extracted by dermatologists and to evaluate computer-based methods from dermoscopy images for refining a computer-based melanoma diagnostic system. Dermoscopic images of 188 Clark naevi, 56 Reed naevi and 75 melanomas were examined. Five dermatologists manually drew the border of each lesion with a tablet computer. The inter-observer variability was evaluated and the standard tumour area (STA) for each dermoscopy image was defined. Manual extractions by 10 non-medical individuals and by two computer-based methods were evaluated with STA-based assessment criteria: precision and recall. Our new computer-based method introduced the region-growing approach in order to yield results close to those obtained by dermatologists. The effectiveness of our extraction method with regard to diagnostic accuracy was evaluated. Two linear classifiers were built using the results of conventional and new computer-based tumour area extraction methods. The final diagnostic accuracy was evaluated by drawing the receiver operating curve (ROC) of each classifier, and the area under each ROC was evaluated. The standard deviations of the tumour area extracted by five dermatologists and 10 non-medical individuals were 8.9% and 10.7%, respectively. After assessment of the extraction results by dermatologists, the STA was defined as the area that was selected by more than two dermatologists. Dermatologists selected the melanoma area with statistically smaller divergence than that of Clark naevus or Reed naevus (P = 0.05). By contrast, non-medical individuals did not show this difference. Our new computer-based extraction algorithm showed superior performance (precision, 94.1%; recall, 95.3%) to the conventional thresholding method (precision, 99.5%; recall, 87.6%). These results indicate that our new algorithm extracted a tumour area close to that obtained by dermatologists and, in particular, the border part of the tumour was adequately extracted. With this refinement, the area under the ROC increased from 0.795 to 0.875 and the diagnostic accuracy showed an increase of approximately 20% in specificity when the sensitivity was 80%. It can be concluded that our computer-based tumour extraction algorithm extracted almost the same area as that obtained by dermatologists and provided improved computer-based diagnostic accuracy.
Lee, Jae-Hong; Kim, Do-Hyung; Jeong, Seong-Nyum; Choi, Seong-Ho
2018-04-01
The aim of the current study was to develop a computer-assisted detection system based on a deep convolutional neural network (CNN) algorithm and to evaluate the potential usefulness and accuracy of this system for the diagnosis and prediction of periodontally compromised teeth (PCT). Combining pretrained deep CNN architecture and a self-trained network, periapical radiographic images were used to determine the optimal CNN algorithm and weights. The diagnostic and predictive accuracy, sensitivity, specificity, positive predictive value, negative predictive value, receiver operating characteristic (ROC) curve, area under the ROC curve, confusion matrix, and 95% confidence intervals (CIs) were calculated using our deep CNN algorithm, based on a Keras framework in Python. The periapical radiographic dataset was split into training (n=1,044), validation (n=348), and test (n=348) datasets. With the deep learning algorithm, the diagnostic accuracy for PCT was 81.0% for premolars and 76.7% for molars. Using 64 premolars and 64 molars that were clinically diagnosed as severe PCT, the accuracy of predicting extraction was 82.8% (95% CI, 70.1%-91.2%) for premolars and 73.4% (95% CI, 59.9%-84.0%) for molars. We demonstrated that the deep CNN algorithm was useful for assessing the diagnosis and predictability of PCT. Therefore, with further optimization of the PCT dataset and improvements in the algorithm, a computer-aided detection system can be expected to become an effective and efficient method of diagnosing and predicting PCT.
Diagnostic Algorithm Benchmarking
NASA Technical Reports Server (NTRS)
Poll, Scott
2011-01-01
A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.
Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D
2008-02-15
Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.
Nallikuzhy, Jiss J; Dandapat, S
2017-06-01
In this work, a new patient-specific approach to enhance the spatial resolution of ECG is proposed and evaluated. The proposed model transforms a three-lead ECG into a standard twelve-lead ECG thereby enhancing its spatial resolution. The three leads used for prediction are obtained from the standard twelve-lead ECG. The proposed model takes advantage of the improved inter-lead correlation in wavelet domain. Since the model is patient-specific, it also selects the optimal predictor leads for a given patient using a lead selection algorithm. The lead selection algorithm is based on a new diagnostic similarity score which computes the diagnostic closeness between the original and the spatially enhanced leads. Standard closeness measures are used to assess the performance of the model. The similarity in diagnostic information between the original and the spatially enhanced leads are evaluated using various diagnostic measures. Repeatability and diagnosability are performed to quantify the applicability of the model. A comparison of the proposed model is performed with existing models that transform a subset of standard twelve-lead ECG into the standard twelve-lead ECG. From the analysis of the results, it is evident that the proposed model preserves diagnostic information better compared to other models. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Maul, William A.; Chicatelli, Amy; Fulton, Christopher E.; Balaban, Edward; Sweet, Adam; Hayden, Sandra Claire; Bajwa, Anupa
2005-01-01
The Propulsion IVHM Technology Experiment (PITEX) has been an on-going research effort conducted over several years. PITEX has developed and applied a model-based diagnostic system for the main propulsion system of the X-34 reusable launch vehicle, a space-launch technology demonstrator. The application was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real-time on flight-like hardware. In an attempt to expose potential performance problems, these PITEX algorithms were subject to numerous real-world effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. The current research has demonstrated the potential benefits of model-based diagnostics, defined the performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.
A diagnostic approach to hemochromatosis
Tavill, Anthony S; Adams, Paul C
2006-01-01
In the present clinical review, a diagnostic approach to hemochromatosis is discussed from the perspective of two clinicians with extensive experience in this area. The introduction of genetic testing and large-scale population screening studies have broadened our understanding of the clinical expression of disease and the utility of biochemical iron tests for the detection of disease and for the assessment of disease severity. Liver biopsy has become more of a prognostic test than a diagnostic test. The authors offer a stepwise, diagnostic algorithm based on current evidence-based data, that they regard as most cost-effective. An early diagnosis can lead to phlebotomy therapy to prevent the development of cirrhosis. PMID:16955151
Model-Based Diagnosis in a Power Distribution Test-Bed
NASA Technical Reports Server (NTRS)
Scarl, E.; McCall, K.
1998-01-01
The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.
PCA-based artifact removal algorithm for stroke detection using UWB radar imaging.
Ricci, Elisa; di Domenico, Simone; Cianca, Ernestina; Rossi, Tommaso; Diomedi, Marina
2017-06-01
Stroke patients should be dispatched at the highest level of care available in the shortest time. In this context, a transportable system in specialized ambulances, able to evaluate the presence of an acute brain lesion in a short time interval (i.e., few minutes), could shorten delay of treatment. UWB radar imaging is an emerging diagnostic branch that has great potential for the implementation of a transportable and low-cost device. Transportability, low cost and short response time pose challenges to the signal processing algorithms of the backscattered signals as they should guarantee good performance with a reasonably low number of antennas and low computational complexity, tightly related to the response time of the device. The paper shows that a PCA-based preprocessing algorithm can: (1) achieve good performance already with a computationally simple beamforming algorithm; (2) outperform state-of-the-art preprocessing algorithms; (3) enable a further improvement in the performance (and/or decrease in the number of antennas) by using a multistatic approach with just a modest increase in computational complexity. This is an important result toward the implementation of such a diagnostic device that could play an important role in emergency scenario.
Disk Crack Detection for Seeded Fault Engine Test
NASA Technical Reports Server (NTRS)
Luo, Huageng; Rodriguez, Hector; Hallman, Darren; Corbly, Dennis; Lewicki, David G. (Technical Monitor)
2004-01-01
Work was performed to develop and demonstrate vibration diagnostic techniques for the on-line detection of engine rotor disk cracks and other anomalies through a real engine test. An existing single-degree-of-freedom non-resonance-based vibration algorithm was extended to a multi-degree-of-freedom model. In addition, a resonance-based algorithm was also proposed for the case of one or more resonances. The algorithms were integrated into a diagnostic system using state-of-the- art commercial analysis equipment. The system required only non-rotating vibration signals, such as accelerometers and proximity probes, and the rotor shaft 1/rev signal to conduct the health monitoring. Before the engine test, the integrated system was tested in the laboratory by using a small rotor with controlled mass unbalances. The laboratory tests verified the system integration and both the non-resonance and the resonance-based algorithm implementations. In the engine test, the system concluded that after two weeks of cycling, the seeded fan disk flaw did not propagate to a large enough size to be detected by changes in the synchronous vibration. The unbalance induced by mass shifting during the start up and coast down was still the dominant response in the synchronous vibration.
Hatzichristou, Dimitris; Kirana, Paraskevi-Sofia; Banner, Linda; Althof, Stanley E; Lonnee-Hoffmann, Risa A M; Dennerstein, Lorraine; Rosen, Raymond C
2016-08-01
A detailed sexual history is the cornerstone for all sexual problem assessments and sexual dysfunction diagnoses. Diagnostic evaluation is based on an in-depth sexual history, including sexual and gender identity and orientation, sexual activity and function, current level of sexual function, overall health and comorbidities, partner relationship and interpersonal factors, and the role of cultural and personal expectations and attitudes. To propose key steps in the diagnostic evaluation of sexual dysfunctions, with special focus on the use of symptom scales and questionnaires. Critical assessment of the current literature by the International Consultation on Sexual Medicine committee. A revised algorithm for the management of sexual dysfunctions, level of evidence, and recommendation for scales and questionnaires. The International Consultation on Sexual Medicine proposes an updated algorithm for diagnostic evaluation of sexual dysfunction in men and women, with specific recommendations for sexual history taking and diagnostic evaluation. Standardized scales, checklists, and validated questionnaires are additional adjuncts that should be used routinely in sexual problem evaluation. Scales developed for specific patient groups are included. Results of this evaluation are presented with recommendations for clinical and research uses. Defined principles, an algorithm and a range of scales may provide coherent and evidence based management for sexual dysfunctions. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Retinex enhancement of infrared images.
Li, Ying; He, Renjie; Xu, Guizhi; Hou, Changzhi; Sun, Yunyan; Guo, Lei; Rao, Liyun; Yan, Weili
2008-01-01
With the ability of imaging the temperature distribution of body, infrared imaging is promising in diagnostication and prognostication of diseases. However the poor quality of the raw original infrared images prevented applications and one of the essential problems is the low contrast appearance of the imagined object. In this paper, the image enhancement technique based on the Retinex theory is studied, which is a process that automatically retrieve the visual realism to images. The algorithms, including Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm, multi-scale Retinex algorithm and multi-scale Retinex algorithm with color restoration, are experienced to the enhancement of infrared images. The entropy measurements along with the visual inspection were compared and results shown the algorithms based on Retinex theory have the ability in enhancing the infrared image. Out of the algorithms compared, MSRCR demonstrated the best performance.
Case-Deletion Diagnostics for Nonlinear Structural Equation Models
ERIC Educational Resources Information Center
Lee, Sik-Yum; Lu, Bin
2003-01-01
In this article, a case-deletion procedure is proposed to detect influential observations in a nonlinear structural equation model. The key idea is to develop the diagnostic measures based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm. An one-step pseudo approximation is proposed to reduce the…
ERIC Educational Resources Information Center
Hus, Vanessa; Lord, Catherine
2014-01-01
The recently published Autism Diagnostic Observation Schedule, 2nd edition (ADOS-2) includes revised diagnostic algorithms and standardized severity scores for modules used to assess younger children. A revised algorithm and severity scores are not yet available for Module 4, used with verbally fluent adults. The current study revises the Module 4…
ERIC Educational Resources Information Center
Oosterling, Iris; Roos, Sascha; de Bildt, Annelies; Rommelse, Nanda; de Jonge, Maretha; Visser, Janne; Lappenschaar, Martijn; Swinkels, Sophie; van der Gaag, Rutger Jan; Buitelaar, Jan
2010-01-01
Recently, Gotham et al. ("2007") proposed revised algorithms for the Autism Diagnostic Observation Schedule (ADOS) with improved diagnostic validity. The aim of the current study was to replicate predictive validity, factor structure, and correlations with age and verbal and nonverbal IQ of the ADOS revised algorithms for Modules 1 and 2…
New web-based algorithm to improve rigid gas permeable contact lens fitting in keratoconus.
Ortiz-Toquero, Sara; Rodriguez, Guadalupe; de Juan, Victoria; Martin, Raul
2017-06-01
To calculate and validate a new web-based algorithm for selecting the back optic zone radius (BOZR) of spherical gas permeable (GP) lens in keratoconus eyes. A retrospective calculation (n=35; multiple regression analysis) and a posterior prospective validation (new sample of 50 keratoconus eyes) of a new algorithm to select the BOZR of spherical KAKC design GP lenses (Conoptica) in keratoconus were conducted. BOZR calculated with the new algorithm, manufacturer guidelines and APEX software were compared with the BOZR that was finally prescribed. Number of diagnostic lenses, ordered lenses and visits to achieve optimal fitting were recorded and compared those obtained for a control group [50 healthy eyes fitted with spherical GP (BIAS design; Conoptica)]. The new algorithm highly correlated with the final BOZR fitted (r 2 =0.825, p<0.001). BOZR of the first diagnostic lens using the new algorithm demonstrated lower difference with the final BOZR prescribed (-0.01±0.12mm, p=0.65; 58% difference≤0.05mm) than with the manufacturer guidelines (+0.12±0.22mm, p<0.001; 26% difference≤0.05mm) and APEX software (-0.14±0.16mm, p=0.001; 34% difference≤0.05mm). Close numbers of diagnostic lens (1.6±0.8, 1.3±0.5; p=0.02), ordered lens (1.4±0.6, 1.1±0.3; P<0.001), and visits (3.4±0.7, 3.2±0.4; p=0.08) were required to fit keratoconus and healthy eyes, respectively. This new algorithm (free access at www.calculens.com) improves spherical KAKC GP fitting in keratoconus and can reduce the practitioner and patient chair time to achieve a final acceptable fit in keratoconus. This algorithm reduces differences between keratoconus GP fitting (KAKC design) and standard GP (BIAS design) lenses fitting in healthy eyes. Copyright © 2016 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
Diagnostic Accuracy Comparison of Artificial Immune Algorithms for Primary Headaches.
Çelik, Ufuk; Yurtay, Nilüfer; Koç, Emine Rabia; Tepe, Nermin; Güllüoğlu, Halil; Ertaş, Mustafa
2015-01-01
The present study evaluated the diagnostic accuracy of immune system algorithms with the aim of classifying the primary types of headache that are not related to any organic etiology. They are divided into four types: migraine, tension, cluster, and other primary headaches. After we took this main objective into consideration, three different neurologists were required to fill in the medical records of 850 patients into our web-based expert system hosted on our project web site. In the evaluation process, Artificial Immune Systems (AIS) were used as the classification algorithms. The AIS are classification algorithms that are inspired by the biological immune system mechanism that involves significant and distinct capabilities. These algorithms simulate the specialties of the immune system such as discrimination, learning, and the memorizing process in order to be used for classification, optimization, or pattern recognition. According to the results, the accuracy level of the classifier used in this study reached a success continuum ranging from 95% to 99%, except for the inconvenient one that yielded 71% accuracy.
A Wave Diagnostics in Geophysics: Algorithmic Extraction of Atmosphere Disturbance Modes
NASA Astrophysics Data System (ADS)
Leble, S.; Vereshchagin, S.
2018-04-01
The problem of diagnostics in geophysics is discussed and a proposal based on dynamic projecting operators technique is formulated. The general exposition is demonstrated by an example of symbolic algorithm for the wave and entropy modes in the exponentially stratified atmosphere. The novel technique is developed as a discrete version for the evolution operator and the corresponding projectors via discrete Fourier transformation. Its explicit realization for directed modes in exponential one-dimensional atmosphere is presented via the correspondent projection operators in its discrete version in terms of matrices with a prescribed action on arrays formed from observation tables. A simulation based on opposite directed (upward and downward) wave train solution is performed and the modes' extraction from a mixture is illustrated.
Development of a South African integrated syndromic respiratory disease guideline for primary care.
English, René G; Bateman, Eric D; Zwarenstein, Merrick F; Fairall, Lara R; Bheekie, Angeni; Bachmann, Max O; Majara, Bosielo; Ottmani, Salah-Eddine; Scherpbier, Robert W
2008-09-01
The Practical Approach to Lung Health in South Africa (PALSA) initiative aimed to develop an integrated symptom- and sign-based (syndromic) respiratory disease guideline for nurse care practitioners working in primary care in a developing country. A multidisciplinary team developed the guideline after reviewing local barriers to respiratory health care provision, relevant health care policies, existing respiratory guidelines, and literature. Guideline drafts were evaluated by means of focus group discussions. Existing evidence-based guideline development methodologies were tailored for development of the guideline. A locally-applicable guideline based on syndromic diagnostic algorithms was developed for the management of patients 15 years and older who presented to primary care facilities with cough or difficulty breathing. PALSA has developed a guideline that integrates and presents diagnostic and management recommendations for priority respiratory diseases in adults using a symptom- and sign-based algorithmic guideline for nurses in developing countries.
Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Jensen, David; Poll, Scott
2009-01-01
Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.
NASA Astrophysics Data System (ADS)
Li, Shaoxin; Li, Linfang; Zeng, Qiuyao; Zhang, Yanjiao; Guo, Zhouyi; Liu, Zhiming; Jin, Mei; Su, Chengkang; Lin, Lin; Xu, Junfa; Liu, Songhao
2015-05-01
This study aims to characterize and classify serum surface-enhanced Raman spectroscopy (SERS) spectra between bladder cancer patients and normal volunteers by genetic algorithms (GAs) combined with linear discriminate analysis (LDA). Two group serum SERS spectra excited with nanoparticles are collected from healthy volunteers (n = 36) and bladder cancer patients (n = 55). Six diagnostic Raman bands in the regions of 481-486, 682-687, 1018-1034, 1313-1323, 1450-1459 and 1582-1587 cm-1 related to proteins, nucleic acids and lipids are picked out with the GAs and LDA. By the diagnostic models built with the identified six Raman bands, the improved diagnostic sensitivity of 90.9% and specificity of 100% were acquired for classifying bladder cancer patients from normal serum SERS spectra. The results are superior to the sensitivity of 74.6% and specificity of 97.2% obtained with principal component analysis by the same serum SERS spectra dataset. Receiver operating characteristic (ROC) curves further confirmed the efficiency of diagnostic algorithm based on GA-LDA technique. This exploratory work demonstrates that the serum SERS associated with GA-LDA technique has enormous potential to characterize and non-invasively detect bladder cancer through peripheral blood.
A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging.
Jiang, J; Hall, T J
2007-07-07
Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s(-1)) that exceed our previous methods.
Teh, Seng Khoon; Zheng, Wei; Lau, David P; Huang, Zhiwei
2009-06-01
In this work, we evaluated the diagnostic ability of near-infrared (NIR) Raman spectroscopy associated with the ensemble recursive partitioning algorithm based on random forests for identifying cancer from normal tissue in the larynx. A rapid-acquisition NIR Raman system was utilized for tissue Raman measurements at 785 nm excitation, and 50 human laryngeal tissue specimens (20 normal; 30 malignant tumors) were used for NIR Raman studies. The random forests method was introduced to develop effective diagnostic algorithms for classification of Raman spectra of different laryngeal tissues. High-quality Raman spectra in the range of 800-1800 cm(-1) can be acquired from laryngeal tissue within 5 seconds. Raman spectra differed significantly between normal and malignant laryngeal tissues. Classification results obtained from the random forests algorithm on tissue Raman spectra yielded a diagnostic sensitivity of 88.0% and specificity of 91.4% for laryngeal malignancy identification. The random forests technique also provided variables importance that facilitates correlation of significant Raman spectral features with cancer transformation. This study shows that NIR Raman spectroscopy in conjunction with random forests algorithm has a great potential for the rapid diagnosis and detection of malignant tumors in the larynx.
Naidoo, Pren; van Niekerk, Margaret; du Toit, Elizabeth; Beyers, Nulda; Leon, Natalie
2015-10-28
Although new molecular diagnostic tests such as GenoType MTBDRplus and Xpert® MTB/RIF have reduced multidrug-resistant tuberculosis (MDR-TB) treatment initiation times, patients' experiences of diagnosis and treatment initiation are not known. This study aimed to explore and compare MDR-TB patients' experiences of their diagnostic and treatment initiation pathway in GenoType MTBDRplus and Xpert® MTB/RIF-based diagnostic algorithms. The study was undertaken in Cape Town, South Africa where primary health-care services provided free TB diagnosis and treatment. A smear, culture and GenoType MTBDRplus diagnostic algorithm was used in 2010, with Xpert® MTB/RIF phased in from 2011-2013. Participants diagnosed in each algorithm at four facilities were purposively sampled, stratifying by age, gender and MDR-TB risk profiles. We conducted in-depth qualitative interviews using a semi-structured interview guide. Through constant comparative analysis we induced common and divergent themes related to symptom recognition, health-care access, testing for MDR-TB and treatment initiation within and between groups. Data were triangulated with clinical information and health visit data from a structured questionnaire. We identified both enablers and barriers to early MDR-TB diagnosis and treatment. Half the patients had previously been treated for TB; most recognised recurring symptoms and reported early health-seeking. Those who attributed symptoms to other causes delayed health-seeking. Perceptions of poor public sector services were prevalent and may have contributed both to deferred health-seeking and to patient's use of the private sector, contributing to delays. However, once on treatment, most patients expressed satisfaction with public sector care. Two patients in the Xpert® MTB/RIF-based algorithm exemplified its potential to reduce delays, commencing MDR-TB treatment within a week of their first health contact. However, most patients in both algorithms experienced substantial delays. Avoidable health system delays resulted from providers not testing for TB at initial health contact, non-adherence to testing algorithms, results not being available and failure to promptly recall patients with positive results. Whilst the introduction of rapid tests such as Xpert® MTB/RIF can expedite MDR-TB diagnosis and treatment initiation, the full benefits are unlikely to be realised without reducing delays in health-seeking and addressing the structural barriers present in the health-care system.
NASA Astrophysics Data System (ADS)
Srinivasan, Yeshwanth; Hernes, Dana; Tulpule, Bhakti; Yang, Shuyu; Guo, Jiangling; Mitra, Sunanda; Yagneswaran, Sriraja; Nutter, Brian; Jeronimo, Jose; Phillips, Benny; Long, Rodney; Ferris, Daron
2005-04-01
Automated segmentation and classification of diagnostic markers in medical imagery are challenging tasks. Numerous algorithms for segmentation and classification based on statistical approaches of varying complexity are found in the literature. However, the design of an efficient and automated algorithm for precise classification of desired diagnostic markers is extremely image-specific. The National Library of Medicine (NLM), in collaboration with the National Cancer Institute (NCI), is creating an archive of 60,000 digitized color images of the uterine cervix. NLM is developing tools for the analysis and dissemination of these images over the Web for the study of visual features correlated with precancerous neoplasia and cancer. To enable indexing of images of the cervix, it is essential to develop algorithms for the segmentation of regions of interest, such as acetowhitened regions, and automatic identification and classification of regions exhibiting mosaicism and punctation. Success of such algorithms depends, primarily, on the selection of relevant features representing the region of interest. We present color and geometric features based statistical classification and segmentation algorithms yielding excellent identification of the regions of interest. The distinct classification of the mosaic regions from the non-mosaic ones has been obtained by clustering multiple geometric and color features of the segmented sections using various morphological and statistical approaches. Such automated classification methodologies will facilitate content-based image retrieval from the digital archive of uterine cervix and have the potential of developing an image based screening tool for cervical cancer.
Wormanns, D
2016-09-01
Pulmonary nodules are the most frequent pathological finding in low-dose computed tomography (CT) scanning for early detection of lung cancer. Early stages of lung cancer are often manifested as pulmonary nodules; however, the very commonly occurring small nodules are predominantly benign. These benign nodules are responsible for the high percentage of false positive test results in screening studies. Appropriate diagnostic algorithms are necessary to reduce false positive screening results and to improve the specificity of lung cancer screening. Such algorithms are based on some of the basic principles comprehensively described in this article. Firstly, the diameter of nodules allows a differentiation between large (>8 mm) probably malignant and small (<8 mm) probably benign nodules. Secondly, some morphological features of pulmonary nodules in CT can prove their benign nature. Thirdly, growth of small nodules is the best non-invasive predictor of malignancy and is utilized as a trigger for further diagnostic work-up. Non-invasive testing using positron emission tomography (PET) and contrast enhancement as well as invasive diagnostic tests (e.g. various procedures for cytological and histological diagnostics) are briefly described in this article. Different nodule morphology using CT (e.g. solid and semisolid nodules) is associated with different biological behavior and different algorithms for follow-up are required. Currently, no obligatory algorithm is available in German-speaking countries for the management of pulmonary nodules, which reflects the current state of knowledge. The main features of some international and American recommendations are briefly presented in this article from which conclusions for the daily clinical use are derived.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2007-03-01
Multislice CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multislice CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. Moreover, we have provided diagnostic assistance methods to medical screening specialists by using a lung cancer screening algorithm built into mobile helical CT scanner for the lung cancer mass screening done in the region without the hospital. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system.
ERIC Educational Resources Information Center
Zander, Eric; Sturm, Harald; Bölte, Sven
2015-01-01
The diagnostic validity of the new research algorithms of the Autism Diagnostic Interview-Revised and the revised algorithms of the Autism Diagnostic Observation Schedule was examined in a clinical sample of children aged 18-47 months. Validity was determined for each instrument separately and their combination against a clinical consensus…
Statistical physics of medical diagnostics: Study of a probabilistic model.
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Statistical physics of medical diagnostics: Study of a probabilistic model
NASA Astrophysics Data System (ADS)
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.
2003-01-01
A diagnostic tool for detecting damage to gears was developed. Two different measurement technologies, oil debris analysis and vibration were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Rig. An oil debris sensor and the two vibration algorithms were adapted as the diagnostic tools. An inductance type oil debris sensor was selected for the oil analysis measurement technology. Gear damage data for this type of sensor was limited to data collected in the NASA Glenn test rigs. For this reason, this analysis included development of a parameter for detecting gear pitting damage using this type of sensor. The vibration data was used to calculate two previously available gear vibration diagnostic algorithms. The two vibration algorithms were selected based on their maturity and published success in detecting damage to gears. Oil debris and vibration features were then developed using fuzzy logic analysis techniques, then input into a multi sensor data fusion process. Results show combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spur gears. As a result of this research, this new diagnostic tool has significantly improved detection of gear damage in the NASA Glenn Spur Gear Fatigue Rigs. This research also resulted in several other findings that will improve the development of future health monitoring systems. Oil debris analysis was found to be more reliable than vibration analysis for detecting pitting fatigue failure of gears and is capable of indicating damage progression. Also, some vibration algorithms are as sensitive to operational effects as they are to damage. Another finding was that clear threshold limits must be established for diagnostic tools. Based on additional experimental data obtained from the NASA Glenn Spiral Bevel Gear Fatigue Rig, the methodology developed in this study can be successfully implemented on other geared systems.
Hesar, Hamed Danandeh; Mohebbi, Maryam
2017-05-01
In this paper, a model-based Bayesian filtering framework called the "marginalized particle-extended Kalman filter (MP-EKF) algorithm" is proposed for electrocardiogram (ECG) denoising. This algorithm does not have the extended Kalman filter (EKF) shortcoming in handling non-Gaussian nonstationary situations because of its nonlinear framework. In addition, it has less computational complexity compared with particle filter. This filter improves ECG denoising performance by implementing marginalized particle filter framework while reducing its computational complexity using EKF framework. An automatic particle weighting strategy is also proposed here that controls the reliance of our framework to the acquired measurements. We evaluated the proposed filter on several normal ECGs selected from MIT-BIH normal sinus rhythm database. To do so, artificial white Gaussian and colored noises as well as nonstationary real muscle artifact (MA) noise over a range of low SNRs from 10 to -5 dB were added to these normal ECG segments. The benchmark methods were the EKF and extended Kalman smoother (EKS) algorithms which are the first model-based Bayesian algorithms introduced in the field of ECG denoising. From SNR viewpoint, the experiments showed that in the presence of Gaussian white noise, the proposed framework outperforms the EKF and EKS algorithms in lower input SNRs where the measurements and state model are not reliable. Owing to its nonlinear framework and particle weighting strategy, the proposed algorithm attained better results at all input SNRs in non-Gaussian nonstationary situations (such as presence of pink noise, brown noise, and real MA). In addition, the impact of the proposed filtering method on the distortion of diagnostic features of the ECG was investigated and compared with EKF/EKS methods using an ECG diagnostic distortion measure called the "Multi-Scale Entropy Based Weighted Distortion Measure" or MSEWPRD. The results revealed that our proposed algorithm had the lowest MSEPWRD for all noise types at low input SNRs. Therefore, the morphology and diagnostic information of ECG signals were much better conserved compared with EKF/EKS frameworks, especially in non-Gaussian nonstationary situations.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou
2006-03-01
Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.
[A new information technology for system diagnosis of functional activity of human organs].
Avshalumov, A Sh; Sudakov, K V; Filaretov, G F
2006-01-01
The goal of this work was to consider a new diagnostic technology based on analysis of objective information parameters of functional activity and interaction of normal and pathologically changed human organs. The technology is based on the use of very low power millimeter (EHF) radiation emitted by human body and other biological objects in the process of vital activity. The importance of consideration of the information aspect of vital activity from the standpoint of the theory of functional systems suggested by P. K. Anokhin is emphasized. The suggested information technology is theoretically substantiated. The capabilities of the suggested technology for diagnosis, as well as the difficulties of its practical implementation caused by very low power of electromagnetic fields generated by human body, are discussed. It is noted that only use of modern radiophysical equipment together with new software based on specially developed algorithms made it possible to construct a medical EHF diagnostic system for effective implementation of the suggested technology. The system structure, functions of its components, the examination procedure, and the form of representation of diagnostic information are described together with the specific features of applied software based on the principle of maximal objectivity of analysis and interpretation of the results of diagnosis on the basis of artificial intelligence algorithms. The diagnostic capabilities of the system are illustrated by several examples.
Emergency ultrasound-based algorithms for diagnosing blunt abdominal trauma.
Stengel, Dirk; Bauwens, Kai; Rademacher, Grit; Ekkernkamp, Axel; Güthoff, Claas
2013-07-31
Ultrasonography is regarded as the tool of choice for early diagnostic investigations in patients with suspected blunt abdominal trauma. Although its sensitivity is too low for definite exclusion of abdominal organ injury, proponents of ultrasound argue that ultrasound-based clinical pathways enhance the speed of primary trauma assessment, reduce the number of computed tomography scans and cut costs. To assess the effects of trauma algorithms that include ultrasound examinations in patients with suspected blunt abdominal trauma. We searched the Cochrane Injuries Group's Specialised Register, CENTRAL (The Cochrane Library), MEDLINE (OvidSP), EMBASE (OvidSP), CINAHL (EBSCO), publishers' databases, controlled trials registers and the Internet. Bibliographies of identified articles and conference abstracts were searched for further elligible studies. Trial authors were contacted for further information and individual patient data. The searches were updated in February 2013. randomised controlled trials (RCTs) and quasi-randomised trials (qRCTs). patients with blunt torso, abdominal or multiple trauma undergoing diagnostic investigations for abdominal organ injury. diagnostic algorithms comprising emergency ultrasonography (US). diagnostic algorithms without ultrasound examinations (for example, primary computed tomography [CT] or diagnostic peritoneal lavage [DPL]). mortality, use of CT and DPL, cost-effectiveness, laparotomy and negative laparotomy rates, delayed diagnoses, and quality of life. Two authors independently selected trials for inclusion, assessed methodological quality and extracted data. Where possible, data were pooled and relative risks (RRs), risk differences (RDs) and weighted mean differences, each with 95% confidence intervals (CIs), were calculated by fixed- or random-effects modelling, as appropriate. We identified four studies meeting our inclusion criteria. Overall, trials were of moderate methodological quality. Few trial authors responded to our written inquiries seeking to resolve controversial issues and to obtain individual patient data. We pooled mortality data from three trials involving 1254 patients; relative risk in favour of the US arm was 1.00 (95% CI 0.50 to 2.00). US-based pathways significantly reduced the number of CT scans (random-effects RD -0.52, 95% CI -0.83 to -0.21), but the meaning of this result is unclear. Given the low sensitivity of ultrasound, the reduction in CT scans may either translate to a number needed to treat or number needed to harm of two. There is currently insufficient evidence from RCTs to justify promotion of ultrasound-based clinical pathways in diagnosing patients with suspected blunt abdominal trauma.
ERIC Educational Resources Information Center
Kim, So Hyun; Lord, Catherine
2012-01-01
Autism Diagnostic Interview-Revised (Rutter et al. in "Autism diagnostic interview-revised." Western Psychological Services, Los Angeles, 2003) diagnostic algorithms specific to toddlers and young preschoolers were created using 829 assessments of children aged from 12 to 47 months with ASD, nonspectrum disorders, and typical development. The…
Rizzo, G; Capponi, A; Pietrolucci, M E; Capece, A; Aiello, E; Mammarella, S; Arduini, D
2011-08-01
To describe a novel algorithm, based on the new display technology 'OmniView', developed to visualize diagnostic sagittal and coronal planes of the fetal brain from volumes obtained by three-dimensional (3D) ultrasonography. We developed an algorithm to image standard neurosonographic planes by drawing dissecting lines through the axial transventricular view of 3D volume datasets acquired transabdominally. The algorithm was tested on 106 normal fetuses at 18-24 weeks of gestation and the visualization rates of brain diagnostic planes were evaluated by two independent reviewers. The algorithm was also applied to nine cases with proven brain defects. The two reviewers, using the algorithm on normal fetuses, found satisfactory images with visualization rates ranging between 71.7% and 96.2% for sagittal planes and between 76.4% and 90.6% for coronal planes. The agreement rate between the two reviewers, as expressed by Cohen's kappa coefficient, was > 0.93 for sagittal planes and > 0.89 for coronal planes. All nine abnormal volumes were identified by a single observer from among a series including normal brains, and eight of these nine cases were diagnosed correctly. This novel algorithm can be used to visualize standard sagittal and coronal planes in the fetal brain. This approach may simplify the examination of the fetal brain and reduce dependency of success on operator skill. Copyright © 2011 ISUOG. Published by John Wiley & Sons, Ltd.
Electric machine differential for vehicle traction control and stability control
NASA Astrophysics Data System (ADS)
Kuruppu, Sandun Shivantha
Evolving requirements in energy efficiency and tightening regulations for reliable electric drivetrains drive the advancement of the hybrid electric (HEV) and full electric vehicle (EV) technology. Different configurations of EV and HEV architectures are evaluated for their performance. The future technology is trending towards utilizing distinctive properties in electric machines to not only to improve efficiency but also to realize advanced road adhesion controls and vehicle stability controls. Electric machine differential (EMD) is such a concept under current investigation for applications in the near future. Reliability of a power train is critical. Therefore, sophisticated fault detection schemes are essential in guaranteeing reliable operation of a complex system such as an EMD. The research presented here emphasize on implementation of a 4kW electric machine differential, a novel single open phase fault diagnostic scheme, an implementation of a real time slip optimization algorithm and an electric machine differential based yaw stability improvement study. The proposed d-q current signature based SPO fault diagnostic algorithm detects the fault within one electrical cycle. The EMD based extremum seeking slip optimization algorithm reduces stopping distance by 30% compared to hydraulic braking based ABS.
Hultenmo, Maria; Caisander, Håkan; Mack, Karsten; Thilander-Klang, Anne
2016-06-01
The diagnostic image quality of 75 paediatric abdominal computed tomography (CT) examinations reconstructed with two different iterative reconstruction (IR) algorithms-adaptive statistical IR (ASiR™) and model-based IR (Veo™)-was compared. Axial and coronal images were reconstructed with 70 % ASiR with the Soft™ convolution kernel and with the Veo algorithm. The thickness of the reconstructed images was 2.5 or 5 mm depending on the scanning protocol used. Four radiologists graded the delineation of six abdominal structures and the diagnostic usefulness of the image quality. The Veo reconstruction significantly improved the visibility of most of the structures compared with ASiR in all subgroups of images. For coronal images, the Veo reconstruction resulted in significantly improved ratings of the diagnostic use of the image quality compared with the ASiR reconstruction. This was not seen for the axial images. The greatest improvement using Veo reconstruction was observed for the 2.5 mm coronal slices. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Arpaia, P; Cimmino, P; Girone, M; La Commara, G; Maisto, D; Manna, C; Pezzetti, M
2014-09-01
Evolutionary approach to centralized multiple-faults diagnostics is extended to distributed transducer networks monitoring large experimental systems. Given a set of anomalies detected by the transducers, each instance of the multiple-fault problem is formulated as several parallel communicating sub-tasks running on different transducers, and thus solved one-by-one on spatially separated parallel processes. A micro-genetic algorithm merges evaluation time efficiency, arising from a small-size population distributed on parallel-synchronized processors, with the effectiveness of centralized evolutionary techniques due to optimal mix of exploitation and exploration. In this way, holistic view and effectiveness advantages of evolutionary global diagnostics are combined with reliability and efficiency benefits of distributed parallel architectures. The proposed approach was validated both (i) by simulation at CERN, on a case study of a cold box for enhancing the cryogeny diagnostics of the Large Hadron Collider, and (ii) by experiments, under the framework of the industrial research project MONDIEVOB (Building Remote Monitoring and Evolutionary Diagnostics), co-funded by EU and the company Del Bo srl, Napoli, Italy.
OSA severity assessment based on sleep breathing analysis using ambient microphone.
Dafna, E; Tarasiuk, A; Zigel, Y
2013-01-01
In this paper, an audio-based system for severity estimation of obstructive sleep apnea (OSA) is proposed. The system estimates the apnea-hypopnea index (AHI), which is the average number of apneic events per hour of sleep. This system is based on a Gaussian mixture regression algorithm that was trained and validated on full-night audio recordings. Feature selection process using a genetic algorithm was applied to select the best features extracted from time and spectra domains. A total of 155 subjects, referred to in-laboratory polysomnography (PSG) study, were recruited. Using the PSG's AHI score as a gold-standard, the performances of the proposed system were evaluated using a Pearson correlation, AHI error, and diagnostic agreement methods. Correlation of R=0.89, AHI error of 7.35 events/hr, and diagnostic agreement of 77.3% were achieved, showing encouraging performances and a reliable non-contact alternative method for OSA severity estimation.
Andreini, Daniele; Lin, Fay Y; Rizvi, Asim; Cho, Iksung; Heo, Ran; Pontone, Gianluca; Bartorelli, Antonio L; Mushtaq, Saima; Villines, Todd C; Carrascosa, Patricia; Choi, Byoung Wook; Bloom, Stephen; Wei, Han; Xing, Yan; Gebow, Dan; Gransar, Heidi; Chang, Hyuk-Jae; Leipsic, Jonathon; Min, James K
2018-06-01
Motion artifact can reduce the diagnostic accuracy of coronary CT angiography (CCTA) for coronary artery disease (CAD). The purpose of this study was to compare the diagnostic performance of an algorithm dedicated to correcting coronary motion artifact with the performance of standard reconstruction methods in a prospective international multicenter study. Patients referred for clinically indicated invasive coronary angiography (ICA) for suspected CAD prospectively underwent an investigational CCTA examination free from heart rate-lowering medications before they underwent ICA. Blinded core laboratory interpretations of motion-corrected and standard reconstructions for obstructive CAD (≥ 50% stenosis) were compared with ICA findings. Segments unevaluable owing to artifact were considered obstructive. The primary endpoint was per-subject diagnostic accuracy of the intracycle motion correction algorithm for obstructive CAD found at ICA. Among 230 patients who underwent CCTA with the motion correction algorithm and standard reconstruction, 92 (40.0%) had obstructive CAD on the basis of ICA findings. At a mean heart rate of 68.0 ± 11.7 beats/min, the motion correction algorithm reduced the number of nondiagnostic scans compared with standard reconstruction (20.4% vs 34.8%; p < 0.001). Diagnostic accuracy for obstructive CAD with the motion correction algorithm (62%; 95% CI, 56-68%) was not significantly different from that of standard reconstruction on a per-subject basis (59%; 95% CI, 53-66%; p = 0.28) but was superior on a per-vessel basis: 77% (95% CI, 74-80%) versus 72% (95% CI, 69-75%) (p = 0.02). The motion correction algorithm was superior in subgroups of patients with severely obstructive (≥ 70%) stenosis, heart rate ≥ 70 beats/min, and vessels in the atrioventricular groove. The motion correction algorithm studied reduces artifacts and improves diagnostic performance for obstructive CAD on a per-vessel basis and in selected subgroups on a per-subject basis.
Vitte, Joana; Ranque, Stéphane; Carsin, Ania; Gomez, Carine; Romain, Thomas; Cassagne, Carole; Gouitaa, Marion; Baravalle-Einaudi, Mélisande; Bel, Nathalie Stremler-Le; Reynaud-Gaubert, Martine; Dubus, Jean-Christophe; Mège, Jean-Louis; Gaudart, Jean
2017-01-01
Molecular-based allergy diagnosis yields multiple biomarker datasets. The classical diagnostic score for allergic bronchopulmonary aspergillosis (ABPA), a severe disease usually occurring in asthmatic patients and people with cystic fibrosis, comprises succinct immunological criteria formulated in 1977: total IgE, anti- Aspergillus fumigatus ( Af ) IgE, anti- Af "precipitins," and anti- Af IgG. Progress achieved over the last four decades led to multiple IgE and IgG(4) Af biomarkers available with quantitative, standardized, molecular-level reports. These newly available biomarkers have not been included in the current diagnostic criteria, either individually or in algorithms, despite persistent underdiagnosis of ABPA. Large numbers of individual biomarkers may hinder their use in clinical practice. Conversely, multivariate analysis using new tools may bring about a better chance of less diagnostic mistakes. We report here a proof-of-concept work consisting of a three-step multivariate analysis of Af IgE, IgG, and IgG4 biomarkers through a combination of principal component analysis, hierarchical ascendant classification, and classification and regression tree multivariate analysis. The resulting diagnostic algorithms might show the way for novel criteria and improved diagnostic efficiency in Af -sensitized patients at risk for ABPA.
Recurrent Pneumonia in Children: A Reasoned Diagnostic Approach and a Single Centre Experience.
Montella, Silvia; Corcione, Adele; Santamaria, Francesca
2017-01-29
Recurrent pneumonia (RP), i.e., at least two episodes of pneumonia in one year or three episodes ever with intercritical radiographic clearing of densities, occurs in 7.7%-9% of children with community-acquired pneumonia. In RP, the challenge is to discriminate between children with self-limiting or minor problems, that do not require a diagnostic work-up, and those with an underlying disease. The aim of the current review is to discuss a reasoned diagnostic approach to RP in childhood. Particular emphasis has been placed on which children should undergo a diagnostic work-up and which tests should be performed. A pediatric case series is also presented, in order to document a single centre experience of RP. A management algorithm for the approach to children with RP, based on the evidence from a literature review, is proposed. Like all algorithms, it is not meant to replace clinical judgment, but it should drive physicians to adopt a systematic approach to pediatric RP and provide a useful guide to the clinician.
Diagnosing breast cancer using Raman spectroscopy: prospective analysis
NASA Astrophysics Data System (ADS)
Haka, Abigail S.; Volynskaya, Zoya; Gardecki, Joseph A.; Nazemi, Jon; Shenk, Robert; Wang, Nancy; Dasari, Ramachandra R.; Fitzmaurice, Maryann; Feld, Michael S.
2009-09-01
We present the first prospective test of Raman spectroscopy in diagnosing normal, benign, and malignant human breast tissues. Prospective testing of spectral diagnostic algorithms allows clinicians to accurately assess the diagnostic information contained in, and any bias of, the spectroscopic measurement. In previous work, we developed an accurate, internally validated algorithm for breast cancer diagnosis based on analysis of Raman spectra acquired from fresh-frozen in vitro tissue samples. We currently evaluate the performance of this algorithm prospectively on a large ex vivo clinical data set that closely mimics the in vivo environment. Spectroscopic data were collected from freshly excised surgical specimens, and 129 tissue sites from 21 patients were examined. Prospective application of the algorithm to the clinical data set resulted in a sensitivity of 83%, a specificity of 93%, a positive predictive value of 36%, and a negative predictive value of 99% for distinguishing cancerous from normal and benign tissues. The performance of the algorithm in different patient populations is discussed. Sources of bias in the in vitro calibration and ex vivo prospective data sets, including disease prevalence and disease spectrum, are examined and analytical methods for comparison provided.
Alexander, C L; Currie, S; Pollock, K; Smith-Palmer, A; Jones, B L
2017-06-01
Giardia duodenalis and Cryptosporidium species are protozoan parasites capable of causing gastrointestinal disease in humans and animals through the ingestion of infective faeces. Whereas Cryptosporidium species can be acquired locally or through foreign travel, there is the mis-conception that giardiasis is considered to be largely travel-associated, which results in differences in laboratory testing algorithms. In order to determine the level of variation in testing criteria and detection methods between diagnostic laboratories for both pathogens across Scotland, an audit was performed. Twenty Scottish diagnostic microbiology laboratories were invited to participate with questions on sample acceptance criteria, testing methods, testing rates and future plans for pathogen detection. Reponses were received from 19 of the 20 laboratories representing each of the 14 territorial Health Boards. Detection methods varied between laboratories with the majority performing microscopy, one using a lateral flow immunochromatographic antigen assay, another using a manually washed plate-based enzyme immunoassay (EIA) and one laboratory trialling a plate-based EIA automated with an EIA plate washer. Whereas all laboratories except one screened every stool for Cryptosporidium species, an important finding was that significant variation in the testing algorithm for detecting Giardia was noted with only four laboratories testing all diagnostic stools. The most common criteria were 'travel history' (11 laboratories) and/or 'when requested' (14 laboratories). Despite only a small proportion of stools being examined in 15 laboratories for Giardia (2%-18% of the total number of stools submitted), of interest is the finding that a higher positivity rate was observed for Giardia than Cryptosporidium in 10 of these 15 laboratories. These findings highlight that the underreporting of Giardia in Scotland is likely based on current selection and testing algorithms.
A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging
NASA Astrophysics Data System (ADS)
Jiang, J.; Hall, T. J.
2007-07-01
Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows® system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s-1) that exceed our previous methods.
Fiuzy, Mohammad; Haddadnia, Javad; Mollania, Nasrin; Hashemian, Maryam; Hassanpour, Kazem
2012-01-01
Accurate Diagnosis of Breast Cancer is of prime importance. Fine Needle Aspiration test or "FNA", which has been used for several years in Europe, is a simple, inexpensive, noninvasive and accurate technique for detecting breast cancer. Expending the suitable features of the Fine Needle Aspiration results is the most important diagnostic problem in early stages of breast cancer. In this study, we introduced a new algorithm that can detect breast cancer based on combining artificial intelligent system and Fine Needle Aspiration (FNA). We studied the Features of Wisconsin Data Base Cancer which contained about 569 FNA test samples (212 patient samples (malignant) and 357 healthy samples (benign)). In this research, we combined Artificial Intelligence Approaches, such as Evolutionary Algorithm (EA) with Genetic Algorithm (GA), and also used Exact Classifier Systems (here by Fuzzy C-Means (FCM)) to separate malignant from benign samples. Furthermore, we examined artificial Neural Networks (NN) to identify the model and structure. This research proposed a new algorithm for an accurate diagnosis of breast cancer. According to Wisconsin Data Base Cancer (WDBC) data base, 62.75% of samples were benign, and 37.25% were malignant. After applying the proposed algorithm, we achieved high detection accuracy of about "96.579%" on 205 patients who were diagnosed as having breast cancer. It was found that the method had 93% sensitivity, 73% specialty, 65% positive predictive value, and 95% negative predictive value, respectively. If done by experts, Fine Needle Aspiration (FNA) can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. FNA can be the first line of diagnosis in women with breast masses, at least in deprived regions, and may increase health standards and clinical supervision of patients. Such a smart, economical, non-invasive, rapid and accurate system can be introduced as a useful diagnostic system for comprehensive treatment of breast cancer. Another advantage of this method is the possibility of diagnosing breast abnormalities. If done by experts, FNA can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples.
Using qualitative research to inform development of a diagnostic algorithm for UTI in children.
de Salis, Isabel; Whiting, Penny; Sterne, Jonathan A C; Hay, Alastair D
2013-06-01
Diagnostic and prognostic algorithms can help reduce clinical uncertainty. The selection of candidate symptoms and signs to be measured in case report forms (CRFs) for potential inclusion in diagnostic algorithms needs to be comprehensive, clearly formulated and relevant for end users. To investigate whether qualitative methods could assist in designing CRFs in research developing diagnostic algorithms. Specifically, the study sought to establish whether qualitative methods could have assisted in designing the CRF for the Health Technology Association funded Diagnosis of Urinary Tract infection in Young children (DUTY) study, which will develop a diagnostic algorithm to improve recognition of urinary tract infection (UTI) in children aged <5 years presenting acutely unwell to primary care. Qualitative methods were applied using semi-structured interviews of 30 UK doctors and nurses working with young children in primary care and a Children's Emergency Department. We elicited features that clinicians believed useful in diagnosing UTI and compared these for presence or absence and terminology with the DUTY CRF. Despite much agreement between clinicians' accounts and the DUTY CRFs, we identified a small number of potentially important symptoms and signs not included in the CRF and some included items that could have been reworded to improve understanding and final data analysis. This study uniquely demonstrates the role of qualitative methods in the design and content of CRFs used for developing diagnostic (and prognostic) algorithms. Research groups developing such algorithms should consider using qualitative methods to inform the selection and wording of candidate symptoms and signs.
Planetary Transmission Diagnostics
NASA Technical Reports Server (NTRS)
Lewicki, David G. (Technical Monitor); Samuel, Paul D.; Conroy, Joseph K.; Pines, Darryll J.
2004-01-01
This report presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting algorithm. The lifting scheme, developed by Wim Sweldens of Bell Labs, is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Classic lifting analyzes a given signal using wavelets derived from a single fundamental basis function. A number of researchers have proposed techniques for adding adaptivity to the lifting scheme, allowing the transform to choose from a set of fundamental bases the basis that best fits the signal. This characteristic is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation (synchronous signal-averaging) algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local wave-form changes associated with certain types of gear damage are poorly adapted, causing a significant change in the prediction error. The constrained adaptive lifting diagnostic algorithm is validated using data collected from the University of Maryland Transmission Test Rig and the results are discussed.
Kolusheva, S; Yossef, R; Kugel, A; Katz, M; Volinsky, R; Welt, M; Hadad, U; Drory, V; Kliger, M; Rubin, E; Porgador, A; Jelinek, R
2012-07-17
We demonstrate a novel array-based diagnostic platform comprising lipid/polydiacetylene (PDA) vesicles embedded within a transparent silica-gel matrix. The diagnostic scheme is based upon the unique chromatic properties of PDA, which undergoes blue-red transformations induced by interactions with amphiphilic or membrane-active analytes. We show that constructing a gel matrix array hosting PDA vesicles with different lipid compositions and applying to blood plasma obtained from healthy individuals and from patients suffering from disease, respectively, allow distinguishing among the disease conditions through application of a simple machine-learning algorithm, using the colorimetric response of the lipid/PDA/gel matrix as the input. Importantly, the new colorimetric diagnostic approach does not require a priori knowledge on the exact metabolite compositions of the blood plasma, since the concept relies only on identifying statistically significant changes in overall disease-induced chromatic response. The chromatic lipid/PDA/gel array-based "fingerprinting" concept is generic, easy to apply, and could be implemented for varied diagnostic and screening applications.
ERIC Educational Resources Information Center
Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L.; Yerys, Benjamin E.; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth
2015-01-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised…
Investigating the Link Between Radiologists Gaze, Diagnostic Decision, and Image Content
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tourassi, Georgia; Voisin, Sophie; Paquit, Vincent C
2013-01-01
Objective: To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods: Gaze data and diagnostic decisions were collected from six radiologists who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Texture analysis was performed in mammographic regions that attracted radiologists attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results: By poolingmore » the data from all radiologists machine learning produced highly accurate predictive models linking image content, gaze, cognition, and error. Merging radiologists gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the radiologists diagnostic errors while confirming 96.2% of their correct diagnoses. The radiologists individual errors could be adequately predicted by modeling the behavior of their peers. However, personalized tuning appears to be beneficial in many cases to capture more accurately individual behavior. Conclusions: Machine learning algorithms combining image features with radiologists gaze data and diagnostic decisions can be effectively developed to recognize cognitive and perceptual errors associated with the diagnostic interpretation of mammograms.« less
NASA Astrophysics Data System (ADS)
Kovalev, I. A.; Rakovskii, V. G.; Isakov, N. Yu.; Sandovskii, A. V.
2016-03-01
The work results on the development and improvement of the techniques, algorithms, and software-hardware of continuous operating diagnostics systems of rotating units and parts of turbine equipment state are presented. In particular, to ensure the full remote service of monitored turbine equipment using web technologies, the web version of the software of the automated systems of vibration-based diagnostics (ASVD VIDAS) was developed. The experience in the automated analysis of data obtained by ASVD VIDAS form the basis of the new algorithm of early detection of such dangerous defects as rotor deflection, crack in the rotor, and strong misalignment of supports. The program-technical complex of monitoring and measuring the deflection of medium pressure rotor (PTC) realizing this algorithm will alert the electric power plant staff during a deflection and indicate its value. This will give the opportunity to take timely measures to prevent the further extension of the defect. Repeatedly, recorded cases of full or partial destruction of shrouded shelves of rotor blades of the last stages of low-pressure cylinders of steam turbines defined the need to develop a version of the automated system of blade diagnostics (ASBD SKALA) for shrouded stages. The processing, analysis, presentation, and backup of data characterizing the mechanical state of blade device are carried out with a newly developed controller of the diagnostics system. As a result of the implementation of the works, the diagnosed parameters determining the operation security of rotating elements of equipment was expanded and the new tasks on monitoring the state of units and parts of turbines were solved. All algorithmic solutions and hardware-software implementations mentioned in the article were tested on the test benches and applied at some power plants.
An architecture for the development of real-time fault diagnosis systems using model-based reasoning
NASA Technical Reports Server (NTRS)
Hall, Gardiner A.; Schuetzle, James; Lavallee, David; Gupta, Uday
1992-01-01
Presented here is an architecture for implementing real-time telemetry based diagnostic systems using model-based reasoning. First, we describe Paragon, a knowledge acquisition tool for offline entry and validation of physical system models. Paragon provides domain experts with a structured editing capability to capture the physical component's structure, behavior, and causal relationships. We next describe the architecture of the run time diagnostic system. The diagnostic system, written entirely in Ada, uses the behavioral model developed offline by Paragon to simulate expected component states as reflected in the telemetry stream. The diagnostic algorithm traces causal relationships contained within the model to isolate system faults. Since the diagnostic process relies exclusively on the behavioral model and is implemented without the use of heuristic rules, it can be used to isolate unpredicted faults in a wide variety of systems. Finally, we discuss the implementation of a prototype system constructed using this technique for diagnosing faults in a science instrument. The prototype demonstrates the use of model-based reasoning to develop maintainable systems with greater diagnostic capabilities at a lower cost.
ERIC Educational Resources Information Center
Maljaars, Jarymke; Noens, Ilse; Scholte, Evert; van Berckelaer-Onnes, Ina
2012-01-01
The Diagnostic Interview for Social and Communication Disorders (DISCO; Wing, 2006) is a standardized, semi-structured and interviewer-based schedule for diagnosis of autism spectrum disorder (ASD). The objective of this study was to evaluate the criterion and convergent validity of the DISCO-11 ICD-10 algorithm in young and low-functioning…
Gelhorn, Heather; Hartman, Christie; Sakai, Joseph; Stallings, Michael; Young, Susan; Rhee, Soo Hyun; Corley, Robin; Hewitt, John; Hopfer, Christian; Crowley, Thomas
2008-11-01
Item response theory analyses were used to examine alcohol abuse and dependence symptoms and diagnoses in adolescents. Previous research suggests that the DSM-IV alcohol use disorder (AUD) symptoms in adolescents may be characterized by a single dimension. The present study extends prior research with a larger and more comprehensive sample and an examination of an alternative diagnostic algorithm for AUDs. Approximately 5,587 adolescents between the ages of 12 and 18 years from adjudicated, clinical, and community samples were administered structured clinical interviews. Analyses were conducted to examine the severity of alcohol abuse and dependence symptoms and the severity of alcohol use problems (AUDs) within the diagnostic categories created by the DSM-IV. Although the DSM-IV diagnostic categories differ in severity of AUDs, there is substantial overlap and inconsistency in AUD severity of persons across these categories. Item Response Theory-based AUD severity estimates suggest that many persons diagnosed with abuse have AUD severity greater than persons with dependence. Similarly, many persons who endorse some symptoms but do not qualify for a diagnosis (i.e., diagnostic orphans) have more severe AUDs than persons with an abuse diagnosis. Additionally, two dependence items, "tolerance" and "larger/longer," show differences in severity between samples. The distinction between DSM-IV abuse and dependence based on severity can be improved using an alternative diagnostic algorithm that considers all of the alcohol abuse and dependence symptoms conjointly.
NASA Astrophysics Data System (ADS)
Wickersham, Andrew Joseph
There are two critical research needs for the study of hydrocarbon combustion in high speed flows: 1) combustion diagnostics with adequate temporal and spatial resolution, and 2) mathematical techniques that can extract key information from large datasets. The goal of this work is to address these needs, respectively, by the use of high speed and multi-perspective chemiluminescence and advanced mathematical algorithms. To obtain the measurements, this work explored the application of high speed chemiluminescence diagnostics and the use of fiber-based endoscopes (FBEs) for non-intrusive and multi-perspective chemiluminescence imaging up to 20 kHz. Non-intrusive and full-field imaging measurements provide a wealth of information for model validation and design optimization of propulsion systems. However, it is challenging to obtain such measurements due to various implementation difficulties such as optical access, thermal management, and equipment cost. This work therefore explores the application of FBEs for non-intrusive imaging to supersonic propulsion systems. The FBEs used in this work are demonstrated to overcome many of the aforementioned difficulties and provided datasets from multiple angular positions up to 20 kHz in a supersonic combustor. The combustor operated on ethylene fuel at Mach 2 with an inlet stagnation temperature and pressure of approximately 640 degrees Fahrenheit and 70 psia, respectively. The imaging measurements were obtained from eight perspectives simultaneously, providing full-field datasets under such flow conditions for the first time, allowing the possibility of inferring multi-dimensional measurements. Due to the high speed and multi-perspective nature, such new diagnostic capability generates a large volume of data and calls for analysis algorithms that can process the data and extract key physics effectively. To extract the key combustion dynamics from the measurements, three mathematical methods were investigated in this work: Fourier analysis, proper orthogonal decomposition (POD), and wavelet analysis (WA). These algorithms were first demonstrated and tested on imaging measurements obtained from one perspective in a sub-sonic combustor (up to Mach 0.2). The results show that these algorithms are effective in extracting the key physics from large datasets, including the characteristic frequencies of flow-flame interactions especially during transient processes such as lean blow off and ignition. After these relatively simple tests and demonstrations, these algorithms were applied to process the measurements obtained from multi-perspective in the supersonic combustor. compared to past analyses (which have been limited to data obtained from one perspective only), the availability of data at multiple perspective provide further insights into the flame and flow structures in high speed flows. In summary, this work shows that high speed chemiluminescence is a simple yet powerful combustion diagnostic. Especially when combined with FBEs and the analyses algorithms described in this work, such diagnostics provide full-field imaging at high repetition rate in challenging flows. Based on such measurements, a wealth of information can be obtained from proper analysis algorithms, including characteristic frequency, dominating flame modes, and even multi-dimensional flame and flow structures.
Multispectral autofluorescence diagnosis of non-melanoma cutaneous tumors
NASA Astrophysics Data System (ADS)
Borisova, Ekaterina; Dogandjiiska, Daniela; Bliznakova, Irina; Avramov, Latchezar; Pavlova, Elmira; Troyanova, Petranka
2009-07-01
Fluorescent analysis of basal cell carcinoma (BCC), squamous cell carcinoma (SCC), keratoacanthoma and benign cutaneous lesions is carried out under initial phase of clinical trial in the National Oncological Center - Sofia. Excitation sources with maximum of emission at 365, 380, 405, 450 and 630 nm are applied for better differentiation between nonmelanoma malignant cutaneous lesions fluorescence and spectral discrimination from the benign pathologies. Major spectral features are addressed and diagnostic discrimination algorithms based on lesions' emission properties are proposed. The diagnostic algorithms and evaluation procedures found will be applied for development of an optical biopsy clinical system for skin cancer detection in the frames of National Oncological Center and other university hospital dermatological departments in our country.
2017-07-07
RESEARCH ARTICLE Self-reported HIV-positive status but subsequent HIV-negative test result using rapid diagnostic testing algorithms among seven sub...America * judith.harbertson.ctr@mail.mil Abstract HIV rapid diagnostic tests (RDTs) combined in an algorithm are the current standard for HIV diagnosis...in many sub-Saharan African countries, and extensive laboratory testing has con- firmed HIV RDTs have excellent sensitivity and specificity. However
Wilke, Russell A; Berg, Richard L; Peissig, Peggy; Kitchner, Terrie; Sijercic, Bozana; McCarty, Catherine A; McCarty, Daniel J
2007-03-01
Diabetes mellitus is a rapidly increasing and costly public health problem. Large studies are needed to understand the complex gene-environment interactions that lead to diabetes and its complications. The Marshfield Clinic Personalized Medicine Research Project (PMRP) represents one of the largest population-based DNA biobanks in the United States. As part of an effort to begin phenotyping common diseases within the PMRP, we now report on the construction of a diabetes case-finding algorithm using electronic medical record data from adult subjects aged > or =50 years living in one of the target PMRP ZIP codes. Based upon diabetic diagnostic codes alone, we observed a false positive case rate ranging from 3.0% (in subjects with the highest glycosylated hemoglobin values) to 44.4% (in subjects with the lowest glycosylated hemoglobin values). We therefore developed an improved case finding algorithm that utilizes diabetic diagnostic codes in combination with clinical laboratory data and medication history. This algorithm yielded an estimated prevalence of 24.2% for diabetes mellitus in adult subjects aged > or =50 years.
Cairns, Andrew W; Bond, Raymond R; Finlay, Dewar D; Guldenring, Daniel; Badilini, Fabio; Libretti, Guido; Peace, Aaron J; Leslie, Stephen J
The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. To improve interpretation accuracy and reduce missed co-abnormalities. The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct interpretation more often than humans, and 3) as many as 7 computerised diagnostic suggestions augmented human decision making in ECG interpretation. Statistical significance may be achieved by expanding sample size. Copyright © 2017 Elsevier Inc. All rights reserved.
Tanis, Wilco; Habets, Jesse; van den Brink, Renee B A; Symersky, Petr; Budde, Ricardo P J; Chamuleau, Steven A J
2014-02-01
For acquired mechanical prosthetic heart valve (PHV) obstruction and suspicion on thrombosis, recently updated European Society of Cardiology guidelines advocate the confirmation of thrombus by transthoracic echocardiography, transesophageal echocardiography (TEE), and fluoroscopy. However, no evidence-based diagnostic algorithm is available for correct thrombus detection, although this is clinically important as fibrinolysis is contraindicated in non-thrombotic obstruction (isolated pannus). Here, we performed a review of the literature in order to propose a diagnostic algorithm. We performed a systematic search in Pubmed and Embase. Included publications were assessed on methodological quality based on the validated Quality Assessment of Diagnostic Accuracy Studies (QUADAS) II checklist. Studies were scarce (n = 15) and the majority were of moderate methodological quality. In total, 238 mechanical PHV's with acquired obstruction and a reliable reference standard were included for the evaluation of the role of fluoroscopy, echocardiography, or multidetector-row computed tomography (MDCT). In acquired PHV obstruction caused by thrombosis, mass detection by TEE and leaflet restriction detected by fluoroscopy were observed in the majority of cases (96 and 100%, respectively). In contrast, in acquired PHV obstruction free of thrombosis (pannus), leaflet restriction detected by fluoroscopy was absent in some cases (17%) and mass detection by TEE was absent in the majority of cases (66%). In case of mass detection by TEE, predictors for obstructive thrombus masses (compared with pannus masses) were leaflet restriction, soft echo density, and increased mass length. In situations of inconclusive echocardiography, MDCT may correctly detect pannus/thrombus based on the morphological aspects and localization. In acquired mechanical PHV obstruction without leaflet restriction and absent mass on TEE, obstructive PHV thrombosis cannot be confirmed and consequently, fibrinolysis is not advised. Based on the literature search and our opinion, a diagnostic algorithm is provided to correctly identify non-thrombotic PHV obstruction, which is highly relevant in daily clinical practice.
Region of interest processing for iterative reconstruction in x-ray computed tomography
NASA Astrophysics Data System (ADS)
Kopp, Felix K.; Nasirudin, Radin A.; Mei, Kai; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Noël, Peter B.
2015-03-01
The recent advancements in the graphics card technology raised the performance of parallel computing and contributed to the introduction of iterative reconstruction methods for x-ray computed tomography in clinical CT scanners. Iterative maximum likelihood (ML) based reconstruction methods are known to reduce image noise and to improve the diagnostic quality of low-dose CT. However, iterative reconstruction of a region of interest (ROI), especially ML based, is challenging. But for some clinical procedures, like cardiac CT, only a ROI is needed for diagnostics. A high-resolution reconstruction of the full field of view (FOV) consumes unnecessary computation effort that results in a slower reconstruction than clinically acceptable. In this work, we present an extension and evaluation of an existing ROI processing algorithm. Especially improvements for the equalization between regions inside and outside of a ROI are proposed. The evaluation was done on data collected from a clinical CT scanner. The performance of the different algorithms is qualitatively and quantitatively assessed. Our solution to the ROI problem provides an increase in signal-to-noise ratio and leads to visually less noise in the final reconstruction. The reconstruction speed of our technique was observed to be comparable with other previous proposed techniques. The development of ROI processing algorithms in combination with iterative reconstruction will provide higher diagnostic quality in the near future.
Accuracy of vaginal symptom self-diagnosis algorithms for deployed military women.
Ryan-Wenger, Nancy A; Neal, Jeremy L; Jones, Ashley S; Lowe, Nancy K
2010-01-01
Deployed military women have an increased risk for development of vaginitis due to extreme temperatures, primitive sanitation, hygiene and laundry facilities, and unavailable or unacceptable healthcare resources. The Women in the Military Self-Diagnosis (WMSD) and treatment kit was developed as a field-expedient solution to this problem. The primary study aims were to evaluate the accuracy of women's self-diagnosis of vaginal symptoms and eight diagnostic algorithms and to predict potential self-medication omission and commission error rates. Participants included 546 active duty, deployable Army (43.3%) and Navy (53.6%) women with vaginal symptoms who sought healthcare at troop medical clinics on base.In the clinic lavatory, women conducted a self-diagnosis using a sterile cotton swab to obtain vaginal fluid, a FemExam card to measure positive or negative pH and amines, and the investigator-developed WMSD Decision-Making Guide. Potential self-diagnoses were "bacterial infection" (bacterial vaginosis [BV] and/or trichomonas vaginitis [TV]), "yeast infection" (candida vaginitis [CV]), "no infection/normal," or "unclear." The Affirm VPIII laboratory reference standard was used to detect clinically significant amounts of vaginal fluid DNA for organisms associated with BV, TV, and CV. Women's self-diagnostic accuracy was 56% for BV/TV and 69.2% for CV. False-positives would have led to a self-medication commission error rate of 20.3% for BV/TV and 8% for CV. Potential self-medication omission error rates due to false-negatives were 23.7% for BV/TV and 24.8% for CV. The positive predictive value of diagnostic algorithms ranged from 0% to 78.1% for BV/TV and 41.7% for CV. The algorithms were based on clinical diagnostic standards. The nonspecific nature of vaginal symptoms, mixed infections, and a faulty device intended to measure vaginal pH and amines explain why none of the algorithms reached the goal of 95% accuracy. The next prototype of the WMSD kit will not include nonspecific vaginal signs and symptoms in favor of recently available point-of-care devices that identify antigens or enzymes of the causative BV, TV, and CV organisms.
Diagnostic Approach to a Patient With Paraneoplastic Neurological Syndrome.
Mahta, Ali; Vijayvergia, Namrata; Bhavsar, Tapan M; Ward, Lawrence D
2012-10-01
Herein, we discussed a case of an otherwise healthy man who presented with progressive gait imbalance and ataxia, found to have small cell lung cancer. Based upon our clinical findings and laboratory data, a diagnosis of paraneoplastic cerebellar degeneration was made. Paraneoplastic neurological syndromes (PNS) are relatively rare but diverse and always should be considered in differentials. A diagnostic algorithm along with appropriate work up is discussed here.
Diagnostic Approach to a Patient With Paraneoplastic Neurological Syndrome
Mahta, Ali; Vijayvergia, Namrata; Bhavsar, Tapan M.; Ward, Lawrence D.
2012-01-01
Herein, we discussed a case of an otherwise healthy man who presented with progressive gait imbalance and ataxia, found to have small cell lung cancer. Based upon our clinical findings and laboratory data, a diagnosis of paraneoplastic cerebellar degeneration was made. Paraneoplastic neurological syndromes (PNS) are relatively rare but diverse and always should be considered in differentials. A diagnostic algorithm along with appropriate work up is discussed here. PMID:29147315
Diagnostic potential of Raman spectroscopy in Barrett's esophagus
NASA Astrophysics Data System (ADS)
Wong Kee Song, Louis-Michel; Molckovsky, Andrea; Wang, Kenneth K.; Burgart, Lawrence J.; Dolenko, Brion; Somorjai, Rajmund L.; Wilson, Brian C.
2005-04-01
Patients with Barrett's esophagus (BE) undergo periodic endoscopic surveillance with random biopsies in an effort to detect dysplastic or early cancerous lesions. Surveillance may be enhanced by near-infrared Raman spectroscopy (NIRS), which has the potential to identify endoscopically-occult dysplastic lesions within the Barrett's segment and allow for targeted biopsies. The aim of this study was to assess the diagnostic performance of NIRS for identifying dysplastic lesions in BE in vivo. Raman spectra (Pexc=70 mW; t=5 s) were collected from Barrett's mucosa at endoscopy using a custom-built NIRS system (λexc=785 nm) equipped with a filtered fiber-optic probe. Each probed site was biopsied for matching histological diagnosis as assessed by an expert pathologist. Diagnostic algorithms were developed using genetic algorithm-based feature selection and linear discriminant analysis, and classification was performed on all spectra with a bootstrap-based cross-validation scheme. The analysis comprised 192 samples (112 non-dysplastic, 54 low-grade dysplasia and 26 high-grade dysplasia/early adenocarcinoma) from 65 patients. Compared with histology, NIRS differentiated dysplastic from non-dysplastic Barrett's samples with 86% sensitivity, 88% specificity and 87% accuracy. NIRS identified 'high-risk' lesions (high-grade dysplasia/early adenocarcinoma) with 88% sensitivity, 89% specificity and 89% accuracy. In the present study, NIRS classified Barrett's epithelia with high and clinically-useful diagnostic accuracy.
Weiß, Jakob; Schabel, Christoph; Bongers, Malte; Raupach, Rainer; Clasen, Stephan; Notohamiprodjo, Mike; Nikolaou, Konstantin; Bamberg, Fabian
2017-03-01
Background Metal artifacts often impair diagnostic accuracy in computed tomography (CT) imaging. Therefore, effective and workflow implemented metal artifact reduction algorithms are crucial to gain higher diagnostic image quality in patients with metallic hardware. Purpose To assess the clinical performance of a novel iterative metal artifact reduction (iMAR) algorithm for CT in patients with dental fillings. Material and Methods Thirty consecutive patients scheduled for CT imaging and dental fillings were included in the analysis. All patients underwent CT imaging using a second generation dual-source CT scanner (120 kV single-energy; 100/Sn140 kV in dual-energy, 219 mAs, gantry rotation time 0.28-1/s, collimation 0.6 mm) as part of their clinical work-up. Post-processing included standard kernel (B49) and an iterative MAR algorithm. Image quality and diagnostic value were assessed qualitatively (Likert scale) and quantitatively (HU ± SD) by two reviewers independently. Results All 30 patients were included in the analysis, with equal reconstruction times for iMAR and standard reconstruction (17 s ± 0.5 vs. 19 s ± 0.5; P > 0.05). Visual image quality was significantly higher for iMAR as compared with standard reconstruction (3.8 ± 0.5 vs. 2.6 ± 0.5; P < 0.0001, respectively) and showed improved evaluation of adjacent anatomical structures. Similarly, HU-based measurements of degree of artifacts were significantly lower in the iMAR reconstructions as compared with the standard reconstruction (0.9 ± 1.6 vs. -20 ± 47; P < 0.05, respectively). Conclusion The tested iterative, raw-data based reconstruction MAR algorithm allows for a significant reduction of metal artifacts and improved evaluation of adjacent anatomical structures in the head and neck area in patients with dental hardware.
Automated Dermoscopy Image Analysis of Pigmented Skin Lesions
Baldi, Alfonso; Quartulli, Marco; Murace, Raffaele; Dragonetti, Emanuele; Manganaro, Mario; Guerra, Oscar; Bizzi, Stefano
2010-01-01
Dermoscopy (dermatoscopy, epiluminescence microscopy) is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs), allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis). This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR). PMID:24281070
Application of content-based image compression to telepathology
NASA Astrophysics Data System (ADS)
Varga, Margaret J.; Ducksbury, Paul G.; Callagy, Grace
2002-05-01
Telepathology is a means of practicing pathology at a distance, viewing images on a computer display rather than directly through a microscope. Without compression, images take too long to transmit to a remote location and are very expensive to store for future examination. However, to date the use of compressed images in pathology remains controversial. This is because commercial image compression algorithms such as JPEG achieve data compression without knowledge of the diagnostic content. Often images are lossily compressed at the expense of corrupting informative content. None of the currently available lossy compression techniques are concerned with what information has been preserved and what data has been discarded. Their sole objective is to compress and transmit the images as fast as possible. By contrast, this paper presents a novel image compression technique, which exploits knowledge of the slide diagnostic content. This 'content based' approach combines visually lossless and lossy compression techniques, judiciously applying each in the appropriate context across an image so as to maintain 'diagnostic' information while still maximising the possible compression. Standard compression algorithms, e.g. wavelets, can still be used, but their use in a context sensitive manner can offer high compression ratios and preservation of diagnostically important information. When compared with lossless compression the novel content-based approach can potentially provide the same degree of information with a smaller amount of data. When compared with lossy compression it can provide more information for a given amount of compression. The precise gain in the compression performance depends on the application (e.g. database archive or second opinion consultation) and the diagnostic content of the images.
Lungu, Angela; Swift, Andrew J; Capener, David; Kiely, David; Hose, Rod; Wild, Jim M
2016-06-01
Accurately identifying patients with pulmonary hypertension (PH) using noninvasive methods is challenging, and right heart catheterization (RHC) is the gold standard. Magnetic resonance imaging (MRI) has been proposed as an alternative to echocardiography and RHC in the assessment of cardiac function and pulmonary hemodynamics in patients with suspected PH. The aim of this study was to assess whether machine learning using computational modeling techniques and image-based metrics of PH can improve the diagnostic accuracy of MRI in PH. Seventy-two patients with suspected PH attending a referral center underwent RHC and MRI within 48 hours. Fifty-seven patients were diagnosed with PH, and 15 had no PH. A number of functional and structural cardiac and cardiovascular markers derived from 2 mathematical models and also solely from MRI of the main pulmonary artery and heart were integrated into a classification algorithm to investigate the diagnostic utility of the combination of the individual markers. A physiological marker based on the quantification of wave reflection in the pulmonary artery was shown to perform best individually, but optimal diagnostic performance was found by the combination of several image-based markers. Classifier results, validated using leave-one-out cross validation, demonstrated that combining computation-derived metrics reflecting hemodynamic changes in the pulmonary vasculature with measurement of right ventricular morphology and function, in a decision support algorithm, provides a method to noninvasively diagnose PH with high accuracy (92%). The high diagnostic accuracy of these MRI-based model parameters may reduce the need for RHC in patients with suspected PH.
Ostreĭkov, I F; Podkopaev, V N; Moiseev, D B; Karpysheva, E V; Markova, L A; Sizov, S V
1997-01-01
Total mortality decreased by 2.5 times in the wards for intensive care of the newborns in the Tushino Pediatric Hospital in 1996 and is now 7.6%. Such results are due to a complex of measures, one such measure being the development and introduction of an algorithm for the diagnosis and treatment of newborns hospitalized in intensive care wards. The algorithm facilitates the work of the staff, helps earlier diagnose a disease, and, hence, carry out timely scientifically based therapy.
Pugliese, Cara E; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L; Yerys, Benjamin E; Maddox, Brenna B; White, Susan W; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D; Schultz, Robert T; Martin, Alex; Anthony, Laura Gutermuth
2015-12-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised algorithm demonstrated increased sensitivity, but lower specificity in the overall sample. Estimates were highest for females, individuals with a verbal IQ below 85 or above 115, and ages 16 and older. Best practice diagnostic procedures should include the Module 4 in conjunction with other assessment tools. Balancing needs for sensitivity and specificity depending on the purpose of assessment (e.g., clinical vs. research) and demographic characteristics mentioned above will enhance its utility.
Hus, Vanessa; Lord, Catherine
2014-08-01
The recently published Autism Diagnostic Observation Schedule, 2nd edition (ADOS-2) includes revised diagnostic algorithms and standardized severity scores for modules used to assess younger children. A revised algorithm and severity scores are not yet available for Module 4, used with verbally fluent adults. The current study revises the Module 4 algorithm and calibrates raw overall and domain totals to provide metrics of autism spectrum disorder (ASD) symptom severity. Sensitivity and specificity of the revised Module 4 algorithm exceeded 80 % in the overall sample. Module 4 calibrated severity scores provide quantitative estimates of ASD symptom severity that are relatively independent of participant characteristics. These efforts increase comparability of ADOS scores across modules and should facilitate efforts to examine symptom trajectories from toddler to adulthood.
Sequential Test Strategies for Multiple Fault Isolation
NASA Technical Reports Server (NTRS)
Shakeri, M.; Pattipati, Krishna R.; Raghavan, V.; Patterson-Hine, Ann; Kell, T.
1997-01-01
In this paper, we consider the problem of constructing near optimal test sequencing algorithms for diagnosing multiple faults in redundant (fault-tolerant) systems. The computational complexity of solving the optimal multiple-fault isolation problem is super-exponential, that is, it is much more difficult than the single-fault isolation problem, which, by itself, is NP-hard. By employing concepts from information theory and Lagrangian relaxation, we present several static and dynamic (on-line or interactive) test sequencing algorithms for the multiple fault isolation problem that provide a trade-off between the degree of suboptimality and computational complexity. Furthermore, we present novel diagnostic strategies that generate a static diagnostic directed graph (digraph), instead of a static diagnostic tree, for multiple fault diagnosis. Using this approach, the storage complexity of the overall diagnostic strategy reduces substantially. Computational results based on real-world systems indicate that the size of a static multiple fault strategy is strictly related to the structure of the system, and that the use of an on-line multiple fault strategy can diagnose faults in systems with as many as 10,000 failure sources.
Emergency ultrasound-based algorithms for diagnosing blunt abdominal trauma.
Stengel, Dirk; Rademacher, Grit; Ekkernkamp, Axel; Güthoff, Claas; Mutze, Sven
2015-09-14
Ultrasonography (performed by means of a four-quadrant, focused assessment of sonography for trauma (FAST)) is regarded as a key instrument for the initial assessment of patients with suspected blunt abdominal and thoraco-abdominal trauma in the emergency department setting. FAST has a high specificity but low sensitivity in detecting and excluding visceral injuries. Proponents of FAST argue that ultrasound-based clinical pathways enhance the speed of primary trauma assessment, reduce the number of unnecessary multi-detector computed tomography (MDCT) scans, and enable quicker triage to surgical and non-surgical care. Given the proven accuracy, increasing availability of, and indication for, MDCT among patients with blunt abdominal and multiple injuries, we aimed to compile the best available evidence of the use of FAST-based assessment compared with other primary trauma assessment protocols. To assess the effects of diagnostic algorithms using ultrasonography including in FAST examinations in the emergency department in relation to the early, late, and overall mortality of patients with suspected blunt abdominal trauma. The most recent search was run on 30th June 2015. We searched the Cochrane Injuries Group Specialised Register, The Cochrane Library, MEDLINE (OvidSP), EMBASE (OvidSP), ISI Web of Science (SCI-EXPANDED, SSCI, CPCI-S, and CPSI-SSH), clinical trials registers, and screened reference lists. Trial authors were contacted for further information and individual patient data. We included randomised controlled trials (RCTs). Participants were patients with blunt torso, abdominal, or multiple trauma undergoing diagnostic investigations for abdominal organ injury. The intervention was diagnostic algorithms comprising emergency ultrasonography (US). The control was diagnostic algorithms without US examinations (for example, primary computed tomography (CT) or diagnostic peritoneal lavage (DPL)). Outcomes were mortality, use of CT or invasive procedures (DPL, laparoscopy, laparotomy), and cost-effectiveness. Two authors (DS and CG) independently selected trials for inclusion, assessed methodological quality, and extracted data. Methodological quality was assessed using the Cochrane Collaboration risk of bias tool. Where possible, data were pooled and relative risks (RRs), risk differences (RDs), and weighted mean differences, each with 95% confidence intervals (CIs), were calculated by fixed-effect or random-effects models as appropriate. We identified four studies meeting our inclusion criteria. Overall, trials were of poor to moderate methodological quality. Few trial authors responded to our written inquiries seeking to resolve controversial issues and to obtain individual patient data. Strong heterogeneity amongst the trials prompted discussion between the review authors as to whether the data should or should not be pooled; we decided in favour of a quantitative synthesis to provide a rough impression about the effect sizes achievable with US-based triage algorithms. We pooled mortality data from three trials involving 1254 patients; the RR in favour of the FAST arm was 1.00 (95% CI 0.50 to 2.00). FAST-based pathways reduced the number of CT scans (random-effects model RD -0.52, 95% CI -0.83 to -0.21), but the meaning of this result was unclear. The experimental evidence justifying FAST-based clinical pathways in diagnosing patients with suspected abdominal or multiple blunt trauma remains poor. Because of strong heterogeneity between the trial results, the quantitative information provided by this review may only be used in an exploratory fashion. It is unlikely that FAST will ever be investigated by means of a confirmatory, large-scale RCT in the future. Thus, this Cochrane Review may be regarded as a review which provides the best available evidence for clinical practice guidelines and management recommendations. It can only be concluded from the few head-to-head studies that negative US scans are likely to reduce the incidence of MDCT scans which, given the low sensitivity of FAST (or reliability of negative results), may adversely affect the diagnostic yield of the trauma survey. At best, US has no negative impact on mortality or morbidity. Assuming that major blunt abdominal or multiple trauma is associated with 15% mortality and a CT-based diagnostic work-up is considered the current standard of care, 874, 3495, or 21,838 patients are needed per intervention group to demonstrate non-inferiority of FAST to CT-based algorithms with non-inferiority margins of 5%, 2.5%, and 1%, power of 90%, and a type-I error alpha of 5%.
Continued Evaluation of Gear Condition Indicator Performance on Rotorcraft Fleet
NASA Technical Reports Server (NTRS)
Delgado, Irebert R.; Dempsey, Paula J.; Antolick, Lance J.; Wade, Daniel R.
2013-01-01
This paper details analyses of condition indicator performance for the helicopter nose gearbox within the U.S. Army's Condition-Based Maintenance Program. Ten nose gearbox data sets underwent two specific analyses. A mean condition indicator level analysis was performed where condition indicator performance was based on a 'batting average' measured before and after part replacement. Two specific condition indicators, Diagnostic Algorithm 1 and Sideband Index, were found to perform well for the data sets studied. A condition indicator versus gear wear analysis was also performed, where gear wear photographs and descriptions from Army tear-down analyses were categorized based on ANSI/AGMA 1010-E95 standards. Seven nose gearbox data sets were analyzed and correlated with condition indicators Diagnostic Algorithm 1 and Sideband Index. Both were found to be most responsive to gear wear cases of micropitting and spalling. Input pinion nose gear box condition indicators were found to be more responsive to part replacement during overhaul than their corresponding output gear nose gear box condition indicators.
Smeets, Miek; Degryse, Jan; Janssens, Stefan; Matheï, Catharina; Wallemacq, Pierre; Vanoverschelde, Jean-Louis; Aertgeerts, Bert; Vaes, Bert
2016-10-06
Different diagnostic algorithms for non-acute heart failure (HF) exist. Our aim was to compare the ability of these algorithms to identify HF in symptomatic patients aged 80 years and older and identify those patients at highest risk for mortality. Diagnostic accuracy and validation study. General practice, Belgium. 365 patients with HF symptoms aged 80 years and older (BELFRAIL cohort). Participants underwent a full clinical assessment, including a detailed echocardiographic examination at home. The diagnostic accuracy of 4 different algorithms was compared using an intention-to-diagnose analysis. The European Society of Cardiology (ESC) definition of HF was used as the reference standard for HF diagnosis. Kaplan-Meier curves for 5-year all-cause mortality were plotted and HRs and corresponding 95% CIs were calculated to compare the mortality risk predicting abilities of the different algorithms. Net reclassification improvement (NRI) was calculated. The prevalence of HF was 20% (n=74). The 2012 ESC algorithm yielded the highest sensitivity (92%, 95% CI 83% to 97%) as well as the highest referral rate (71%, n=259), whereas the Oudejans algorithm yielded the highest specificity (73%, 95% CI 68% to 78%) and the lowest referral rate (36%, n=133). These differences could be ascribed to differences in N-terminal probrain natriuretic peptide cut-off values (125 vs 400 pg/mL). The Kelder and Oudejans algorithms exhibited NRIs of 12% (95% CI 0.7% to 22%, p=0.04) and 22% (95% CI 9% to 32%, p<0.001), respectively, compared with the ESC algorithm. All algorithms detected patients at high risk for mortality (HR 1.9, 95% CI 1.4 to 2.5; Kelder) to 2.3 (95% CI 1.7 to 3.1; Oudejans). No significant differences were observed among the algorithms with respect to mortality risk predicting abilities. Choosing a diagnostic algorithm for non-acute HF in elderly patients represents a trade-off between sensitivity and specificity, mainly depending on differences between cut-off values for natriuretic peptides. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Algorithm for Video Summarization of Bronchoscopy Procedures
2011-01-01
Background The duration of bronchoscopy examinations varies considerably depending on the diagnostic and therapeutic procedures used. It can last more than 20 minutes if a complex diagnostic work-up is included. With wide access to videobronchoscopy, the whole procedure can be recorded as a video sequence. Common practice relies on an active attitude of the bronchoscopist who initiates the recording process and usually chooses to archive only selected views and sequences. However, it may be important to record the full bronchoscopy procedure as documentation when liability issues are at stake. Furthermore, an automatic recording of the whole procedure enables the bronchoscopist to focus solely on the performed procedures. Video recordings registered during bronchoscopies include a considerable number of frames of poor quality due to blurry or unfocused images. It seems that such frames are unavoidable due to the relatively tight endobronchial space, rapid movements of the respiratory tract due to breathing or coughing, and secretions which occur commonly in the bronchi, especially in patients suffering from pulmonary disorders. Methods The use of recorded bronchoscopy video sequences for diagnostic, reference and educational purposes could be considerably extended with efficient, flexible summarization algorithms. Thus, the authors developed a prototype system to create shortcuts (called summaries or abstracts) of bronchoscopy video recordings. Such a system, based on models described in previously published papers, employs image analysis methods to exclude frames or sequences of limited diagnostic or education value. Results The algorithm for the selection or exclusion of specific frames or shots from video sequences recorded during bronchoscopy procedures is based on several criteria, including automatic detection of "non-informative", frames showing the branching of the airways and frames including pathological lesions. Conclusions The paper focuses on the challenge of generating summaries of bronchoscopy video recordings. PMID:22185344
Benz, Dominik C; Fuchs, Tobias A; Gräni, Christoph; Studer Bruengger, Annina A; Clerc, Olivier F; Mikulicic, Fran; Messerli, Michael; Stehli, Julia; Possner, Mathias; Pazhenkottil, Aju P; Gaemperli, Oliver; Kaufmann, Philipp A; Buechel, Ronny R
2018-02-01
Iterative reconstruction (IR) algorithms allow for a significant reduction in radiation dose of coronary computed tomography angiography (CCTA). We performed a head-to-head comparison of adaptive statistical IR (ASiR) and model-based IR (MBIR) algorithms to assess their impact on quantitative image parameters and diagnostic accuracy for submillisievert CCTA. CCTA datasets of 91 patients were reconstructed using filtered back projection (FBP), increasing contributions of ASiR (20, 40, 60, 80, and 100%), and MBIR. Signal and noise were measured in the aortic root to calculate signal-to-noise ratio (SNR). In a subgroup of 36 patients, diagnostic accuracy of ASiR 40%, ASiR 100%, and MBIR for diagnosis of coronary artery disease (CAD) was compared with invasive coronary angiography. Median radiation dose was 0.21 mSv for CCTA. While increasing levels of ASiR gradually reduced image noise compared with FBP (up to - 48%, P < 0.001), MBIR provided largest noise reduction (-79% compared with FBP) outperforming ASiR (-59% compared with ASiR 100%; P < 0.001). Increased noise and lower SNR with ASiR 40% and ASiR 100% resulted in substantially lower diagnostic accuracy to detect CAD as diagnosed by invasive coronary angiography compared with MBIR: sensitivity and specificity were 100 and 37%, 100 and 57%, and 100 and 74% for ASiR 40%, ASiR 100%, and MBIR, respectively. MBIR offers substantial noise reduction with increased SNR, paving the way for implementation of submillisievert CCTA protocols in clinical routine. In contrast, inferior noise reduction by ASiR negatively affects diagnostic accuracy of submillisievert CCTA for CAD detection. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.
Noël, Peter B; Engels, Stephan; Köhler, Thomas; Muenzel, Daniela; Franz, Daniela; Rasper, Michael; Rummeny, Ernst J; Dobritz, Martin; Fingerle, Alexander A
2018-01-01
Background The explosive growth of computer tomography (CT) has led to a growing public health concern about patient and population radiation dose. A recently introduced technique for dose reduction, which can be combined with tube-current modulation, over-beam reduction, and organ-specific dose reduction, is iterative reconstruction (IR). Purpose To evaluate the quality, at different radiation dose levels, of three reconstruction algorithms for diagnostics of patients with proven liver metastases under tumor follow-up. Material and Methods A total of 40 thorax-abdomen-pelvis CT examinations acquired from 20 patients in a tumor follow-up were included. All patients were imaged using the standard-dose and a specific low-dose CT protocol. Reconstructed slices were generated by using three different reconstruction algorithms: a classical filtered back projection (FBP); a first-generation iterative noise-reduction algorithm (iDose4); and a next generation model-based IR algorithm (IMR). Results The overall detection of liver lesions tended to be higher with the IMR algorithm than with FBP or iDose4. The IMR dataset at standard dose yielded the highest overall detectability, while the low-dose FBP dataset showed the lowest detectability. For the low-dose protocols, a significantly improved detectability of the liver lesion can be reported compared to FBP or iDose 4 ( P = 0.01). The radiation dose decreased by an approximate factor of 5 between the standard-dose and the low-dose protocol. Conclusion The latest generation of IR algorithms significantly improved the diagnostic image quality and provided virtually noise-free images for ultra-low-dose CT imaging.
Sola, J; Braun, F; Muntane, E; Verjus, C; Bertschi, M; Hugon, F; Manzano, S; Benissa, M; Gervaix, A
2016-08-01
Pneumonia remains the worldwide leading cause of children mortality under the age of five, with every year 1.4 million deaths. Unfortunately, in low resource settings, very limited diagnostic support aids are provided to point-of-care practitioners. Current UNICEF/WHO case management algorithm relies on the use of a chronometer to manually count breath rates on pediatric patients: there is thus a major need for more sophisticated tools to diagnose pneumonia that increase sensitivity and specificity of breath-rate-based algorithms. These tools should be low cost, and adapted to practitioners with limited training. In this work, a novel concept of unsupervised tool for the diagnosis of childhood pneumonia is presented. The concept relies on the automated analysis of respiratory sounds as recorded by a point-of-care electronic stethoscope. By identifying the presence of auscultation sounds at different chest locations, this diagnostic tool is intended to estimate a pneumonia likelihood score. After presenting the overall architecture of an algorithm to estimate pneumonia scores, the importance of a robust unsupervised method to identify inspiratory and expiratory phases of a respiratory cycle is highlighted. Based on data from an on-going study involving pediatric pneumonia patients, a first algorithm to segment respiratory sounds is suggested. The unsupervised algorithm relies on a Mel-frequency filter bank, a two-step Gaussian Mixture Model (GMM) description of data, and a final Hidden Markov Model (HMM) interpretation of inspiratory-expiratory sequences. Finally, illustrative results on first recruited patients are provided. The presented algorithm opens the doors to a new family of unsupervised respiratory sound analyzers that could improve future versions of case management algorithms for the diagnosis of pneumonia in low-resources settings.
Fiuzy, Mohammad; Haddadnia, Javad; Mollania, Nasrin; Hashemian, Maryam; Hassanpour, Kazem
2012-01-01
Background Accurate Diagnosis of Breast Cancer is of prime importance. Fine Needle Aspiration test or "FNA”, which has been used for several years in Europe, is a simple, inexpensive, noninvasive and accurate technique for detecting breast cancer. Expending the suitable features of the Fine Needle Aspiration results is the most important diagnostic problem in early stages of breast cancer. In this study, we introduced a new algorithm that can detect breast cancer based on combining artificial intelligent system and Fine Needle Aspiration (FNA). Methods We studied the Features of Wisconsin Data Base Cancer which contained about 569 FNA test samples (212 patient samples (malignant) and 357 healthy samples (benign)). In this research, we combined Artificial Intelligence Approaches, such as Evolutionary Algorithm (EA) with Genetic Algorithm (GA), and also used Exact Classifier Systems (here by Fuzzy C-Means (FCM)) to separate malignant from benign samples. Furthermore, we examined artificial Neural Networks (NN) to identify the model and structure. This research proposed a new algorithm for an accurate diagnosis of breast cancer. Results According to Wisconsin Data Base Cancer (WDBC) data base, 62.75% of samples were benign, and 37.25% were malignant. After applying the proposed algorithm, we achieved high detection accuracy of about "96.579%” on 205 patients who were diagnosed as having breast cancer. It was found that the method had 93% sensitivity, 73% specialty, 65% positive predictive value, and 95% negative predictive value, respectively. If done by experts, Fine Needle Aspiration (FNA) can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. FNA can be the first line of diagnosis in women with breast masses, at least in deprived regions, and may increase health standards and clinical supervision of patients. Conclusion Such a smart, economical, non-invasive, rapid and accurate system can be introduced as a useful diagnostic system for comprehensive treatment of breast cancer. Another advantage of this method is the possibility of diagnosing breast abnormalities. If done by experts, FNA can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. PMID:25352966
NASA Astrophysics Data System (ADS)
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh; Sheng, Shuangwen; Tan, Yuegang; Zhou, Zude
2017-09-01
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is often unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. The results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is oftenmore » unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. Lastly, the results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.« less
Hong, Liu; Qu, Yongzhi; Dhupia, Jaspreet Singh; ...
2017-02-27
The localized failures of gears introduce cyclic-transient impulses in the measured gearbox vibration signals. These impulses are usually identified from the sidebands around gear-mesh harmonics through the spectral analysis of cyclo-stationary signals. However, in practice, several high-powered applications of gearboxes like wind turbines are intrinsically characterized by nonstationary processes that blur the measured vibration spectra of a gearbox and deteriorate the efficacy of spectral diagnostic methods. Although order-tracking techniques have been proposed to improve the performance of spectral diagnosis for nonstationary signals measured in such applications, the required hardware for the measurement of rotational speed of these machines is oftenmore » unavailable in industrial settings. Moreover, existing tacho-less order-tracking approaches are usually limited by the high time-frequency resolution requirement, which is a prerequisite for the precise estimation of the instantaneous frequency. To address such issues, a novel fault-signature enhancement algorithm is proposed that can alleviate the spectral smearing without the need of rotational speed measurement. This proposed tacho-less diagnostic technique resamples the measured acceleration signal of the gearbox based on the optimal warping path evaluated from the fast dynamic time-warping algorithm, which aligns a filtered shaft rotational harmonic signal with respect to a reference signal assuming a constant shaft rotational speed estimated from the approximation of operational speed. The effectiveness of this method is validated using both simulated signals from a fixed-axis gear pair under nonstationary conditions and experimental measurements from a 750-kW planetary wind turbine gearbox on a dynamometer test rig. Lastly, the results demonstrate that the proposed algorithm can identify fault information from typical gearbox vibration measurements carried out in a resource-constrained industrial environment.« less
Monahan, Mark; Jowett, Sue; Lovibond, Kate; Gill, Paramjit; Godwin, Marshall; Greenfield, Sheila; Hanley, Janet; Hobbs, F D Richard; Martin, Una; Mant, Jonathan; McKinstry, Brian; Williams, Bryan; Sheppard, James P; McManus, Richard J
2018-02-01
Clinical guidelines in the United States and United Kingdom recommend that individuals with suspected hypertension should have ambulatory blood pressure (BP) monitoring to confirm the diagnosis. This approach reduces misdiagnosis because of white coat hypertension but will not identify people with masked hypertension who may benefit from treatment. The Predicting Out-of-Office Blood Pressure (PROOF-BP) algorithm predicts masked and white coat hypertension based on patient characteristics and clinic BP, improving the accuracy of diagnosis while limiting subsequent ambulatory BP monitoring. This study assessed the cost-effectiveness of using this tool in diagnosing hypertension in primary care. A Markov cost-utility cohort model was developed to compare diagnostic strategies: the PROOF-BP approach, including those with clinic BP ≥130/80 mm Hg who receive ambulatory BP monitoring as guided by the algorithm, compared with current standard diagnostic strategies including those with clinic BP ≥140/90 mm Hg combined with further monitoring (ambulatory BP monitoring as reference, clinic, and home monitoring also assessed). The model adopted a lifetime horizon with a 3-month time cycle, taking a UK Health Service/Personal Social Services perspective. The PROOF-BP algorithm was cost-effective in screening all patients with clinic BP ≥130/80 mm Hg compared with current strategies that only screen those with clinic BP ≥140/90 mm Hg, provided healthcare providers were willing to pay up to £20 000 ($26 000)/quality-adjusted life year gained. Deterministic and probabilistic sensitivity analyses supported the base-case findings. The PROOF-BP algorithm seems to be cost-effective compared with the conventional BP diagnostic options in primary care. Its use in clinical practice is likely to lead to reduced cardiovascular disease, death, and disability. © 2017 American Heart Association, Inc.
Design of the algorithm of photons migration in the multilayer skin structure
NASA Astrophysics Data System (ADS)
Bulykina, Anastasiia B.; Ryzhova, Victoria A.; Korotaev, Valery V.; Samokhin, Nikita Y.
2017-06-01
Design of approaches and methods of the oncological diseases diagnostics has special significance. It allows determining any kind of tumors at early stages. The development of optical and laser technologies provided increase of a number of methods allowing making diagnostic studies of oncological diseases. A promising area of biomedical diagnostics is the development of automated nondestructive testing systems for the study of the skin polarizing properties based on backscattered radiation detection. Specification of the examined tissue polarizing properties allows studying of structural properties change influenced by various pathologies. Consequently, measurement and analysis of the polarizing properties of the scattered optical radiation for the development of methods for diagnosis and imaging of skin in vivo appear relevant. The purpose of this research is to design the algorithm of photons migration in the multilayer skin structure. In this research, the algorithm of photons migration in the multilayer skin structure was designed. It is based on the use of the Monte Carlo method. Implemented Monte Carlo method appears as a tracking the paths of photons experiencing random discrete direction changes before they are released from the analyzed area or decrease their intensity to negligible levels. Modeling algorithm consists of the medium and the source characteristics generation, a photon generating considering spatial coordinates of the polar and azimuthal angles, the photon weight reduction calculating due to specular and diffuse reflection, the photon mean free path definition, the photon motion direction angle definition as a result of random scattering with a Henyey-Greenstein phase function, the medium's absorption calculation. Biological tissue is modeled as a homogeneous scattering sheet characterized by absorption, a scattering and anisotropy coefficients.
Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L; Yerys, Benjamin E; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth
2015-01-01
Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised algorithm demonstrated increased sensitivity, but lower specificity in the overall sample. Estimates were highest for females, individuals with a verbal IQ below 85 or above 115, and ages 16 and older. Best practice diagnostic procedures should include the Module 4 in conjunction with other assessment tools. Balancing needs for sensitivity and specificity depending on the purpose of assessment (e.g., clinical vs. research) and demographic characteristics mentioned above will enhance its utility. PMID:26385796
Harman, David J; Ryder, Stephen D; James, Martin W; Jelpke, Matthew; Ottey, Dominic S; Wilkes, Emilie A; Card, Timothy R; Aithal, Guruprasad P; Guha, Indra Neil
2015-05-03
To assess the feasibility of a novel diagnostic algorithm targeting patients with risk factors for chronic liver disease in a community setting. Prospective cross-sectional study. Two primary care practices (adult patient population 10,479) in Nottingham, UK. Adult patients (aged 18 years or over) fulfilling one or more selected risk factors for developing chronic liver disease: (1) hazardous alcohol use, (2) type 2 diabetes or (3) persistently elevated alanine aminotransferase (ALT) liver function enzyme with negative serology. A serial biomarker algorithm, using a simple blood-based marker (aspartate aminotransferase:ALT ratio for hazardous alcohol users, BARD score for other risk groups) and subsequently liver stiffness measurement using transient elastography (TE). Diagnosis of clinically significant liver disease (defined as liver stiffness ≥8 kPa); definitive diagnosis of liver cirrhosis. We identified 920 patients with the defined risk factors of whom 504 patients agreed to undergo investigation. A normal blood biomarker was found in 62 patients (12.3%) who required no further investigation. Subsequently, 378 patients agreed to undergo TE, of whom 98 (26.8% of valid scans) had elevated liver stiffness. Importantly, 71/98 (72.4%) patients with elevated liver stiffness had normal liver enzymes and would be missed by traditional investigation algorithms. We identified 11 new patients with definite cirrhosis, representing a 140% increase in the number of diagnosed cases in this population. A non-invasive liver investigation algorithm based in a community setting is feasible to implement. Targeting risk factors using a non-invasive biomarker approach identified a substantial number of patients with previously undetected cirrhosis. The diagnostic algorithm utilised for this study can be found on clinicaltrials.gov (NCT02037867), and is part of a continuing longitudinal cohort study. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Image quality enhancement for skin cancer optical diagnostics
NASA Astrophysics Data System (ADS)
Bliznuks, Dmitrijs; Kuzmina, Ilona; Bolocko, Katrina; Lihachev, Alexey
2017-12-01
The research presents image quality analysis and enhancement proposals in biophotonic area. The sources of image problems are reviewed and analyzed. The problems with most impact in biophotonic area are analyzed in terms of specific biophotonic task - skin cancer diagnostics. The results point out that main problem for skin cancer analysis is the skin illumination problems. Since it is often not possible to prevent illumination problems, the paper proposes image post processing algorithm - low frequency filtering. Practical results show diagnostic results improvement after using proposed filter. Along that, filter do not reduces diagnostic results' quality for images without illumination defects. Current filtering algorithm requires empirical tuning of filter parameters. Further work needed to test the algorithm in other biophotonic applications and propose automatic filter parameter selection.
Hus, Vanessa; Lord, Catherine
2014-01-01
The Autism Diagnostic Observation Schedule, 2nd Edition includes revised diagnostic algorithms and standardized severity scores for modules used to assess children and adolescents of varying language abilities. Comparable revisions have not yet been applied to the Module 4, used with verbally fluent adults. The current study revises the Module 4 algorithm and calibrates raw overall and domain totals to provide metrics of ASD symptom severity. Sensitivity and specificity of the revised Module 4 algorithm exceeded 80% in the overall sample. Module 4 calibrated severity scores provide quantitative estimates of ASD symptom severity that are relatively independent of participant characteristics. These efforts increase comparability of ADOS scores across modules and should facilitate efforts to increase understanding of adults with ASD. PMID:24590409
An ultra low power ECG signal processor design for cardiovascular disease detection.
Jain, Sanjeev Kumar; Bhaumik, Basabi
2015-08-01
This paper presents an ultra low power ASIC design based on a new cardiovascular disease diagnostic algorithm. This new algorithm based on forward search is designed for real time ECG signal processing. The algorithm is evaluated for Physionet PTB database from the point of view of cardiovascular disease diagnosis. The failed detection rate of QRS complex peak detection of our algorithm ranges from 0.07% to 0.26% for multi lead ECG signal. The ASIC is designed using 130-nm CMOS low leakage process technology. The area of ASIC is 1.21 mm(2). This ASIC consumes only 96 nW at an operating frequency of 1 kHz with a supply voltage of 0.9 V. Due to ultra low power consumption, our proposed ASIC design is most suitable for energy efficient wearable ECG monitoring devices.
Diagnosis and treatment of gastroesophageal reflux disease complicated by Barrett's esophagus.
Stasyshyn, Andriy
2017-08-31
The aim of the study was to evaluate the effectiveness of a diagnostic and therapeutic algorithm for gastroesophageal reflux disease complicated by Barrett's esophagus in 46 patients. A diagnostic and therapeutic algorithm for complicated GERD was developed. To describe the changes in the esophagus with reflux esophagitis, the Los Angeles classification was used. Intestinal metaplasia of the epithelium in the lower third of the esophagus was assessed using videoendoscopy, chromoscopy, and biopsy. Quality of life was assessed with the Gastro-Intestinal Quality of Life Index. The used methods were modeling, clinical, analytical, comparative, standardized, and questionnaire-based. Results and their discussion. Among the complications of GERD, Barrett's esophagus was diagnosed in 9 (19.6 %), peptic ulcer in the esophagus in 10 (21.7 %), peptic stricture of the esophagus in 4 (8.7 %), esophageal-gastric bleeding in 23 (50.0 %), including Malory-Weiss syndrome in 18, and erosive ulcerous bleeding in 5 people. Hiatal hernia was diagnosed in 171 (87.7 %) patients (sliding in 157 (91.8%), paraesophageal hernia in 2 (1.2%), and mixed hernia in 12 (7.0%) cases). One hundred ninety-five patients underwent laparoscopic surgery. Nissen fundoplication was conducted in 176 (90.2%) patients, Toupet fundoplication in 14 (7.2%), and Dor fundoplication in 5 (2.6%). It was established that the use of the diagnostic and treatment algorithm promoted systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life. Argon coagulation and use of PPIs for 8-12 weeks before surgery led to the regeneration of the mucous membrane in the esophagus. The developed diagnostic and therapeutic algorithm facilitated systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life.
Mexican consensus on lysosomal acid lipase deficiency diagnosis.
Vázquez-Frias, R; García-Ortiz, J E; Valencia-Mayoral, P F; Castro-Narro, G E; Medina-Bravo, P G; Santillán-Hernández, Y; Flores-Calderón, J; Mehta, R; Arellano-Valdés, C A; Carbajal-Rodríguez, L; Navarrete-Martínez, J I; Urbán-Reyes, M L; Valadez-Reyes, M T; Zárate-Mondragón, F; Consuelo-Sánchez, A
Lysosomal acid lipase deficiency (LAL-D) causes progressive cholesteryl ester and triglyceride accumulation in the lysosomes of hepatocytes and monocyte-macrophage system cells, resulting in a systemic disease with various manifestations that may go unnoticed. It is indispensable to recognize the deficiency, which can present in patients at any age, so that specific treatment can be given. The aim of the present review was to offer a guide for physicians in understanding the fundamental diagnostic aspects of LAL-D, to successfully aid in its identification. The review was designed by a group of Mexican experts and is presented as an orienting algorithm for the pediatrician, internist, gastroenterologist, endocrinologist, geneticist, pathologist, radiologist, and other specialists that could come across this disease in their patients. An up-to-date review of the literature in relation to the clinical manifestations of LAL-D and its diagnosis was performed. The statements were formulated based on said review and were then voted upon. The structured quantitative method employed for reaching consensus was the nominal group technique. A practical algorithm of the diagnostic process in LAL-D patients was proposed, based on clinical and laboratory data indicative of the disease and in accordance with the consensus established for each recommendation. The algorithm provides a sequence of clinical actions from different studies for optimizing the diagnostic process of patients suspected of having LAL-D. Copyright © 2017 Asociación Mexicana de Gastroenterología. Publicado por Masson Doyma México S.A. All rights reserved.
ERIC Educational Resources Information Center
de Bildt, Annelies; Sytema, Sjoerd; Meffert, Harma; Bastiaansen, Jojanneke A. C. J.
2016-01-01
This study examined the discriminative ability of the revised Autism Diagnostic Observation Schedule module 4 algorithm (Hus and Lord in "J Autism Dev Disord" 44(8):1996-2012, 2014) in 93 Dutch males with Autism Spectrum Disorder (ASD), schizophrenia, psychopathy or controls. Discriminative ability of the revised algorithm ASD cut-off…
NASA Astrophysics Data System (ADS)
Shen, Fei; Chen, Chao; Yan, Ruqiang
2017-05-01
Classical bearing fault diagnosis methods, being designed according to one specific task, always pay attention to the effectiveness of extracted features and the final diagnostic performance. However, most of these approaches suffer from inefficiency when multiple tasks exist, especially in a real-time diagnostic scenario. A fault diagnosis method based on Non-negative Matrix Factorization (NMF) and Co-clustering strategy is proposed to overcome this limitation. Firstly, some high-dimensional matrixes are constructed using the Short-Time Fourier Transform (STFT) features, where the dimension of each matrix equals to the number of target tasks. Then, the NMF algorithm is carried out to obtain different components in each dimension direction through optimized matching, such as Euclidean distance and divergence distance. Finally, a Co-clustering technique based on information entropy is utilized to realize classification of each component. To verity the effectiveness of the proposed approach, a series of bearing data sets were analysed in this research. The tests indicated that although the diagnostic performance of single task is comparable to traditional clustering methods such as K-mean algorithm and Guassian Mixture Model, the accuracy and computational efficiency in multi-tasks fault diagnosis are improved.
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1996-01-01
NASA's advanced propulsion system Small Scale Magnetic Disturbances/Advanced Technology Development (SSME/ATD) has been undergoing extensive flight certification and developmental testing, which involves large numbers of health monitoring measurements. To enhance engine safety and reliability, detailed analysis and evaluation of the measurement signals are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce the risk of catastrophic system failures and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. During the development of SSME, ASRI participated in the research and development of several advanced non- linear signal diagnostic methods for health monitoring and failure prediction in turbomachinery components. However, due to the intensive computational requirement associated with such advanced analysis tasks, current SSME dynamic data analysis and diagnostic evaluation is performed off-line following flight or ground test with a typical diagnostic turnaround time of one to two days. The objective of MSFC's MPP Prototype System is to eliminate such 'diagnostic lag time' by achieving signal processing and analysis in real-time. Such an on-line diagnostic system can provide sufficient lead time to initiate corrective action and also to enable efficient scheduling of inspection, maintenance and repair activities. The major objective of this project was to convert and implement a number of advanced nonlinear diagnostic DSP algorithms in a format consistent with that required for integration into the Vanderbilt Multigraph Architecture (MGA) Model Based Programming environment. This effort will allow the real-time execution of these algorithms using the MSFC MPP Prototype System. ASRI has completed the software conversion and integration of a sequence of nonlinear signal analysis techniques specified in the SOW for real-time execution on MSFC's MPP Prototype. This report documents and summarizes the results of the contract tasks; provides the complete computer source code; including all FORTRAN/C Utilities; and all other utilities/supporting software libraries that are required for operation.
Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems
NASA Technical Reports Server (NTRS)
Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith
1988-01-01
Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.
Automated System for Early Breast Cancer Detection in Mammograms
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.
1993-01-01
The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.
Potente, Giuseppe; Messineo, Daniela; Maggi, Claudia; Savelli, Sara
2009-03-01
The purpose of this article is to report our practical utilization of dynamic contrast-enhanced magnetic resonance mammography [DCE-MRM] in the diagnosis of breast lesions. In many European centers, was preferred a high-temporal acquisition of both breasts simultaneously in a large FOV. We preferred to scan single breasts, with the aim to combine the analysis of the contrast intake and washout with the morphological evaluation of breast lesions. We followed an interpretation model, based upon a diagnostic algorithm, which combined contrast enhancement with morphological evaluation, in order to increase our confidence in diagnosis. DCE-MRM with our diagnostic algorithm has identified 179 malignant and 41 benign lesions; final outcome has identified 178 malignant and 42 benign lesions, 3 false positives and 2 false negatives. Sensitivity of CE-MRM was 98.3%; specificity, 95.1%; positive predictive value, 98.9%; negative predictive value, 92.8% and accuracy, 97.7%.
Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J
2016-01-01
Background Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). Objectives To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Methods Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Results Out of 9,161,866 individuals aged 0–99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. Conclusion The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings. PMID:27785102
Sharma, Manuj; Petersen, Irene; Nazareth, Irwin; Coton, Sonia J
2016-01-01
Research into diabetes mellitus (DM) often requires a reproducible method for identifying and distinguishing individuals with type 1 DM (T1DM) and type 2 DM (T2DM). To develop a method to identify individuals with T1DM and T2DM using UK primary care electronic health records. Using data from The Health Improvement Network primary care database, we developed a two-step algorithm. The first algorithm step identified individuals with potential T1DM or T2DM based on diagnostic records, treatment, and clinical test results. We excluded individuals with records for rarer DM subtypes only. For individuals to be considered diabetic, they needed to have at least two records indicative of DM; one of which was required to be a diagnostic record. We then classified individuals with T1DM and T2DM using the second algorithm step. A combination of diagnostic codes, medication prescribed, age at diagnosis, and whether the case was incident or prevalent were used in this process. We internally validated this classification algorithm through comparison against an independent clinical examination of The Health Improvement Network electronic health records for a random sample of 500 DM individuals. Out of 9,161,866 individuals aged 0-99 years from 2000 to 2014, we classified 37,693 individuals with T1DM and 418,433 with T2DM, while 1,792 individuals remained unclassified. A small proportion were classified with some uncertainty (1,155 [3.1%] of all individuals with T1DM and 6,139 [1.5%] with T2DM) due to unclear health records. During validation, manual assignment of DM type based on clinical assessment of the entire electronic record and algorithmic assignment led to equivalent classification in all instances. The majority of individuals with T1DM and T2DM can be readily identified from UK primary care electronic health records. Our approach can be adapted for use in other health care settings.
Electronics and Algorithms for HOM Based Beam Diagnostics
NASA Astrophysics Data System (ADS)
Frisch, Josef; Baboi, Nicoleta; Eddy, Nathan; Nagaitsev, Sergei; Hensler, Olaf; McCormick, Douglas; May, Justin; Molloy, Stephen; Napoly, Olivier; Paparella, Rita; Petrosyan, Lyudvig; Ross, Marc; Simon, Claire; Smith, Tonee
2006-11-01
The signals from the Higher Order Mode (HOM) ports on superconducting cavities can be used as beam position monitors and to do survey structure alignment. A HOM-based diagnostic system has been installed to instrument both couplers on each of the 40 cryogenic accelerating structures in the DESY TTF2 Linac. The electronics uses a single stage down conversion from the 1.7 GHz HOM spectral line to a 20MHz IF which has been digitized. The electronics is based on low cost surface mount components suitable for large scale production. The analysis of the HOM data is based on Singular Value Decomposition. The response of the OM modes is calibrated using conventional BPMs.
[Diagnostic algorithm in chronic myeloproliferative diseases (CMPD)].
Haferlach, Torsten; Bacher, Ulrike; Kern, Wolfgang; Schnittger, Susanne; Haferlach, Claudia
2007-09-15
The Philadelphia-negative chronic myeloproliferative diseases (CMPD) are very complex and heterogeneous disorders. They are represented by polycythemia vera (PV), chronic idiopathic myelofibrosis (CIMF), essential thrombocythemia (ET), CMPD/unclassifiable (CMPD-U), chronic neutrophilic leukemia (CNL), and chronic eosinophilic leukemia/hypereosinophilic syndrome (CEL/HES) according to the WHO classification. Before, diagnostics were mainly focused on clinical and morphological aspects, but in recent years cytogenetics and fluorescence in situ hybridization (FISH) found entrance in routine schedules as chromosomal abnormalities are relevant for prognosis and classification. Recently, there is rapid progress in the field of molecular characterization: the JAK2V617F mutation which shows a high incidence in PV, CIMF, and ET already plays a central role and will probably soon be included in follow-up procedures. Due to the detection of mutations in exon 12 of the JAK2 gene or mutations in the MPL gene the variety of activating mutations in the CMPD is still increasing. In CEL/HES the detection of the FIP1L1-PDGFRA fusion gene and overexpression of PDGFRA and PDGFRB led to targeted therapy with tyrosine kinase inhibitors. Thus, diagnostics in the CMPD transform toward a multimodal diagnostic concept based on a combination of methods - cyto-/histomorphology, cytogenetics, and individual molecular methods which can be included in a diagnostic algorithm.
Cloud computing-based TagSNP selection algorithm for human genome data.
Hung, Che-Lun; Chen, Wen-Pei; Hua, Guan-Jie; Zheng, Huiru; Tsai, Suh-Jen Jane; Lin, Yaw-Ling
2015-01-05
Single nucleotide polymorphisms (SNPs) play a fundamental role in human genetic variation and are used in medical diagnostics, phylogeny construction, and drug design. They provide the highest-resolution genetic fingerprint for identifying disease associations and human features. Haplotypes are regions of linked genetic variants that are closely spaced on the genome and tend to be inherited together. Genetics research has revealed SNPs within certain haplotype blocks that introduce few distinct common haplotypes into most of the population. Haplotype block structures are used in association-based methods to map disease genes. In this paper, we propose an efficient algorithm for identifying haplotype blocks in the genome. In chromosomal haplotype data retrieved from the HapMap project website, the proposed algorithm identified longer haplotype blocks than an existing algorithm. To enhance its performance, we extended the proposed algorithm into a parallel algorithm that copies data in parallel via the Hadoop MapReduce framework. The proposed MapReduce-paralleled combinatorial algorithm performed well on real-world data obtained from the HapMap dataset; the improvement in computational efficiency was proportional to the number of processors used.
Cloud Computing-Based TagSNP Selection Algorithm for Human Genome Data
Hung, Che-Lun; Chen, Wen-Pei; Hua, Guan-Jie; Zheng, Huiru; Tsai, Suh-Jen Jane; Lin, Yaw-Ling
2015-01-01
Single nucleotide polymorphisms (SNPs) play a fundamental role in human genetic variation and are used in medical diagnostics, phylogeny construction, and drug design. They provide the highest-resolution genetic fingerprint for identifying disease associations and human features. Haplotypes are regions of linked genetic variants that are closely spaced on the genome and tend to be inherited together. Genetics research has revealed SNPs within certain haplotype blocks that introduce few distinct common haplotypes into most of the population. Haplotype block structures are used in association-based methods to map disease genes. In this paper, we propose an efficient algorithm for identifying haplotype blocks in the genome. In chromosomal haplotype data retrieved from the HapMap project website, the proposed algorithm identified longer haplotype blocks than an existing algorithm. To enhance its performance, we extended the proposed algorithm into a parallel algorithm that copies data in parallel via the Hadoop MapReduce framework. The proposed MapReduce-paralleled combinatorial algorithm performed well on real-world data obtained from the HapMap dataset; the improvement in computational efficiency was proportional to the number of processors used. PMID:25569088
A real-time spectral mapper as an emerging diagnostic technology in biomedical sciences.
Epitropou, George; Kavvadias, Vassilis; Iliou, Dimitris; Stathopoulos, Efstathios; Balas, Costas
2013-01-01
Real time spectral imaging and mapping at video rates can have tremendous impact not only on diagnostic sciences but also on fundamental physiological problems. We report the first real-time spectral mapper based on the combination of snap-shot spectral imaging and spectral estimation algorithms. Performance evaluation revealed that six band imaging combined with the Wiener algorithm provided high estimation accuracy, with error levels lying within the experimental noise. High accuracy is accompanied with much faster, by 3 orders of magnitude, spectral mapping, as compared with scanning spectral systems. This new technology is intended to enable spectral mapping at nearly video rates in all kinds of dynamic bio-optical effects as well as in applications where the target-probe relative position is randomly and fast changing.
Ehteshami Bejnordi, Babak; Mullooly, Maeve; Pfeiffer, Ruth M; Fan, Shaoqi; Vacek, Pamela M; Weaver, Donald L; Herschorn, Sally; Brinton, Louise A; van Ginneken, Bram; Karssemeijer, Nico; Beck, Andrew H; Gierach, Gretchen L; van der Laak, Jeroen A W M; Sherman, Mark E
2018-06-13
The breast stromal microenvironment is a pivotal factor in breast cancer development, growth and metastases. Although pathologists often detect morphologic changes in stroma by light microscopy, visual classification of such changes is subjective and non-quantitative, limiting its diagnostic utility. To gain insights into stromal changes associated with breast cancer, we applied automated machine learning techniques to digital images of 2387 hematoxylin and eosin stained tissue sections of benign and malignant image-guided breast biopsies performed to investigate mammographic abnormalities among 882 patients, ages 40-65 years, that were enrolled in the Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project. Using deep convolutional neural networks, we trained an algorithm to discriminate between stroma surrounding invasive cancer and stroma from benign biopsies. In test sets (928 whole-slide images from 330 patients), this algorithm could distinguish biopsies diagnosed as invasive cancer from benign biopsies solely based on the stromal characteristics (area under the receiver operator characteristics curve = 0.962). Furthermore, without being trained specifically using ductal carcinoma in situ as an outcome, the algorithm detected tumor-associated stroma in greater amounts and at larger distances from grade 3 versus grade 1 ductal carcinoma in situ. Collectively, these results suggest that algorithms based on deep convolutional neural networks that evaluate only stroma may prove useful to classify breast biopsies and aid in understanding and evaluating the biology of breast lesions.
Xiao, Li-Hong; Chen, Pei-Ran; Gou, Zhong-Ping; Li, Yong-Zhong; Li, Mei; Xiang, Liang-Cheng; Feng, Ping
2017-01-01
The aim of this study is to evaluate the ability of the random forest algorithm that combines data on transrectal ultrasound findings, age, and serum levels of prostate-specific antigen to predict prostate carcinoma. Clinico-demographic data were analyzed for 941 patients with prostate diseases treated at our hospital, including age, serum prostate-specific antigen levels, transrectal ultrasound findings, and pathology diagnosis based on ultrasound-guided needle biopsy of the prostate. These data were compared between patients with and without prostate cancer using the Chi-square test, and then entered into the random forest model to predict diagnosis. Patients with and without prostate cancer differed significantly in age and serum prostate-specific antigen levels (P < 0.001), as well as in all transrectal ultrasound characteristics (P < 0.05) except uneven echo (P = 0.609). The random forest model based on age, prostate-specific antigen and ultrasound predicted prostate cancer with an accuracy of 83.10%, sensitivity of 65.64%, and specificity of 93.83%. Positive predictive value was 86.72%, and negative predictive value was 81.64%. By integrating age, prostate-specific antigen levels and transrectal ultrasound findings, the random forest algorithm shows better diagnostic performance for prostate cancer than either diagnostic indicator on its own. This algorithm may help improve diagnosis of the disease by identifying patients at high risk for biopsy.
Medial elbow injury in young throwing athletes
Gregory, Bonnie; Nyland, John
2013-01-01
Summary This report reviews the anatomy, overhead throwing biomechanics, injury mechanism and incidence, physical examination and diagnosis, diagnostic imaging and conservative treatment of medial elbow injuries in young throwing athletes. Based on the information a clinical management decision-making algorithm is presented. PMID:23888291
Sethi, Gaurav; Saini, B S
2015-12-01
This paper presents an abdomen disease diagnostic system based on the flexi-scale curvelet transform, which uses different optimal scales for extracting features from computed tomography (CT) images. To optimize the scale of the flexi-scale curvelet transform, we propose an improved genetic algorithm. The conventional genetic algorithm assumes that fit parents will likely produce the healthiest offspring that leads to the least fit parents accumulating at the bottom of the population, reducing the fitness of subsequent populations and delaying the optimal solution search. In our improved genetic algorithm, combining the chromosomes of a low-fitness and a high-fitness individual increases the probability of producing high-fitness offspring. Thereby, all of the least fit parent chromosomes are combined with high fit parent to produce offspring for the next population. In this way, the leftover weak chromosomes cannot damage the fitness of subsequent populations. To further facilitate the search for the optimal solution, our improved genetic algorithm adopts modified elitism. The proposed method was applied to 120 CT abdominal images; 30 images each of normal subjects, cysts, tumors and stones. The features extracted by the flexi-scale curvelet transform were more discriminative than conventional methods, demonstrating the potential of our method as a diagnostic tool for abdomen diseases.
NASA Astrophysics Data System (ADS)
Wu, Yu-Jie; Lin, Guan-Wei
2017-04-01
Since 1999, Taiwan has experienced a rapid rise in the number of landslides, and the number even reached a peak after the 2009 Typhoon Morakot. Although it is proved that the ground-motion signals induced by slope processes could be recorded by seismograph, it is difficult to be distinguished from continuous seismic records due to the lack of distinct P and S waves. In this study, we combine three common seismic detectors including the short-term average/long-term average (STA/LTA) approach, and two diagnostic functions of moving average and scintillation index. Based on these detectors, we have established an auto-detection algorithm of landslide-quakes and the detection thresholds are defined to distinguish landslide-quake from earthquakes and background noises. To further improve the proposed detection algorithm, we apply it to seismic archives recorded by Broadband Array in Taiwan for Seismology (BATS) during the 2009 Typhoon Morakots and consequently the discrete landslide-quakes detected by the automatic algorithm are located. The detection algorithm show that the landslide-detection results are consistent with that of visual inspection and hence can be used to automatically monitor landslide-quakes.
Koa-Wing, Michael; Nakagawa, Hiroshi; Luther, Vishal; Jamil-Copley, Shahnaz; Linton, Nick; Sandler, Belinda; Qureshi, Norman; Peters, Nicholas S; Davies, D Wyn; Francis, Darrel P; Jackman, Warren; Kanagaratnam, Prapa
2015-11-15
Ripple Mapping (RM) is designed to overcome the limitations of existing isochronal 3D mapping systems by representing the intracardiac electrogram as a dynamic bar on a surface bipolar voltage map that changes in height according to the electrogram voltage-time relationship, relative to a fiduciary point. We tested the hypothesis that standard approaches to atrial tachycardia CARTO™ activation maps were inadequate for RM creation and interpretation. From the results, we aimed to develop an algorithm to optimize RMs for future prospective testing on a clinical RM platform. CARTO-XP™ activation maps from atrial tachycardia ablations were reviewed by two blinded assessors on an off-line RM workstation. Ripple Maps were graded according to a diagnostic confidence scale (Grade I - high confidence with clear pattern of activation through to Grade IV - non-diagnostic). The RM-based diagnoses were corroborated against the clinical diagnoses. 43 RMs from 14 patients were classified as Grade I (5 [11.5%]); Grade II (17 [39.5%]); Grade III (9 [21%]) and Grade IV (12 [28%]). Causes of low gradings/errors included the following: insufficient chamber point density; window-of-interest<100% of cycle length (CL); <95% tachycardia CL mapped; variability of CL and/or unstable fiducial reference marker; and suboptimal bar height and scar settings. A data collection and map interpretation algorithm has been developed to optimize Ripple Maps in atrial tachycardias. This algorithm requires prospective testing on a real-time clinical platform. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Song, Xiaoying; Huang, Qijun; Chang, Sheng; He, Jin; Wang, Hao
2016-12-01
To address the low compression efficiency of lossless compression and the low image quality of general near-lossless compression, a novel near-lossless compression algorithm based on adaptive spatial prediction is proposed for medical sequence images for possible diagnostic use in this paper. The proposed method employs adaptive block size-based spatial prediction to predict blocks directly in the spatial domain and Lossless Hadamard Transform before quantization to improve the quality of reconstructed images. The block-based prediction breaks the pixel neighborhood constraint and takes full advantage of the local spatial correlations found in medical images. The adaptive block size guarantees a more rational division of images and the improved use of the local structure. The results indicate that the proposed algorithm can efficiently compress medical images and produces a better peak signal-to-noise ratio (PSNR) under the same pre-defined distortion than other near-lossless methods.
The Malaria System MicroApp: A New, Mobile Device-Based Tool for Malaria Diagnosis.
Oliveira, Allisson Dantas; Prats, Clara; Espasa, Mateu; Zarzuela Serrat, Francesc; Montañola Sales, Cristina; Silgado, Aroa; Codina, Daniel Lopez; Arruda, Mercia Eliane; I Prat, Jordi Gomez; Albuquerque, Jones
2017-04-25
Malaria is a public health problem that affects remote areas worldwide. Climate change has contributed to the problem by allowing for the survival of Anopheles in previously uninhabited areas. As such, several groups have made developing news systems for the automated diagnosis of malaria a priority. The objective of this study was to develop a new, automated, mobile device-based diagnostic system for malaria. The system uses Giemsa-stained peripheral blood samples combined with light microscopy to identify the Plasmodium falciparum species in the ring stage of development. The system uses image processing and artificial intelligence techniques as well as a known face detection algorithm to identify Plasmodium parasites. The algorithm is based on integral image and haar-like features concepts, and makes use of weak classifiers with adaptive boosting learning. The search scope of the learning algorithm is reduced in the preprocessing step by removing the background around blood cells. As a proof of concept experiment, the tool was used on 555 malaria-positive and 777 malaria-negative previously-made slides. The accuracy of the system was, on average, 91%, meaning that for every 100 parasite-infected samples, 91 were identified correctly. Accessibility barriers of low-resource countries can be addressed with low-cost diagnostic tools. Our system, developed for mobile devices (mobile phones and tablets), addresses this by enabling access to health centers in remote communities, and importantly, not depending on extensive malaria expertise or expensive diagnostic detection equipment. ©Allisson Dantas Oliveira, Clara Prats, Mateu Espasa, Francesc Zarzuela Serrat, Cristina Montañola Sales, Aroa Silgado, Daniel Lopez Codina, Mercia Eliane Arruda, Jordi Gomez i Prat, Jones Albuquerque. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 25.04.2017.
Zou, Yi-Bo; Chen, Yi-Min; Gao, Ming-Ke; Liu, Quan; Jiang, Si-Yu; Lu, Jia-Hui; Huang, Chen; Li, Ze-Yu; Zhang, Dian-Hua
2017-08-01
Coronary heart disease preoperative diagnosis plays an important role in the treatment of vascular interventional surgery. Actually, most doctors are used to diagnosing the position of the vascular stenosis and then empirically estimating vascular stenosis by selective coronary angiography images instead of using mouse, keyboard and computer during preoperative diagnosis. The invasive diagnostic modality is short of intuitive and natural interaction and the results are not accurate enough. Aiming at above problems, the coronary heart disease preoperative gesture interactive diagnostic system based on Augmented Reality is proposed. The system uses Leap Motion Controller to capture hand gesture video sequences and extract the features which that are the position and orientation vector of the gesture motion trajectory and the change of the hand shape. The training planet is determined by K-means algorithm and then the effect of gesture training is improved by multi-features and multi-observation sequences for gesture training. The reusability of gesture is improved by establishing the state transition model. The algorithm efficiency is improved by gesture prejudgment which is used by threshold discriminating before recognition. The integrity of the trajectory is preserved and the gesture motion space is extended by employing space rotation transformation of gesture manipulation plane. Ultimately, the gesture recognition based on SRT-HMM is realized. The diagnosis and measurement of the vascular stenosis are intuitively and naturally realized by operating and measuring the coronary artery model with augmented reality and gesture interaction techniques. All of the gesture recognition experiments show the distinguish ability and generalization ability of the algorithm and gesture interaction experiments prove the availability and reliability of the system.
Chan, Pak-Hei; Wong, Chun-Ka; Poh, Yukkee C; Pun, Louise; Leung, Wangie Wan-Chiu; Wong, Yu-Fai; Wong, Michelle Man-Ying; Poh, Ming-Zher; Chu, Daniel Wai-Sing; Siu, Chung-Wah
2016-07-21
Diagnosing atrial fibrillation (AF) before ischemic stroke occurs is a priority for stroke prevention in AF. Smartphone camera-based photoplethysmographic (PPG) pulse waveform measurement discriminates between different heart rhythms, but its ability to diagnose AF in real-world situations has not been adequately investigated. We sought to assess the diagnostic performance of a standalone smartphone PPG application, Cardiio Rhythm, for AF screening in primary care setting. Patients with hypertension, with diabetes mellitus, and/or aged ≥65 years were recruited. A single-lead ECG was recorded by using the AliveCor heart monitor with tracings reviewed subsequently by 2 cardiologists to provide the reference standard. PPG measurements were performed by using the Cardiio Rhythm smartphone application. AF was diagnosed in 28 (2.76%) of 1013 participants. The diagnostic sensitivity of the Cardiio Rhythm for AF detection was 92.9% (95% CI] 77-99%) and was higher than that of the AliveCor automated algorithm (71.4% [95% CI 51-87%]). The specificities of Cardiio Rhythm and the AliveCor automated algorithm were comparable (97.7% [95% CI: 97-99%] versus 99.4% [95% CI 99-100%]). The positive predictive value of the Cardiio Rhythm was lower than that of the AliveCor automated algorithm (53.1% [95% CI 38-67%] versus 76.9% [95% CI 56-91%]); both had a very high negative predictive value (99.8% [95% CI 99-100%] versus 99.2% [95% CI 98-100%]). The Cardiio Rhythm smartphone PPG application provides an accurate and reliable means to detect AF in patients at risk of developing AF and has the potential to enable population-based screening for AF. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
NASA Astrophysics Data System (ADS)
Kong, Changduk; Lim, Semyeong; Kim, Keunwoo
2013-03-01
The Neural Networks is mostly used to engine fault diagnostic system due to its good learning performance, but it has a drawback due to low accuracy and long learning time to build learning data base. This work builds inversely a base performance model of a turboprop engine to be used for a high altitude operation UAV using measuring performance data, and proposes a fault diagnostic system using the base performance model and artificial intelligent methods such as Fuzzy and Neural Networks. Each real engine performance model, which is named as the base performance model that can simulate a new engine performance, is inversely made using its performance test data. Therefore the condition monitoring of each engine can be more precisely carried out through comparison with measuring performance data. The proposed diagnostic system identifies firstly the faulted components using Fuzzy Logic, and then quantifies faults of the identified components using Neural Networks leaned by fault learning data base obtained from the developed base performance model. In leaning the measuring performance data of the faulted components, the FFBP (Feed Forward Back Propagation) is used. In order to user's friendly purpose, the proposed diagnostic program is coded by the GUI type using MATLAB.
UWGSP6: a diagnostic radiology workstation of the future
NASA Astrophysics Data System (ADS)
Milton, Stuart W.; Han, Sang; Choi, Hyung-Sik; Kim, Yongmin
1993-06-01
The Univ. of Washington's Image Computing Systems Lab. (ICSL) has been involved in research into the development of a series of PACS workstations since the middle 1980's. The most recent research, a joint UW-IBM project, attempted to create a diagnostic radiology workstation using an IBM RISC System 6000 (RS6000) computer workstation and the X-Window system. While the results are encouraging, there are inherent limitations in the workstation hardware which prevent it from providing an acceptable level of functionality for diagnostic radiology. Realizing the RS6000 workstation's limitations, a parallel effort was initiated to design a workstation, UWGSP6 (Univ. of Washington Graphics System Processor #6), that provides the required functionality. This paper documents the design of UWGSP6, which not only addresses the requirements for a diagnostic radiology workstation in terms of display resolution, response time, etc., but also includes the processing performance necessary to support key functions needed in the implementation of algorithms for computer-aided diagnosis. The paper includes a description of the workstation architecture, and specifically its image processing subsystem. Verification of the design through hardware simulation is then discussed, and finally, performance of selected algorithms based on detailed simulation is provided.
Multiplex PCR Tests for Detection of Pathogens Associated with Gastroenteritis
Zhang, Hongwei; Morrison, Scott; Tang, Yi-Wei
2016-01-01
Synopsis A wide range of enteric pathogens can cause infectious gastroenteritis. Conventional diagnostic algorithms including culture, biochemical identification, immunoassay and microscopic examination are time consuming and often lack sensitivity and specificity. Advances in molecular technology have as allowed its use as clinical diagnostic tools. Multiplex PCR based testing has made its way to gastroenterology diagnostic arena in recent years. In this article we present a review of recent laboratory developed multiplex PCR tests and current commercial multiplex gastrointestinal pathogen tests. We will focus on two FDA cleared commercial syndromic multiplex tests: Luminex xTAG GPP and Biofire FimArray GI test. These multiplex tests can detect and identify multiple enteric pathogens in one test and provide results within hours. Multiplex PCR tests have shown superior sensitivity to conventional methods for detection of most pathogens. The high negative predictive value of these multiplex tests has led to the suggestion that they be used as screening tools especially in outbreaks. Although the clinical utility and benefit of multiplex PCR test are to be further investigated, implementing these multiplex PCR tests in gastroenterology diagnostic algorithm has the potential to improve diagnosis of infectious gastroenteritis. PMID:26004652
Akberov, R F; Gorshkov, A N
1997-01-01
The X-ray endoscopic semiotics of precancerous gastric mucosal changes (epithelial dysplasia, intestinal epithelial rearrangement) was examined by the results of 1574 gastric examination. A diagnostic algorithm was developed for radiation studies in the diagnosis of the above pathology.
2012 HIV Diagnostics Conference: the molecular diagnostics perspective.
Branson, Bernard M; Pandori, Mark
2013-04-01
2012 HIV Diagnostic Conference Atlanta, GA, USA, 12-14 December 2012. This report highlights the presentations and discussions from the 2012 National HIV Diagnostic Conference held in Atlanta (GA, USA), on 12-14 December 2012. Reflecting changes in the evolving field of HIV diagnostics, the conference provided a forum for evaluating developments in molecular diagnostics and their role in HIV diagnosis. In 2010, the HIV Diagnostics Conference concluded with the proposal of a new diagnostic algorithm which included nucleic acid testing to resolve discordant screening and supplemental antibody test results. The 2012 meeting, picking up where the 2010 meeting left off, focused on scientific presentations that assessed this new algorithm and the role played by RNA testing and new developments in molecular diagnostics, including detection of total and integrated HIV-1 DNA, detection and quantification of HIV-2 RNA, and rapid formats for detection of HIV-1 RNA.
Diagnostic challenges of childhood asthma.
Bakirtas, Arzu
2017-01-01
Diagnosis of asthma in childhood is challenging. Both underdiagnosis and overdiagnosis of asthma are important issues. The present review gives information about challenging factors for an accurate diagnosis of childhood asthma. Although underdiagnosis of asthma in childhood has always been the most important diagnostic problem, overdiagnosis of asthma has also been increasingly recognized. This is probably due to diagnosis of asthma based on symptoms and signs alone. Demonstration of variable airflow obstruction by lung function tests is the most common asthma diagnostic tests used in practice and is therefore strongly recommended in children who can cooperate. Recently, an asthma guideline combining the clinical and economic evidences with sensitivity and specificity of diagnostic procedures was developed to improve accuracy of diagnosis and to avoid overdiagnosis. This guideline provided an algorithmic clinical and cost-effective approach and included fractional exhaled nitric oxide measurement as one of the diagnostic tests in addition to lung function. Diagnosis of asthma in children should be made by combining relevant history with at least two confirmatory diagnostic tests whenever possible. Diagnosis based on short-period treatment trials should be limited to young children who are unable to cooperate with these tests.
Alfa, Michelle J; Sepehri, Shadi
2013-01-01
BACKGROUND: There has been a growing interest in developing an appropriate laboratory diagnostic algorithm for Clostridium difficile, mainly as a result of increases in both the number and severity of cases of C difficile infection in the past decade. A C difficile diagnostic algorithm is necessary because diagnostic kits, mostly for the detection of toxins A and B or glutamate dehydrogenase (GDH) antigen, are not sufficient as stand-alone assays for optimal diagnosis of C difficile infection. In addition, conventional reference methods for C difficile detection (eg, toxigenic culture and cytotoxin neutralization [CTN] assays) are not routinely practiced in diagnostic laboratory settings. OBJECTIVE: To review the four-step algorithm used at Diagnostic Services of Manitoba sites for the laboratory diagnosis of toxigenic C difficile. RESULT: One year of retrospective C difficile data using the proposed algorithm was reported. Of 5695 stool samples tested, 9.1% (n=517) had toxigenic C difficile. Sixty per cent (310 of 517) of toxigenic C difficile stools were detected following the first two steps of the algorithm. CTN confirmation of GDH-positive, toxin A- and B-negative assays resulted in detection of an additional 37.7% (198 of 517) of toxigenic C difficile. Culture of the third specimen, from patients who had two previous negative specimens, detected an additional 2.32% (12 of 517) of toxigenic C difficile samples. DISCUSSION: Using GDH antigen as the screening and toxin A and B as confirmatory test for C difficile, 85% of specimens were reported negative or positive within 4 h. Without CTN confirmation for GDH antigen and toxin A and B discordant results, 37% (195 of 517) of toxigenic C difficile stools would have been missed. Following the algorithm, culture was needed for only 2.72% of all specimens submitted for C difficile testing. CONCLUSION: The overview of the data illustrated the significance of each stage of this four-step C difficile algorithm and emphasized the value of using CTN assay and culture as parts of an algorithm that ensures accurate diagnosis of toxigenic C difficile. PMID:24421808
Reducing noise component on medical images
NASA Astrophysics Data System (ADS)
Semenishchev, Evgeny; Voronin, Viacheslav; Dub, Vladimir; Balabaeva, Oksana
2018-04-01
Medical visualization and analysis of medical data is an actual direction. Medical images are used in microbiology, genetics, roentgenology, oncology, surgery, ophthalmology, etc. Initial data processing is a major step towards obtaining a good diagnostic result. The paper considers the approach allows an image filtering with preservation of objects borders. The algorithm proposed in this paper is based on sequential data processing. At the first stage, local areas are determined, for this purpose the method of threshold processing, as well as the classical ICI algorithm, is applied. The second stage uses a method based on based on two criteria, namely, L2 norm and the first order square difference. To preserve the boundaries of objects, we will process the transition boundary and local neighborhood the filtering algorithm with a fixed-coefficient. For example, reconstructed images of CT, x-ray, and microbiological studies are shown. The test images show the effectiveness of the proposed algorithm. This shows the applicability of analysis many medical imaging applications.
Chan, Jason; Mack, David R.; Manuel, Douglas G.; Mojaverian, Nassim; de Nanassy, Joseph
2017-01-01
Importance Celiac disease (CD) is a common pediatric illness, and awareness of gluten-related disorders including CD is growing. Health administrative data represents a unique opportunity to conduct population-based surveillance of this chronic condition and assess the impact of caring for children with CD on the health system. Objective The objective of the study was to validate an algorithm based on health administrative data diagnostic codes to accurately identify children with biopsy-proven CD. We also evaluated trends over time in the use of health services related to CD by children in Ontario, Canada. Study design and setting We conducted a retrospective cohort study and validation study of population-based health administrative data in Ontario, Canada. All cases of biopsy-proven CD diagnosed 2005–2011 in Ottawa were identified through chart review from a large pediatric health care center, and linked to the Ontario health administrative data to serve as positive reference standard. All other children living within Ottawa served as the negative reference standard. Case-identifying algorithms based on outpatient physician visits with associated ICD-9 code for CD plus endoscopy billing code were constructed and tested. Sensitivity, specificity, PPV and NPV were tested for each algorithm (with 95% CI). Poisson regression, adjusting for sex and age at diagnosis, was used to explore the trend in outpatient visits associated with a CD diagnostic code from 1995–2011. Results The best algorithm to identify CD consisted of an endoscopy billing claim follow by 1 or more adult or pediatric gastroenterologist encounters after the endoscopic procedure. The sensitivity, specificity, PPV, and NPV for the algorithm were: 70.4% (95% CI 61.1–78.4%), >99.9% (95% CI >99.9->99.9%), 53.3% (95% CI 45.1–61.4%) and >99.9% (95% CI >99.9->99.9%) respectively. It identified 1289 suspected CD cases from Ontario-wide administrative data. There was a 9% annual increase in the use of this combination of CD-associated diagnostic codes in physician billing data (RR 1.09, 95% CI 1.07–1.10, P<0.001). Conclusions With its current structure and variables Ontario health administrative data is not suitable in identifying incident pediatric CD cases. The tested algorithms suffer from poor sensitivity and/or poor PPV, which increase the risk of case misclassification that could lead to biased estimation of CD incidence rate. This study reinforced the importance of validating the codes used to identify cohorts or outcomes when conducting research using health administrative data. PMID:28662204
Chaotic particle swarm optimization with mutation for classification.
Assarzadeh, Zahra; Naghsh-Nilchi, Ahmad Reza
2015-01-01
In this paper, a chaotic particle swarm optimization with mutation-based classifier particle swarm optimization is proposed to classify patterns of different classes in the feature space. The introduced mutation operators and chaotic sequences allows us to overcome the problem of early convergence into a local minima associated with particle swarm optimization algorithms. That is, the mutation operator sharpens the convergence and it tunes the best possible solution. Furthermore, to remove the irrelevant data and reduce the dimensionality of medical datasets, a feature selection approach using binary version of the proposed particle swarm optimization is introduced. In order to demonstrate the effectiveness of our proposed classifier, mutation-based classifier particle swarm optimization, it is checked out with three sets of data classifications namely, Wisconsin diagnostic breast cancer, Wisconsin breast cancer and heart-statlog, with different feature vector dimensions. The proposed algorithm is compared with different classifier algorithms including k-nearest neighbor, as a conventional classifier, particle swarm-classifier, genetic algorithm, and Imperialist competitive algorithm-classifier, as more sophisticated ones. The performance of each classifier was evaluated by calculating the accuracy, sensitivity, specificity and Matthews's correlation coefficient. The experimental results show that the mutation-based classifier particle swarm optimization unequivocally performs better than all the compared algorithms.
ERIC Educational Resources Information Center
de Bildt, Annelies; Sytema, Sjoerd; Zander, Eric; Bölte, Sven; Sturm, Harald; Yirmiya, Nurit; Yaari, Maya; Charman, Tony; Salomone, Erica; LeCouteur, Ann; Green, Jonathan; Bedia, Ricardo Canal; Primo, Patricia García; van Daalen, Emma; de Jonge, Maretha V.; Guðmundsdóttir, Emilía; Jóhannsdóttir, Sigurrós; Raleva, Marija; Boskovska, Meri; Rogé, Bernadette; Baduel, Sophie; Moilanen, Irma; Yliherva, Anneli; Buitelaar, Jan; Oosterling, Iris J.
2015-01-01
The current study aimed to investigate the Autism Diagnostic Interview-Revised (ADI-R) algorithms for toddlers and young preschoolers (Kim and Lord, "J Autism Dev Disord" 42(1):82-93, 2012) in a non-US sample from ten sites in nine countries (n = 1,104). The construct validity indicated a good fit of the algorithms. The diagnostic…
The Weiss score and beyond--histopathology for adrenocortical carcinoma.
Papotti, Mauro; Libè, Rossella; Duregon, Eleonora; Volante, Marco; Bertherat, Jerome; Tissier, Frederique
2011-12-01
The pathological diagnosis of adrenocortical carcinoma (ACC) is still challenging for its rarity and the presence of special variants (pediatric, oncocytic, myxoid, and sarcomatoid). It is based on the recognition at light microscopy of at least three among nine morphological parameters, according to the Weiss scoring system, which has been introduced 27 years ago and nowadays is the most widely employed. Nevertheless, the diagnostic performance of this system is very high but does not reach a sensitivity and specificity of 100%, its diagnostic applicability is potentially low among non-expert pathologists, and a group of borderline cases with only one or two criteria exist of uncertain behavior. Moreover, it is scarcely reproducible in the ACC morphological variants. In fact, specifically for the pure oncocytic neoplasms that seem to have a better prognosis in comparison to the conventional ACCs, a modified system (the Lin-Weiss-Bisceglia) has been proposed. With the aim to simplify the ACC diagnosis, 2 years ago, the "reticulin" diagnostic algorithm has been proposed, based on the observation that the tumoral reticulin framework (highlighted by reticulin silver-based histochemical staining) is consistently disrupted in malignant cases but only in a small subset of benign cases. Following this algorithm, in the presence of reticulin alterations, malignancy is further defined through the identification of at least one of the following parameters: necrosis, high mitotic rate, and venous invasion. As a complement to the morphological approach, some immunohistochemical markers (such as steroidogenic factor 1) have been proposed as diagnostic and prognostic adjuncts but still lack wide clinical validation.
Skull base osteomyelitis: current microbiology and management.
Spielmann, P M; Yu, R; Neeff, M
2013-01-01
Skull base osteomyelitis typically presents in an immunocompromised patient with severe otalgia and otorrhoea. Pseudomonas aeruginosa is the commonest pathogenic micro-organism, and reports of resistance to fluoroquinolones are now emerging, complicating management. We reviewed our experience of this condition, and of the local pathogenic organisms. A retrospective review from 2004 to 2011 was performed. Patients were identified by their admission diagnostic code, and computerised records examined. Twenty patients were identified. A facial palsy was present in 12 patients (60 per cent). Blood cultures were uniformly negative, and culture of ear canal granulations was non-diagnostic in 71 per cent of cases. Pseudomonas aeruginosa was isolated in only 10 (50 per cent) cases; one strain was resistant to ciprofloxacin but all were sensitive to ceftazidime. Two cases of fungal skull base osteomyelitis were identified. The mortality rate was 15 per cent. The patients' treatment algorithm is presented. Our treatment algorithm reflects the need for multidisciplinary input, early microbial culture of specimens, appropriate imaging, and prolonged and systemic antimicrobial treatment. Resolution of infection must be confirmed by close follow up and imaging.
Ehteshami Bejnordi, Babak; Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A W M; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory Crf; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-12-12
Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin-stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists' diagnoses in a diagnostic setting. Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting.
Balouchestani, Mohammadreza; Krishnan, Sridhar
2014-01-01
Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.
Manna, Raffaele; Cauda, Roberto; Feriozzi, Sandro; Gambaro, Giovanni; Gasbarrini, Antonio; Lacombe, Didier; Livneh, Avi; Martini, Alberto; Ozdogan, Huri; Pisani, Antonio; Riccio, Eleonora; Verrecchia, Elena; Dagna, Lorenzo
2017-10-01
Fever of unknown origin (FUO) is a rather rare clinical syndrome representing a major diagnostic challenge. The occurrence of more than three febrile attacks with fever-free intervals of variable duration during 6 months of observation has recently been proposed as a subcategory of FUO, Recurrent FUO (RFUO). A substantial number of patients with RFUO have auto-inflammatory genetic fevers, but many patients remain undiagnosed. We hypothesize that this undiagnosed subgroup may be comprised of, at least in part, a number of rare genetic febrile diseases such as Fabry disease. We aimed to identify key features or potential diagnostic clues for Fabry disease as a model of rare genetic febrile diseases causing RFUO, and to develop diagnostic guidelines for RFUO, using Fabry disease as an example of inserting other rare diseases in the existing FUO algorithms. An international panel of specialists in recurrent fevers and rare diseases, including internists, infectious disease specialists, rheumatologists, gastroenterologists, nephrologists, and medical geneticists convened to review the existing diagnostic algorithms, and to suggest recommendations for arriving at accurate diagnoses on the basis of available literature and clinical experience. By combining specific features of rare diseases with other diagnostic considerations, guidelines have been designed to raise awareness and identify rare diseases among other causes of FUO. The proposed guidelines may be useful for the inclusion of rare diseases in the diagnostic algorithms for FUO. A wide spectrum of patients will be needed to validate the algorithm in different clinical settings.
NASA Astrophysics Data System (ADS)
Hirabayashi, Miki; Ohashi, Hirotada; Kubo, Tai
We have presented experimental analysis on the controllability of our transcription-based diagnostic biomolecular automata by programmed molecules. Focusing on the noninvasive transcriptome diagnosis by salivary mRNAs, we already proposed the novel concept of diagnostic device using DNA computation. This system consists of the main computational element which has a stem shaped promoter region and a pseudo-loop shaped read-only memory region for transcription regulation through the conformation change caused by the recognition of disease-related biomarkers. We utilize the transcription of malachite green aptamer sequence triggered by the target recognition for observation of detection. This algorithm makes it possible to release RNA-aptamer drugs multiply, different from the digestion-based systems by the restriction enzyme which was proposed previously, for the in-vivo use, however, the controllability of aptamer release is not enough at the previous stage. In this paper, we verified the regulation effect on aptamer transcription by programmed molecules in basic conditions towards the developm! ent of therapeutic automata. These results would bring us one step closer to the realization of new intelligent diagnostic and therapeutic automata based on molecular circuits.
Herscovici, Sarah; Pe'er, Avivit; Papyan, Surik; Lavie, Peretz
2007-02-01
Scoring of REM sleep based on polysomnographic recordings is a laborious and time-consuming process. The growing number of ambulatory devices designed for cost-effective home-based diagnostic sleep recordings necessitates the development of a reliable automatic REM sleep detection algorithm that is not based on the traditional electroencephalographic, electrooccolographic and electromyographic recordings trio. This paper presents an automatic REM detection algorithm based on the peripheral arterial tone (PAT) signal and actigraphy which are recorded with an ambulatory wrist-worn device (Watch-PAT100). The PAT signal is a measure of the pulsatile volume changes at the finger tip reflecting sympathetic tone variations. The algorithm was developed using a training set of 30 patients recorded simultaneously with polysomnography and Watch-PAT100. Sleep records were divided into 5 min intervals and two time series were constructed from the PAT amplitudes and PAT-derived inter-pulse periods in each interval. A prediction function based on 16 features extracted from the above time series that determines the likelihood of detecting a REM epoch was developed. The coefficients of the prediction function were determined using a genetic algorithm (GA) optimizing process tuned to maximize a price function depending on the sensitivity, specificity and agreement of the algorithm in comparison with the gold standard of polysomnographic manual scoring. Based on a separate validation set of 30 patients overall sensitivity, specificity and agreement of the automatic algorithm to identify standard 30 s epochs of REM sleep were 78%, 92%, 89%, respectively. Deploying this REM detection algorithm in a wrist worn device could be very useful for unattended ambulatory sleep monitoring. The innovative method of optimization using a genetic algorithm has been proven to yield robust results in the validation set.
Mechanick, Jeffrey I; Hurley, Daniel L; Garvey, W Timothy
2017-03-01
The American Association of Clinical Endocrinologists (AACE) and American College of Endocrinology (ACE) have created a chronic care model, advanced diagnostic framework, clinical practice guidelines, and clinical practice algorithm for the comprehensive management of obesity. This coordinated effort is not solely based on body mass index as in previous models, but emphasizes a complications-centric approach that primarily determines therapeutic decisions and desired outcomes. Adiposity-Based Chronic Disease (ABCD) is a new diagnostic term for obesity that explicitly identifies a chronic disease, alludes to a precise pathophysiologic basis, and avoids the stigmata and confusion related to the differential use and multiple meanings of the term "obesity." Key elements to further the care of patients using this new ABCD term are: (1) positioning lifestyle medicine in the promotion of overall health, not only as the first algorithmic step, but as the central, pervasive action; (2) standardizing protocols that comprehensively and durably address weight loss and management of adiposity-based complications; (3) approaching patient care through contextualization (e.g., primordial prevention to decrease obesogenic environmental risk factors and transculturalization to adapt evidence-based recommendations for different ethnicities, cultures, and socio-economics); and lastly, (4) developing evidence-based strategies for successful implementation, monitoring, and optimization of patient care over time. This AACE/ACE blueprint extends current work and aspires to meaningfully improve both individual and population health by presenting a new ABCD term for medical diagnostic purposes, use in a complications-centric management and staging strategy, and precise reference to the obesity chronic disease state, divested from counterproductive stigmata and ambiguities found in the general public sphere. AACE = American Association of Clinical Endocrinologists ABCD = Adiposity-Based Chronic Disease ACE = American College of Endocrinology BMI = body mass index CPG = clinical practice guidelines HCP = health care professionals.
Developing a modular architecture for creation of rule-based clinical diagnostic criteria.
Hong, Na; Pathak, Jyotishman; Chute, Christopher G; Jiang, Guoqian
2016-01-01
With recent advances in computerized patient records system, there is an urgent need for producing computable and standards-based clinical diagnostic criteria. Notably, constructing rule-based clinical diagnosis criteria has become one of the goals in the International Classification of Diseases (ICD)-11 revision. However, few studies have been done in building a unified architecture to support the need for diagnostic criteria computerization. In this study, we present a modular architecture for enabling the creation of rule-based clinical diagnostic criteria leveraging Semantic Web technologies. The architecture consists of two modules: an authoring module that utilizes a standards-based information model and a translation module that leverages Semantic Web Rule Language (SWRL). In a prototype implementation, we created a diagnostic criteria upper ontology (DCUO) that integrates ICD-11 content model with the Quality Data Model (QDM). Using the DCUO, we developed a transformation tool that converts QDM-based diagnostic criteria into Semantic Web Rule Language (SWRL) representation. We evaluated the domain coverage of the upper ontology model using randomly selected diagnostic criteria from broad domains (n = 20). We also tested the transformation algorithms using 6 QDM templates for ontology population and 15 QDM-based criteria data for rule generation. As the results, the first draft of DCUO contains 14 root classes, 21 subclasses, 6 object properties and 1 data property. Investigation Findings, and Signs and Symptoms are the two most commonly used element types. All 6 HQMF templates are successfully parsed and populated into their corresponding domain specific ontologies and 14 rules (93.3 %) passed the rule validation. Our efforts in developing and prototyping a modular architecture provide useful insight into how to build a scalable solution to support diagnostic criteria representation and computerization.
Lykiardopoulos, Byron; Hagström, Hannes; Fredrikson, Mats; Ignatova, Simone; Stål, Per; Hultcrantz, Rolf; Ekstedt, Mattias; Kechagias, Stergios
2016-01-01
Detection of advanced fibrosis (F3-F4) in nonalcoholic fatty liver disease (NAFLD) is important for ascertaining prognosis. Serum markers have been proposed as alternatives to biopsy. We attempted to develop a novel algorithm for detection of advanced fibrosis based on a more efficient combination of serological markers and to compare this with established algorithms. We included 158 patients with biopsy-proven NAFLD. Of these, 38 had advanced fibrosis. The following fibrosis algorithms were calculated: NAFLD fibrosis score, BARD, NIKEI, NASH-CRN regression score, APRI, FIB-4, King´s score, GUCI, Lok index, Forns score, and ELF. Study population was randomly divided in a training and a validation group. A multiple logistic regression analysis using bootstrapping methods was applied to the training group. Among many variables analyzed age, fasting glucose, hyaluronic acid and AST were included, and a model (LINKI-1) for predicting advanced fibrosis was created. Moreover, these variables were combined with platelet count in a mathematical way exaggerating the opposing effects, and alternative models (LINKI-2) were also created. Models were compared using area under the receiver operator characteristic curves (AUROC). Of established algorithms FIB-4 and King´s score had the best diagnostic accuracy with AUROCs 0.84 and 0.83, respectively. Higher accuracy was achieved with the novel LINKI algorithms. AUROCs in the total cohort for LINKI-1 was 0.91 and for LINKI-2 models 0.89. The LINKI algorithms for detection of advanced fibrosis in NAFLD showed better accuracy than established algorithms and should be validated in further studies including larger cohorts.
Rodgers, M; Nixon, J; Hempel, S; Aho, T; Kelly, J; Neal, D; Duffy, S; Ritchie, G; Kleijnen, J; Westwood, M
2006-06-01
To determine the most effective diagnostic strategy for the investigation of microscopic and macroscopic haematuria in adults. Electronic databases from inception to October 2003, updated in August 2004. A systematic review was undertaken according to published guidelines. Decision analytic modelling was undertaken, based on the findings of the review, expert opinion and additional information from the literature, to assess the relative cost-effectiveness of plausible alternative tests that are part of diagnostic algorithms for haematuria. A total of 118 studies met the inclusion criteria. No studies that evaluated the effectiveness of diagnostic algorithms for haematuria or the effectiveness of screening for haematuria or investigating its underlying cause were identified. Eighteen out of 19 identified studies evaluated dipstick tests and data from these suggested that these are moderately useful in establishing the presence of, but cannot be used to rule out, haematuria. Six studies using haematuria as a test for the presence of a disease indicated that the detection of microhaematuria cannot alone be considered a useful test either to rule in or rule out the presence of a significant underlying pathology (urinary calculi or bladder cancer). Forty-eight of 80 studies addressed methods to localise the source of bleeding (renal or lower urinary tract). The methods and thresholds described in these studies varied greatly, precluding any estimate of a 'best performance' threshold that could be applied across patient groups. However, studies of red blood cell morphology that used a cut-off value of 80% dysmorphic cells for glomerular disease reported consistently high specificities (potentially useful in ruling in a renal cause for haematuria). The reported sensitivities were generally low. Twenty-eight studies included data on the accuracy of laboratory tests (tumour markers, cytology) for the diagnosis of bladder cancer. The majority of tumour marker studies evaluated nuclear matrix protein 22 or bladder tumour antigen. The sensitivity and specificity ranges suggested that neither of these would be useful either for diagnosing bladder cancer or for ruling out patients for further investigation (cystoscopy). However, the evidence remains sparse and the diagnostic accuracy estimates varied widely between studies. Fifteen studies evaluating urine cytology as a test for urinary tract malignancies were heterogeneous and poorly reported. The calculated specificity values were generally high, suggesting some possible utility in confirming malignancy. However, the evidence suggests that urine cytology has no application in ruling out malignancy or excluding patients from further investigation. Fifteen studies evaluated imaging techniques [computed tomography (CT), intravenous urography (IVU) or ultrasound scanning (US)] to detect the underlying cause of haematuria. The target condition and the reference standard varied greatly between these studies. The diagnostic accuracy data for several individual studies appeared promising but meaningful comparison of the available imaging technologies was impossible. Eight studies met the inclusion criteria but addressed different parts of the diagnostic chain (e.g. screening programmes, laboratory investigations, full urological work-up). No single study addressed the complete diagnostic process. The review also highlighted a number of methodological limitations of these studies, including their lack of generalisability to the UK context. Separate decision analytic models were therefore developed to progress estimation of the optimal strategy for the diagnostic management of haematuria. The economic model for the detection of microhaematuria found that immediate microscopy following a positive dipstick test would improve diagnostic efficiency as it eliminates the high number of false positives produced by dipstick testing. Strategies that use routine microscopy may be associated with high numbers of false results, but evidence was lacking regarding the accuracy of routine microscopy and estimates were adopted for the model. The model for imaging the upper urinary tract showed that US detects more tumours than IVU at one-third of the cost, and is also associated with fewer false results. For any cause of haematuria, CT was shown to have a mean incremental cost-effectiveness ratio of pounds sterling 9939 in comparison with the next best option, US. When US is followed up with CT for negative results with persistent haematuria, it dominates the initial use of CT alone, with a saving of pounds sterling 235,000 for the evaluation of 1000 patients. The model for investigation of the lower urinary tract showed that for low-risk patients the use of immediate cystoscopy could be avoided if cystoscopy were used for follow-up patients with a negative initial test using tumour markers and/or cytology, resulting in a saving of pounds sterling 483,000 for the evaluation of 1000 patients. The clinical and economic impact on delayed detection of both upper and lower urinary tract tumours through the use of follow-up testing should be evaluated in future studies. There are insufficient data currently available to derive an evidence-based algorithm of the diagnostic pathway for haematuria. A hypothetical algorithm based on the opinion and practice of clinical experts in the review team, other published algorithms and the results of economic modelling is presented in this report. This algorithm is presented, for comparative purposes, alongside current US and UK guidelines. The ideas contained in these algorithms and the specific questions outlined should form the basis of future research. Quality assessment of the diagnostic accuracy studies included in this review highlighted several areas of deficiency.
Eosinophilic pustular folliculitis: A proposal of diagnostic and therapeutic algorithms.
Nomura, Takashi; Katoh, Mayumi; Yamamoto, Yosuke; Miyachi, Yoshiki; Kabashima, Kenji
2016-11-01
Eosinophilic pustular folliculitis (EPF) is a sterile inflammatory dermatosis of unknown etiology. In addition to classic EPF, which affects otherwise healthy individuals, an immunocompromised state can cause immunosuppression-associated EPF (IS-EPF), which may be referred to dermatologists in inpatient services for assessments. Infancy-associated EPF (I-EPF) is the least characterized subtype, being observed mainly in non-Japanese infants. Diagnosis of EPF is challenging because its lesions mimic those of other common diseases, such as acne and dermatomycosis. Furthermore, there is no consensus regarding the treatment for each subtype of EPF. Here, we created procedure algorithms that facilitate the diagnosis and selection of therapeutic options on the basis of published work available in the public domain. Our diagnostic algorithm comprised a simple flowchart to direct physicians toward proper diagnosis. Recommended regimens were summarized in an easy-to-comprehend therapeutic algorithm for each subtype of EPF. These algorithms would facilitate the diagnostic and therapeutic procedure of EPF. © 2016 Japanese Dermatological Association.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki
2010-03-01
Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Spectral analysis of major heart tones
NASA Astrophysics Data System (ADS)
Lejkowski, W.; Dobrowolski, A. P.; Majka, K.; Olszewski, R.
2018-04-01
The World Health Organization (WHO) figures clearly indicate that cardiovascular disease is the most common cause of death and disability in the world. Early detection of cardiovascular pathologies may contribute to reducing such a high mortality rate. Auscultatory examination is one of the first and most important step in cardiologic diagnostics. Unfortunately, proper diagnosis is closely related to long-term practice and medical experience. The article presents the author's system of recording phonocardiograms and the way of saving data, as well as the outline of the analysis algorithm, which will allow to assign a case to a patient with heart failure or healthy voluntaries' with a certain high probability. The results of a pilot study of phonocardiographic signals were also presented as an introduction to further research aimed at the development of an efficient diagnostic algorithm based on spectral analysis of the heart tone.
Huerga, Helena; Ferlazzo, Gabriella; Bevilacqua, Paolo; Kirubi, Beatrice; Ardizzoni, Elisa; Wanjala, Stephen; Sitienei, Joseph; Bonnet, Maryline
2017-01-01
Determine-TB LAM assay is a urine point-of-care test useful for TB diagnosis in HIV-positive patients. We assessed the incremental diagnostic yield of adding LAM to algorithms based on clinical signs, sputum smear-microscopy, chest X-ray and Xpert MTB/RIF in HIV-positive patients with symptoms of pulmonary TB (PTB). Prospective observational cohort of ambulatory (either severely ill or CD4<200cells/μl or with Body Mass Index<17Kg/m2) and hospitalized symptomatic HIV-positive adults in Kenya. Incremental diagnostic yield of adding LAM was the difference in the proportion of confirmed TB patients (positive Xpert or MTB culture) diagnosed by the algorithm with LAM compared to the algorithm without LAM. The multivariable mortality model was adjusted for age, sex, clinical severity, BMI, CD4, ART initiation, LAM result and TB confirmation. Among 474 patients included, 44.1% were severely ill, 69.6% had CD4<200cells/μl, 59.9% had initiated ART, 23.2% could not produce sputum. LAM, smear-microscopy, Xpert and culture in sputum were positive in 39.0% (185/474), 21.6% (76/352), 29.1% (102/350) and 39.7% (92/232) of the patients tested, respectively. Of 156 patients with confirmed TB, 65.4% were LAM positive. Of those classified as non-TB, 84.0% were LAM negative. Adding LAM increased the diagnostic yield of the algorithms by 36.6%, from 47.4% (95%CI:39.4-55.6) to 84.0% (95%CI:77.3-89.4%), when using clinical signs and X-ray; by 19.9%, from 62.2% (95%CI:54.1-69.8) to 82.1% (95%CI:75.1-87.7), when using clinical signs and microscopy; and by 13.4%, from 74.4% (95%CI:66.8-81.0) to 87.8% (95%CI:81.6-92.5), when using clinical signs and Xpert. LAM positive patients had an increased risk of 2-months mortality (aOR:2.7; 95%CI:1.5-4.9). LAM should be included in TB diagnostic algorithms in parallel to microscopy or Xpert request for HIV-positive patients either ambulatory (severely ill or CD4<200cells/μl) or hospitalized. LAM allows same day treatment initiation in patients at higher risk of death and in those not able to produce sputum.
Assessing an AI knowledge-base for asymptomatic liver diseases.
Babic, A; Mathiesen, U; Hedin, K; Bodemar, G; Wigertz, O
1998-01-01
Discovering not yet seen knowledge from clinical data is of importance in the field of asymptomatic liver diseases. Avoidance of liver biopsy which is used as the ultimate confirmation of diagnosis by making the decision based on relevant laboratory findings only, would be considered an essential support. The system based on Quinlan's ID3 algorithm was simple and efficient in extracting the sought knowledge. Basic principles of applying the AI systems are therefore described and complemented with medical evaluation. Some of the diagnostic rules were found to be useful as decision algorithms i.e. they could be directly applied in clinical work and made a part of the knowledge-base of the Liver Guide, an automated decision support system.
Noureldine, Salem I; Najafian, Alireza; Aragon Han, Patricia; Olson, Matthew T; Genther, Dane J; Schneider, Eric B; Prescott, Jason D; Agrawal, Nishant; Mathur, Aarti; Zeiger, Martha A; Tufano, Ralph P
2016-07-01
Diagnostic molecular testing is used in the workup of thyroid nodules. While these tests appear to be promising in more definitively assigning a risk of malignancy, their effect on surgical decision making has yet to be demonstrated. To investigate the effect of diagnostic molecular profiling of thyroid nodules on the surgical decision-making process. A surgical management algorithm was developed and published after peer review that incorporated individual Bethesda System for Reporting Thyroid Cytopathology classifications with clinical, laboratory, and radiological results. This algorithm was created to formalize the decision-making process selected herein in managing patients with thyroid nodules. Between April 1, 2014, and March 31, 2015, a prospective study of patients who had undergone diagnostic molecular testing of a thyroid nodule before being seen for surgical consultation was performed. The recommended management undertaken by the surgeon was then prospectively compared with the corresponding one in the algorithm. Patients with thyroid nodules who did not undergo molecular testing and were seen for surgical consultation during the same period served as a control group. All pertinent treatment options were presented to each patient, and any deviation from the algorithm was recorded prospectively. To evaluate the appropriateness of any change (deviation) in management, the surgical histopathology diagnosis was correlated with the surgery performed. The study cohort comprised 140 patients who underwent molecular testing. Their mean (SD) age was 50.3 (14.6) years, and 75.0% (105 of 140) were female. Over a 1-year period, 20.3% (140 of 688) had undergone diagnostic molecular testing before surgical consultation, and 79.7% (548 of 688) had not undergone molecular testing. The surgical management deviated from the treatment algorithm in 12.9% (18 of 140) with molecular testing and in 10.2% (56 of 548) without molecular testing (P = .37). In the group with molecular testing, the surgical management plan of only 7.9% (11 of 140) was altered as a result of the molecular test. All but 1 of those patients were found to be overtreated relative to the surgical histopathology analysis. Molecular testing did not significantly affect the surgical decision-making process in this study. Among patients whose treatment was altered based on these markers, there was evidence of overtreatment.
How to create a cardiac CT clinic.
Dowe, David A
2007-02-01
Coronary computed tomography (CT) angiography is taking an exponentially increasing role in the diagnostic algorithm of suspected coronary artery disease. It has the immediate potential of replacing stress tests as the first study a patient receives if suspected of having coronary artery disease. In the near future, it will likely precede all elective, diagnostic cardiac catheterizations secondary to its extraordinary negative predictive value. This paper discusses the 3 building blocks of a successful cardiac CT clinic, image quality, service, and marketing. It then discusses the significant differences in establishing a cardiac CT clinic depending on if the radiologist is hospital based or private office based.
de Bildt, Annelies; Sytema, Sjoerd; Meffert, Harma; Bastiaansen, Jojanneke A C J
2016-01-01
This study examined the discriminative ability of the revised Autism Diagnostic Observation Schedule module 4 algorithm (Hus and Lord in J Autism Dev Disord 44(8):1996-2012, 2014) in 93 Dutch males with Autism Spectrum Disorder (ASD), schizophrenia, psychopathy or controls. Discriminative ability of the revised algorithm ASD cut-off resembled the original algorithm ASD cut-off: highly specific for psychopathy and controls, lower sensitivity than Hus and Lord (2014; i.e. ASD .61, AD .53). The revised algorithm AD cut-off improved sensitivity over the original algorithm. Discriminating ASD from schizophrenia was still challenging, but the better-balanced sensitivity (.53) and specificity (.78) of the revised algorithm AD cut-off may aide clinicians' differential diagnosis. Findings support using the revised algorithm, being conceptually conform the other modules, thus improving comparability across the lifespan.
Severson, Carl A; Pendharkar, Sachin R; Ronksley, Paul E; Tsai, Willis H
2015-01-01
To assess the ability of electronic health data and existing screening tools to identify clinically significant obstructive sleep apnea (OSA), as defined by symptomatic or severe OSA. The present retrospective cohort study of 1041 patients referred for sleep diagnostic testing was undertaken at a tertiary sleep centre in Calgary, Alberta. A diagnosis of clinically significant OSA or an alternative sleep diagnosis was assigned to each patient through blinded independent chart review by two sleep physicians. Predictive variables were identified from online questionnaire data, and diagnostic algorithms were developed. The performance of electronically derived algorithms for identifying patients with clinically significant OSA was determined. Diagnostic performance of these algorithms was compared with versions of the STOP-Bang questionnaire and adjusted neck circumference score (ANC) derived from electronic data. Electronic questionnaire data were highly sensitive (>95%) at identifying clinically significant OSA, but not specific. Sleep diagnostic testing-determined respiratory disturbance index was very specific (specificity ≥95%) for clinically relevant disease, but not sensitive (<35%). Derived algorithms had similar accuracy to the STOP-Bang or ANC, but required fewer questions and calculations. These data suggest that a two-step process using a small number of clinical variables (maximizing sensitivity) and objective diagnostic testing (maximizing specificity) is required to identify clinically significant OSA. When used in an online setting, simple algorithms can identify clinically relevant OSA with similar performance to existing decision rules such as the STOP-Bang or ANC.
Chapman, Brian E.; Lee, Sean; Kang, Hyunseok Peter; Chapman, Wendy W.
2011-01-01
In this paper we describe an application called peFinder for document-level classification of CT pulmonary angiography reports. peFinder is based on a generalized version of the ConText algorithm, a simple text processing algorithm for identifying features in clinical report documents. peFinder was used to answer questions about the disease state (pulmonary emboli present or absent), the certainty state of the diagnosis (uncertainty present or absent), the temporal state of an identified pulmonary embolus (acute or chronic), and the technical quality state of the exam (diagnostic or not diagnostic). Gold standard answers for each question were determined from the consensus classifications of three human annotators. peFinder results were compared to naive Bayes’ classifiers using unigrams and bigrams. The sensitivities (and positive predictive values) for peFinder were 0.98(0.83), 0.86(0.96), 0.94(0.93), and 0.60(0.90) for disease state, quality state, certainty state, and temporal state respectively, compared to 0.68(0.77), 0.67(0.87), 0.62(0.82), and 0.04(0.25) for the naive Bayes’ classifier using unigrams, and 0.75(0.79), 0.52(0.69), 0.59(0.84), and 0.04(0.25) for the naive Bayes’ classifier using bigrams. PMID:21459155
Prince, Martin J; de Rodriguez, Juan Llibre; Noriega, L; Lopez, A; Acosta, Daisy; Albanese, Emiliano; Arizaga, Raul; Copeland, John RM; Dewey, Michael; Ferri, Cleusa P; Guerra, Mariella; Huang, Yueqin; Jacob, KS; Krishnamoorthy, ES; McKeigue, Paul; Sousa, Renata; Stewart, Robert J; Salas, Aquiles; Sosa, Ana Luisa; Uwakwa, Richard
2008-01-01
Background The criterion for dementia implicit in DSM-IV is widely used in research but not fully operationalised. The 10/66 Dementia Research Group sought to do this using assessments from their one phase dementia diagnostic research interview, and to validate the resulting algorithm in a population-based study in Cuba. Methods The criterion was operationalised as a computerised algorithm, applying clinical principles, based upon the 10/66 cognitive tests, clinical interview and informant reports; the Community Screening Instrument for Dementia, the CERAD 10 word list learning and animal naming tests, the Geriatric Mental State, and the History and Aetiology Schedule – Dementia Diagnosis and Subtype. This was validated in Cuba against a local clinician DSM-IV diagnosis and the 10/66 dementia diagnosis (originally calibrated probabilistically against clinician DSM-IV diagnoses in the 10/66 pilot study). Results The DSM-IV sub-criteria were plausibly distributed among clinically diagnosed dementia cases and controls. The clinician diagnoses agreed better with 10/66 dementia diagnosis than with the more conservative computerized DSM-IV algorithm. The DSM-IV algorithm was particularly likely to miss less severe dementia cases. Those with a 10/66 dementia diagnosis who did not meet the DSM-IV criterion were less cognitively and functionally impaired compared with the DSMIV confirmed cases, but still grossly impaired compared with those free of dementia. Conclusion The DSM-IV criterion, strictly applied, defines a narrow category of unambiguous dementia characterized by marked impairment. It may be specific but incompletely sensitive to clinically relevant cases. The 10/66 dementia diagnosis defines a broader category that may be more sensitive, identifying genuine cases beyond those defined by our DSM-IV algorithm, with relevance to the estimation of the population burden of this disorder. PMID:18577205
Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian
2016-10-01
Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.
The PHQ-PD as a Screening Tool for Panic Disorder in the Primary Care Setting in Spain
Wood, Cristina Mae; Ruíz-Rodríguez, Paloma; Tomás-Tomás, Patricia; Gracia-Gracia, Irene; Dongil-Collado, Esperanza; Iruarrizaga, M. Iciar
2016-01-01
Introduction Panic disorder is a common anxiety disorder and is highly prevalent in Spanish primary care centres. The use of validated tools can improve the detection of panic disorder in primary care populations, thus enabling referral for specialized treatment. The aim of this study is to determine the accuracy of the Patient Health Questionnaire-Panic Disorder (PHQ-PD) as a screening and diagnostic tool for panic disorder in Spanish primary care centres. Method We compared the psychometric properties of the PHQ-PD to the reference standard, the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I) interview. General practitioners referred 178 patients who completed the entire PHQ test, including the PHQ-PD, to undergo the SCID-I. The sensitivity, specificity, positive and negative predictive values and positive and negative likelihood ratios of the PHQ-PD were assessed. Results The operating characteristics of the PHQ-PD are moderate. The best cut-off score was 5 (sensitivity .77, specificity .72). Modifications to the questionnaire's algorithms improved test characteristics (sensitivity .77, specificity .72) compared to the original algorithm. The screening question alone yielded the highest sensitivity score (.83). Conclusion Although the modified algorithm of the PHQ-PD only yielded moderate results as a diagnostic test for panic disorder, it was better than the original. Using only the first question of the PHQ-PD showed the best psychometric properties (sensitivity). Based on these findings, we suggest the use of the screening questions for screening purposes and the modified algorithm for diagnostic purposes. PMID:27525977
Propulsion IVHM Technology Experiment
NASA Technical Reports Server (NTRS)
Chicatelli, Amy K.; Maul, William A.; Fulton, Christopher E.
2006-01-01
The Propulsion IVHM Technology Experiment (PITEX) successfully demonstrated real-time fault detection and isolation of a virtual reusable launch vehicle (RLV) main propulsion system (MPS). Specifically, the PITEX research project developed and applied a model-based diagnostic system for the MPS of the X-34 RLV, a space-launch technology demonstrator. The demonstration was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real time on flight-like hardware. In an attempt to expose potential performance problems, the PITEX diagnostic system was subjected to numerous realistic effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. In all cases, the PITEX system performed as required. The research demonstrated potential benefits of model-based diagnostics, defined performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.
Intelligent approach to prognostic enhancements of diagnostic systems
NASA Astrophysics Data System (ADS)
Vachtsevanos, George; Wang, Peng; Khiripet, Noppadon; Thakker, Ash; Galie, Thomas R.
2001-07-01
This paper introduces a novel methodology to prognostics based on a dynamic wavelet neural network construct and notions from the virtual sensor area. This research has been motivated and supported by the U.S. Navy's active interest in integrating advanced diagnostic and prognostic algorithms in existing Naval digital control and monitoring systems. A rudimentary diagnostic platform is assumed to be available providing timely information about incipient or impending failure conditions. We focus on the development of a prognostic algorithm capable of predicting accurately and reliably the remaining useful lifetime of a failing machine or component. The prognostic module consists of a virtual sensor and a dynamic wavelet neural network as the predictor. The virtual sensor employs process data to map real measurements into difficult to monitor fault quantities. The prognosticator uses a dynamic wavelet neural network as a nonlinear predictor. Means to manage uncertainty and performance metrics are suggested for comparison purposes. An interface to an available shipboard Integrated Condition Assessment System is described and applications to shipboard equipment are discussed. Typical results from pump failures are presented to illustrate the effectiveness of the methodology.
An Efficient Reachability Analysis Algorithm
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Fijany, Amir
2008-01-01
A document discusses a new algorithm for generating higher-order dependencies for diagnostic and sensor placement analysis when a system is described with a causal modeling framework. This innovation will be used in diagnostic and sensor optimization and analysis tools. Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in-situ platforms. This algorithm will serve as a power tool for technologies that satisfy a key requirement of autonomous spacecraft, including science instruments and in-situ missions.
Diagnostic and Remedial Learning Strategy Based on Conceptual Graphs
ERIC Educational Resources Information Center
Jong, BinShyan; Lin, TsongWuu; Wu, YuLung; Chan, Teyi
2004-01-01
Numerous scholars have applied conceptual graphs for explanatory purposes. This study devised the Remedial-Instruction Decisive path (RID path) algorithm for diagnosing individual student learning situation. This study focuses on conceptual graphs. According to the concepts learned by students and the weight values of relations among these…
Grude, Nils; Lindbaek, Morten
2015-01-01
Objective. To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Design. Randomized controlled trial. Setting. Out-of-hours service, Oslo, Norway. Intervention. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010–November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Subjects. Women (n = 441) aged 16–55 years. Mean age in both groups 27 years. Main outcome measures. Number of days until symptomatic resolution. Results. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Conclusion. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder. PMID:25961367
Bollestad, Marianne; Grude, Nils; Lindbaek, Morten
2015-06-01
To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Randomized controlled trial. Out-of-hours service, Oslo, Norway. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010-November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Women (n = 441) aged 16-55 years. Mean age in both groups 27 years. Number of days until symptomatic resolution. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder.
Marchetti, Antonio; Pace, Maria Vittoria; Di Lorito, Alessia; Canarecci, Sara; Felicioni, Lara; D'Antuono, Tommaso; Liberatore, Marcella; Filice, Giampaolo; Guetti, Luigi; Mucilli, Felice; Buttitta, Fiamma
2016-09-01
Anaplastic Lymphoma Kinase (ALK) gene rearrangements have been described in 3-5% of lung adenocarcinomas (ADC) and their identification is essential to select patients for treatment with ALK tyrosine kinase inhibitors. For several years, fluorescent in situ hybridization (FISH) has been considered as the only validated diagnostic assay. Currently, alternative methods are commercially available as diagnostic tests. A series of 217 ADC comprising 196 consecutive resected tumors and 21 ALK FISH-positive cases from an independent series of 702 ADC were investigated. All specimens were screened by IHC (ALK-D5F3-CDx-Ventana), FISH (Vysis ALK Break-Apart-Abbott) and RT-PCR (ALK RGQ RT-PCR-Qiagen). Results were compared and discordant cases subjected to Next Generation Sequencing. Thirty-nine of 217 samples were positive by the ALK RGQ RT-PCR assay, using a threshold cycle (Ct) cut-off ≤35.9, as recommended. Of these positive samples, 14 were negative by IHC and 12 by FISH. ALK RGQ RT-PCR/FISH discordant cases were analyzed by the NGS assay with results concordant with FISH data. In order to obtain the maximum level of agreement between FISH and ALK RGQ RT-PCR data, we introduced a new scoring algorithm based on the ΔCt value. A ΔCt cut-off level ≤3.5 was used in a pilot series. Then the algorithm was tested on a completely independent validation series. By using the new scoring algorithm and FISH as reference standard, the sensitivity and the specificity of the ALK RGQ RT-PCR(ΔCt) assay were 100% and 100%, respectively. Our results suggest that the ALK RGQ RT-PCR test could be useful in clinical practice as a complementary assay in multi-test diagnostic algorithms or even, if our data will be confirmed in independent studies, as a standalone or screening test for the selection of patients to be treated with ALK inhibitors. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Computer-aided diagnostic strategy selection.
Greenes, R A
1986-03-01
Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2013-01-01
Background Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. Objective To develop a clinical decision–support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. Methods A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. Results The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision–support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k = 0.68 (p < 0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician’s diagnostic impression and the gold standard k = 0. 64 (p < 0.0001). There was moderate agreement between the physician’s diagnostic impression and CDSS k = 0.46 (p = 0.0008). Conclusions The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. PMID:21917512
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2011-11-01
Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. To develop a clinical decision-support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision-support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k=0.68 (p<0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician's diagnostic impression and the gold standard k=0. 64 (p<0.0001). There was moderate agreement between the physician's diagnostic impression and CDSS k=0.46 (p=0.0008). The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Murungi, Moses; Fulton, Travis; Reyes, Raquel; Matte, Michael; Ntaro, Moses; Mulogo, Edgar; Nyehangane, Dan; Juliano, Jonathan J; Siedner, Mark J; Boum, Yap; Boyce, Ross M
2017-05-01
Poor specificity may negatively impact rapid diagnostic test (RDT)-based diagnostic strategies for malaria. We performed real-time PCR on a subset of subjects who had undergone diagnostic testing with a multiple-antigen (histidine-rich protein 2 and pan -lactate dehydrogenase pLDH [HRP2/pLDH]) RDT and microscopy. We determined the sensitivity and specificity of the RDT in comparison to results of PCR for the detection of Plasmodium falciparum malaria. We developed and evaluated a two-step algorithm utilizing the multiple-antigen RDT to screen patients, followed by confirmatory microscopy for those individuals with HRP2-positive (HRP2 + )/pLDH-negative (pLDH - ) results. In total, dried blood spots (DBS) were collected from 276 individuals. There were 124 (44.9%) individuals with an HRP2 + /pLDH + result, 94 (34.1%) with an HRP2 + /pLDH - result, and 58 (21%) with a negative RDT result. The sensitivity and specificity of the RDT compared to results with real-time PCR were 99.4% (95% confidence interval [CI], 95.9 to 100.0%) and 46.7% (95% CI, 37.7 to 55.9%), respectively. Of the 94 HRP2 + /pLDH - results, only 32 (34.0%) and 35 (37.2%) were positive by microscopy and PCR, respectively. The sensitivity and specificity of the two-step algorithm compared to results with real-time PCR were 95.5% (95% CI, 90.5 to 98.0%) and 91.0% (95% CI, 84.1 to 95.2), respectively. HRP2 antigen bands demonstrated poor specificity for the diagnosis of malaria compared to that of real-time PCR in a high-transmission setting. The most likely explanation for this finding is the persistence of HRP2 antigenemia following treatment of an acute infection. The two-step diagnostic algorithm utilizing microscopy as a confirmatory test for indeterminate HRP2 + /pLDH - results showed significantly improved specificity with little loss of sensitivity in a high-transmission setting. Copyright © 2017 American Society for Microbiology.
Kress, Inge Ulrike; Paslakis, Georgios; Erim, Yesim
2018-03-01
The present review investigates the prevalence and medical causes of food-related gastrointestinal symptoms in eating disorder (ED) patients and recommends a diagnostic algorithm based on the current literature. A literature search was conducted, which included publications from January 2000 until January 2017 Results: Over 90% of ED patients suffer from food-related symptoms. There is no evidence for a higher prevalence of immunological or structural gastrointestinal disorders in ED patients compared to the healthy population. Most food-related symptoms in ED patients are likely to be functional. Diagnostic work-up of food-related symptoms in ED patients needs to be based on clinical history. Only if timing and quality of symptoms point towards a disorder independent from the ED is a comprehensive diagnostic work-up necessary.
Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A. W. M.; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory CRF; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M. Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-01-01
Importance Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Objective Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin–stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists’ diagnoses in a diagnostic setting. Design, Setting, and Participants Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Exposures Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. Main Outcomes and Measures The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. Results The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). Conclusions and Relevance In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting. PMID:29234806
NASA Technical Reports Server (NTRS)
Lindsey, Tony; Pecheur, Charles
2004-01-01
Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Roy, Surajit; Hirt, Evelyn H.
2014-09-12
This report describes research results to date in support of the integration and demonstration of diagnostics technologies for prototypical AdvSMR passive components (to establish condition indices for monitoring) with model-based prognostics methods. The focus of the PHM methodology and algorithm development in this study is at the localized scale. Multiple localized measurements of material condition (using advanced nondestructive measurement methods), along with available measurements of the stressor environment, enhance the performance of localized diagnostics and prognostics of passive AdvSMR components and systems.
Bar-Cohen, Yaniv; Khairy, Paul; Morwood, James; Alexander, Mark E; Cecchin, Frank; Berul, Charles I
2006-07-01
ECG algorithms used to localize accessory pathways (AP) in patients with Wolff-Parkinson-White (WPW) syndrome have been validated in adults, but less is known of their use in children, especially in patients with congenital heart disease (CHD). We hypothesize that these algorithms have low diagnostic accuracy in children and even lower in those with CHD. Pre-excited ECGs in 43 patients with WPW and CHD (median age 5.4 years [0.9-32 years]) were evaluated and compared to 43 consecutive WPW control patients without CHD (median age 14.5 years [1.8-18 years]). Two blinded observers predicted AP location using 2 adult and 1 pediatric WPW algorithms, and a third blinded observer served as a tiebreaker. Predicted locations were compared with ablation-verified AP location to identify (a) exact match for AP location and (b) match for laterality (left-sided vs right-sided AP). In control children, adult algorithms were accurate in only 56% and 60%, while the pediatric algorithm was correct in 77%. In 19 patients with Ebstein's anomaly, diagnostic accuracy was similar to controls with at times an even better ability to predict laterality. In non-Ebstein's CHD, however, the algorithms were markedly worse (29% for the adult algorithms and 42% for the pediatric algorithms). A relatively large degree of interobserver variability was seen (kappa values from 0.30 to 0.58). Adult localization algorithms have poor diagnostic accuracy in young patients with and without CHD. Both adult and pediatric algorithms are particularly misleading in non-Ebstein's CHD patients and should be interpreted with caution.
Moon, Hee-Won; Kim, Hyeong Nyeon; Hur, Mina; Shim, Hee Sook; Kim, Heejung; Yun, Yeo-Min
2016-01-01
Since every single test has some limitations for detecting toxigenic Clostridium difficile, multistep algorithms are recommended. This study aimed to compare the current, representative diagnostic algorithms for detecting toxigenic C. difficile, using VIDAS C. difficile toxin A&B (toxin ELFA), VIDAS C. difficile GDH (GDH ELFA, bioMérieux, Marcy-l'Etoile, France), and Xpert C. difficile (Cepheid, Sunnyvale, California, USA). In 271 consecutive stool samples, toxigenic culture, toxin ELFA, GDH ELFA, and Xpert C. difficile were performed. We simulated two algorithms: screening by GDH ELFA and confirmation by Xpert C. difficile (GDH + Xpert) and combined algorithm of GDH ELFA, toxin ELFA, and Xpert C. difficile (GDH + Toxin + Xpert). The performance of each assay and algorithm was assessed. The agreement of Xpert C. difficile and two algorithms (GDH + Xpert and GDH+ Toxin + Xpert) with toxigenic culture were strong (Kappa, 0.848, 0.857, and 0.868, respectively). The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of algorithms (GDH + Xpert and GDH + Toxin + Xpert) were 96.7%, 95.8%, 85.0%, 98.1%, and 94.5%, 95.8%, 82.3%, 98.5%, respectively. There were no significant differences between Xpert C. difficile and two algorithms in sensitivity, specificity, PPV and NPV. The performances of both algorithms for detecting toxigenic C. difficile were comparable to that of Xpert C. difficile. Either algorithm would be useful in clinical laboratories and can be optimized in the diagnostic workflow of C. difficile depending on costs, test volume, and clinical needs.
Chaotic Particle Swarm Optimization with Mutation for Classification
Assarzadeh, Zahra; Naghsh-Nilchi, Ahmad Reza
2015-01-01
In this paper, a chaotic particle swarm optimization with mutation-based classifier particle swarm optimization is proposed to classify patterns of different classes in the feature space. The introduced mutation operators and chaotic sequences allows us to overcome the problem of early convergence into a local minima associated with particle swarm optimization algorithms. That is, the mutation operator sharpens the convergence and it tunes the best possible solution. Furthermore, to remove the irrelevant data and reduce the dimensionality of medical datasets, a feature selection approach using binary version of the proposed particle swarm optimization is introduced. In order to demonstrate the effectiveness of our proposed classifier, mutation-based classifier particle swarm optimization, it is checked out with three sets of data classifications namely, Wisconsin diagnostic breast cancer, Wisconsin breast cancer and heart-statlog, with different feature vector dimensions. The proposed algorithm is compared with different classifier algorithms including k-nearest neighbor, as a conventional classifier, particle swarm-classifier, genetic algorithm, and Imperialist competitive algorithm-classifier, as more sophisticated ones. The performance of each classifier was evaluated by calculating the accuracy, sensitivity, specificity and Matthews's correlation coefficient. The experimental results show that the mutation-based classifier particle swarm optimization unequivocally performs better than all the compared algorithms. PMID:25709937
Kariuki, Jacob K; Gona, Philimon; Leveille, Suzanne G; Stuart-Shor, Eileen M; Hayman, Laura L; Cromwell, Jerry
2018-06-01
The non-lab Framingham algorithm, which substitute body mass index for lipids in the laboratory based (lab-based) Framingham algorithm, has been validated among African Americans (AAs). However, its cost-effectiveness and economic tradeoffs have not been evaluated. This study examines the incremental cost-effectiveness ratio (ICER) of two cardiovascular disease (CVD) prevention programs guided by the non-lab versus lab-based Framingham algorithm. We simulated the World Health Organization CVD prevention guidelines on a cohort of 2690 AA participants in the Atherosclerosis Risk in Communities (ARIC) cohort. Costs were estimated using Medicare fee schedules (diagnostic tests, drugs & visits), Bureau of Labor Statistics (RN wages), and estimates for managing incident CVD events. Outcomes were assumed to be true positive cases detected at a data driven treatment threshold. Both algorithms had the best balance of sensitivity/specificity at the moderate risk threshold (>10% risk). Over 12years, 82% and 77% of 401 incident CVD events were accurately predicted via the non-lab and lab-based Framingham algorithms, respectively. There were 20 fewer false negative cases in the non-lab approach translating into over $900,000 in savings over 12years. The ICER was -$57,153 for every extra CVD event prevented when using the non-lab algorithm. The approach guided by the non-lab Framingham strategy dominated the lab-based approach with respect to both costs and predictive ability. Consequently, the non-lab Framingham algorithm could potentially provide a highly effective screening tool at lower cost to address the high burden of CVD especially among AA and in resource-constrained settings where lab tests are unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.
Counting malaria parasites with a two-stage EM based algorithm using crowsourced data.
Cabrera-Bean, Margarita; Pages-Zamora, Alba; Diaz-Vilor, Carles; Postigo-Camps, Maria; Cuadrado-Sanchez, Daniel; Luengo-Oroz, Miguel Angel
2017-07-01
Malaria eradication of the worldwide is currently one of the main WHO's global goals. In this work, we focus on the use of human-machine interaction strategies for low-cost fast reliable malaria diagnostic based on a crowdsourced approach. The addressed technical problem consists in detecting spots in images even under very harsh conditions when positive objects are very similar to some artifacts. The clicks or tags delivered by several annotators labeling an image are modeled as a robust finite mixture, and techniques based on the Expectation-Maximization (EM) algorithm are proposed for accurately counting malaria parasites on thick blood smears obtained by microscopic Giemsa-stained techniques. This approach outperforms other traditional methods as it is shown through experimentation with real data.
Threshold Assessment of Gear Diagnostic Tools on Flight and Test Rig Data
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Mosher, Marianne; Huff, Edward M.
2003-01-01
A method for defining thresholds for vibration-based algorithms that provides the minimum number of false alarms while maintaining sensitivity to gear damage was developed. This analysis focused on two vibration based gear damage detection algorithms, FM4 and MSA. This method was developed using vibration data collected during surface fatigue tests performed in a spur gearbox rig. The thresholds were defined based on damage progression during tests with damage. The thresholds false alarm rates were then evaluated on spur gear tests without damage. Next, the same thresholds were applied to flight data from an OH-58 helicopter transmission. Results showed that thresholds defined in test rigs can be used to define thresholds in flight to correctly classify the transmission operation as normal.
Schiffman, Eric; Ohrbach, Richard; Truelove, Edmond; Look, John; Anderson, Gary; Goulet, Jean-Paul; List, Thomas; Svensson, Peter; Gonzalez, Yoly; Lobbezoo, Frank; Michelotti, Ambra; Brooks, Sharon L.; Ceusters, Werner; Drangsholt, Mark; Ettlin, Dominik; Gaul, Charly; Goldberg, Louis J.; Haythornthwaite, Jennifer A.; Hollender, Lars; Jensen, Rigmor; John, Mike T.; De Laat, Antoon; de Leeuw, Reny; Maixner, William; van der Meulen, Marylee; Murray, Greg M.; Nixdorf, Donald R.; Palla, Sandro; Petersson, Arne; Pionchon, Paul; Smith, Barry; Visscher, Corine M.; Zakrzewska, Joanna; Dworkin, Samuel F.
2015-01-01
Aims The original Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Axis I diagnostic algorithms have been demonstrated to be reliable. However, the Validation Project determined that the RDC/TMD Axis I validity was below the target sensitivity of ≥ 0.70 and specificity of ≥ 0.95. Consequently, these empirical results supported the development of revised RDC/TMD Axis I diagnostic algorithms that were subsequently demonstrated to be valid for the most common pain-related TMD and for one temporomandibular joint (TMJ) intra-articular disorder. The original RDC/TMD Axis II instruments were shown to be both reliable and valid. Working from these findings and revisions, two international consensus workshops were convened, from which recommendations were obtained for the finalization of new Axis I diagnostic algorithms and new Axis II instruments. Methods Through a series of workshops and symposia, a panel of clinical and basic science pain experts modified the revised RDC/TMD Axis I algorithms by using comprehensive searches of published TMD diagnostic literature followed by review and consensus via a formal structured process. The panel's recommendations for further revision of the Axis I diagnostic algorithms were assessed for validity by using the Validation Project's data set, and for reliability by using newly collected data from the ongoing TMJ Impact Project—the follow-up study to the Validation Project. New Axis II instruments were identified through a comprehensive search of the literature providing valid instruments that, relative to the RDC/TMD, are shorter in length, are available in the public domain, and currently are being used in medical settings. Results The newly recommended Diagnostic Criteria for TMD (DC/TMD) Axis I protocol includes both a valid screener for detecting any pain-related TMD as well as valid diagnostic criteria for differentiating the most common pain-related TMD (sensitivity ≥ 0.86, specificity ≥ 0.98) and for one intra-articular disorder (sensitivity of 0.80 and specificity of 0.97). Diagnostic criteria for other common intra-articular disorders lack adequate validity for clinical diagnoses but can be used for screening purposes. Inter-examiner reliability for the clinical assessment associated with the validated DC/TMD criteria for pain-related TMD is excellent (kappa ≥ 0.85). Finally, a comprehensive classification system that includes both the common and less common TMD is also presented. The Axis II protocol retains selected original RDC/TMD screening instruments augmented with new instruments to assess jaw function as well as behavioral and additional psychosocial factors. The Axis II protocol is divided into screening and comprehensive self-report instrument sets. The screening instruments’ 41 questions assess pain intensity, pain-related disability, psychological distress, jaw functional limitations, and parafunctional behaviors, and a pain drawing is used to assess locations of pain. The comprehensive instruments, composed of 81 questions, assess in further detail jaw functional limitations and psychological distress as well as additional constructs of anxiety and presence of comorbid pain conditions. Conclusion The recommended evidence-based new DC/TMD protocol is appropriate for use in both clinical and research settings. More comprehensive instruments augment short and simple screening instruments for Axis I and Axis II. These validated instruments allow for identification of patients with a range of simple to complex TMD presentations. PMID:24482784
Zhu, Bohui; Ding, Yongsheng; Hao, Kuangrong
2013-01-01
This paper presents a novel maximum margin clustering method with immune evolution (IEMMC) for automatic diagnosis of electrocardiogram (ECG) arrhythmias. This diagnostic system consists of signal processing, feature extraction, and the IEMMC algorithm for clustering of ECG arrhythmias. First, raw ECG signal is processed by an adaptive ECG filter based on wavelet transforms, and waveform of the ECG signal is detected; then, features are extracted from ECG signal to cluster different types of arrhythmias by the IEMMC algorithm. Three types of performance evaluation indicators are used to assess the effect of the IEMMC method for ECG arrhythmias, such as sensitivity, specificity, and accuracy. Compared with K-means and iterSVR algorithms, the IEMMC algorithm reflects better performance not only in clustering result but also in terms of global search ability and convergence ability, which proves its effectiveness for the detection of ECG arrhythmias. PMID:23690875
Sie, Daoud; Snijders, Peter J F; Meijer, Gerrit A; Doeleman, Marije W; van Moorsel, Marinda I H; van Essen, Hendrik F; Eijk, Paul P; Grünberg, Katrien; van Grieken, Nicole C T; Thunnissen, Erik; Verheul, Henk M; Smit, Egbert F; Ylstra, Bauke; Heideman, Daniëlle A M
2014-10-01
Next generation DNA sequencing (NGS) holds promise for diagnostic applications, yet implementation in routine molecular pathology practice requires performance evaluation on DNA derived from routine formalin-fixed paraffin-embedded (FFPE) tissue specimens. The current study presents a comprehensive analysis of TruSeq Amplicon Cancer Panel-based NGS using a MiSeq Personal sequencer (TSACP-MiSeq-NGS) for somatic mutation profiling. TSACP-MiSeq-NGS (testing 212 hotspot mutation amplicons of 48 genes) and a data analysis pipeline were evaluated in a retrospective learning/test set approach (n = 58/n = 45 FFPE-tumor DNA samples) against 'gold standard' high-resolution-melting (HRM)-sequencing for the genes KRAS, EGFR, BRAF and PIK3CA. Next, the performance of the validated test algorithm was assessed in an independent, prospective cohort of FFPE-tumor DNA samples (n = 75). In the learning set, a number of minimum parameter settings was defined to decide whether a FFPE-DNA sample is qualified for TSACP-MiSeq-NGS and for calling mutations. The resulting test algorithm revealed 82% (37/45) compliance to the quality criteria and 95% (35/37) concordant assay findings for KRAS, EGFR, BRAF and PIK3CA with HRM-sequencing (kappa = 0.92; 95% CI = 0.81-1.03) in the test set. Subsequent application of the validated test algorithm to the prospective cohort yielded a success rate of 84% (63/75), and a high concordance with HRM-sequencing (95% (60/63); kappa = 0.92; 95% CI = 0.84-1.01). TSACP-MiSeq-NGS detected 77 mutations in 29 additional genes. TSACP-MiSeq-NGS is suitable for diagnostic gene mutation profiling in oncopathology.
Defining the Needs for Next Generation Assays for Tuberculosis
Denkinger, Claudia M.; Kik, Sandra V.; Cirillo, Daniela Maria; Casenghi, Martina; Shinnick, Thomas; Weyer, Karin; Gilpin, Chris; Boehme, Catharina C.; Schito, Marco; Kimerling, Michael; Pai, Madhukar
2015-01-01
To accelerate the fight against tuberculosis, major diagnostic challenges need to be addressed urgently. Post-2015 targets are unlikely to be met without the use of novel diagnostics that are more accurate and can be used closer to where patients first seek care in affordable diagnostic algorithms. This article describes the efforts by the stakeholder community that led to the identification of the high-priority diagnostic needs in tuberculosis. Subsequently target product profiles for the high-priority diagnostic needs were developed and reviewed in a World Health Organization (WHO)-led consensus meeting. The high-priority diagnostic needs included (1) a sputum-based replacement test for smear-microscopy; (2) a non-sputum-based biomarker test for all forms of tuberculosis, ideally suitable for use at levels below microscopy centers; (3) a simple, low cost triage test for use by first-contact care providers as a rule-out test, ideally suitable for use by community health workers; and (4) a rapid drug susceptibility test for use at the microscopy center level. The developed target product profiles, along with complimentary work presented in this supplement, will help to facilitate the interaction between the tuberculosis community and the diagnostics industry with the goal to lead the way toward the post-2015 global tuberculosis targets. PMID:25765104
HIV misdiagnosis in sub-Saharan Africa: performance of diagnostic algorithms at six testing sites
Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng’ang’a, Anne; Andre, Bita; Zahinda, Jean-Paul BN; Fransen, Katrien; Page, Anne-Laure
2017-01-01
Abstract Introduction: We evaluated the diagnostic accuracy of HIV testing algorithms at six programmes in five sub-Saharan African countries. Methods: In this prospective multisite diagnostic evaluation study (Conakry, Guinea; Kitgum, Uganda; Arua, Uganda; Homa Bay, Kenya; Doula, Cameroun and Baraka, Democratic Republic of Congo), samples from clients (greater than equal to five years of age) testing for HIV were collected and compared to a state-of-the-art algorithm from the AIDS reference laboratory at the Institute of Tropical Medicine, Belgium. The reference algorithm consisted of an enzyme-linked immuno-sorbent assay, a line-immunoassay, a single antigen-enzyme immunoassay and a DNA polymerase chain reaction test. Results: Between August 2011 and January 2015, over 14,000 clients were tested for HIV at 6 HIV counselling and testing sites. Of those, 2786 (median age: 30; 38.1% males) were included in the study. Sensitivity of the testing algorithms ranged from 89.5% in Arua to 100% in Douala and Conakry, while specificity ranged from 98.3% in Doula to 100% in Conakry. Overall, 24 (0.9%) clients, and as many as 8 per site (1.7%), were misdiagnosed, with 16 false-positive and 8 false-negative results. Six false-negative specimens were retested with the on-site algorithm on the same sample and were found to be positive. Conversely, 13 false-positive specimens were retested: 8 remained false-positive with the on-site algorithm. Conclusions: The performance of algorithms at several sites failed to meet expectations and thresholds set by the World Health Organization, with unacceptably high rates of false results. Alongside the careful selection of rapid diagnostic tests and the validation of algorithms, strictly observing correct procedures can reduce the risk of false results. In the meantime, to identify false-positive diagnoses at initial testing, patients should be retested upon initiating antiretroviral therapy. PMID:28691437
HIV misdiagnosis in sub-Saharan Africa: performance of diagnostic algorithms at six testing sites.
Kosack, Cara S; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng'ang'a, Anne; Andre, Bita; Zahinda, Jean-Paul Bn; Fransen, Katrien; Page, Anne-Laure
2017-07-03
We evaluated the diagnostic accuracy of HIV testing algorithms at six programmes in five sub-Saharan African countries. In this prospective multisite diagnostic evaluation study (Conakry, Guinea; Kitgum, Uganda; Arua, Uganda; Homa Bay, Kenya; Doula, Cameroun and Baraka, Democratic Republic of Congo), samples from clients (greater than equal to five years of age) testing for HIV were collected and compared to a state-of-the-art algorithm from the AIDS reference laboratory at the Institute of Tropical Medicine, Belgium. The reference algorithm consisted of an enzyme-linked immuno-sorbent assay, a line-immunoassay, a single antigen-enzyme immunoassay and a DNA polymerase chain reaction test. Between August 2011 and January 2015, over 14,000 clients were tested for HIV at 6 HIV counselling and testing sites. Of those, 2786 (median age: 30; 38.1% males) were included in the study. Sensitivity of the testing algorithms ranged from 89.5% in Arua to 100% in Douala and Conakry, while specificity ranged from 98.3% in Doula to 100% in Conakry. Overall, 24 (0.9%) clients, and as many as 8 per site (1.7%), were misdiagnosed, with 16 false-positive and 8 false-negative results. Six false-negative specimens were retested with the on-site algorithm on the same sample and were found to be positive. Conversely, 13 false-positive specimens were retested: 8 remained false-positive with the on-site algorithm. The performance of algorithms at several sites failed to meet expectations and thresholds set by the World Health Organization, with unacceptably high rates of false results. Alongside the careful selection of rapid diagnostic tests and the validation of algorithms, strictly observing correct procedures can reduce the risk of false results. In the meantime, to identify false-positive diagnoses at initial testing, patients should be retested upon initiating antiretroviral therapy.
Hybrid Kalman Filter: A New Approach for Aircraft Engine In-Flight Diagnostics
NASA Technical Reports Server (NTRS)
Kobayashi, Takahisa; Simon, Donald L.
2006-01-01
In this paper, a uniquely structured Kalman filter is developed for its application to in-flight diagnostics of aircraft gas turbine engines. The Kalman filter is a hybrid of a nonlinear on-board engine model (OBEM) and piecewise linear models. The utilization of the nonlinear OBEM allows the reference health baseline of the in-flight diagnostic system to be updated to the degraded health condition of the engines through a relatively simple process. Through this health baseline update, the effectiveness of the in-flight diagnostic algorithm can be maintained as the health of the engine degrades over time. Another significant aspect of the hybrid Kalman filter methodology is its capability to take advantage of conventional linear and nonlinear Kalman filter approaches. Based on the hybrid Kalman filter, an in-flight fault detection system is developed, and its diagnostic capability is evaluated in a simulation environment. Through the evaluation, the suitability of the hybrid Kalman filter technique for aircraft engine in-flight diagnostics is demonstrated.
The JET diagnostic fast central acquisition and trigger system (abstract)
NASA Astrophysics Data System (ADS)
Edwards, A. W.; Blackler, K.
1995-01-01
Most plasma physics diagnostics sample at a fixed frequency that is normally matched to available memory limits. This technique is not appropriate for long pulse machines such as JET where sampling frequencies of hundreds of kHz are required to diagnose very fast events. As a result of work using real-time event selection within the previous JET soft x-ray diagnostic, a single data acquisition and event triggering system for all suitable fast diagnostics, the fast central acquisition and trigger system (Fast CATS), has been developed for JET. The front-end analog-to-digital conversion (ADC) part samples all channels at 250 kHz, with a 100 kHz pass band and a stop band of 125 kHz. The back-end data collection system is based around Texas Instruments TMS320C40 microprocessors. Within this system, two levels of trigger algorithms are able to evaluate data. The first level typically analyzes data on a per diagnostic and individual channel basis. The second level looks at the data from one or more diagnostics in a window around the time of interest flagged by the first level system. Selection criteria defined by the diagnosticians are then imposed on the results from the second level to decide whether that data should be kept. The use of such a system involving intelligent real time trigger algorithms and fast data analysis will improve both the quantity and quality of JET diagnostic data, while providing valuable input to the design of data acquisition systems for very long pulse machines such as ITER. This paper will give an overview of the various elements of this new system. In addition, first results from this system following the restart of JET operation will be presented.
Diagnostic methods for insect sting allergy.
Hamilton, Robert G
2004-08-01
This review overviews advances from mid-2002 to the present in the validation and performance methods used in the diagnosis of Hymenoptera venom-induced immediate-type hypersensitivity. The general diagnostic algorithm for insect sting allergy is initially discussed with an examination of the AAAAI's 2003 revised practice parameter guidelines. Changes as a result of a greater recognition of skin test negative systemic reactors include repeat analysis of all testing and acceptance of serology as a complementary diagnostic test to the skin test. Original data examining concordance of venom-specific IgE results produced by the second-generation Pharmacia CAP System with the Johns Hopkins University radioallergosorbent test are presented. Diagnostic performance of honeybee venom-specific IgE assays used in clinical laboratories in North America is discussed using data from the Diagnostic Allergy Proficiency Survey conducted by the College of American Pathologists. Validity of venom-specific IgE antibody in postmortem blood specimens is demonstrated. The utility of alternative in-vivo (provocation) and in-vitro (basophil-based) diagnostic testing methods is critiqued. This overview supports the following conclusions. Improved practice parameter guidelines include serology and skin test as complementary in supporting a positive clinical history during the diagnostic process. Data are provided which support the analytical performance of commercially available venom-specific IgE antibody serology-based assays. Intentional sting challenge in-vivo provocation, in-vitro basophil flow cytometry (CD63, CD203c) based assays, and in-vitro basophil histamine and sulfidoleukotriene release assays have their utility in the study of difficult diagnostic cases, but their use will remain as supplementary, secondary diagnostic tests.
Labourier, Emmanuel; Shifrin, Alexander; Busseniers, Anne E; Lupo, Mark A; Manganelli, Monique L; Andruss, Bernard; Wylie, Dennis; Beaudenon-Huibregtse, Sylvie
2015-07-01
Molecular testing for oncogenic mutations or gene expression in fine-needle aspirations (FNAs) from thyroid nodules with indeterminate cytology identifies a subset of benign or malignant lesions with high predictive value. This study aimed to evaluate a novel diagnostic algorithm combining mutation detection and miRNA expression to improve the diagnostic yield of molecular cytology. Surgical specimens and preoperative FNAs (n = 638) were tested for 17 validated gene alterations using the miRInform Thyroid test and with a 10-miRNA gene expression classifier generating positive (malignant) or negative (benign) results. Cross-sectional sampling of thyroid nodules with atypia of undetermined significance/follicular lesion of undetermined significance (AUS/FLUS) or follicular neoplasm/suspicious for a follicular neoplasm (FN/SFN) cytology (n = 109) was conducted at 12 endocrinology centers across the United States. Qualitative molecular results were compared with surgical histopathology to determine diagnostic performance and model clinical effect. Mutations were detected in 69% of nodules with malignant outcome. Among mutation-negative specimens, miRNA testing correctly identified 64% of malignant cases and 98% of benign cases. The diagnostic sensitivity and specificity of the combined algorithm was 89% (95% confidence interval [CI], 73-97%) and 85% (95% CI, 75-92%), respectively. At 32% cancer prevalence, 61% of the molecular results were benign with a negative predictive value of 94% (95% CI, 85-98%). Independently of variations in cancer prevalence, the test increased the yield of true benign results by 65% relative to mRNA-based gene expression classification and decreased the rate of avoidable diagnostic surgeries by 69%. Multiplatform testing for DNA, mRNA, and miRNA can accurately classify benign and malignant thyroid nodules, increase the diagnostic yield of molecular cytology, and further improve the preoperative risk-based management of benign nodules with AUS/FLUS or FN/SFN cytology.
Sex-specific performance of pre-imaging diagnostic algorithms for pulmonary embolism.
van Mens, T E; van der Pol, L M; van Es, N; Bistervels, I M; Mairuhu, A T A; van der Hulle, T; Klok, F A; Huisman, M V; Middeldorp, S
2018-05-01
Essentials Decision rules for pulmonary embolism are used indiscriminately despite possible sex-differences. Various pre-imaging diagnostic algorithms have been investigated in several prospective studies. When analysed at an individual patient data level the algorithms perform similarly in both sexes. Estrogen use and male sex were associated with a higher prevalence in suspected pulmonary embolism. Background In patients suspected of pulmonary embolism (PE), clinical decision rules are combined with D-dimer testing to rule out PE, avoiding the need for imaging in those at low risk. Despite sex differences in several aspects of the disease, including its diagnosis, these algorithms are used indiscriminately in women and men. Objectives To compare the performance, defined as efficiency and failure rate, of three pre-imaging diagnostic algorithms for PE between women and men: the Wells rule with fixed or with age-adjusted D-dimer cut-off, and a recently validated algorithm (YEARS). A secondary aim was to determine the sex-specific prevalence of PE. Methods Individual patient data were obtained from six studies using the Wells rule (fixed D-dimer, n = 5; age adjusted, n = 1) and from one study using the YEARS algorithm. All studies prospectively enrolled consecutive patients with suspected PE. Main outcomes were efficiency (proportion of patients in which the algorithm ruled out PE without imaging) and failure rate (proportion of patients with PE not detected by the algorithm). Outcomes were estimated using (multilevel) logistic regression models. Results The main outcomes showed no sex differences in any of the separate algorithms. With all three, the prevalence of PE was lower in women (OR, 0.66, 0.68 and 0.74). In women, estrogen use, adjusted for age, was associated with lower efficiency and higher prevalence and D-dimer levels. Conclusions The investigated pre-imaging diagnostic algorithms for patients suspected of PE show no sex differences in performance. Male sex and estrogen use are both associated with a higher probability of having the disease. © 2018 International Society on Thrombosis and Haemostasis.
How Does Relaxing the Algorithm for Autism Affect DSM-V Prevalence Rates?
ERIC Educational Resources Information Center
Matson, Johnny L.; Hattier, Megan A.; Williams, Lindsey W.
2012-01-01
Although it is still unclear what causes autism spectrum disorders (ASDs), over time researchers and clinicians have become more precise with detecting and diagnosing ASD. Many diagnoses, however, are based on the criteria established within the "Diagnostic and Statistical Manual of Mental Disorders" ("DSM"); thus, any change in these diagnostic…
Algorithms development for the GEM-based detection system
NASA Astrophysics Data System (ADS)
Czarski, T.; Chernyshova, M.; Malinowski, K.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R.; Wojenski, A.; Zabolotny, W.
2016-09-01
The measurement system based on GEM - Gas Electron Multiplier detector - is developed for soft X-ray diagnostics of tokamak plasmas. The multi-channel setup is designed for estimation of the energy and the position distribution of an Xray source. The focal measuring issue is the charge cluster identification by its value and position estimation. The fast and accurate mode of the serial data acquisition is applied for the dynamic plasma diagnostics. The charge clusters are counted in the space determined by 2D position, charge value and time intervals. Radiation source characteristics are presented by histograms for a selected range of position, time intervals and cluster charge values corresponding to the energy spectra.
Montagna, Fabio; Buiatti, Marco; Benatti, Simone; Rossi, Davide; Farella, Elisabetta; Benini, Luca
2017-10-01
EEG is a standard non-invasive technique used in neural disease diagnostics and neurosciences. Frequency-tagging is an increasingly popular experimental paradigm that efficiently tests brain function by measuring EEG responses to periodic stimulation. Recently, frequency-tagging paradigms have proven successful with low stimulation frequencies (0.5-6Hz), but the EEG signal is intrinsically noisy in this frequency range, requiring heavy signal processing and significant human intervention for response estimation. This limits the possibility to process the EEG on resource-constrained systems and to design smart EEG based devices for automated diagnostic. We propose an algorithm for artifact removal and automated detection of frequency tagging responses in a wide range of stimulation frequencies, which we test on a visual stimulation protocol. The algorithm is rooted on machine learning based pattern recognition techniques and it is tailored for a new generation parallel ultra low power processing platform (PULP), reaching performance of more that 90% accuracy in the frequency detection even for very low stimulation frequencies (<1Hz) with a power budget of 56mW. Copyright © 2017 Elsevier Inc. All rights reserved.
Kavitha, Muthu Subash; Asano, Akira; Taguchi, Akira; Heo, Min-Suk
2013-09-01
To prevent low bone mineral density (BMD), that is, osteoporosis, in postmenopausal women, it is essential to diagnose osteoporosis more precisely. This study presented an automatic approach utilizing a histogram-based automatic clustering (HAC) algorithm with a support vector machine (SVM) to analyse dental panoramic radiographs (DPRs) and thus improve diagnostic accuracy by identifying postmenopausal women with low BMD or osteoporosis. We integrated our newly-proposed histogram-based automatic clustering (HAC) algorithm with our previously-designed computer-aided diagnosis system. The extracted moment-based features (mean, variance, skewness, and kurtosis) of the mandibular cortical width for the radial basis function (RBF) SVM classifier were employed. We also compared the diagnostic efficacy of the SVM model with the back propagation (BP) neural network model. In this study, DPRs and BMD measurements of 100 postmenopausal women patients (aged >50 years), with no previous record of osteoporosis, were randomly selected for inclusion. The accuracy, sensitivity, and specificity of the BMD measurements using our HAC-SVM model to identify women with low BMD were 93.0% (88.0%-98.0%), 95.8% (91.9%-99.7%) and 86.6% (79.9%-93.3%), respectively, at the lumbar spine; and 89.0% (82.9%-95.1%), 96.0% (92.2%-99.8%) and 84.0% (76.8%-91.2%), respectively, at the femoral neck. Our experimental results predict that the proposed HAC-SVM model combination applied on DPRs could be useful to assist dentists in early diagnosis and help to reduce the morbidity and mortality associated with low BMD and osteoporosis.
Recommendations for the use of molecular diagnostics in the diagnosis of allergic dis-eases.
Villalta, D; Tonutti, E; Bizzaro, N; Brusca, I; Sargentini, V; Asero, R; Bilo, M B; Manzotti, G; Murzilli, F; Cecchi, L; Musarra, A
2018-03-01
The Study Group on Allergology of the Italian Society of Clinical Pathology and Laboratory Medicine (SIPMeL) and the Associazione Italiana degli Allergologi e Immunologi Territoriali e Ospedalieri (AAIITO) developed the present recommendations on the diagnosis of allergic diseases based on the use of molecular allergenic components, whose purpose is to provide the pathologists and the clinicians with information and algorithms enabling a proper use of this second-level diagnostics. Molecular diagnostics allows definition of the exact sensitization profile of the allergic patient. The methodology followed to develop these recommendations included an initial phase of discussion between all the components to integrate the knowledge derived from scientific evidence, a revision of the recommendations made by Italian and foreign experts, and the subsequent production of this document to be disseminated to all those who deal with allergy diagnostics.
Zbroch, Tomasz; Knapp, Paweł Grzegorz; Knapp, Piotr Andrzej
2007-09-01
Increasing knowledge concerning carcinogenesis within cervical epithelium has forced us to make continues modifications of cytology classification of the cervical smears. Eventually, new descriptions of the submicroscopic cytomorphological abnormalities have enabled the implementation of Bethesda System which was meant to take place of the former Papanicolaou classification although temporarily both are sometimes used simultaneously. The aim of this study was to compare results of these two classification systems in the aspect of diagnostic accuracy verified by further tests of the diagnostic algorithm for the cervical lesion evaluation. The study was conducted in the group of women selected from general population, the criteria being the place of living and cervical cancer age risk group, in the consecutive periods of mass screening in Podlaski region. The performed diagnostic tests have been based on the commonly used algorithm, as well as identical laboratory and methodological conditions. Performed assessment revealed comparable diagnostic accuracy of both analyzing classifications, verified by histological examination, although with marked higher specificity for dysplastic lesions with decreased number of HSIL results and increased diagnosis of LSILs. Higher number of performed colposcopies and biopsies were an additional consequence of TBS classification. Results based on Bethesda System made it possible to find the sources and reasons of abnormalities with much greater precision, which enabled causing agent treatment. Two evaluated cytology classification systems, although not much different, depicted higher potential of TBS and better, more effective communication between cytology laboratory and gynecologist, making reasonable implementation of The Bethesda System in the daily cytology screening work.
Bron, Esther E; Smits, Marion; van der Flier, Wiesje M; Vrenken, Hugo; Barkhof, Frederik; Scheltens, Philip; Papma, Janne M; Steketee, Rebecca M E; Méndez Orellana, Carolina; Meijboom, Rozanna; Pinto, Madalena; Meireles, Joana R; Garrett, Carolina; Bastos-Leite, António J; Abdulkadir, Ahmed; Ronneberger, Olaf; Amoroso, Nicola; Bellotti, Roberto; Cárdenas-Peña, David; Álvarez-Meza, Andrés M; Dolph, Chester V; Iftekharuddin, Khan M; Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir S; Franke, Katja; Gaser, Christian; Ledig, Christian; Guerrero, Ricardo; Tong, Tong; Gray, Katherine R; Moradi, Elaheh; Tohka, Jussi; Routier, Alexandre; Durrleman, Stanley; Sarica, Alessia; Di Fatta, Giuseppe; Sensi, Francesco; Chincarini, Andrea; Smith, Garry M; Stoyanov, Zhivko V; Sørensen, Lauge; Nielsen, Mads; Tangaro, Sabina; Inglese, Paolo; Wachinger, Christian; Reuter, Martin; van Swieten, John C; Niessen, Wiro J; Klein, Stefan
2015-05-01
Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n=30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org. Copyright © 2015 Elsevier Inc. All rights reserved.
Thoracoabdominal Computed Tomography in Trauma Patients: A Cost-Consequences Analysis
van Vugt, Raoul; Kool, Digna R.; Brink, Monique; Dekker, Helena M.; Deunk, Jaap; Edwards, Michael J.
2014-01-01
Background: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. Objectives: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use thoracoabdominal CT in primary evaluation of adult patients with high-energy blunt trauma. Materials and Methods: We compared three different algorithms in which CT was applied as an immediate diagnostic tool (rush CT), a diagnostic tool after limited conventional work-up (routine CT), and a selective tool (selective CT). Probabilities of detecting and missing clinically relevant injuries were retrospectively derived. We collected data on radiation exposure and performed a micro-cost analysis on a reference case-based approach. Results: Both rush and routine CT detected all thoracoabdominal injuries in 99.1% of the patients during primary evaluation (n = 1040). Selective CT missed one or more diagnoses in 11% of the patients in which a change of treatment was necessary in 4.8%. Rush CT algorithm costed € 2676 (US$ 3660) per patient with a mean radiation dose of 26.40 mSv per patient. Routine CT costed € 2815 (US$ 3850) and resulted in the same radiation exposure. Selective CT resulted in less radiation dose (23.23 mSv) and costed € 2771 (US$ 3790). Conclusions: Rush CT seems to result in the least costs and is comparable in terms of radiation dose exposure and diagnostic certainty with routine CT after a limited conventional work-up. However, selective CT results in less radiation dose exposure but a slightly higher cost and less certainty. PMID:25337521
Delcroix, Olivier; Robin, Philippe; Gouillou, Maelenn; Le Duc-Pennec, Alexandra; Alavi, Zarrin; Le Roux, Pierre-Yves; Abgral, Ronan; Salaun, Pierre-Yves; Bourhis, David; Querellou, Solène
2018-02-12
xSPECT Bone® (xB) is a new reconstruction algorithm developed by Siemens® in bone hybrid imaging (SPECT/CT). A CT-based tissue segmentation is incorporated into SPECT reconstruction to provide SPECT images with bone anatomy appearance. The objectives of this study were to assess xB/CT reconstruction diagnostic reliability and accuracy in comparison with Flash 3D® (F3D)/CT in clinical routine. Two hundred thirteen consecutive patients referred to the Brest Nuclear Medicine Department for non-oncological bone diseases were evaluated retrospectively. Two hundred seven SPECT/CT were included. All SPECT/CT were independently interpreted by two nuclear medicine physicians (a junior and a senior expert) with xB/CT then with F3D/CT three months later. Inter-observer agreement (IOA) and diagnostic confidence were determined using McNemar test, and unweighted Kappa coefficient. The study objectives were then re-assessed for validation through > 18 months of clinical and paraclinical follow-up. No statistically significant differences between IOA xB and IOA F3D were found (p = 0.532). Agreement for xB after categorical classification of the diagnoses was high (κ xB = 0.89 [95% CI 0.84 -0.93]) but without statistically significant difference F3D (κ F3D = 0.90 [95% CI 0.86 - 0.94]). Thirty-one (14.9%) inter-reconstruction diagnostic discrepancies were observed of which 21 (10.1%) were classified as major. The follow-up confirmed the diagnosis of F3D in 10 cases, xB in 6 cases and was non-contributory in 5 cases. xB reconstruction algorithm was found reliable, providing high interobserver agreement and similar diagnostic confidence to F3D reconstruction in clinical routine.
Side-locked headaches: an algorithm-based approach.
Prakash, Sanjay; Rathore, Chaturbhuj
2016-12-01
The differential diagnosis of strictly unilateral hemicranial pain includes a large number of primary and secondary headaches and cranial neuropathies. It may arise from both intracranial and extracranial structures such as cranium, neck, vessels, eyes, ears, nose, sinuses, teeth, mouth, and the other facial or cervical structure. Available data suggest that about two-third patients with side-locked headache visiting neurology or headache clinics have primary headaches. Other one-third will have either secondary headaches or neuralgias. Many of these hemicranial pain syndromes have overlapping presentations. Primary headache disorders may spread to involve the face and / or neck. Even various intracranial and extracranial pathologies may have similar overlapping presentations. Patients may present to a variety of clinicians, including headache experts, dentists, otolaryngologists, ophthalmologist, psychiatrists, and physiotherapists. Unfortunately, there is not uniform approach for such patients and diagnostic ambiguity is frequently encountered in clinical practice.Herein, we review the differential diagnoses of side-locked headaches and provide an algorithm based approach for patients presenting with side-locked headaches. Side-locked headache is itself a red flag. So, the first priority should be to rule out secondary headaches. A comprehensive history and thorough examinations will help one to formulate an algorithm to rule out or confirm secondary side-locked headaches. The diagnoses of most secondary side-locked headaches are largely investigations dependent. Therefore, each suspected secondary headache should be subjected for appropriate investigations or referral. The diagnostic approach of primary side-locked headache starts once one rule out all the possible secondary headaches. We have discussed an algorithmic approach for both secondary and primary side-locked headaches.
Raman spectral feature selection using ant colony optimization for breast cancer diagnosis.
Fallahzadeh, Omid; Dehghani-Bidgoli, Zohreh; Assarian, Mohammad
2018-06-04
Pathology as a common diagnostic test of cancer is an invasive, time-consuming, and partially subjective method. Therefore, optical techniques, especially Raman spectroscopy, have attracted the attention of cancer diagnosis researchers. However, as Raman spectra contain numerous peaks involved in molecular bounds of the sample, finding the best features related to cancerous changes can improve the accuracy of diagnosis in this method. The present research attempted to improve the power of Raman-based cancer diagnosis by finding the best Raman features using the ACO algorithm. In the present research, 49 spectra were measured from normal, benign, and cancerous breast tissue samples using a 785-nm micro-Raman system. After preprocessing for removal of noise and background fluorescence, the intensity of 12 important Raman bands of the biological samples was extracted as features of each spectrum. Then, the ACO algorithm was applied to find the optimum features for diagnosis. As the results demonstrated, by selecting five features, the classification accuracy of the normal, benign, and cancerous groups increased by 14% and reached 87.7%. ACO feature selection can improve the diagnostic accuracy of Raman-based diagnostic models. In the present study, features corresponding to ν(C-C) αhelix proline, valine (910-940), νs(C-C) skeletal lipids (1110-1130), and δ(CH2)/δ(CH3) proteins (1445-1460) were selected as the best features in cancer diagnosis.
AF-DHNN: Fuzzy Clustering and Inference-Based Node Fault Diagnosis Method for Fire Detection
Jin, Shan; Cui, Wen; Jin, Zhigang; Wang, Ying
2015-01-01
Wireless Sensor Networks (WSNs) have been utilized for node fault diagnosis in the fire detection field since the 1990s. However, the traditional methods have some problems, including complicated system structures, intensive computation needs, unsteady data detection and local minimum values. In this paper, a new diagnosis mechanism for WSN nodes is proposed, which is based on fuzzy theory and an Adaptive Fuzzy Discrete Hopfield Neural Network (AF-DHNN). First, the original status of each sensor over time is obtained with two features. One is the root mean square of the filtered signal (FRMS), the other is the normalized summation of the positive amplitudes of the difference spectrum between the measured signal and the healthy one (NSDS). Secondly, distributed fuzzy inference is introduced. The evident abnormal nodes’ status is pre-alarmed to save time. Thirdly, according to the dimensions of the diagnostic data, an adaptive diagnostic status system is established with a Fuzzy C-Means Algorithm (FCMA) and Sorting and Classification Algorithm to reducing the complexity of the fault determination. Fourthly, a Discrete Hopfield Neural Network (DHNN) with iterations is improved with the optimization of the sensors’ detected status information and standard diagnostic levels, with which the associative memory is achieved, and the search efficiency is improved. The experimental results show that the AF-DHNN method can diagnose abnormal WSN node faults promptly and effectively, which improves the WSN reliability. PMID:26193280
Shirahata, Mitsuaki; Iwao-Koizumi, Kyoko; Saito, Sakae; Ueno, Noriko; Oda, Masashi; Hashimoto, Nobuo; Takahashi, Jun A; Kato, Kikuya
2007-12-15
Current morphology-based glioma classification methods do not adequately reflect the complex biology of gliomas, thus limiting their prognostic ability. In this study, we focused on anaplastic oligodendroglioma and glioblastoma, which typically follow distinct clinical courses. Our goal was to construct a clinically useful molecular diagnostic system based on gene expression profiling. The expression of 3,456 genes in 32 patients, 12 and 20 of whom had prognostically distinct anaplastic oligodendroglioma and glioblastoma, respectively, was measured by PCR array. Next to unsupervised methods, we did supervised analysis using a weighted voting algorithm to construct a diagnostic system discriminating anaplastic oligodendroglioma from glioblastoma. The diagnostic accuracy of this system was evaluated by leave-one-out cross-validation. The clinical utility was tested on a microarray-based data set of 50 malignant gliomas from a previous study. Unsupervised analysis showed divergent global gene expression patterns between the two tumor classes. A supervised binary classification model showed 100% (95% confidence interval, 89.4-100%) diagnostic accuracy by leave-one-out cross-validation using 168 diagnostic genes. Applied to a gene expression data set from a previous study, our model correlated better with outcome than histologic diagnosis, and also displayed 96.6% (28 of 29) consistency with the molecular classification scheme used for these histologically controversial gliomas in the original article. Furthermore, we observed that histologically diagnosed glioblastoma samples that shared anaplastic oligodendroglioma molecular characteristics tended to be associated with longer survival. Our molecular diagnostic system showed reproducible clinical utility and prognostic ability superior to traditional histopathologic diagnosis for malignant glioma.
Machine learning-based in-line holographic sensing of unstained malaria-infected red blood cells.
Go, Taesik; Kim, Jun H; Byeon, Hyeokjun; Lee, Sang J
2018-04-19
Accurate and immediate diagnosis of malaria is important for medication of the infectious disease. Conventional methods for diagnosing malaria are time consuming and rely on the skill of experts. Therefore, an automatic and simple diagnostic modality is essential for healthcare in developing countries that lack the expertise of trained microscopists. In the present study, a new automatic sensing method using digital in-line holographic microscopy (DIHM) combined with machine learning algorithms was proposed to sensitively detect unstained malaria-infected red blood cells (iRBCs). To identify the RBC characteristics, 13 descriptors were extracted from segmented holograms of individual RBCs. Among the 13 descriptors, 10 features were highly statistically different between healthy RBCs (hRBCs) and iRBCs. Six machine learning algorithms were applied to effectively combine the dominant features and to greatly improve the diagnostic capacity of the present method. Among the classification models trained by the 6 tested algorithms, the model trained by the support vector machine (SVM) showed the best accuracy in separating hRBCs and iRBCs for training (n = 280, 96.78%) and testing sets (n = 120, 97.50%). This DIHM-based artificial intelligence methodology is simple and does not require blood staining. Thus, it will be beneficial and valuable in the diagnosis of malaria. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chapman, Brian E; Lee, Sean; Kang, Hyunseok Peter; Chapman, Wendy W
2011-10-01
In this paper we describe an application called peFinder for document-level classification of CT pulmonary angiography reports. peFinder is based on a generalized version of the ConText algorithm, a simple text processing algorithm for identifying features in clinical report documents. peFinder was used to answer questions about the disease state (pulmonary emboli present or absent), the certainty state of the diagnosis (uncertainty present or absent), the temporal state of an identified pulmonary embolus (acute or chronic), and the technical quality state of the exam (diagnostic or not diagnostic). Gold standard answers for each question were determined from the consensus classifications of three human annotators. peFinder results were compared to naive Bayes' classifiers using unigrams and bigrams. The sensitivities (and positive predictive values) for peFinder were 0.98(0.83), 0.86(0.96), 0.94(0.93), and 0.60(0.90) for disease state, quality state, certainty state, and temporal state respectively, compared to 0.68(0.77), 0.67(0.87), 0.62(0.82), and 0.04(0.25) for the naive Bayes' classifier using unigrams, and 0.75(0.79), 0.52(0.69), 0.59(0.84), and 0.04(0.25) for the naive Bayes' classifier using bigrams. Copyright © 2011 Elsevier Inc. All rights reserved.
McDonald, William C; Banerji, Nilanjana; McDonald, Kelsey N; Ho, Bridget; Macias, Virgilia; Kajdacsy-Balla, Andre
2017-01-01
-Pituitary adenoma classification is complex, and diagnostic strategies vary greatly from laboratory to laboratory. No optimal diagnostic algorithm has been defined. -To develop a panel of immunohistochemical (IHC) stains that provides the optimal combination of cost, accuracy, and ease of use. -We examined 136 pituitary adenomas with stains of steroidogenic factor 1 (SF-1), Pit-1, anterior pituitary hormones, cytokeratin CAM5.2, and α subunit of human chorionic gonadotropin. Immunohistochemical staining was scored using the Allred system. Adenomas were assigned to a gold standard class based on IHC results and available clinical and serologic information. Correlation and cluster analyses were used to develop an algorithm for parsimoniously classifying adenomas. -The algorithm entailed a 1- or 2-step process: (1) a screening step consisting of IHC stains for SF-1, Pit-1, and adrenocorticotropic hormone; and (2) when screening IHC pattern and clinical history were not clearly gonadotrophic (SF-1 positive only), corticotrophic (adrenocorticotropic hormone positive only), or IHC null cell (negative-screening IHC), we subsequently used IHC for prolactin, growth hormone, thyroid-stimulating hormone, and cytokeratin CAM5.2. -Comparison between diagnoses generated by our algorithm and the gold standard diagnoses showed excellent agreement. When compared with a commonly used panel using 6 IHC for anterior pituitary hormones plus IHC for a low-molecular-weight cytokeratin in certain tumors, our algorithm uses approximately one-third fewer IHC stains and detects gonadotroph adenomas with greater sensitivity.
NASA Astrophysics Data System (ADS)
Jardin, A.; Mazon, D.; Malard, P.; O'Mullane, M.; Chernyshova, M.; Czarski, T.; Malinowski, K.; Kasprowicz, G.; Wojenski, A.; Pozniak, K.
2017-08-01
The tokamak WEST aims at testing ITER divertor high heat flux component technology in long pulse operation. Unfortunately, heavy impurities like tungsten (W) sputtered from the plasma facing components can pollute the plasma core by radiation cooling in the soft x-ray (SXR) range, which is detrimental for the energy confinement and plasma stability. SXR diagnostics give valuable information to monitor impurities and study their transport. The WEST SXR diagnostic is composed of two new cameras based on the Gas Electron Multiplier (GEM) technology. The WEST GEM cameras will be used for impurity transport studies by performing 2D tomographic reconstructions with spectral resolution in tunable energy bands. In this paper, we characterize the GEM spectral response and investigate W density reconstruction thanks to a synthetic diagnostic recently developed and coupled with a tomography algorithm based on the minimum Fisher information (MFI) inversion method. The synthetic diagnostic includes the SXR source from a given plasma scenario, the photoionization, electron cloud transport and avalanche in the detection volume using Magboltz, and tomographic reconstruction of the radiation from the GEM signal. Preliminary studies of the effect of transport on the W ionization equilibrium and on the reconstruction capabilities are also presented.
Gearbox vibration diagnostic analyzer
NASA Technical Reports Server (NTRS)
1992-01-01
This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.
An effective non-rigid registration approach for ultrasound image based on "demons" algorithm.
Liu, Yan; Cheng, H D; Huang, Jianhua; Zhang, Yingtao; Tang, Xianglong; Tian, Jiawei
2013-06-01
Medical image registration is an important component of computer-aided diagnosis system in diagnostics, therapy planning, and guidance of surgery. Because of its low signal/noise ratio (SNR), ultrasound (US) image registration is a difficult task. In this paper, a fully automatic non-rigid image registration algorithm based on demons algorithm is proposed for registration of ultrasound images. In the proposed method, an "inertia force" derived from the local motion trend of pixels in a Moore neighborhood system is produced and integrated into optical flow equation to estimate the demons force, which is helpful to handle the speckle noise and preserve the geometric continuity of US images. In the experiment, a series of US images and several similarity measure metrics are utilized for evaluating the performance. The experimental results demonstrate that the proposed method can register ultrasound images efficiently, robust to noise, quickly and automatically.
NASA Astrophysics Data System (ADS)
Laguda, Edcer Jerecho
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient's medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method. Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated. Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated. Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Misawa, Masashi; Kudo, Shin-Ei; Mori, Yuichi; Takeda, Kenichi; Maeda, Yasuharu; Kataoka, Shinichi; Nakamura, Hiroki; Kudo, Toyoki; Wakamura, Kunihiko; Hayashi, Takemasa; Katagiri, Atsushi; Baba, Toshiyuki; Ishida, Fumio; Inoue, Haruhiro; Nimura, Yukitaka; Oda, Msahiro; Mori, Kensaku
2017-05-01
Real-time characterization of colorectal lesions during colonoscopy is important for reducing medical costs, given that the need for a pathological diagnosis can be omitted if the accuracy of the diagnostic modality is sufficiently high. However, it is sometimes difficult for community-based gastroenterologists to achieve the required level of diagnostic accuracy. In this regard, we developed a computer-aided diagnosis (CAD) system based on endocytoscopy (EC) to evaluate cellular, glandular, and vessel structure atypia in vivo. The purpose of this study was to compare the diagnostic ability and efficacy of this CAD system with the performances of human expert and trainee endoscopists. We developed a CAD system based on EC with narrow-band imaging that allowed microvascular evaluation without dye (ECV-CAD). The CAD algorithm was programmed based on texture analysis and provided a two-class diagnosis of neoplastic or non-neoplastic, with probabilities. We validated the diagnostic ability of the ECV-CAD system using 173 randomly selected EC images (49 non-neoplasms, 124 neoplasms). The images were evaluated by the CAD and by four expert endoscopists and three trainees. The diagnostic accuracies for distinguishing between neoplasms and non-neoplasms were calculated. ECV-CAD had higher overall diagnostic accuracy than trainees (87.8 vs 63.4%; [Formula: see text]), but similar to experts (87.8 vs 84.2%; [Formula: see text]). With regard to high-confidence cases, the overall accuracy of ECV-CAD was also higher than trainees (93.5 vs 71.7%; [Formula: see text]) and comparable to experts (93.5 vs 90.8%; [Formula: see text]). ECV-CAD showed better diagnostic accuracy than trainee endoscopists and was comparable to that of experts. ECV-CAD could thus be a powerful decision-making tool for less-experienced endoscopists.
First International Diagnosis Competition - DXC'09
NASA Technical Reports Server (NTRS)
Kurtoglu, tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander
2009-01-01
A framework to compare and evaluate diagnosis algorithms (DAs) has been created jointly by NASA Ames Research Center and PARC. In this paper, we present the first concrete implementation of this framework as a competition called DXC 09. The goal of this competition was to evaluate and compare DAs in a common platform and to determine a winner based on diagnosis results. 12 DAs (model-based and otherwise) competed in this first year of the competition in 3 tracks that included industrial and synthetic systems. Specifically, the participants provided algorithms that communicated with the run-time architecture to receive scenario data and return diagnostic results. These algorithms were run on extended scenario data sets (different from sample set) to compute a set of pre-defined metrics. A ranking scheme based on weighted metrics was used to declare winners. This paper presents the systems used in DXC 09, description of faults and data sets, a listing of participating DAs, the metrics and results computed from running the DAs, and a superficial analysis of the results.
NASA Astrophysics Data System (ADS)
Rysavy, Steven; Flores, Arturo; Enciso, Reyes; Okada, Kazunori
2008-03-01
This paper presents an experimental study for assessing the applicability of general-purpose 3D segmentation algorithms for analyzing dental periapical lesions in cone-beam computed tomography (CBCT) scans. In the field of Endodontics, clinical studies have been unable to determine if a periapical granuloma can heal with non-surgical methods. Addressing this issue, Simon et al. recently proposed a diagnostic technique which non-invasively classifies target lesions using CBCT. Manual segmentation exploited in their study, however, is too time consuming and unreliable for real world adoption. On the other hand, many technically advanced algorithms have been proposed to address segmentation problems in various biomedical and non-biomedical contexts, but they have not yet been applied to the field of dentistry. Presented in this paper is a novel application of such segmentation algorithms to the clinically-significant dental problem. This study evaluates three state-of-the-art graph-based algorithms: a normalized cut algorithm based on a generalized eigen-value problem, a graph cut algorithm implementing energy minimization techniques, and a random walks algorithm derived from discrete electrical potential theory. In this paper, we extend the original 2D formulation of the above algorithms to segment 3D images directly and apply the resulting algorithms to the dental CBCT images. We experimentally evaluate quality of the segmentation results for 3D CBCT images, as well as their 2D cross sections. The benefits and pitfalls of each algorithm are highlighted.
Ronald, L A; Ling, D I; FitzGerald, J M; Schwartzman, K; Bartlett-Esquilant, G; Boivin, J-F; Benedetti, A; Menzies, D
2017-05-01
An increasing number of studies are using health administrative databases for tuberculosis (TB) research. However, there are limitations to using such databases for identifying patients with TB. To summarise validated methods for identifying TB in health administrative databases. We conducted a systematic literature search in two databases (Ovid Medline and Embase, January 1980-January 2016). We limited the search to diagnostic accuracy studies assessing algorithms derived from drug prescription, International Classification of Diseases (ICD) diagnostic code and/or laboratory data for identifying patients with TB in health administrative databases. The search identified 2413 unique citations. Of the 40 full-text articles reviewed, we included 14 in our review. Algorithms and diagnostic accuracy outcomes to identify TB varied widely across studies, with positive predictive value ranging from 1.3% to 100% and sensitivity ranging from 20% to 100%. Diagnostic accuracy measures of algorithms using out-patient, in-patient and/or laboratory data to identify patients with TB in health administrative databases vary widely across studies. Use solely of ICD diagnostic codes to identify TB, particularly when using out-patient records, is likely to lead to incorrect estimates of case numbers, given the current limitations of ICD systems in coding TB.
Thyroid Storm: A Japanese Perspective.
Akamizu, Takashi
2018-01-01
Thyroid storm (TS) is life threatening. In the mid-2000s, its incidence was poorly defined, peer-reviewed diagnostic criteria were not available, and management and treatment did not seem to be verified based upon evidence and latest advances in medicine. First, diagnostic criteria were developed based on 99 patients in the literature and seven patients in this study. Then, initial and follow-up surveys were conducted from 2004 through 2008, targeting all hospitals in Japan to obtain and verify information on patients who met diagnostic criteria for TS. Based on these data, the diagnostic criteria were revised, and management and treatment guidelines were created. The incidence of TS in hospitalized patients in Japan was estimated to be 0.20 per 100,000 per year and 0.22% of all thyrotoxic patients. The mortality rate was 10.7%. Multiple organ failure was the most common cause of death, followed by congestive heart failure, respiratory failure, and arrhythmia. In the final diagnostic criteria for TS, the definition of jaundice as serum bilirubin concentration >3 mg/dL was added. Based upon nationwide surveys and the latest information, guidelines for the management and treatment for TS were extensively revised and algorithms were developed. TS remains a life-threatening disorder, with >10% mortality in Japan. New peer-reviewed diagnostic criteria for TS are presented and its clinical features, prognosis, and incidence are clarified based on nationwide surveys. Furthermore, this information helped to establish detailed guidelines for the management and treatment of TS. A prospective prognostic study to validate the guidelines is eagerly anticipated.
Thyroid Storm: A Japanese Perspective
2018-01-01
Background: Thyroid storm (TS) is life threatening. In the mid-2000s, its incidence was poorly defined, peer-reviewed diagnostic criteria were not available, and management and treatment did not seem to be verified based upon evidence and latest advances in medicine. Methods: First, diagnostic criteria were developed based on 99 patients in the literature and seven patients in this study. Then, initial and follow-up surveys were conducted from 2004 through 2008, targeting all hospitals in Japan to obtain and verify information on patients who met diagnostic criteria for TS. Based on these data, the diagnostic criteria were revised, and management and treatment guidelines were created. Results: The incidence of TS in hospitalized patients in Japan was estimated to be 0.20 per 100,000 per year and 0.22% of all thyrotoxic patients. The mortality rate was 10.7%. Multiple organ failure was the most common cause of death, followed by congestive heart failure, respiratory failure, and arrhythmia. In the final diagnostic criteria for TS, the definition of jaundice as serum bilirubin concentration >3 mg/dL was added. Based upon nationwide surveys and the latest information, guidelines for the management and treatment for TS were extensively revised and algorithms were developed. Conclusions: TS remains a life-threatening disorder, with >10% mortality in Japan. New peer-reviewed diagnostic criteria for TS are presented and its clinical features, prognosis, and incidence are clarified based on nationwide surveys. Furthermore, this information helped to establish detailed guidelines for the management and treatment of TS. A prospective prognostic study to validate the guidelines is eagerly anticipated. PMID:28899229
Resonance Raman of BCC and normal skin
NASA Astrophysics Data System (ADS)
Liu, Cheng-hui; Sriramoju, Vidyasagar; Boydston-White, Susie; Wu, Binlin; Zhang, Chunyuan; Pei, Zhe; Sordillo, Laura; Beckman, Hugh; Alfano, Robert R.
2017-02-01
The Resonance Raman (RR) spectra of basal cell carcinoma (BCC) and normal human skin tissues were analyzed using 532nm laser excitation. RR spectral differences in vibrational fingerprints revealed skin normal and cancerous states tissues. The standard diagnosis criterion for BCC tissues are created by native RR biomarkers and its changes at peak intensity. The diagnostic algorithms for the classification of BCC and normal were generated based on SVM classifier and PCA statistical method. These statistical methods were used to analyze the RR spectral data collected from skin tissues, yielding a diagnostic sensitivity of 98.7% and specificity of 79% compared with pathological reports.
[Tinnitus: algorithm of diagnostics and clinical management].
Boiko, N V
Hearing of sound, or tinnitus, can be a symptom of different diseases. The differential diagnosis should be based on the identification of subgroups with confirmed causes of the disease. Subjective and objective tinnitus groups should be isolated. Objective tinnitus can be vascular or muscular. In making a diagnosis of tinnitus, it is important to know its characteristics, laterality, circumstances of onset, duration, comorbidity with other symptoms: headache, hearing decline, dizziness, depression, etc. Urgent diagnostic and treatment measures are needed after the identification of 'red flags': acute pulsatile tinnitus, in particular after the brain injury, combination of tinnitus with acute hearing loss and depression.
Dual energy computed tomography for the head.
Naruto, Norihito; Itoh, Toshihide; Noguchi, Kyo
2018-02-01
Dual energy CT (DECT) is a promising technology that provides better diagnostic accuracy in several brain diseases. DECT can generate various types of CT images from a single acquisition data set at high kV and low kV based on material decomposition algorithms. The two-material decomposition algorithm can separate bone/calcification from iodine accurately. The three-material decomposition algorithm can generate a virtual non-contrast image, which helps to identify conditions such as brain hemorrhage. A virtual monochromatic image has the potential to eliminate metal artifacts by reducing beam-hardening effects. DECT also enables exploration of advanced imaging to make diagnosis easier. One such novel application of DECT is the X-Map, which helps to visualize ischemic stroke in the brain without using iodine contrast medium.
NASA Astrophysics Data System (ADS)
Wang, Jianfeng; Lin, Kan; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Huang, Zhiwei
2015-08-01
This work aims to evaluate clinical value of a fiber-optic Raman spectroscopy technique developed for in vivo diagnosis of esophageal squamous cell carcinoma (ESCC) during clinical endoscopy. We have developed a rapid fiber-optic Raman endoscopic system capable of simultaneously acquiring both fingerprint (FP)(800-1800 cm-1) and high-wavenumber (HW)(2800-3600 cm-1) Raman spectra from esophageal tissue in vivo. A total of 1172 in vivo FP/HW Raman spectra were acquired from 48 esophageal patients undergoing endoscopic examination. The total Raman dataset was split into two parts: 80% for training; while 20% for testing. Partial least squares-discriminant analysis (PLS-DA) and leave-one patient-out, cross validation (LOPCV) were implemented on training dataset to develop diagnostic algorithms for tissue classification. PLS-DA-LOPCV shows that simultaneous FP/HW Raman spectroscopy on training dataset provides a diagnostic sensitivity of 97.0% and specificity of 97.4% for ESCC classification. Further, the diagnostic algorithm applied to the independent testing dataset based on simultaneous FP/HW Raman technique gives a predictive diagnostic sensitivity of 92.7% and specificity of 93.6% for ESCC identification, which is superior to either FP or HW Raman technique alone. This work demonstrates that the simultaneous FP/HW fiber-optic Raman spectroscopy technique improves real-time in vivo diagnosis of esophageal neoplasia at endoscopy.
A feature-preserving hair removal algorithm for dermoscopy images.
Abbas, Qaisar; Garcia, Irene Fondón; Emre Celebi, M; Ahmad, Waqar
2013-02-01
Accurate segmentation and repair of hair-occluded information from dermoscopy images are challenging tasks for computer-aided detection (CAD) of melanoma. Currently, many hair-restoration algorithms have been developed, but most of these fail to identify hairs accurately and their removal technique is slow and disturbs the lesion's pattern. In this article, a novel hair-restoration algorithm is presented, which has a capability to preserve the skin lesion features such as color and texture and able to segment both dark and light hairs. Our algorithm is based on three major steps: the rough hairs are segmented using a matched filtering with first derivative of gaussian (MF-FDOG) with thresholding that generate strong responses for both dark and light hairs, refinement of hairs by morphological edge-based techniques, which are repaired through a fast marching inpainting method. Diagnostic accuracy (DA) and texture-quality measure (TQM) metrics are utilized based on dermatologist-drawn manual hair masks that were used as a ground truth to evaluate the performance of the system. The hair-restoration algorithm is tested on 100 dermoscopy images. The comparisons have been done among (i) linear interpolation, inpainting by (ii) non-linear partial differential equation (PDE), and (iii) exemplar-based repairing techniques. Among different hair detection and removal techniques, our proposed algorithm obtained the highest value of DA: 93.3% and TQM: 90%. The experimental results indicate that the proposed algorithm is highly accurate, robust and able to restore hair pixels without damaging the lesion texture. This method is fully automatic and can be easily integrated into a CAD system. © 2011 John Wiley & Sons A/S.
Validation of an algorithm-based definition of treatment resistance in patients with schizophrenia.
Ajnakina, Olesya; Horsdal, Henriette Thisted; Lally, John; MacCabe, James H; Murray, Robin M; Gasse, Christiane; Wimberley, Theresa
2018-02-19
Large-scale pharmacoepidemiological research on treatment resistance relies on accurate identification of people with treatment-resistant schizophrenia (TRS) based on data that are retrievable from administrative registers. This is usually approached by operationalising clinical treatment guidelines by using prescription and hospital admission information. We examined the accuracy of an algorithm-based definition of TRS based on clozapine prescription and/or meeting algorithm-based eligibility criteria for clozapine against a gold standard definition using case notes. We additionally validated a definition entirely based on clozapine prescription. 139 schizophrenia patients aged 18-65years were followed for a mean of 5years after first presentation to psychiatric services in South-London, UK. The diagnostic accuracy of the algorithm-based measure against the gold standard was measured with sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). A total of 45 (32.4%) schizophrenia patients met the criteria for the gold standard definition of TRS; applying the algorithm-based definition to the same cohort led to 44 (31.7%) patients fulfilling criteria for TRS with sensitivity, specificity, PPV and NPV of 62.2%, 83.0%, 63.6% and 82.1%, respectively. The definition based on lifetime clozapine prescription had sensitivity, specificity, PPV and NPV of 40.0%, 94.7%, 78.3% and 76.7%, respectively. Although a perfect definition of TRS cannot be derived from available prescription and hospital registers, these results indicate that researchers can confidently use registries to identify individuals with TRS for research and clinical practices. Copyright © 2018 Elsevier B.V. All rights reserved.
Robust In-Flight Sensor Fault Diagnostics for Aircraft Engine Based on Sliding Mode Observers
Chang, Xiaodong; Huang, Jinquan; Lu, Feng
2017-01-01
For a sensor fault diagnostic system of aircraft engines, the health performance degradation is an inevitable interference that cannot be neglected. To address this issue, this paper investigates an integrated on-line sensor fault diagnostic scheme for a commercial aircraft engine based on a sliding mode observer (SMO). In this approach, one sliding mode observer is designed for engine health performance tracking, and another for sensor fault reconstruction. Both observers are employed in in-flight applications. The results of the former SMO are analyzed for post-flight updating the baseline model of the latter. This idea is practical and feasible since the updating process does not require the algorithm to be regulated or redesigned, so that ground-based intervention is avoided, and the update process is implemented in an economical and efficient way. With this setup, the robustness of the proposed scheme to the health degradation is much enhanced and the latter SMO is able to fulfill sensor fault reconstruction over the course of the engine life. The proposed sensor fault diagnostic system is applied to a nonlinear simulation of a commercial aircraft engine, and its effectiveness is evaluated in several fault scenarios. PMID:28398255
Robust In-Flight Sensor Fault Diagnostics for Aircraft Engine Based on Sliding Mode Observers.
Chang, Xiaodong; Huang, Jinquan; Lu, Feng
2017-04-11
For a sensor fault diagnostic system of aircraft engines, the health performance degradation is an inevitable interference that cannot be neglected. To address this issue, this paper investigates an integrated on-line sensor fault diagnostic scheme for a commercial aircraft engine based on a sliding mode observer (SMO). In this approach, one sliding mode observer is designed for engine health performance tracking, and another for sensor fault reconstruction. Both observers are employed in in-flight applications. The results of the former SMO are analyzed for post-flight updating the baseline model of the latter. This idea is practical and feasible since the updating process does not require the algorithm to be regulated or redesigned, so that ground-based intervention is avoided, and the update process is implemented in an economical and efficient way. With this setup, the robustness of the proposed scheme to the health degradation is much enhanced and the latter SMO is able to fulfill sensor fault reconstruction over the course of the engine life. The proposed sensor fault diagnostic system is applied to a nonlinear simulation of a commercial aircraft engine, and its effectiveness is evaluated in several fault scenarios.
Ahn, Hye Shin; Kim, Sun Mi; Jang, Mijung; Yun, Bo La; Kim, Bohyoung; Ko, Eun Sook; Han, Boo-Kyung; Chang, Jung Min; Yi, Ann; Cho, Nariya; Moon, Woo Kyung; Choi, Hye Young
2014-01-01
To compare new full-field digital mammography (FFDM) with and without use of an advanced post-processing algorithm to improve image quality, lesion detection, diagnostic performance, and priority rank. During a 22-month period, we prospectively enrolled 100 cases of specimen FFDM mammography (Brestige®), which was performed alone or in combination with a post-processing algorithm developed by the manufacturer: group A (SMA), specimen mammography without application of "Mammogram enhancement ver. 2.0"; group B (SMB), specimen mammography with application of "Mammogram enhancement ver. 2.0". Two sets of specimen mammographies were randomly reviewed by five experienced radiologists. Image quality, lesion detection, diagnostic performance, and priority rank with regard to image preference were evaluated. Three aspects of image quality (overall quality, contrast, and noise) of the SMB were significantly superior to those of SMA (p < 0.05). SMB was significantly superior to SMA for visualizing calcifications (p < 0.05). Diagnostic performance, as evaluated by cancer score, was similar between SMA and SMB. SMB was preferred to SMA by four of the five reviewers. The post-processing algorithm may improve image quality with better image preference in FFDM than without use of the software.
New method for detection of gastric cancer by hyperspectral imaging: a pilot study
NASA Astrophysics Data System (ADS)
Kiyotoki, Shu; Nishikawa, Jun; Okamoto, Takeshi; Hamabe, Kouichi; Saito, Mari; Goto, Atsushi; Fujita, Yusuke; Hamamoto, Yoshihiko; Takeuchi, Yusuke; Satori, Shin; Sakaida, Isao
2013-02-01
We developed a new, easy, and objective method to detect gastric cancer using hyperspectral imaging (HSI) technology combining spectroscopy and imaging A total of 16 gastroduodenal tumors removed by endoscopic resection or surgery from 14 patients at Yamaguchi University Hospital, Japan, were recorded using a hyperspectral camera (HSC) equipped with HSI technology Corrected spectral reflectance was obtained from 10 samples of normal mucosa and 10 samples of tumors for each case The 16 cases were divided into eight training cases (160 training samples) and eight test cases (160 test samples) We established a diagnostic algorithm with training samples and evaluated it with test samples Diagnostic capability of the algorithm for each tumor was validated, and enhancement of tumors by image processing using the HSC was evaluated The diagnostic algorithm used the 726-nm wavelength, with a cutoff point established from training samples The sensitivity, specificity, and accuracy rates of the algorithm's diagnostic capability in the test samples were 78.8% (63/80), 92.5% (74/80), and 85.6% (137/160), respectively Tumors in HSC images of 13 (81.3%) cases were well enhanced by image processing Differences in spectral reflectance between tumors and normal mucosa suggested that tumors can be clearly distinguished from background mucosa with HSI technology.
Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms
Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas
2016-01-01
Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640
Rodriguez-Diaz, Eladio; Castanon, David A; Singh, Satish K; Bigio, Irving J
2011-06-01
Optical spectroscopy has shown potential as a real-time, in vivo, diagnostic tool for identifying neoplasia during endoscopy. We present the development of a diagnostic algorithm to classify elastic-scattering spectroscopy (ESS) spectra as either neoplastic or non-neoplastic. The algorithm is based on pattern recognition methods, including ensemble classifiers, in which members of the ensemble are trained on different regions of the ESS spectrum, and misclassification-rejection, where the algorithm identifies and refrains from classifying samples that are at higher risk of being misclassified. These "rejected" samples can be reexamined by simply repositioning the probe to obtain additional optical readings or ultimately by sending the polyp for histopathological assessment, as per standard practice. Prospective validation using separate training and testing sets result in a baseline performance of sensitivity = .83, specificity = .79, using the standard framework of feature extraction (principal component analysis) followed by classification (with linear support vector machines). With the developed algorithm, performance improves to Se ∼ 0.90, Sp ∼ 0.90, at a cost of rejecting 20-33% of the samples. These results are on par with a panel of expert pathologists. For colonoscopic prevention of colorectal cancer, our system could reduce biopsy risk and cost, obviate retrieval of non-neoplastic polyps, decrease procedure time, and improve assessment of cancer risk.
Rodriguez-Diaz, Eladio; Castanon, David A.; Singh, Satish K.; Bigio, Irving J.
2011-01-01
Optical spectroscopy has shown potential as a real-time, in vivo, diagnostic tool for identifying neoplasia during endoscopy. We present the development of a diagnostic algorithm to classify elastic-scattering spectroscopy (ESS) spectra as either neoplastic or non-neoplastic. The algorithm is based on pattern recognition methods, including ensemble classifiers, in which members of the ensemble are trained on different regions of the ESS spectrum, and misclassification-rejection, where the algorithm identifies and refrains from classifying samples that are at higher risk of being misclassified. These “rejected” samples can be reexamined by simply repositioning the probe to obtain additional optical readings or ultimately by sending the polyp for histopathological assessment, as per standard practice. Prospective validation using separate training and testing sets result in a baseline performance of sensitivity = .83, specificity = .79, using the standard framework of feature extraction (principal component analysis) followed by classification (with linear support vector machines). With the developed algorithm, performance improves to Se ∼ 0.90, Sp ∼ 0.90, at a cost of rejecting 20–33% of the samples. These results are on par with a panel of expert pathologists. For colonoscopic prevention of colorectal cancer, our system could reduce biopsy risk and cost, obviate retrieval of non-neoplastic polyps, decrease procedure time, and improve assessment of cancer risk. PMID:21721830
Floros, Nikolaos; Papadakis, Marios; Schelzig, Hubert; Oberhuber, Alexander
2018-03-10
Over the last three decades, the development of systematic and protocol-based algorithms, and advances in available diagnostic tests have become the indispensable parts of practising medicine. Naturally, despite the implementation of meticulous protocols involving diagnostic tests or even trials of empirical therapies, the cause of one's symptoms may still not be obvious. We herein report a case of chronic back pain, which took about 5 years to get accurately diagnosed. The case challenges the diagnostic assumptions and sets ground of discussion for the diagnostic reasoning pitfalls and heuristic biases that mislead the caring physicians and cost years of low quality of life to our patient. This case serves as an example of how anchoring heuristics can interfere in the diagnostic process of a complex and rare entity when combined with a concurrent potentially life-threatening condition. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Computer Vision Malaria Diagnostic Systems-Progress and Prospects.
Pollak, Joseph Joel; Houri-Yafin, Arnon; Salpeter, Seth J
2017-01-01
Accurate malaria diagnosis is critical to prevent malaria fatalities, curb overuse of antimalarial drugs, and promote appropriate management of other causes of fever. While several diagnostic tests exist, the need for a rapid and highly accurate malaria assay remains. Microscopy and rapid diagnostic tests are the main diagnostic modalities available, yet they can demonstrate poor performance and accuracy. Automated microscopy platforms have the potential to significantly improve and standardize malaria diagnosis. Based on image recognition and machine learning algorithms, these systems maintain the benefits of light microscopy and provide improvements such as quicker scanning time, greater scanning area, and increased consistency brought by automation. While these applications have been in development for over a decade, recently several commercial platforms have emerged. In this review, we discuss the most advanced computer vision malaria diagnostic technologies and investigate several of their features which are central to field use. Additionally, we discuss the technological and policy barriers to implementing these technologies in low-resource settings world-wide.
Breast tumor malignancy modelling using evolutionary neural logic networks.
Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia
2006-01-01
The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.
Assessing operating characteristics of CAD algorithms in the absence of a gold standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy Choudhury, Kingshuk; Paik, David S.; Yi, Chin A.
2010-04-15
Purpose: The authors examine potential bias when using a reference reader panel as ''gold standard'' for estimating operating characteristics of CAD algorithms for detecting lesions. As an alternative, the authors propose latent class analysis (LCA), which does not require an external gold standard to evaluate diagnostic accuracy. Methods: A binomial model for multiple reader detections using different diagnostic protocols was constructed, assuming conditional independence of readings given true lesion status. Operating characteristics of all protocols were estimated by maximum likelihood LCA. Reader panel and LCA based estimates were compared using data simulated from the binomial model for a range ofmore » operating characteristics. LCA was applied to 36 thin section thoracic computed tomography data sets from the Lung Image Database Consortium (LIDC): Free search markings of four radiologists were compared to markings from four different CAD assisted radiologists. For real data, bootstrap-based resampling methods, which accommodate dependence in reader detections, are proposed to test of hypotheses of differences between detection protocols. Results: In simulation studies, reader panel based sensitivity estimates had an average relative bias (ARB) of -23% to -27%, significantly higher (p-value <0.0001) than LCA (ARB -2% to -6%). Specificity was well estimated by both reader panel (ARB -0.6% to -0.5%) and LCA (ARB 1.4%-0.5%). Among 1145 lesion candidates LIDC considered, LCA estimated sensitivity of reference readers (55%) was significantly lower (p-value 0.006) than CAD assisted readers' (68%). Average false positives per patient for reference readers (0.95) was not significantly lower (p-value 0.28) than CAD assisted readers' (1.27). Conclusions: Whereas a gold standard based on a consensus of readers may substantially bias sensitivity estimates, LCA may be a significantly more accurate and consistent means for evaluating diagnostic accuracy.« less
In vitro osteosarcoma biosensing using THz time domain spectroscopy
NASA Astrophysics Data System (ADS)
Ferguson, Bradley S.; Liu, Haibo; Hay, Shelley; Findlay, David; Zhang, Xi-Cheng; Abbott, Derek
2004-03-01
Terahertz time domain spectroscopy (THz-TDS) has a wide range of applications from semiconductor diagnostics to biosensing. Recent attention has focused on bio-applications and several groups have noted the ability of THz-TDS to differentiate basal cell carcinoma tissue from healthy dermal tissue ex vivo. The contrast mechanism is unclear but has been attributed to increased interstitial water in cancerous tissue. In this work we investigate the THz response of human osteosarcoma cells and normal human bone cells grown in culture to isolate the cells' responses from other effects. A classification algorithms based on a frequency selection by genetic algorithm is used to attempt to differentiate between the cell types based on the THz spectra. Encouraging preliminary results have been obtained.
The diagnostic management of upper extremity deep vein thrombosis: A review of the literature.
Kraaijpoel, Noémie; van Es, Nick; Porreca, Ettore; Büller, Harry R; Di Nisio, Marcello
2017-08-01
Upper extremity deep vein thrombosis (UEDVT) accounts for 4% to 10% of all cases of deep vein thrombosis. UEDVT may present with localized pain, erythema, and swelling of the arm, but may also be detected incidentally by diagnostic imaging tests performed for other reasons. Prompt and accurate diagnosis is crucial to prevent pulmonary embolism and long-term complications as the post-thrombotic syndrome of the arm. Unlike the diagnostic management of deep vein thrombosis (DVT) of the lower extremities, which is well established, the work-up of patients with clinically suspected UEDVT remains uncertain with limited evidence from studies of small size and poor methodological quality. Currently, only one prospective study evaluated the use of an algorithm, similar to the one used for DVT of the lower extremities, for the diagnostic workup of clinically suspected UEDVT. The algorithm combined clinical probability assessment, D-dimer testing and ultrasonography and appeared to safely and effectively exclude UEDVT. However, before recommending its use in routine clinical practice, external validation of this strategy and improvements of the efficiency are needed, especially in high-risk subgroups in whom the performance of the algorithm appeared to be suboptimal, such as hospitalized or cancer patients. In this review, we critically assess the accuracy and efficacy of current diagnostic tools and provide clinical guidance for the diagnostic management of clinically suspected UEDVT. Copyright © 2017 Elsevier Ltd. All rights reserved.
Boehnke, Mitchell; Patel, Nayana; McKinney, Kristin; Clark, Toshimasa
The Society of Radiologists in Ultrasound (SRU 2005) and American Thyroid Association (ATA 2009 and ATA 2015) have published algorithms regarding thyroid nodule management. Kwak et al. and other groups have described models that estimate thyroid nodules' malignancy risk. The aim of our study is to use Kwak's model to evaluate the tradeoffs of both sensitivity and specificity of SRU 2005, ATA 2009 and ATA 2015 management algorithms. 1,000,000 thyroid nodules were modeled in MATLAB. Ultrasound characteristics were modeled after published data. Malignancy risk was estimated per Kwak's model and assigned as a binary variable. All nodules were then assessed using the published management algorithms. With the malignancy variable as condition positivity and algorithms' recommendation for FNA as test positivity, diagnostic performance was calculated. Modeled nodule characteristics mimic those of Kwak et al. 12.8% nodules were assigned as malignant (malignancy risk range of 2.0-98%). FNA was recommended for 41% of nodules by SRU 2005, 66% by ATA 2009, and 82% by ATA 2015. Sensitivity and specificity is significantly different (< 0.0001): 49% and 60% for SRU; 81% and 36% for ATA 2009; and 95% and 20% for ATA 2015. SRU 2005, ATA 2009 and ATA 2015 algorithms are used routinely in clinical practice to determine whether thyroid nodule biopsy is indicated. We demonstrate significant differences in these algorithms' diagnostic performance, which result in a compromise between sensitivity and specificity. Copyright © 2017 Elsevier Inc. All rights reserved.
Data Processing Algorithm for Diagnostics of Combustion Using Diode Laser Absorption Spectrometry.
Mironenko, Vladimir R; Kuritsyn, Yuril A; Liger, Vladimir V; Bolshov, Mikhail A
2018-02-01
A new algorithm for the evaluation of the integral line intensity for inferring the correct value for the temperature of a hot zone in the diagnostic of combustion by absorption spectroscopy with diode lasers is proposed. The algorithm is based not on the fitting of the baseline (BL) but on the expansion of the experimental and simulated spectra in a series of orthogonal polynomials, subtracting of the first three components of the expansion from both the experimental and simulated spectra, and fitting the spectra thus modified. The algorithm is tested in the numerical experiment by the simulation of the absorption spectra using a spectroscopic database, the addition of white noise, and the parabolic BL. Such constructed absorption spectra are treated as experimental in further calculations. The theoretical absorption spectra were simulated with the parameters (temperature, total pressure, concentration of water vapor) close to the parameters used for simulation of the experimental data. Then, spectra were expanded in the series of orthogonal polynomials and first components were subtracted from both spectra. The value of the correct integral line intensities and hence the correct temperature evaluation were obtained by fitting of the thus modified experimental and simulated spectra. The dependence of the mean and standard deviation of the evaluation of the integral line intensity on the linewidth and the number of subtracted components (first two or three) were examined. The proposed algorithm provides a correct estimation of temperature with standard deviation better than 60 K (for T = 1000 K) for the line half-width up to 0.6 cm -1 . The proposed algorithm allows for obtaining the parameters of a hot zone without the fitting of usually unknown BL.
TH-A-BRF-11: Image Intensity Non-Uniformities Between MRI Simulation and Diagnostic MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, E
2014-06-15
Purpose: MRI simulation for MRI-based radiotherapy demands that patients be setup in treatment position, which frequently involves use of alternative radiofrequency (RF) coil configurations to accommodate immobilized patients. However, alternative RF coil geometries may exacerbate image intensity non-uniformities (IINU) beyond those observed in diagnostic MRI, which may challenge image segmentation and registration accuracy as well as confound studies assessing radiotherapy response when MR simulation images are used as baselines for evaluation. The goal of this work was to determine whether differences in IINU exist between MR simulation and diagnostic MR images. Methods: ACR-MRI phantom images were acquired at 3T usingmore » a spin-echo sequence (TE/TR:20/500ms, rBW:62.5kHz, TH/skip:5/5mm). MR simulation images were obtained by wrapping two flexible phased-array RF coils around the phantom. Diagnostic MR images were obtained by placing the phantom into a commercial phased-array head coil. Pre-scan normalization was enabled in both cases. Images were transferred offline and corrected for IINU using the MNI N3 algorithm. Coefficients of variation (CV=σ/μ) were calculated for each slice. Wilcoxon matched-pairs and Mann-Whitney tests compared CV values between original and N3 images and between MR simulation and diagnostic MR images. Results: Significant differences in CV were detected between original and N3 images in both MRI simulation and diagnostic MRI groups (p=0.010, p=0.010). In addition, significant differences in CV were detected between original MR simulation and original and N3 diagnostic MR images (p=0.0256, p=0.0016). However, no significant differences in CV were detected between N3 MR simulation images and original or N3 diagnostic MR images, demonstrating the importance of correcting MR simulation images beyond pre-scan normalization prior to use in radiotherapy. Conclusions: Alternative RF coil configurations used in MRI simulation can Result in significant IINU differences compared to diagnostic MR images. The MNI N3 algorithm reduced MR simulation IINU to levels observed in diagnostic MR images. Funding provided by Advancing a Healthier Wisconsin.« less
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1988-01-01
The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.
Toward detecting deception in intelligent systems
NASA Astrophysics Data System (ADS)
Santos, Eugene, Jr.; Johnson, Gregory, Jr.
2004-08-01
Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.
Major developments in the 2016 european guidelines for heart failure.
Trullàs, J C; González-Franco, Á
2017-10-01
The European Society of Cardiology has recently published new guidelines on the diagnosis and treatment of acute and chronic heart failure (HF). This article aims to review these recommendations and their level of scientific evidence and to present the most innovative aspects. The most significant deviations from the 2012 edition are: 1) the introduction of the concept of HF with midrange LVEF (40-49%); 2) a new diagnostic algorithm for chronic HF, initially considering the clinical probability; 3) recommendations on preventing or delaying the apparition of HF; 4) indications for the use of the new sacubitril-valsartan compound, the first angiotensin receptor blocker and neprilysin inhibitor; 5) modification of indications for cardiac resynchronisation therapy; and 6) a new algorithm for a combined diagnostic and treatment strategy for acute HF based on the presence or absence of congestion and hypoperfusion. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
Diagnosis and Management of Functional Heartburn.
Hachem, Christine; Shaheen, Nicholas J
2016-01-01
Heartburn is among the most common gastrointestinal symptoms presenting to both generalist physicians and gastroenterologists. Heartburn that does not respond to traditional acid suppression is a diagnostic and therapeutic dilemma. In the era of high utilization of proton pump inhibitors, a substantial proportion of patients presenting to the gastroenterologist with chronic symptoms of heartburn do not have a reflux-mediated disease. Subjects without objective evidence of reflux as a cause of their symptoms have "functional heartburn". The diagnostic role of endoscopy, reflux and motility testing in functional heartburn (FH) patients is discussed. Lifestyle modifications, pharmacological interventions, and alternative therapies for FH are also presented. Recognition of patients with FH allows earlier assignment of these patients to different treatment algorithms, which may allow greater likelihood of success of treatment, diminished resource utilization and improved quality of life. Further data on this large and understudied group of patients is necessary to allow improvement in treatment algorithms and a more evidence-based approach to care of these patients.
Architecting Integrated System Health Management for Airworthiness
2013-09-01
aircraft safety and reliability through condition-based maintenance [Miller et al., 1991]. With the same motivation, Integrated System Health Management...diagnostics and prognostics algorithms. 2.2.2 Health and Usage Monitoring System (HUMS) in Helicopters Increased demand for improved operational safety ...offshore shuttle helicopters traversing the petrol installations in the North Sea, and increased demand for improved operational safety and reduced
Tan, Bruce K; Lu, Guanning; Kwasny, Mary J; Hsueh, Wayne D; Shintani-Smith, Stephanie; Conley, David B; Chandra, Rakesh K; Kern, Robert C; Leung, Randy
2013-11-01
Current symptom criteria poorly predict a diagnosis of chronic rhinosinusitis (CRS) resulting in excessive treatment of patients with presumed CRS. The objective of this study was analyze the positive predictive value of individual symptoms, or symptoms in combination, in patients with CRS symptoms and examine the costs of the subsequent diagnostic algorithm using a decision tree-based cost analysis. We analyzed previously collected patient-reported symptoms from a cross-sectional study of patients who had received a computed tomography (CT) scan of their sinuses at a tertiary care otolaryngology clinic for evaluation of CRS symptoms to calculate the positive predictive value of individual symptoms. Classification and regression tree (CART) analysis then optimized combinations of symptoms and thresholds to identify CRS patients. The calculated positive predictive values were applied to a previously developed decision tree that compared an upfront CT (uCT) algorithm against an empiric medical therapy (EMT) algorithm with further analysis that considered the availability of point of care (POC) imaging. The positive predictive value of individual symptoms ranged from 0.21 for patients reporting forehead pain and to 0.69 for patients reporting hyposmia. The CART model constructed a dichotomous model based on forehead pain, maxillary pain, hyposmia, nasal discharge, and facial pain (C-statistic 0.83). If POC CT were available, median costs ($64-$415) favored using the upfront CT for all individual symptoms. If POC CT was unavailable, median costs favored uCT for most symptoms except intercanthal pain (-$15), hyposmia (-$100), and discolored nasal discharge (-$24), although these symptoms became equivocal on cost sensitivity analysis. The three-tiered CART model could subcategorize patients into tiers where uCT was always favorable (median costs: $332-$504) and others for which EMT was always favorable (median costs -$121 to -$275). The uCT algorithm was always more costly if the nasal endoscopy was positive. Among patients with classic CRS symptoms, the frequency of individual symptoms varied the likelihood of a CRS diagnosis marginally. Only hyposmia, the absence of facial pain, and discolored discharge sufficiently increased the likelihood of diagnosis to potentially make EMT less costly. The development of an evidence-based, multisymptom-based risk stratification model could substantially affect the management costs of the subsequent diagnostic algorithm. © 2013 ARS-AAOA, LLC.
Beam Stability R&D for the APS MBA Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sereno, Nicholas S.; Arnold, Ned D.; Bui, Hanh D.
2015-01-01
Beam diagnostics required for the APS Multi-bend acromat (MBA) are driven by ambitious beam stability requirements. The major AC stability challenge is to correct rms beam motion to 10% the rms beam size at the insertion device source points from0.01 to 1000 Hz. The vertical plane represents the biggest challenge forAC stability, which is required to be 400 nm rms for a 4-micron vertical beam size. In addition to AC stability, long-term drift over a period of seven days is required to be 1 micron or less. Major diagnostics R&D components include improved rf beam position processing using commercially availablemore » FPGA-based BPM processors, new X-ray beam position monitors based on hard X-ray fluorescence from copper and Compton scattering off diamond, mechanical motion sensing to detect and correct long-term vacuum chamber drift, a new feedback system featuring a tenfold increase in sampling rate, and a several-fold increase in the number of fast correctors and BPMs in the feedback algorithm. Feedback system development represents a major effort, and we are pursuing development of a novel algorithm that integrates orbit correction for both slow and fast correctors down to DC simultaneously. Finally, a new data acquisition system (DAQ) is being developed to simultaneously acquire streaming data from all diagnostics as well as the feedback processors for commissioning and fault diagnosis. Results of studies and the design effort are reported.« less
NASA Astrophysics Data System (ADS)
Hidalgo-Aguirre, Maribel; Gitelman, Julian; Lesk, Mark Richard; Costantino, Santiago
2015-11-01
Optical coherence tomography (OCT) imaging has become a standard diagnostic tool in ophthalmology, providing essential information associated with various eye diseases. In order to investigate the dynamics of the ocular fundus, we present a simple and accurate automated algorithm to segment the inner limiting membrane in video-rate optic nerve head spectral domain (SD) OCT images. The method is based on morphological operations including a two-step contrast enhancement technique, proving to be very robust when dealing with low signal-to-noise ratio images and pathological eyes. An analysis algorithm was also developed to measure neuroretinal tissue deformation from the segmented retinal profiles. The performance of the algorithm is demonstrated, and deformation results are presented for healthy and glaucomatous eyes.
Cost-effective Diagnostic Checklists for Meningitis in Resource Limited Settings
Durski, Kara N.; Kuntz, Karen M.; Yasukawa, Kosuke; Virnig, Beth A.; Meya, David B.; Boulware, David R.
2013-01-01
Background Checklists can standardize patient care, reduce errors, and improve health outcomes. For meningitis in resource-limited settings, with high patient loads and limited financial resources, CNS diagnostic algorithms may be useful to guide diagnosis and treatment. However, the cost-effectiveness of such algorithms is unknown. Methods We used decision analysis methodology to evaluate the costs, diagnostic yield, and cost-effectiveness of diagnostic strategies for adults with suspected meningitis in resource limited settings with moderate/high HIV prevalence. We considered three strategies: 1) comprehensive “shotgun” approach of utilizing all routine tests; 2) “stepwise” strategy with tests performed in a specific order with additional TB diagnostics; 3) “minimalist” strategy of sequential ordering of high-yield tests only. Each strategy resulted in one of four meningitis diagnoses: bacterial (4%), cryptococcal (59%), TB (8%), or other (aseptic) meningitis (29%). In model development, we utilized prevalence data from two Ugandan sites and published data on test performance. We validated the strategies with data from Malawi, South Africa, and Zimbabwe. Results The current comprehensive testing strategy resulted in 93.3% correct meningitis diagnoses costing $32.00/patient. A stepwise strategy had 93.8% correct diagnoses costing an average of $9.72/patient, and a minimalist strategy had 91.1% correct diagnoses costing an average of $6.17/patient. The incremental cost effectiveness ratio was $133 per additional correct diagnosis for the stepwise over minimalist strategy. Conclusions Through strategically choosing the order and type of testing coupled with disease prevalence rates, algorithms can deliver more care more efficiently. The algorithms presented herein are generalizable to East Africa and Southern Africa. PMID:23466647
Shin, Ha Kyung; Grahame, George; McCandless, Shawn E; Kerr, Douglas S; Bedoyan, Jirair K
2017-11-01
Pyruvate dehydrogenase complex (PDC) deficiency is a major cause of primary lactic acidemia in children. Prompt and correct diagnosis of PDC deficiency and differentiating between specific vs generalized, or secondary deficiencies has important implications for clinical management and therapeutic interventions. Both genetic and enzymatic testing approaches are being used in the diagnosis of PDC deficiency. However, the diagnostic efficacy of such testing approaches for individuals affected with PDC deficiency has not been systematically investigated in this disorder. We sought to evaluate the diagnostic sensitivity and variability of the various PDC enzyme assays in females and males at the Center for Inherited Disorders of Energy Metabolism (CIDEM). CIDEM data were filtered by lactic acidosis and functional PDC deficiency in at least one cell/tissue type (blood lymphocytes, cultured fibroblasts or skeletal muscle) identifying 186 subjects (51% male and 49% female), about half were genetically resolved with 78% of those determined to have a pathogenic PDHA1 mutation. Assaying PDC in cultured fibroblasts in cases where the underlying genetic etiology is PDHA1, was highly sensitive irrespective of gender; 97% (95% confidence interval [CI]: 90%-100%) and 91% (95% CI: 82%-100%) in females and males, respectively. In contrast to the fibroblast-based testing, the lymphocyte- and muscle-based testing were not sensitive (36% [95% CI: 11%-61%, p=0.0003] and 58% [95% CI: 30%-86%, p=0.014], respectively) for identifying known PDC deficient females with pathogenic PDHA1 mutations. In males with a known PDHA1 mutation, the sensitivity of the various cell/tissue assays (75% lymphocyte, 91% fibroblast and 88% muscle) were not statistically different, and the discordance frequency due to the specific cell/tissue used for assaying PDC was 0.15±0.11. Based on this data, a practical diagnostic algorithm is proposed accounting for current molecular approaches, enzyme testing sensitivity, and variability due to gender, cell/tissue type used for testing, and successive repeat testing. Copyright © 2017 Elsevier Inc. All rights reserved.
Kalmar, Alain F; Absalom, Anthony; Rombouts, Pieter; Roets, Jelle; Dewaele, Frank; Verdonck, Pascal; Stemerdink, Arjanne; Zijlstra, Jan G; Monsieurs, Koenraad G
2016-08-01
Unrecognised endotracheal tube misplacement in emergency intubations has a reported incidence of up to 17%. Current detection methods have many limitations restricting their reliability and availability in these circumstances. There is therefore a clinical need for a device that is small enough to be practical in emergency situations and that can detect oesophageal intubation within seconds. In a first reported evaluation, we demonstrated an algorithm based on pressure waveform analysis, able to determine tube location with high reliability in healthy patients. The aim of this study was to validate the specificity of the algorithm in patients with abnormal pulmonary compliance, and to demonstrate the reliability of a newly developed small device that incorporates the technology. Intubated patients with mild to moderate lung injury, admitted to intensive care were included in the study. The device was connected to the endotracheal tube, and three test ventilations were performed in each patient. All diagnostic data were recorded on PC for subsequent specificity/sensitivity analysis. A total of 105 ventilations in 35 patients with lung injury were analysed. With the threshold D-value of 0.1, the system showed a 100% sensitivity and specificity to diagnose tube location. The algorithm retained its specificity in patients with decreased pulmonary compliance. We also demonstrated the feasibility to integrate sensors and diagnostic hardware in a small, portable hand-held device for convenient use in emergency situations. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
[SCAN system--semi-structured interview based on diagnostic criteria].
Adamowski, Tomasz; Kiejna, Andrzej; Hadryś, Tomasz
2006-01-01
This paper presents the main features of contemporary diagnostic systems which are implemented into the SCAN--modern and semi-structured diagnostic interview. The concepts of further development of the classifications, rationale for operationalized diagnostic criteria and for the divisional approach to mental diagnoses will be in focus. The structure and components of SCAN ver. 2.1 (WHO), i.e. Present State Examination--10th edition, Item Group Checklist, Clinical History Schedule, Glossary of Definitions and computer software with the diagnostic algorithm: I-Shell, as well as rules for a reliable use of diagnostic rating scales, will be discussed within the scope of this paper. The materials and training sets necessary for the learning of proper use of the SCAN, especially training sets for SCAN Training Centers and the Reference Manual--a form of guidebook for SCAN shall be introduced. Finally the paper will present evidence that SCAN is an instrument feasible in different cultural settings. Reliability and validity data of SCAN will also be dealt with indicating that SCAN could be widely used in research studies as well as in everyday clinical practice facilitating more detailed diagnostic approach to a patient.
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
On-the-fly detection of images with gastritis aspects in magnetically guided capsule endoscopy
NASA Astrophysics Data System (ADS)
Mewes, P. W.; Neumann, D.; Juloski, A. L.; Angelopoulou, E.; Hornegger, J.
2011-03-01
Capsule Endoscopy (CE) was introduced in 2000 and has since become an established diagnostic procedure for the small bowel, colon and esophagus. For the CE examination the patient swallows the capsule, which then travels through the gastrointestinal tract under the influence of the peristaltic movements. CE is not indicated for stomach examination, as the capsule movements can not be controlled from the outside and the entire surface of the stomach can not be reliably covered. Magnetically-guided capsule endoscopy (MGCE) was introduced in 2010. For the MGCE procedure the stomach is filled with water and the capsule is navigated from the outside using an external magnetic field. During the examination the operator can control the motion of the capsule in order to obtain a sufficient number of stomach-surface images with diagnostic value. The quality of the examination depends on the skill of the operator and his ability to detect aspects of interest in real time. We present a novel computer-assisted diagnostic-procedure (CADP) algorithm for indicating gastritis pathologies in the stomach during the examination. Our algorithm is based on pre-processing methods and feature vectors that are suitably chosen for the challenges of the MGCE imaging (suspended particles, bubbles, lighting). An image is classified using an ada-boost trained classifier. For the classifier training, a number of possible features were investigated. Statistical evaluation was conducted to identify relevant features with discriminative potential. The proposed algorithm was tested on 12 video sequences stemming from 6 volunteers. A mean detection rate of 91.17% was achieved during leave-one out cross-validation.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
EAACI position paper on occupational rhinitis
Moscato, Gianna; Vandenplas, Olivier; Van Wijk, Roy Gerth; Malo, Jean-Luc; Perfetti, Luca; Quirce, Santiago; Walusiak, Jolanta; Castano, Roberto; Pala, Gianni; Gautrin, Denyse; De Groot, Hans; Folletti, Ilenia; Yacoub, Mona Rita; Siracusa, Andrea
2009-01-01
The present document is the result of a consensus reached by a panel of experts from European and non-European countries on Occupational Rhinitis (OR), a disease of emerging relevance which has received little attention in comparison to occupational asthma. The document covers the main items of OR including epidemiology, diagnosis, management, socio-economic impact, preventive strategies and medicolegal issues. An operational definition and classification of OR tailored on that of occupational asthma, as well as a diagnostic algorithm based on steps allowing for different levels of diagnostic evidence are proposed. The needs for future research are pointed out. Key messages are issued for each item. PMID:19257881
Iron deficiency anemia: diagnosis and management.
Clark, Susan F
2009-03-01
Iron deficiency anemia (IDA) still remains universally problematic worldwide. The primary focus of this review is to critique articles published over the past 18 months that describe strategies for the diagnosis and management of this prevalent condition. The medical community continues to lack consensus when identifying the optimal approach for the diagnosis and management of IDA. Current diagnostic recommendations revolve around the validity and practicality of current biomarkers such as soluble transferrin-receptor concentrations and others, and cause-based diagnostics that potentially include endoscopy. Management of IDA is based on supplementation combined with effective etiological treatment. Advances in oral and parenteral low-molecular-weight iron preparations has expanded and improved treatment modalities for IDA. Since the introduction of low versus high-molecular-weight intravenous iron administration, there have been fewer serious adverse events associated with parenteral iron preparations. Best practice guidelines for diagnosing and managing IDA should include the design of an algorithm that is inclusive of multiple biomarkers and cause-based diagnostics, which will provide direction in managing IDA, and distinguish between IDA from the anemia of chronic disease.
Translational bioinformatics: linking the molecular world to the clinical world.
Altman, R B
2012-06-01
Translational bioinformatics represents the union of translational medicine and bioinformatics. Translational medicine moves basic biological discoveries from the research bench into the patient-care setting and uses clinical observations to inform basic biology. It focuses on patient care, including the creation of new diagnostics, prognostics, prevention strategies, and therapies based on biological discoveries. Bioinformatics involves algorithms to represent, store, and analyze basic biological data, including DNA sequence, RNA expression, and protein and small-molecule abundance within cells. Translational bioinformatics spans these two fields; it involves the development of algorithms to analyze basic molecular and cellular data with an explicit goal of affecting clinical care.
Dealing with low-incidence serious diseases in general practice
Buntinx, Frank; Mant, David; Van den Bruel, Ann; Donner-Banzhof, Norbert; Dinant, Geert-Jan
2011-01-01
Cost-effective health care depends on high-quality triage. The most challenging aspect of triage, which GPs confront on a regular basis, is diagnosing rare but serious disease. Failure to shoulder any risk in this situation overloads the health system and subjects patients to unnecessary investigation. Adopting too high a risk threshold leads to missed cases, late diagnosis, and sometimes avoidable death. It also undermines the credibility of primary care practitioners. Quantification of diagnostic risk suggests there is a potential risk gap between the maximum certainty with which GPs can assess the risk of serious disease at presentation and the minimum certainty required by many health systems for further investigation or hospital referral. Physician gut-feeling and diagnostic safety netting are often employed to fill the gap. Neither strategy is well defined or well supported by evidence. It should be possible to reduce the diagnostic risk gap cost-effectively by adopting more explicit diagnostic algorithms and providing better GP access to new diagnostic technologies. It is also essential, given the decreasing experience of triage clinicians employed in a number of countries, that a teachable evidence base is constructed for gut feeling and diagnostic safety netting. However, this construction of an evidence base requires very large-scale studies, and the global primary care research community remains small. The challenge therefore needs to be met by urgent and effective international collaboration. PMID:21401991
NASA Astrophysics Data System (ADS)
Lauritzen, P. H.; Ullrich, P. A.; Jablonowski, C.; Bosler, P. A.; Calhoun, D.; Conley, A. J.; Enomoto, T.; Dong, L.; Dubey, S.; Guba, O.; Hansen, A. B.; Kaas, E.; Kent, J.; Lamarque, J.-F.; Prather, M. J.; Reinert, D.; Shashkin, V. V.; Skamarock, W. C.; Sørensen, B.; Taylor, M. A.; Tolstykh, M. A.
2013-09-01
Recently, a standard test case suite for 2-D linear transport on the sphere was proposed to assess important aspects of accuracy in geophysical fluid dynamics with a "minimal" set of idealized model configurations/runs/diagnostics. Here we present results from 19 state-of-the-art transport scheme formulations based on finite-difference/finite-volume methods as well as emerging (in the context of atmospheric/oceanographic sciences) Galerkin methods. Discretization grids range from traditional regular latitude-longitude grids to more isotropic domain discretizations such as icosahedral and cubed-sphere tessellations of the sphere. The schemes are evaluated using a wide range of diagnostics in idealized flow environments. Accuracy is assessed in single- and two-tracer configurations using conventional error norms as well as novel diagnostics designed for climate and climate-chemistry applications. In addition, algorithmic considerations that may be important for computational efficiency are reported on. The latter is inevitably computing platform dependent, The ensemble of results from a wide variety of schemes presented here helps shed light on the ability of the test case suite diagnostics and flow settings to discriminate between algorithms and provide insights into accuracy in the context of global atmospheric/ocean modeling. A library of benchmark results is provided to facilitate scheme intercomparison and model development. Simple software and data-sets are made available to facilitate the process of model evaluation and scheme intercomparison.
NASA Astrophysics Data System (ADS)
Lauritzen, P. H.; Ullrich, P. A.; Jablonowski, C.; Bosler, P. A.; Calhoun, D.; Conley, A. J.; Enomoto, T.; Dong, L.; Dubey, S.; Guba, O.; Hansen, A. B.; Kaas, E.; Kent, J.; Lamarque, J.-F.; Prather, M. J.; Reinert, D.; Shashkin, V. V.; Skamarock, W. C.; Sørensen, B.; Taylor, M. A.; Tolstykh, M. A.
2014-01-01
Recently, a standard test case suite for 2-D linear transport on the sphere was proposed to assess important aspects of accuracy in geophysical fluid dynamics with a "minimal" set of idealized model configurations/runs/diagnostics. Here we present results from 19 state-of-the-art transport scheme formulations based on finite-difference/finite-volume methods as well as emerging (in the context of atmospheric/oceanographic sciences) Galerkin methods. Discretization grids range from traditional regular latitude-longitude grids to more isotropic domain discretizations such as icosahedral and cubed-sphere tessellations of the sphere. The schemes are evaluated using a wide range of diagnostics in idealized flow environments. Accuracy is assessed in single- and two-tracer configurations using conventional error norms as well as novel diagnostics designed for climate and climate-chemistry applications. In addition, algorithmic considerations that may be important for computational efficiency are reported on. The latter is inevitably computing platform dependent. The ensemble of results from a wide variety of schemes presented here helps shed light on the ability of the test case suite diagnostics and flow settings to discriminate between algorithms and provide insights into accuracy in the context of global atmospheric/ocean modeling. A library of benchmark results is provided to facilitate scheme intercomparison and model development. Simple software and data sets are made available to facilitate the process of model evaluation and scheme intercomparison.
CSE database: extended annotations and new recommendations for ECG software testing.
Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie
2017-08-01
Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow for faster development and testing of classification software. As a result, this might facilitate cardiologists' work and lead to faster diagnoses and earlier treatment.
Augmenting epidemiological models with point-of-care diagnostics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.
Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less
Bhaumik, Basabi
2016-01-01
A novel algorithm based on forward search is developed for real-time electrocardiogram (ECG) signal processing and implemented in application specific integrated circuit (ASIC) for QRS complex related cardiovascular disease diagnosis. The authors have evaluated their algorithm using MIT-BIH database and achieve sensitivity of 99.86% and specificity of 99.93% for QRS complex peak detection. In this Letter, Physionet PTB diagnostic ECG database is used for QRS complex related disease detection. An ASIC for cardiovascular disease detection is fabricated using 130-nm CMOS high-speed process technology. The area of the ASIC is 0.5 mm2. The power dissipation is 1.73 μW at the operating frequency of 1 kHz with a supply voltage of 0.6 V. The output from the ASIC is fed to their Android application that generates diagnostic report and can be sent to a cardiologist through email. Their ASIC result shows average failed detection rate of 0.16% for six leads data of 290 patients in PTB diagnostic ECG database. They also have implemented a low-leakage version of their ASIC. The ASIC dissipates only 45 pJ with a supply voltage of 0.9 V. Their proposed ASIC is most suitable for energy efficient telemetry cardiovascular disease detection system. PMID:27284458
Augmenting epidemiological models with point-of-care diagnostics data
Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.; ...
2016-04-20
Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less
De Carolis, Elena; Paoletti, Silvia; Nagel, Domenico; Vella, Antonietta; Mello, Enrica; Palucci, Ivana; De Angelis, Giulia; D'Inzeo, Tiziana; Sanguinetti, Maurizio; Posteraro, Brunella; Spanu, Teresa
2017-01-01
Nowadays, the global spread of resistance to oxyimino-cephalosporins in Enterobacteriaceae implies the need for novel diagnostics that can rapidly target resistant organisms from these bacterial species. In this study, we developed and evaluated a Direct Mass Spectrometry assay for Beta-Lactamase (D-MSBL) that allows direct identification of (oxyimino)cephalosporin-resistant Escherichia coli or Klebsiella pneumoniae from positive blood cultures (BCs), by using the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) technology. The D-MSBL assay was performed on 93 E. coli or K. pneumoniae growing BC samples that were shortly co-incubated with cefotaxime (CTX) as the indicator cephalosporin. Susceptibility and resistance defining peaks from the samples' mass spectra were analyzed by a novel algorithm for bacterial organism classification. The D-MSBL assay allowed discrimination between E. coli and K. pneumoniae that were resistant or susceptible to CTX with a sensitivity of 86.8% and a specificity of 98.2%. The proposed algorithm-based D-MSBL assay, if integrated in the routine laboratory diagnostic workflow, may be useful to enhance the establishment of appropriate antibiotic therapy and to control the threat of oxyimino-cephalosporin resistance in hospital.
Jain, Sanjeev Kumar; Bhaumik, Basabi
2016-03-01
A novel algorithm based on forward search is developed for real-time electrocardiogram (ECG) signal processing and implemented in application specific integrated circuit (ASIC) for QRS complex related cardiovascular disease diagnosis. The authors have evaluated their algorithm using MIT-BIH database and achieve sensitivity of 99.86% and specificity of 99.93% for QRS complex peak detection. In this Letter, Physionet PTB diagnostic ECG database is used for QRS complex related disease detection. An ASIC for cardiovascular disease detection is fabricated using 130-nm CMOS high-speed process technology. The area of the ASIC is 0.5 mm(2). The power dissipation is 1.73 μW at the operating frequency of 1 kHz with a supply voltage of 0.6 V. The output from the ASIC is fed to their Android application that generates diagnostic report and can be sent to a cardiologist through email. Their ASIC result shows average failed detection rate of 0.16% for six leads data of 290 patients in PTB diagnostic ECG database. They also have implemented a low-leakage version of their ASIC. The ASIC dissipates only 45 pJ with a supply voltage of 0.9 V. Their proposed ASIC is most suitable for energy efficient telemetry cardiovascular disease detection system.
Zvyagin, V N; Rakitin, V A; Fomina, E E
The objective of the present study was the development of the point-digital model for the scaless interpretation of the dermatoglyphic papillary patterns on human fingers that would allow to comprehensively describe, in digital terms, the main characteristics of the traits and perform the quantitative assessment of the frequency of their inheritance. A specially developed computer program, D.glyphic. 7-14 was used to mark the dermatoglyphic patterns on the fingerprints obtained from 30 familial triplets (father + mother + child).The values of all the studied traits for kinship diagnostics were found by calculating the ratios of the sums of differences between the traits in the parent-parent pairs to those in the respective parent-child pairs. The algorithms for the point marking of the traits and reading out the digital information about them have been developed. The traditional dermatoglyphic patterns were selected and the novel ones applied for the use in the framework of the point-digital model for the interpretation of the for diagnostics of consanguineous relationship. The present experimental study has demonstrated the high level of inheritance of the selected traits and the possibility to develop the algorithms and computation techniques for the calculation of consanguineous relationship coefficients based on these traits.
van't Hoog, Anna H; Cobelens, Frank; Vassall, Anna; van Kampen, Sanne; Dorman, Susan E; Alland, David; Ellner, Jerrold
2013-01-01
High costs are a limitation to scaling up the Xpert MTB/RIF assay (Xpert) for the diagnosis of tuberculosis in resource-constrained settings. A triaging strategy in which a sensitive but not necessarily highly specific rapid test is used to select patients for Xpert may result in a more affordable diagnostic algorithm. To inform the selection and development of particular diagnostics as a triage test we explored combinations of sensitivity, specificity and cost at which a hypothetical triage test will improve affordability of the Xpert assay. In a decision analytical model parameterized for Uganda, India and South Africa, we compared a diagnostic algorithm in which a cohort of patients with presumptive TB received Xpert to a triage algorithm whereby only those with a positive triage test were tested by Xpert. A triage test with sensitivity equal to Xpert, 75% specificity, and costs of US$5 per patient tested reduced total diagnostic costs by 42% in the Uganda setting, and by 34% and 39% respectively in the India and South Africa settings. When exploring triage algorithms with lower sensitivity, the use of an example triage test with 95% sensitivity relative to Xpert, 75% specificity and test costs $5 resulted in similar cost reduction, and was cost-effective by the WHO willingness-to-pay threshold compared to Xpert for all in Uganda, but not in India and South Africa. The gain in affordability of the examined triage algorithms increased with decreasing prevalence of tuberculosis among the cohort. A triage test strategy could potentially improve the affordability of Xpert for TB diagnosis, particularly in low-income countries and with enhanced case-finding. Tests and markers with lower accuracy than desired of a diagnostic test may fall within the ranges of sensitivity, specificity and cost required for triage tests and be developed as such.
NASA Astrophysics Data System (ADS)
Kong, Changduk; Lim, Semyeong
2011-12-01
Recently, the health monitoring system of major gas path components of gas turbine uses mostly the model based method like the Gas Path Analysis (GPA). This method is to find quantity changes of component performance characteristic parameters such as isentropic efficiency and mass flow parameter by comparing between measured engine performance parameters such as temperatures, pressures, rotational speeds, fuel consumption, etc. and clean engine performance parameters without any engine faults which are calculated by the base engine performance model. Currently, the expert engine diagnostic systems using the artificial intelligent methods such as Neural Networks (NNs), Fuzzy Logic and Genetic Algorithms (GAs) have been studied to improve the model based method. Among them the NNs are mostly used to the engine fault diagnostic system due to its good learning performance, but it has a drawback due to low accuracy and long learning time to build learning data base if there are large amount of learning data. In addition, it has a very complex structure for finding effectively single type faults or multiple type faults of gas path components. This work builds inversely a base performance model of a turboprop engine to be used for a high altitude operation UAV using measured performance data, and proposes a fault diagnostic system using the base engine performance model and the artificial intelligent methods such as Fuzzy logic and Neural Network. The proposed diagnostic system isolates firstly the faulted components using Fuzzy Logic, then quantifies faults of the identified components using the NN leaned by fault learning data base, which are obtained from the developed base performance model. In leaning the NN, the Feed Forward Back Propagation (FFBP) method is used. Finally, it is verified through several test examples that the component faults implanted arbitrarily in the engine are well isolated and quantified by the proposed diagnostic system.
A novel algorithm for Bluetooth ECG.
Pandya, Utpal T; Desai, Uday B
2012-11-01
In wireless transmission of ECG, data latency will be significant when battery power level and data transmission distance are not maintained. In applications like home monitoring or personalized care, to overcome the joint effect of previous issues of wireless transmission and other ECG measurement noises, a novel filtering strategy is required. Here, a novel algorithm, identified as peak rejection adaptive sampling modified moving average (PRASMMA) algorithm for wireless ECG is introduced. This algorithm first removes error in bit pattern of received data if occurred in wireless transmission and then removes baseline drift. Afterward, a modified moving average is implemented except in the region of each QRS complexes. The algorithm also sets its filtering parameters according to different sampling rate selected for acquisition of signals. To demonstrate the work, a prototyped Bluetooth-based ECG module is used to capture ECG with different sampling rate and in different position of patient. This module transmits ECG wirelessly to Bluetooth-enabled devices where the PRASMMA algorithm is applied on captured ECG. The performance of PRASMMA algorithm is compared with moving average and S-Golay algorithms visually as well as numerically. The results show that the PRASMMA algorithm can significantly improve the ECG reconstruction by efficiently removing the noise and its use can be extended to any parameters where peaks are importance for diagnostic purpose.
Digital Pathology: Data-Intensive Frontier in Medical Imaging
Cooper, Lee A. D.; Carter, Alexis B.; Farris, Alton B.; Wang, Fusheng; Kong, Jun; Gutman, David A.; Widener, Patrick; Pan, Tony C.; Cholleti, Sharath R.; Sharma, Ashish; Kurc, Tahsin M.; Brat, Daniel J.; Saltz, Joel H.
2013-01-01
Pathology is a medical subspecialty that practices the diagnosis of disease. Microscopic examination of tissue reveals information enabling the pathologist to render accurate diagnoses and to guide therapy. The basic process by which anatomic pathologists render diagnoses has remained relatively unchanged over the last century, yet advances in information technology now offer significant opportunities in image-based diagnostic and research applications. Pathology has lagged behind other healthcare practices such as radiology where digital adoption is widespread. As devices that generate whole slide images become more practical and affordable, practices will increasingly adopt this technology and eventually produce an explosion of data that will quickly eclipse the already vast quantities of radiology imaging data. These advances are accompanied by significant challenges for data management and storage, but they also introduce new opportunities to improve patient care by streamlining and standardizing diagnostic approaches and uncovering disease mechanisms. Computer-based image analysis is already available in commercial diagnostic systems, but further advances in image analysis algorithms are warranted in order to fully realize the benefits of digital pathology in medical discovery and patient care. In coming decades, pathology image analysis will extend beyond the streamlining of diagnostic workflows and minimizing interobserver variability and will begin to provide diagnostic assistance, identify therapeutic targets, and predict patient outcomes and therapeutic responses. PMID:25328166
An Artificial Intelligence Approach for Gears Diagnostics in AUVs
Marichal, Graciliano Nicolás; Del Castillo, María Lourdes; López, Jesús; Padrón, Isidro; Artés, Mariano
2016-01-01
In this paper, an intelligent scheme for detecting incipient defects in spur gears is presented. In fact, the study has been undertaken to determine these defects in a single propeller system of a small-sized unmanned helicopter. It is important to remark that although the study focused on this particular system, the obtained results could be extended to other systems known as AUVs (Autonomous Unmanned Vehicles), where the usage of polymer gears in the vehicle transmission is frequent. Few studies have been carried out on these kinds of gears. In this paper, an experimental platform has been adapted for the study and several samples have been prepared. Moreover, several vibration signals have been measured and their time-frequency characteristics have been taken as inputs to the diagnostic system. In fact, a diagnostic system based on an artificial intelligence strategy has been devised. Furthermore, techniques based on several paradigms of the Artificial Intelligence (Neural Networks, Fuzzy systems and Genetic Algorithms) have been applied altogether in order to design an efficient fault diagnostic system. A hybrid Genetic Neuro-Fuzzy system has been developed, where it is possible, at the final stage of the learning process, to express the fault diagnostic system as a set of fuzzy rules. Several trials have been carried out and satisfactory results have been achieved. PMID:27077868
An Artificial Intelligence Approach for Gears Diagnostics in AUVs.
Marichal, Graciliano Nicolás; Del Castillo, María Lourdes; López, Jesús; Padrón, Isidro; Artés, Mariano
2016-04-12
In this paper, an intelligent scheme for detecting incipient defects in spur gears is presented. In fact, the study has been undertaken to determine these defects in a single propeller system of a small-sized unmanned helicopter. It is important to remark that although the study focused on this particular system, the obtained results could be extended to other systems known as AUVs (Autonomous Unmanned Vehicles), where the usage of polymer gears in the vehicle transmission is frequent. Few studies have been carried out on these kinds of gears. In this paper, an experimental platform has been adapted for the study and several samples have been prepared. Moreover, several vibration signals have been measured and their time-frequency characteristics have been taken as inputs to the diagnostic system. In fact, a diagnostic system based on an artificial intelligence strategy has been devised. Furthermore, techniques based on several paradigms of the Artificial Intelligence (Neural Networks, Fuzzy systems and Genetic Algorithms) have been applied altogether in order to design an efficient fault diagnostic system. A hybrid Genetic Neuro-Fuzzy system has been developed, where it is possible, at the final stage of the learning process, to express the fault diagnostic system as a set of fuzzy rules. Several trials have been carried out and satisfactory results have been achieved.
Bio-Optics Based Sensation Imaging for Breast Tumor Detection Using Tissue Characterization
Lee, Jong-Ha; Kim, Yoon Nyun; Park, Hee-Jun
2015-01-01
The tissue inclusion parameter estimation method is proposed to measure the stiffness as well as geometric parameters. The estimation is performed based on the tactile data obtained at the surface of the tissue using an optical tactile sensation imaging system (TSIS). A forward algorithm is designed to comprehensively predict the tactile data based on the mechanical properties of tissue inclusion using finite element modeling (FEM). This forward information is used to develop an inversion algorithm that will be used to extract the size, depth, and Young's modulus of a tissue inclusion from the tactile data. We utilize the artificial neural network (ANN) for the inversion algorithm. The proposed estimation method was validated by a realistic tissue phantom with stiff inclusions. The experimental results showed that the proposed estimation method can measure the size, depth, and Young's modulus of a tissue inclusion with 0.58%, 3.82%, and 2.51% relative errors, respectively. The obtained results prove that the proposed method has potential to become a useful screening and diagnostic method for breast cancer. PMID:25785306
Defining the needs for next generation assays for tuberculosis.
Denkinger, Claudia M; Kik, Sandra V; Cirillo, Daniela Maria; Casenghi, Martina; Shinnick, Thomas; Weyer, Karin; Gilpin, Chris; Boehme, Catharina C; Schito, Marco; Kimerling, Michael; Pai, Madhukar
2015-04-01
To accelerate the fight against tuberculosis, major diagnostic challenges need to be addressed urgently. Post-2015 targets are unlikely to be met without the use of novel diagnostics that are more accurate and can be used closer to where patients first seek care in affordable diagnostic algorithms. This article describes the efforts by the stakeholder community that led to the identification of the high-priority diagnostic needs in tuberculosis. Subsequently target product profiles for the high-priority diagnostic needs were developed and reviewed in a World Health Organization (WHO)-led consensus meeting. The high-priority diagnostic needs included (1) a sputum-based replacement test for smear-microscopy; (2) a non-sputum-based biomarker test for all forms of tuberculosis, ideally suitable for use at levels below microscopy centers; (3) a simple, low cost triage test for use by first-contact care providers as a rule-out test, ideally suitable for use by community health workers; and (4) a rapid drug susceptibility test for use at the microscopy center level. The developed target product profiles, along with complimentary work presented in this supplement, will help to facilitate the interaction between the tuberculosis community and the diagnostics industry with the goal to lead the way toward the post-2015 global tuberculosis targets. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Lange, M; Siemen, H; Blome, S; Thulke, H-H
2014-11-15
African swine fever (ASF) is a highly lethal viral disease of domestic pigs and wild boar. ASF was introduced into the southern Russian Federation in 2007 and is now reported to be spreading in populations of wild and domestic suids. An endemic situation in the local wild boar population would significantly complicate management of the disease in the livestock population. To date no sound method exists for identifying the characteristic pattern of an endemic situation, which describes infection persisting from generation to generation in the same population. To support urgent management decisions at the wildlife-livestock interface, a new algorithm was constructed to test the hypothesis of an endemic disease situation in wildlife on the basis of case reports. The approach described here uses spatial and temporal associations between observed diagnostic data to discriminate between endemic and non-endemic patterns of case occurrence. The algorithm was validated with data from an epidemiological simulation model and applied to ASF case data from southern Russia. Based on the algorithm and the diagnostic data available, the null hypothesis of an endemic situation of ASF in wild boar of the region was rejected. Copyright © 2014 Elsevier B.V. All rights reserved.
Development of an imaging method for quantifying a large digital PCR droplet
NASA Astrophysics Data System (ADS)
Huang, Jen-Yu; Lee, Shu-Sheng; Hsu, Yu-Hsiang
2017-02-01
Portable devices have been recognized as the future linkage between end-users and lab-on-a-chip devices. It has a user friendly interface and provides apps to interface headphones, cameras, and communication duct, etc. In particular, the digital resolution of cameras installed in smartphones or pads already has a high imaging resolution with a high number of pixels. This unique feature has triggered researches to integrate optical fixtures with smartphone to provide microscopic imaging capabilities. In this paper, we report our study on developing a portable diagnostic tool based on the imaging system of a smartphone and a digital PCR biochip. A computational algorithm is developed to processing optical images taken from a digital PCR biochip with a smartphone in a black box. Each reaction droplet is recorded in pixels and is analyzed in a sRGB (red, green, and blue) color space. Multistep filtering algorithm and auto-threshold algorithm are adopted to minimize background noise contributed from ccd cameras and rule out false positive droplets, respectively. Finally, a size-filtering method is applied to identify the number of positive droplets to quantify target's concentration. Statistical analysis is then performed for diagnostic purpose. This process can be integrated in an app and can provide a user friendly interface without professional training.
Multari, Rosalie A.; Cremers, David A.; Bostian, Melissa L.; Dupre, Joanne M.
2013-01-01
Laser-Induced Breakdown Spectroscopy (LIBS) is a rapid, in situ, diagnostic technique in which light emissions from a laser plasma formed on the sample are used for analysis allowing automated analysis results to be available in seconds to minutes. This speed of analysis coupled with little or no sample preparation makes LIBS an attractive detection tool. In this study, it is demonstrated that LIBS can be utilized to discriminate both the bacterial species and strains of bacterial colonies grown on blood agar. A discrimination algorithm was created based on multivariate regression analysis of spectral data. The algorithm was deployed on a simulated LIBS instrument system to demonstrate discrimination capability using 6 species. Genetically altered Staphylococcus aureus strains grown on BA, including isogenic sets that differed only by the acquisition of mutations that increase fusidic acid or vancomycin resistance, were also discriminated. The algorithm successfully identified all thirteen cultures used in this study in a time period of 2 minutes. This work provides proof of principle for a LIBS instrumentation system that could be developed for the rapid discrimination of bacterial species and strains demonstrating relatively minor genomic alterations using data collected directly from pathogen isolation media. PMID:24109513
ERIC Educational Resources Information Center
Kamp-Becker, Inge; Ghahreman, Mardjan; Heinzel-Gutenbrunner, Monika; Peters, Mira; Remschmidt, Helmut; Becker, Katja
2013-01-01
The Autism Diagnostic Observation Schedule (ADOS) is a semi-structured, standardized assessment designed for use in diagnostic evaluation of individuals with suspected autism spectrum disorder (ASD). The ADOS has been effective in categorizing children who definitely have autism or not, but has lower specificity and sometimes sensitivity for…
Alici, Ibrahim Onur; Yılmaz Demirci, Nilgün; Yılmaz, Aydın; Karakaya, Jale; Özaydın, Esra
2016-09-01
There are several papers on the sonographic features of mediastinal lymph nodes affected by several diseases, but none gives the importance and clinical utility of the features. In order to find out which lymph node should be sampled in a particular nodal station during endobronchial ultrasound, we investigated the diagnostic performances of certain sonographic features and proposed an algorithmic approach. We retrospectively analyzed 1051 lymph nodes and randomly assigned them into a preliminary experimental and a secondary study group. The diagnostic performances of the sonographic features (gray scale, echogeneity, shape, size, margin, presence of necrosis, presence of calcification and absence of central hilar structure) were calculated, and an algorithm for lymph node sampling was obtained with decision tree analysis in the experimental group. Later, a modified algorithm was applied to the patients in the study group to give the accuracy. The demographic characteristics of the patients were not statistically significant between the primary and the secondary groups. All of the features were discriminative between malignant and benign diseases. The modified algorithm sensitivity, specificity, and positive and negative predictive values and diagnostic accuracy for detecting metastatic lymph nodes were 100%, 51.2%, 50.6%, 100% and 67.5%, respectively. In this retrospective analysis, the standardized sonographic classification system and the proposed algorithm performed well in choosing the node that should be sampled in a particular station during endobronchial ultrasound. © 2015 John Wiley & Sons Ltd.
Abraham, N S; Cohen, D C; Rivers, B; Richardson, P
2006-07-15
To validate veterans affairs (VA) administrative data for the diagnosis of nonsteroidal anti-inflammatory drug (NSAID)-related upper gastrointestinal events (UGIE) and to develop a diagnostic algorithm. A retrospective study of veterans prescribed an NSAID as identified from the national pharmacy database merged with in-patient and out-patient data, followed by primary chart abstraction. Contingency tables were constructed to allow comparison with a random sample of patients prescribed an NSAID, but without UGIE. Multivariable logistic regression analysis was used to derive a predictive algorithm. Once derived, the algorithm was validated in a separate cohort of veterans. Of 906 patients, 606 had a diagnostic code for UGIE; 300 were a random subsample of 11 744 patients (control). Only 161 had a confirmed UGIE. The positive predictive value (PPV) of diagnostic codes was poor, but improved from 27% to 51% with the addition of endoscopic procedural codes. The strongest predictors of UGIE were an in-patient ICD-9 code for gastric ulcer, duodenal ulcer and haemorrhage combined with upper endoscopy. This algorithm had a PPV of 73% when limited to patients >or=65 years (c-statistic 0.79). Validation of the algorithm revealed a PPV of 80% among patients with an overlapping NSAID prescription. NSAID-related UGIE can be assessed using VA administrative data. The optimal algorithm includes an in-patient ICD-9 code for gastric or duodenal ulcer and gastrointestinal bleeding combined with a procedural code for upper endoscopy.
Feigl, Guenther C; Hiergeist, Wolfgang; Fellner, Claudia; Schebesch, Karl-Michael M; Doenitz, Christian; Finkenzeller, Thomas; Brawanski, Alexander; Schlaier, Juergen
2014-01-01
Diffusion tensor imaging (DTI)-based tractography has become an integral part of preoperative diagnostic imaging in many neurosurgical centers, and other nonsurgical specialties depend increasingly on DTI tractography as a diagnostic tool. The aim of this study was to analyze the anatomic accuracy of visualized white matter fiber pathways using different, readily available DTI tractography software programs. Magnetic resonance imaging scans of the head of 20 healthy volunteers were acquired using a Siemens Symphony TIM 1.5T scanner and a 12-channel head array coil. The standard settings of the scans in this study were 12 diffusion directions and 5-mm slices. The fornices were chosen as an anatomic structure for the comparative fiber tracking. Identical data sets were loaded into nine different fiber tracking packages that used different algorithms. The nine software packages and algorithms used were NeuroQLab (modified tensor deflection [TEND] algorithm), Sörensen DTI task card (modified streamline tracking technique algorithm), Siemens DTI module (modified fourth-order Runge-Kutta algorithm), six different software packages from Trackvis (interpolated streamline algorithm, modified FACT algorithm, second-order Runge-Kutta algorithm, Q-ball [FACT algorithm], tensorline algorithm, Q-ball [second-order Runge-Kutta algorithm]), DTI Query (modified streamline tracking technique algorithm), Medinria (modified TEND algorithm), Brainvoyager (modified TEND algorithm), DTI Studio modified FACT algorithm, and the BrainLab DTI module based on the modified Runge-Kutta algorithm. Three examiners (a neuroradiologist, a magnetic resonance imaging physicist, and a neurosurgeon) served as examiners. They were double-blinded with respect to the test subject and the fiber tracking software used in the presented images. Each examiner evaluated 301 images. The examiners were instructed to evaluate screenshots from the different programs based on two main criteria: (i) anatomic accuracy of the course of the displayed fibers and (ii) number of fibers displayed outside the anatomic boundaries. The mean overall grade for anatomic accuracy was 2.2 (range, 1.1-3.6) with a standard deviation (SD) of 0.9. The mean overall grade for incorrectly displayed fibers was 2.5 (range, 1.6-3.5) with a SD of 0.6. The mean grade of the overall program ranking was 2.3 with a SD of 0.6. The overall mean grade of the program ranked number one (NeuroQLab) was 1.7 (range, 1.5-2.8). The mean overall grade of the program ranked last (BrainLab iPlan Cranial 2.6 DTI Module) was 3.3 (range, 1.7-4). The difference between the mean grades of these two programs was statistically highly significant (P < 0.0001). There was no statistically significant difference between the programs ranked 1-3: NeuroQLab, Sörensen DTI Task Card, and Siemens DTI module. The results of this study show that there is a statistically significant difference in the anatomic accuracy of the tested DTI fiber tracking programs. Although incorrectly displayed fibers could lead to wrong conclusions in the neurosciences field, which relies heavily on this noninvasive imaging technique, incorrectly displayed fibers in neurosurgery could lead to surgical decisions potentially harmful for the patient if used without intraoperative cortical stimulation. DTI fiber tracking presents a valuable noninvasive preoperative imaging tool, which requires further validation after important standardization of the acquisition and processing techniques currently available. Copyright © 2014 Elsevier Inc. All rights reserved.
Chen, Hongda; Zucknick, Manuela; Werner, Simone; Knebel, Phillip; Brenner, Hermann
2015-07-15
Novel noninvasive blood-based screening tests are strongly desirable for early detection of colorectal cancer. We aimed to conduct a head-to-head comparison of the diagnostic performance of 92 plasma-based tumor-associated protein biomarkers for early detection of colorectal cancer in a true screening setting. Among all available 35 carriers of colorectal cancer and a representative sample of 54 men and women free of colorectal neoplasms recruited in a cohort of screening colonoscopy participants in 2005-2012 (N = 5,516), the plasma levels of 92 protein biomarkers were measured. ROC analyses were conducted to evaluate the diagnostic performance. A multimarker algorithm was developed through the Lasso logistic regression model and validated in an independent validation set. The .632+ bootstrap method was used to adjust for the potential overestimation of diagnostic performance. Seventeen protein markers were identified to show statistically significant differences in plasma levels between colorectal cancer cases and controls. The adjusted area under the ROC curves (AUC) of these 17 individual markers ranged from 0.55 to 0.70. An eight-marker classifier was constructed that increased the adjusted AUC to 0.77 [95% confidence interval (CI), 0.59-0.91]. When validating this algorithm in an independent validation set, the AUC was 0.76 (95% CI, 0.65-0.85), and sensitivities at cutoff levels yielding 80% and 90% specificities were 65% (95% CI, 41-80%) and 44% (95% CI, 24-72%), respectively. The identified profile of protein biomarkers could contribute to the development of a powerful multimarker blood-based test for early detection of colorectal cancer. ©2015 American Association for Cancer Research.
Bollepalli, S Chandra; Challa, S Sastry; Anumandla, Laxminarayana; Jana, Soumya
2018-04-25
While cardiovascular diseases (CVDs) are prevalent across economic strata, the economically disadvantaged population is disproportionately affected due to the high cost of traditional CVD management, involving consultations, testing and monitoring at medical facilities. Accordingly, developing an ultra-low-cost alternative, affordable even to groups at the bottom of the economic pyramid, has emerged as a societal imperative. Against this backdrop, we propose an inexpensive yet accurate home-based electrocardiogram (ECG) monitoring service. Specifically, we seek to provide point-of-care monitoring of premature ventricular contractions (PVCs), high frequency of which could indicate the onset of potentially fatal arrhythmia. Note that the first-generation telecardiology system acquires the ECG, transmits it to a professional diagnostic center without processing, and nearly achieves the diagnostic accuracy of a bedside setup. In the process, such a system incurs high bandwidth cost and requires the physicians to process the entire record for diagnosis. To reduce cost, current telecardiology systems compress data before transmitting. However, the burden on physicians remains undiminished. In this context, we develop a dictionary-based algorithm that reduces not only the overall bandwidth requirement, but also the physicians workload by localizing anomalous beats. Specifically, we detect anomalous beats with high sensitivity and only those beats are then transmitted. In fact, we further compress those beats using class-specific dictionaries subject to suitable reconstruction/diagnostic fidelity. Finally, using Monte Carlo cross validation on MIT/BIH arrhythmia database, we evaluate the performance of the proposed system. In particular, with a sensitivity target of at most one undetected PVC in one hundred beats, and a percentage root mean squared difference less than 9% (a clinically acceptable level of fidelity), we achieved about 99.15% reduction in bandwidth cost, equivalent to 118-fold savings over first-generation telecardiology. In the process, the professional workload is reduced by at least 85.9% for noncritical cases. Our algorithm also outperforms known algorithms under certain measures in the telecardiological context. Copyright © 2018 Elsevier B.V. All rights reserved.
Application of Dynamic Logic Algorithm to Inverse Scattering Problems Related to Plasma Diagnostics
NASA Astrophysics Data System (ADS)
Perlovsky, L.; Deming, R. W.; Sotnikov, V.
2010-11-01
In plasma diagnostics scattering of electromagnetic waves is widely used for identification of density and wave field perturbations. In the present work we use a powerful mathematical approach, dynamic logic (DL), to identify the spectra of scattered electromagnetic (EM) waves produced by the interaction of the incident EM wave with a Langmuir soliton in the presence of noise. The problem is especially difficult since the spectral amplitudes of the noise pattern are comparable with the amplitudes of the scattered waves. In the past DL has been applied to a number of complex problems in artificial intelligence, pattern recognition, and signal processing, resulting in revolutionary improvements. Here we demonstrate its application to plasma diagnostic problems. [4pt] Perlovsky, L.I., 2001. Neural Networks and Intellect: using model-based concepts. Oxford University Press, New York, NY.
Optical diagnosis of cervical cancer by higher order spectra and boosting
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2017-03-01
In this contribution, we report the application of higher order statistical moments using decision tree and ensemble based learning methodology for the development of diagnostic algorithms for optical diagnosis of cancer. The classification results were compared to those obtained with an independent feature extractors like linear discriminant analysis (LDA). The performance and efficacy of these methodology using higher order statistics as a classifier using boosting has higher specificity and sensitivity while being much faster as compared to other time-frequency domain based methods.
Graph-based real-time fault diagnostics
NASA Technical Reports Server (NTRS)
Padalkar, S.; Karsai, G.; Sztipanovits, J.
1988-01-01
A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.
Guerrero-Ramos, Alvaro; Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-06-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Cho, M; Lee, D-H; Doh, E J; Kim, Y; Chung, J H; Kim, H C; Kim, S
2017-08-01
Erythema is the most common presenting sign in patients with skin diseases, and various methods to treat erythema symptoms have become common. To evaluate changes in erythema, a reliable device that can support objective diagnosis is required. We developed a novel photography-based system for erythema diagnosis that provides a high-resolution three-view photograph taken in a consistent photography environment with a curved surface light source and can be integrated with optimized image processing algorithms. A new diagnostic algorithm was applied to photographs from 32 patients to determine areas of erythema automatically. To assess the performance in comparison to dermatologists' evaluations, five dermatologists independently evaluate the areas of erythema, and we defined an area called the clinical consensus area of erythema (CCAE), which is based on the majority opinion of dermatologists during evaluation. The CCAE values obtained were compared with the erythema areas determined by the system's diagnostic algorithm. Forty-one photographs with areas of erythema were evaluated by the proposed system and by dermatologists. The results obtained with the proposed system had a mean accuracy of 93.18% with a standard deviation of 3.52% when compared with the CCAE results. The results also showed that the proposed system could detect erythema areas without any pigmentation. In contrast to assessments by individual dermatologists, use of the CCAE reduced the amount of error that occurred owing to bias or subjectivity. A new erythema evaluation system was developed and validated through CCAE, suggesting that the system can support dermatologists' objective diagnoses of erythema. © 2017 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Evaluation of Gear Condition Indicator Performance on Rotorcraft Fleet
NASA Technical Reports Server (NTRS)
Antolick, Lance J.; Branning, Jeremy S.; Wade, Daniel R.; Dempsey, Paula J.
2010-01-01
The U.S. Army is currently expanding its fleet of Health Usage Monitoring Systems (HUMS) equipped aircraft at significant rates, to now include over 1,000 rotorcraft. Two different on-board HUMS, the Honeywell Modern Signal Processing Unit (MSPU) and the Goodrich Integrated Vehicle Health Management System (IVHMS), are collecting vibration health data on aircraft that include the Apache, Blackhawk, Chinook, and Kiowa Warrior. The objective of this paper is to recommend the most effective gear condition indicators for fleet use based on both a theoretical foundation and field data. Gear diagnostics with better performance will be recommended based on both a theoretical foundation and results of in-fleet use. In order to evaluate the gear condition indicator performance on rotorcraft fleets, results of more than five years of health monitoring for gear faults in the entire HUMS equipped Army helicopter fleet will be presented. More than ten examples of gear faults indicated by the gear CI have been compiled and each reviewed for accuracy. False alarms indications will also be discussed. Performance data from test rigs and seeded fault tests will also be presented. The results of the fleet analysis will be discussed, and a performance metric assigned to each of the competing algorithms. Gear fault diagnostic algorithms that are compliant with ADS-79A will be recommended for future use and development. The performance of gear algorithms used in the commercial units and the effectiveness of the gear CI as a fault identifier will be assessed using the criteria outlined in the standards in ADS-79A-HDBK, an Army handbook that outlines the conversion from Reliability Centered Maintenance to the On-Condition status of Condition Based Maintenance.
Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-01-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. PMID:24695777
Development of PET projection data correction algorithm
NASA Astrophysics Data System (ADS)
Bazhanov, P. V.; Kotina, E. D.
2017-12-01
Positron emission tomography is modern nuclear medicine method used in metabolism and internals functions examinations. This method allows to diagnosticate treatments on their early stages. Mathematical algorithms are widely used not only for images reconstruction but also for PET data correction. In this paper random coincidences and scatter correction algorithms implementation are considered, as well as algorithm of PET projection data acquisition modeling for corrections verification.
Warth, A
2015-11-01
Tumor diagnostics are based on histomorphology, immunohistochemistry and molecular pathological analysis of mutations, translocations and amplifications which are of diagnostic, prognostic and/or predictive value. In recent decades only histomorphology was used to classify lung cancer as either small (SCLC) or non-small cell lung cancer (NSCLC), although NSCLC was further subdivided in different entities; however, as no specific therapy options were available classification of specific subtypes was not clinically meaningful. This fundamentally changed with the discovery of specific molecular alterations in adenocarcinoma (ADC), e.g. mutations in KRAS, EGFR and BRAF or translocations of the ALK and ROS1 gene loci, which now form the basis of targeted therapies and have led to a significantly improved patient outcome. The diagnostic, prognostic and predictive value of imaging, morphological, immunohistochemical and molecular characteristics as well as their interaction were systematically assessed in a large cohort with available clinical data including patient survival. Specific and sensitive diagnostic markers and marker panels were defined and diagnostic test algorithms for predictive biomarker assessment were optimized. It was demonstrated that the semi-quantitative assessment of ADC growth patterns is a stage-independent predictor of survival and is reproducibly applicable in the routine setting. Specific histomorphological characteristics correlated with computed tomography (CT) imaging features and thus allowed an improved interdisciplinary classification, especially in the preoperative or palliative setting. Moreover, specific molecular characteristics, for example BRAF mutations and the proliferation index (Ki-67) were identified as clinically relevant prognosticators. Comprehensive clinical, morphological, immunohistochemical and molecular assessment of NSCLCs allow an optimized patient stratification. Respective algorithms now form the backbone of the 2015 lung cancer World Health Organization (WHO) classification.
van der Zwaan, G Lennart; van Dijk, Susan E M; Adriaanse, Marcel C; van Marwijk, Harm W J; van Tulder, Maurits W; Pols, Alide D; Bosmans, Judith E
2016-01-15
Depression is common among type 2 diabetes mellitus (DM2)/coronary heart disease (CHD) patients and is associated with adverse health effects. A promising strategy to reduce burden of disease is to identify patients at risk for depression in order to offer indicated prevention. This study aims to assess the diagnostic accuracy of the Patient Health Questionnaire-9 (PHQ-9) to be used as a tool to identify high risk patients. In this cross-sectional study, 586 consecutive DM2/CHD patients aged >18 were recruited through 23 general practices. PHQ-9 outcomes were compared to the Mini International Neuropsychiatric Interview (MINI), which was considered the reference standard. Diagnostic accuracy was evaluated for minor and major depression, comparing both sum- and algorithm based PHQ-9 scores. For minor depression, the optimal cut-off score was 8 (sensitivity 71%, specificity 71% and an AUC of 0.74). For major depression, the optimal cut-off score was 10 resulting in a sensitivity of 84%, a specificity of 82%, and an AUC of 0.88. The positive predictive value of the PHQ-9 algorithm for diagnosing minor and major depression was 25% and 33%, respectively. Two main limitations apply. MINI Interviewers were not blinded for PHQ-9 scores and less than 10% of all invited patients could be included in the analyses. This could have resulted in biased outcomes. The PHQ-9 sum score performs well in identifying patients at high risk of minor and major depression. However, the PHQ-9 showed suboptimal results for diagnostic purposes. Therefore, it is recommended to combine the use of the PHQ-9 with further diagnostics to identify depression. Copyright © 2015 Elsevier B.V. All rights reserved.
Raschke, R A; Gallo, T; Curry, S C; Whiting, T; Padilla-Jones, A; Warkentin, T E; Puri, A
2017-08-01
Essentials We previously published a diagnostic algorithm for heparin-induced thrombocytopenia (HIT). In this study, we validated the algorithm in an independent large healthcare system. The accuracy was 98%, sensitivity 82% and specificity 99%. The algorithm has potential to improve accuracy and efficiency in the diagnosis of HIT. Background Heparin-induced thrombocytopenia (HIT) is a life-threatening drug reaction caused by antiplatelet factor 4/heparin (anti-PF4/H) antibodies. Commercial tests to detect these antibodies have suboptimal operating characteristics. We previously developed a diagnostic algorithm for HIT that incorporated 'four Ts' (4Ts) scoring and a stratified interpretation of an anti-PF4/H enzyme-linked immunosorbent assay (ELISA) and yielded a discriminant accuracy of 0.97 (95% confidence interval [CI], 0.93-1.00). Objectives The purpose of this study was to validate the algorithm in an independent patient population and quantitate effects that algorithm adherence could have on clinical care. Methods A retrospective cohort comprised patients who had undergone anti-PF4/H ELISA and serotonin release assay (SRA) testing in our healthcare system from 2010 to 2014. We determined the algorithm recommendation for each patient, compared recommendations with the clinical care received, and enumerated consequences of discrepancies. Operating characteristics were calculated for algorithm recommendations using SRA as the reference standard. Results Analysis was performed on 181 patients, 10 of whom were ruled in for HIT. The algorithm accurately stratified 98% of patients (95% CI, 95-99%), ruling out HIT in 158, ruling in HIT in 10 and recommending an SRA in 13 patients. Algorithm adherence would have obviated 165 SRAs and prevented 30 courses of unnecessary antithrombotic therapy for HIT. Diagnostic sensitivity was 0.82 (95% CI, 0.48-0.98), specificity 0.99 (95% CI, 0.97-1.00), PPV 0.90 (95% CI, 0.56-0.99) and NPV 0.99 (95% CI, 0.96-1.00). Conclusions An algorithm incorporating 4Ts scoring and a stratified interpretation of the anti-PF4/H ELISA has good operating characteristics and the potential to improve management of suspected HIT patients. © 2017 International Society on Thrombosis and Haemostasis.
Radiochromic film diagnostics for laser-driven ion beams
NASA Astrophysics Data System (ADS)
Kaufman, J.; Margarone, Daniele; Candiano, Giacomo; Kim, I. Jong; Jeong, Tae Moon; Pšikal, Jan; Romano, F.; Cirrone, P.; Scuderi, V.; Korn, Georg
2015-05-01
Radiochromic film (RCF) based multichannel diagnostics utilizes the concept of a stack detector comprised of alternating layers of RCFs and shielding aluminium layers. An algorithm based on SRIM simulations is used to correct the accumulated dose. Among the standard information that can be obtained is the maximum ion energy and to some extend the beam energy spectrum. The main area where this detector shines though is the geometrical characterization of the beam. Whereas other detectors such as Thomson parabola spectrometer or Faraday cups detect only a fraction of the outburst cone, the RCF stack placed right behind the target absorbs the whole beam. A complete 2D and to some extend 3D imprint of the ion beam allows us to determine parameters such as divergence or beam center shift with respect to the target normal. The obvious drawback of such diagnostics is its invasive character. But considering that only a few successful shots (2-3) are needed per one kind of target to perform the analysis, the drawbacks are acceptable. In this work, we present results obtained with the RCF diagnostics using both conventional accelerators and laser-driven ion beams during 2 experimental campaigns.
GUI Type Fault Diagnostic Program for a Turboshaft Engine Using Fuzzy and Neural Networks
NASA Astrophysics Data System (ADS)
Kong, Changduk; Koo, Youngju
2011-04-01
The helicopter to be operated in a severe flight environmental condition must have a very reliable propulsion system. On-line condition monitoring and fault detection of the engine can promote reliability and availability of the helicopter propulsion system. A hybrid health monitoring program using Fuzzy Logic and Neural Network Algorithms can be proposed. In this hybrid method, the Fuzzy Logic identifies easily the faulted components from engine measuring parameter changes, and the Neural Networks can quantify accurately its identified faults. In order to use effectively the fault diagnostic system, a GUI (Graphical User Interface) type program is newly proposed. This program is composed of the real time monitoring part, the engine condition monitoring part and the fault diagnostic part. The real time monitoring part can display measuring parameters of the study turboshaft engine such as power turbine inlet temperature, exhaust gas temperature, fuel flow, torque and gas generator speed. The engine condition monitoring part can evaluate the engine condition through comparison between monitoring performance parameters the base performance parameters analyzed by the base performance analysis program using look-up tables. The fault diagnostic part can identify and quantify the single faults the multiple faults from the monitoring parameters using hybrid method.
Validation of Living Donor Nephrectomy Codes
Lam, Ngan N.; Lentine, Krista L.; Klarenbach, Scott; Sood, Manish M.; Kuwornu, Paul J.; Naylor, Kyla L.; Knoll, Gregory A.; Kim, S. Joseph; Young, Ann; Garg, Amit X.
2018-01-01
Background: Use of administrative data for outcomes assessment in living kidney donors is increasing given the rarity of complications and challenges with loss to follow-up. Objective: To assess the validity of living donor nephrectomy in health care administrative databases compared with the reference standard of manual chart review. Design: Retrospective cohort study. Setting: 5 major transplant centers in Ontario, Canada. Patients: Living kidney donors between 2003 and 2010. Measurements: Sensitivity and positive predictive value (PPV). Methods: Using administrative databases, we conducted a retrospective study to determine the validity of diagnostic and procedural codes for living donor nephrectomies. The reference standard was living donor nephrectomies identified through the province’s tissue and organ procurement agency, with verification by manual chart review. Operating characteristics (sensitivity and PPV) of various algorithms using diagnostic, procedural, and physician billing codes were calculated. Results: During the study period, there were a total of 1199 living donor nephrectomies. Overall, the best algorithm for identifying living kidney donors was the presence of 1 diagnostic code for kidney donor (ICD-10 Z52.4) and 1 procedural code for kidney procurement/excision (1PC58, 1PC89, 1PC91). Compared with the reference standard, this algorithm had a sensitivity of 97% and a PPV of 90%. The diagnostic and procedural codes performed better than the physician billing codes (sensitivity 60%, PPV 78%). Limitations: The donor chart review and validation study was performed in Ontario and may not be generalizable to other regions. Conclusions: An algorithm consisting of 1 diagnostic and 1 procedural code can be reliably used to conduct health services research that requires the accurate determination of living kidney donors at the population level. PMID:29662679
Cooper, Robert F.; Lombardo, Marco; Carroll, Joseph; Sloan, Kenneth R.; Lombardo, Giuseppe
2016-01-01
The ability to non-invasively image the cone photoreceptor mosaic holds significant potential as a diagnostic for retinal disease. Central to the realization of this potential is the development of sensitive metrics for characterizing the organization of the mosaic. Here we evaluated previously-described (Pum et al., 1990) and newly-developed (Fourier- and Radon-based) methods of measuring cone orientation in both simulated and real images of the parafoveal cone mosaic. The proposed algorithms correlated well across both simulated and real mosaics, suggesting that each algorithm would provide an accurate description of individual photoreceptor orientation. Despite the high agreement between algorithms, each performed differently in response to image intensity variation and cone coordinate jitter. The integration property of the Fourier transform allowed the Fourier-based method to be resistant to cone coordinate jitter and perform the most robustly of all three algorithms. Conversely, when there is good image quality but unreliable cone identification, the Radon algorithm performed best. Finally, in cases where both the image and cone coordinate reliability was excellent, the method of Pum et al. (1990) performed best. These descriptors are complementary to conventional descriptive metrics of the cone mosaic, such as cell density and spacing, and have the potential to aid in the detection of photoreceptor pathology. PMID:27484961
NASA Astrophysics Data System (ADS)
Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao
2014-09-01
This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.
Kuhnigk, H; Steinhübel, B; Keil, T; Roewer, N
2004-07-01
Anaesthesia management, radiological diagnostic and the concept of damage control surgery should be combined in the resuscitation room. Defined clinical targets and their realisation are a CT-scan and complete damage control surgery in the shock room. Furthermore minimised patient transfer and positioning with continuous access to the head, upper parts of the body and anaesthesia machine should be realised during diagnostic procedures. Based on a carbon-slide fixed on a turntable and innovative alignment of diagnostic devices, a three phase treatment algorithm has been established. Phase A includes primary survey, anaesthetic management and ultrasound examination. Following a turn of the table conventional x-ray diagnostic is assessed in phase B. Tracks for the slide enable immediate transfer to a spiral CT-scan without additional patient positioning (phase C). Following complete CT-scan rearrangement of the table to phase A facilitates immediate damage control surgery. To accelerate device operation and treatment the integrated anaesthesia workstation is ceiling-mounted and manoeuvres close to the patient. This concept realizes complete diagnostic procedures and damage control surgery without time consuming patient transfer or rearrangement.
Full range line-field parallel swept source imaging utilizing digital refocusing
NASA Astrophysics Data System (ADS)
Fechtig, Daniel J.; Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A.
2015-12-01
We present geometric optics-based refocusing applied to a novel off-axis line-field parallel swept source imaging (LPSI) system. LPSI is an imaging modality based on line-field swept source optical coherence tomography, which permits 3-D imaging at acquisition speeds of up to 1 MHz. The digital refocusing algorithm applies a defocus-correcting phase term to the Fourier representation of complex-valued interferometric image data, which is based on the geometrical optics information of the LPSI system. We introduce the off-axis LPSI system configuration, the digital refocusing algorithm and demonstrate the effectiveness of our method for refocusing volumetric images of technical and biological samples. An increase of effective in-focus depth range from 255 μm to 4.7 mm is achieved. The recovery of the full in-focus depth range might be especially valuable for future high-speed and high-resolution diagnostic applications of LPSI in ophthalmology.
A simple algorithm for beam profile diagnostics using a thermographic camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katagiri, Ken; Hojo, Satoru; Honma, Toshihiro
2014-03-15
A new algorithm for digital image processing apparatuses is developed to evaluate profiles of high-intensity DC beams from temperature images of irradiated thin foils. Numerical analyses are performed to examine the reliability of the algorithm. To simulate the temperature images acquired by a thermographic camera, temperature distributions are numerically calculated for 20 MeV proton beams with different parameters. Noise in the temperature images which is added by the camera sensor is also simulated to account for its effect. Using the algorithm, beam profiles are evaluated from the simulated temperature images and compared with exact solutions. We find that niobium ismore » an appropriate material for the thin foil used in the diagnostic system. We also confirm that the algorithm is adaptable over a wide beam current range of 0.11–214 μA, even when employing a general-purpose thermographic camera with rather high noise (ΔT{sub NETD} ≃ 0.3 K; NETD: noise equivalent temperature difference)« less
Genetic Evaluation of Short Stature
Rosenfeld, Ron G.
2014-01-01
Context: Genetics plays a major role in determining an individual's height. Although there are many monogenic disorders that lead to perturbations in growth and result in short stature, there is still no consensus as to the role that genetic diagnostics should play in the evaluation of a child with short stature. Evidence Acquisition: A search of PubMed was performed, focusing on the genetic diagnosis of short stature as well as on specific diagnostic subgroups included in this article. Consensus guidelines were reviewed. Evidence Synthesis: There are a multitude of rare genetic causes of severe short stature. There is no high-quality evidence to define the optimal approach to the genetic evaluation of short stature. We review genetic etiologies of a number of diagnostic subgroups and propose an algorithm for genetic testing based on these subgroups. Conclusion: Advances in genomic technologies are revolutionizing the diagnostic approach to short stature. Endocrinologists must become facile with the use of genetic testing in order to identify the various monogenic disorders that present with short stature. PMID:24915122
New clinical grading scales and objective measurement for conjunctival injection.
Park, In Ki; Chun, Yeoun Sook; Kim, Kwang Gi; Yang, Hee Kyung; Hwang, Jeong-Min
2013-08-05
To establish a new clinical grading scale and objective measurement method to evaluate conjunctival injection. Photographs of conjunctival injection with variable ocular diseases in 429 eyes were reviewed. Seventy-three images with concordance by three ophthalmologists were classified into a 4-step and 10-step subjective grading scale, and used as standard photographs. Each image was quantified in four ways: the relative magnitude of the redness component of each red-green-blue (RGB) pixel; two different algorithms based on the occupied area by blood vessels (K-means clustering with LAB color model and contrast-limited adaptive histogram equalization [CLAHE] algorithm); and the presence of blood vessel edges, based on the Canny edge-detection algorithm. Area under the receiver operating characteristic curves (AUCs) were calculated to summarize diagnostic accuracies of the four algorithms. The RGB color model, K-means clustering with LAB color model, and CLAHE algorithm showed good correlation with the clinical 10-step grading scale (R = 0.741, 0.784, 0.919, respectively) and with the clinical 4-step grading scale (R = 0.645, 0.702, 0.838, respectively). The CLAHE method showed the largest AUC, best distinction power (P < 0.001, ANOVA, Bonferroni multiple comparison test), and high reproducibility (R = 0.996). CLAHE algorithm showed the best correlation with the 10-step and 4-step subjective clinical grading scales together with high distinction power and reproducibility. CLAHE algorithm can be a useful for method for assessment of conjunctival injection.
NASA Astrophysics Data System (ADS)
Smarda, M.; Alexopoulou, E.; Mazioti, A.; Kordolaimi, S.; Ploussi, A.; Priftis, K.; Efstathopoulos, E.
2015-09-01
Purpose of the study is to determine the appropriate iterative reconstruction (IR) algorithm level that combines image quality and diagnostic confidence, for pediatric patients undergoing high-resolution computed tomography (HRCT). During the last 2 years, a total number of 20 children up to 10 years old with a clinical presentation of chronic bronchitis underwent HRCT in our department's 64-detector row CT scanner using the iDose IR algorithm, with almost similar image settings (80kVp, 40-50 mAs). CT images were reconstructed with all iDose levels (level 1 to 7) as well as with filtered-back projection (FBP) algorithm. Subjective image quality was evaluated by 2 experienced radiologists in terms of image noise, sharpness, contrast and diagnostic acceptability using a 5-point scale (1=excellent image, 5=non-acceptable image). Artifacts existance was also pointed out. All mean scores from both radiologists corresponded to satisfactory image quality (score ≤3), even with the FBP algorithm use. Almost excellent (score <2) overall image quality was achieved with iDose levels 5 to 7, but oversmoothing artifacts appearing with iDose levels 6 and 7 affected the diagnostic confidence. In conclusion, the use of iDose level 5 enables almost excellent image quality without considerable artifacts affecting the diagnosis. Further evaluation is needed in order to draw more precise conclusions.
Schellhaas, Barbara; Görtz, Ruediger S; Pfeifer, Lukas; Kielisch, Christian; Neurath, Markus F; Strobel, Deike
2017-09-01
A comparison is made of two contrast-enhanced ultrasound (CEUS) algorithms for the diagnosis of hepatocellular carcinoma (HCC) in high-risk patients: Erlanger Synopsis of Contrast-enhanced Ultrasound for Liver lesion Assessment in Patients at Risk (ESCULAP) and American College of Radiology Contrast-Enhanced Ultrasound-Liver Imaging Reporting and Data System (ACR-CEUS-LI-RADSv.2016). Focal liver lesions in 100 high-risk patients were assessed using both CEUS algorithms (ESCULAP and CEUS-LI-RADSv.2016) for a direct comparison. Lesions were categorized according to size and contrast enhancement in the arterial, portal venous and late phases.For the definite diagnosis of HCC, categories ESCULAP-4, ESCULAP-Tr and ESCULAP-V and CEUS-LI-RADS-LR-5, LR-Tr and LR-5-V were compared. In addition, CEUS-LI-RADS-category LR-M (definitely/probably malignant, but not specific for HCC) and ESCULAP-category C [intrahepatic cholangiocellular carcinoma (ICC)] were compared.Histology, CE-computed tomography and CE-MRI served as reference standards. The reference standard among 100 lesions included 87 HCCs, six ICCs and seven non-HCC-non-ICC-lesions. For the diagnosis of HCC, the diagnostic accuracy of CEUS was significantly higher with ESCULAP versus CEUS-LI-RADS (94.3%/72.4%; p<0.01). Sensitivity, specificity and positive predictive value (PPV) and negative predictive value for ESCULAP/CEUS-LI-RADS were 94.3%/72.4%; 61.5%/69.2%; 94.3%/94%; and 61.5%/27.3%, respectively.The diagnostic accuracy for ICC (LR-M/ESCULAP-C) was identical with both algorithms (50%), with higher PPV for ESCULAP-C versus LR-M (75 vs. 50%). CEUS-based algorithms contribute toward standardized assessment and reporting of HCC-suspect lesions in high-risk patients. ESCULAP shows significantly higher diagnostic accuracy, sensitivity and negative predictive value with no loss of specificity compared with CEUS-LI-RADS. Both algorithms have an excellent PPV. Arterial hyperenhancement is the key feature for the diagnosis of HCC with CEUS. Washout should not be a necessary prerequisite for the diagnosis of definite HCC. CEUS-LI-RADS in its current version is inferior to ESCULAP for the noninvasive diagnosis of HCC. There are two ways to improve CEUS-LI-RADS: firstly, combination of the categories LR-4 and LR-5 for the diagnosis of definite HCC, and secondly, use of subtotal infiltration of a liver lobe as an additional feature.
Bokov, Plamen; Mahut, Bruno; Flaud, Patrice; Delclaux, Christophe
2016-03-01
Respiratory diseases in children are a common reason for physician visits. A diagnostic difficulty arises when parents hear wheezing that is no longer present during the medical consultation. Thus, an outpatient objective tool for recognition of wheezing is of clinical value. We developed a wheezing recognition algorithm from recorded respiratory sounds with a Smartphone placed near the mouth. A total of 186 recordings were obtained in a pediatric emergency department, mostly in toddlers (mean age 20 months). After exclusion of recordings with artefacts and those with a single clinical operator auscultation, 95 recordings with the agreement of two operators on auscultation diagnosis (27 with wheezing and 68 without) were subjected to a two phase algorithm (signal analysis and pattern classifier using machine learning algorithms) to classify records. The best performance (71.4% sensitivity and 88.9% specificity) was observed with a Support Vector Machine-based algorithm. We further tested the algorithm over a set of 39 recordings having a single operator and found a fair agreement (kappa=0.28, CI95% [0.12, 0.45]) between the algorithm and the operator. The main advantage of such an algorithm is its use in contact-free sound recording, thus valuable in the pediatric population. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zeng, Chen; Rosengard, Sarah Z.; Burt, William; Peña, M. Angelica; Nemcek, Nina; Zeng, Tao; Arrigo, Kevin R.; Tortell, Philippe D.
2018-06-01
We evaluate several algorithms for the estimation of phytoplankton size class (PSC) and functional type (PFT) biomass from ship-based optical measurements in the Subarctic Northeast Pacific Ocean. Using underway measurements of particulate absorption and backscatter in surface waters, we derived estimates of PSC/PFT based on chlorophyll-a concentrations (Chl-a), particulate absorption spectra and the wavelength dependence of particulate backscatter. Optically-derived [Chl-a] and phytoplankton absorption measurements were validated against discrete calibration samples, while the derived PSC/PFT estimates were validated using size-fractionated Chl-a measurements and HPLC analysis of diagnostic photosynthetic pigments (DPA). Our results showflo that PSC/PFT algorithms based on [Chl-a] and particulate absorption spectra performed significantly better than the backscatter slope approach. These two more successful algorithms yielded estimates of phytoplankton size classes that agreed well with HPLC-derived DPA estimates (RMSE = 12.9%, and 16.6%, respectively) across a range of hydrographic and productivity regimes. Moreover, the [Chl-a] algorithm produced PSC estimates that agreed well with size-fractionated [Chl-a] measurements, and estimates of the biomass of specific phytoplankton groups that were consistent with values derived from HPLC. Based on these results, we suggest that simple [Chl-a] measurements should be more fully exploited to improve the classification of phytoplankton assemblages in the Northeast Pacific Ocean.
Improved Temperature Diagnostic for Non-Neutral Plasmas with Single-Electron Resolution
NASA Astrophysics Data System (ADS)
Shanman, Sabrina; Evans, Lenny; Fajans, Joel; Hunter, Eric; Nelson, Cheyenne; Sierra, Carlos; Wurtele, Jonathan
2016-10-01
Plasma temperature diagnostics in a Penning-Malmberg trap are essential for reliably obtaining cold, non-neutral plasmas. We have developed a setup for detecting the initial electrons that escape from a trapped pure electron plasma as the confining electrode potential is slowly reduced. The setup minimizes external noise by using a silicon photomultiplier to capture light emitted from an MCP-amplified phosphor screen. To take advantage of this enhanced resolution, we have developed a new plasma temperature diagnostic analysis procedure which takes discrete electron arrival times as input. We have run extensive simulations comparing this new discrete algorithm to our existing exponential fitting algorithm. These simulations are used to explore the behavior of these two temperature diagnostic procedures at low N and at high electronic noise. This work was supported by the DOE DE-FG02-06ER54904, and the NSF 1500538-PHY.
Algorithmic Classification of Five Characteristic Types of Paraphasias.
Fergadiotis, Gerasimos; Gorman, Kyle; Bedrick, Steven
2016-12-01
This study was intended to evaluate a series of algorithms developed to perform automatic classification of paraphasic errors (formal, semantic, mixed, neologistic, and unrelated errors). We analyzed 7,111 paraphasias from the Moss Aphasia Psycholinguistics Project Database (Mirman et al., 2010) and evaluated the classification accuracy of 3 automated tools. First, we used frequency norms from the SUBTLEXus database (Brysbaert & New, 2009) to differentiate nonword errors and real-word productions. Then we implemented a phonological-similarity algorithm to identify phonologically related real-word errors. Last, we assessed the performance of a semantic-similarity criterion that was based on word2vec (Mikolov, Yih, & Zweig, 2013). Overall, the algorithmic classification replicated human scoring for the major categories of paraphasias studied with high accuracy. The tool that was based on the SUBTLEXus frequency norms was more than 97% accurate in making lexicality judgments. The phonological-similarity criterion was approximately 91% accurate, and the overall classification accuracy of the semantic classifier ranged from 86% to 90%. Overall, the results highlight the potential of tools from the field of natural language processing for the development of highly reliable, cost-effective diagnostic tools suitable for collecting high-quality measurement data for research and clinical purposes.
Towards a Framework for Evaluating and Comparing Diagnosis Algorithms
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia,David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander
2009-01-01
Diagnostic inference involves the detection of anomalous system behavior and the identification of its cause, possibly down to a failed unit or to a parameter of a failed unit. Traditional approaches to solving this problem include expert/rule-based, model-based, and data-driven methods. Each approach (and various techniques within each approach) use different representations of the knowledge required to perform the diagnosis. The sensor data is expected to be combined with these internal representations to produce the diagnosis result. In spite of the availability of various diagnosis technologies, there have been only minimal efforts to develop a standardized software framework to run, evaluate, and compare different diagnosis technologies on the same system. This paper presents a framework that defines a standardized representation of the system knowledge, the sensor data, and the form of the diagnosis results and provides a run-time architecture that can execute diagnosis algorithms, send sensor data to the algorithms at appropriate time steps from a variety of sources (including the actual physical system), and collect resulting diagnoses. We also define a set of metrics that can be used to evaluate and compare the performance of the algorithms, and provide software to calculate the metrics.
Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.
Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir
2017-01-01
Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Embedded Reasoning Supporting Aerospace IVHM
2007-01-01
c method (BIT or health assessment algorithm) which the monitoring diagnostic relies on input information tics and Astronautics In the diagram...viewing of the current health state of all monitored subsystems, while also providing a means to probe deeper in the event anomalous operation is...seeks to integrate detection , diagnostic, and prognostic capabilities with a hierarchical diagnostic reasoning architecture into a single
An international consensus algorithm for management of chronic postoperative inguinal pain.
Lange, J F M; Kaufmann, R; Wijsmuller, A R; Pierie, J P E N; Ploeg, R J; Chen, D C; Amid, P K
2015-02-01
Tension-free mesh repair of inguinal hernia has led to uniformly low recurrence rates. Morbidity associated with this operation is mainly related to chronic pain. No consensus guidelines exist for the management of this condition. The goal of this study is to design an expert-based algorithm for diagnostic and therapeutic management of chronic inguinal postoperative pain (CPIP). A group of surgeons considered experts on inguinal hernia surgery was solicited to develop the algorithm. Consensus regarding each step of an algorithm proposed by the authors was sought by means of the Delphi method leading to a revised expert-based algorithm. With the input of 28 international experts, an algorithm for a stepwise approach for management of CPIP was created. 26 participants accepted the final algorithm as a consensus model. One participant could not agree with the final concept. One expert did not respond during the final phase. There is a need for guidelines with regard to management of CPIP. This algorithm can serve as a guide with regard to the diagnosis, management, and treatment of these patients and improve clinical outcomes. If an expectative phase of a few months has passed without any amelioration of CPIP, a multidisciplinary approach is indicated and a pain management team should be consulted. Pharmacologic, behavioral, and interventional modalities including nerve blocks are essential. If conservative measures fail and surgery is considered, triple neurectomy, correction for recurrence with or without neurectomy, and meshoma removal if indicated should be performed. Surgeons less experienced with remedial operations for CPIP should not hesitate to refer their patients to dedicated hernia surgeons.
Idowu, Rachel T; Carnahan, Ryan; Sathe, Nila A; McPheeters, Melissa L
2013-12-30
To identify algorithms that can capture incident cases of myocarditis and pericarditis in administrative and claims databases; these algorithms can eventually be used to identify cardiac inflammatory adverse events following vaccine administration. We searched MEDLINE from 1991 to September 2012 using controlled vocabulary and key terms related to myocarditis. We also searched the reference lists of included studies. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria. Two reviewers independently extracted data regarding participant and algorithm characteristics as well as study conduct. Nine publications (including one study reported in two publications) met criteria for inclusion. Two studies performed medical record review in order to confirm that these coding algorithms actually captured patients with the disease of interest. One of these studies identified five potential cases, none of which were confirmed as acute myocarditis upon review. The other study, which employed a search algorithm based on diagnostic surveillance (using ICD-9 codes 420.90, 420.99, 422.90, 422.91 and 429.0) and sentinel reporting, identified 59 clinically confirmed cases of myopericarditis among 492,671 United States military service personnel who received smallpox vaccine between 2002 and 2003. Neither study provided algorithm validation statistics (positive predictive value, sensitivity, or specificity). A validated search algorithm is currently unavailable for identifying incident cases of pericarditis or myocarditis. Several authors have published unvalidated ICD-9-based search algorithms that appear to capture myocarditis events occurring in the context of other underlying cardiac or autoimmune conditions. Copyright © 2013. Published by Elsevier Ltd.
Hall, Gunnsteinn; Liang, Wenxuan; Li, Xingde
2017-10-01
Collagen fiber alignment derived from second harmonic generation (SHG) microscopy images can be important for disease diagnostics. Image processing algorithms are needed to robustly quantify the alignment in images with high sensitivity and reliability. Fourier transform (FT) magnitude, 2D power spectrum, and image autocorrelation have previously been used to extract fiber information from images by assuming a certain mathematical model (e.g. Gaussian distribution of the fiber-related parameters) and fitting. The fitting process is slow and fails to converge when the data is not Gaussian. Herein we present an efficient constant-time deterministic algorithm which characterizes the symmetricity of the FT magnitude image in terms of a single parameter, named the fiber alignment anisotropy R ranging from 0 (randomized fibers) to 1 (perfect alignment). This represents an important improvement of the technology and may bring us one step closer to utilizing the technology for various applications in real time. In addition, we present a digital image phantom-based framework for characterizing and validating the algorithm, as well as assessing the robustness of the algorithm against different perturbations.
NASA Astrophysics Data System (ADS)
Potlov, A. Yu.; Frolov, S. V.; Proskurin, S. G.
2018-04-01
High-quality OCT structural images reconstruction algorithm for endoscopic optical coherence tomography of biological tissue is described. The key features of the presented algorithm are: (1) raster scanning and averaging of adjacent Ascans and pixels; (2) speckle level minimization. The described algorithm can be used in the gastroenterology, urology, gynecology, otorhinolaryngology for mucous membranes and skin diagnostics in vivo and in situ.
An algorithmic approach to the brain biopsy--part I.
Kleinschmidt-DeMasters, B K; Prayson, Richard A
2006-11-01
The formulation of appropriate differential diagnoses for a slide is essential to the practice of surgical pathology but can be particularly challenging for residents and fellows. Algorithmic flow charts can help the less experienced pathologist to systematically consider all possible choices and eliminate incorrect diagnoses. They can assist pathologists-in-training in developing orderly, sequential, and logical thinking skills when confronting difficult cases. To present an algorithmic flow chart as an approach to formulating differential diagnoses for lesions seen in surgical neuropathology. An algorithmic flow chart to be used in teaching residents. Algorithms are not intended to be final diagnostic answers on any given case. Algorithms do not substitute for training received from experienced mentors nor do they substitute for comprehensive reading by trainees of reference textbooks. Algorithmic flow diagrams can, however, direct the viewer to the correct spot in reference texts for further in-depth reading once they hone down their diagnostic choices to a smaller number of entities. The best feature of algorithms is that they remind the user to consider all possibilities on each case, even if they can be quickly eliminated from further consideration. In Part I, we assist the resident in learning how to handle brain biopsies in general and how to distinguish nonneoplastic lesions that mimic tumors from true neoplasms.
An algorithmic approach to the brain biopsy--part II.
Prayson, Richard A; Kleinschmidt-DeMasters, B K
2006-11-01
The formulation of appropriate differential diagnoses for a slide is essential to the practice of surgical pathology but can be particularly challenging for residents and fellows. Algorithmic flow charts can help the less experienced pathologist to systematically consider all possible choices and eliminate incorrect diagnoses. They can assist pathologists-in-training in developing orderly, sequential, and logical thinking skills when confronting difficult cases. To present an algorithmic flow chart as an approach to formulating differential diagnoses for lesions seen in surgical neuropathology. An algorithmic flow chart to be used in teaching residents. Algorithms are not intended to be final diagnostic answers on any given case. Algorithms do not substitute for training received from experienced mentors nor do they substitute for comprehensive reading by trainees of reference textbooks. Algorithmic flow diagrams can, however, direct the viewer to the correct spot in reference texts for further in-depth reading once they hone down their diagnostic choices to a smaller number of entities. The best feature of algorithms is that they remind the user to consider all possibilities on each case, even if they can be quickly eliminated from further consideration. In Part II, we assist the resident in arriving at the correct diagnosis for neuropathologic lesions containing granulomatous inflammation, macrophages, or abnormal blood vessels.
Ambavane, Apoorva; Lindahl, Bertil; Giannitsis, Evangelos; Roiz, Julie; Mendivil, Joan; Frankenstein, Lutz; Body, Richard; Christ, Michael; Bingisser, Roland; Alquezar, Aitor; Mueller, Christian
2017-01-01
The 1-hour (h) algorithm triages patients presenting with suspected acute myocardial infarction (AMI) to the emergency department (ED) towards "rule-out," "rule-in," or "observation," depending on baseline and 1-h levels of high-sensitivity cardiac troponin (hs-cTn). The economic consequences of applying the accelerated 1-h algorithm are unknown. We performed a post-hoc economic analysis in a large, diagnostic, multicenter study of hs-cTnT using central adjudication of the final diagnosis by two independent cardiologists. Length of stay (LoS), resource utilization (RU), and predicted diagnostic accuracy of the 1-h algorithm compared to standard of care (SoC) in the ED were estimated. The ED LoS, RU, and accuracy of the 1-h algorithm was compared to that achieved by the SoC at ED discharge. Expert opinion was sought to characterize clinical implementation of the 1-h algorithm, which required blood draws at ED presentation and 1h, after which "rule-in" patients were transferred for coronary angiography, "rule-out" patients underwent outpatient stress testing, and "observation" patients received SoC. Unit costs were for the United Kingdom, Switzerland, and Germany. The sensitivity and specificity for the 1-h algorithm were 87% and 96%, respectively, compared to 69% and 98% for SoC. The mean ED LoS for the 1-h algorithm was 4.3h-it was 6.5h for SoC, which is a reduction of 33%. The 1-h algorithm was associated with reductions in RU, driven largely by the shorter LoS in the ED for patients with a diagnosis other than AMI. The estimated total costs per patient were £2,480 for the 1-h algorithm compared to £4,561 for SoC, a reduction of up to 46%. The analysis shows that the use of 1-h algorithm is associated with reduction in overall AMI diagnostic costs, provided it is carefully implemented in clinical practice. These results need to be prospectively validated in the future.
Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng'ang'a, Anne; Bita, André; Zahinda, Jean-Paul B. N.; Fransen, Katrien
2017-01-01
ABSTRACT Our objective was to evaluate the performance of HIV testing algorithms based on WHO recommendations, using data from specimens collected at six HIV testing and counseling sites in sub-Saharan Africa (Conakry, Guinea; Kitgum and Arua, Uganda; Homa Bay, Kenya; Douala, Cameroon; Baraka, Democratic Republic of Congo). A total of 2,780 samples, including 1,306 HIV-positive samples, were included in the analysis. HIV testing algorithms were designed using Determine as a first test. Second and third rapid diagnostic tests (RDTs) were selected based on site-specific performance, adhering where possible to the WHO-recommended minimum requirements of ≥99% sensitivity and specificity. The threshold for specificity was reduced to 98% or 96% if necessary. We also simulated algorithms consisting of one RDT followed by a simple confirmatory assay. The positive predictive values (PPV) of the simulated algorithms ranged from 75.8% to 100% using strategies recommended for high-prevalence settings, 98.7% to 100% using strategies recommended for low-prevalence settings, and 98.1% to 100% using a rapid test followed by a simple confirmatory assay. Although we were able to design algorithms that met the recommended PPV of ≥99% in five of six sites using the applicable high-prevalence strategy, options were often very limited due to suboptimal performance of individual RDTs and to shared falsely reactive results. These results underscore the impact of the sequence of HIV tests and of shared false-reactivity data on algorithm performance. Where it is not possible to identify tests that meet WHO-recommended specifications, the low-prevalence strategy may be more suitable. PMID:28747371
Jorge-Botana, Guillermo; Olmos, Ricardo; León, José Antonio
2009-11-01
There is currently a widespread interest in indexing and extracting taxonomic information from large text collections. An example is the automatic categorization of informally written medical or psychological diagnoses, followed by the extraction of epidemiological information or even terms and structures needed to formulate guiding questions as an heuristic tool for helping doctors. Vector space models have been successfully used to this end (Lee, Cimino, Zhu, Sable, Shanker, Ely & Yu, 2006; Pakhomov, Buntrock & Chute, 2006). In this study we use a computational model known as Latent Semantic Analysis (LSA) on a diagnostic corpus with the aim of retrieving definitions (in the form of lists of semantic neighbors) of common structures it contains (e.g. "storm phobia", "dog phobia") or less common structures that might be formed by logical combinations of categories and diagnostic symptoms (e.g. "gun personality" or "germ personality"). In the quest to bring definitions into line with the meaning of structures and make them in some way representative, various problems commonly arise while recovering content using vector space models. We propose some approaches which bypass these problems, such as Kintsch's (2001) predication algorithm and some corrections to the way lists of neighbors are obtained, which have already been tested on semantic spaces in a non-specific domain (Jorge-Botana, León, Olmos & Hassan-Montero, under review). The results support the idea that the predication algorithm may also be useful for extracting more precise meanings of certain structures from scientific corpora, and that the introduction of some corrections based on vector length may increases its efficiency on non-representative terms.
Model-Based Diagnostics for Propellant Loading Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.
2011-01-01
The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.
Computer Aided Diagnostic Support System for Skin Cancer: A Review of Techniques and Algorithms
Masood, Ammara; Al-Jumaily, Adel Ali
2013-01-01
Image-based computer aided diagnosis systems have significant potential for screening and early detection of malignant melanoma. We review the state of the art in these systems and examine current practices, problems, and prospects of image acquisition, pre-processing, segmentation, feature extraction and selection, and classification of dermoscopic images. This paper reports statistics and results from the most important implementations reported to date. We compared the performance of several classifiers specifically developed for skin lesion diagnosis and discussed the corresponding findings. Whenever available, indication of various conditions that affect the technique's performance is reported. We suggest a framework for comparative assessment of skin cancer diagnostic models and review the results based on these models. The deficiencies in some of the existing studies are highlighted and suggestions for future research are provided. PMID:24575126
A One-Versus-All Class Binarization Strategy for Bearing Diagnostics of Concurrent Defects
Ng, Selina S. Y.; Tse, Peter W.; Tsui, Kwok L.
2014-01-01
In bearing diagnostics using a data-driven modeling approach, a concern is the need for data from all possible scenarios to build a practical model for all operating conditions. This paper is a study on bearing diagnostics with the concurrent occurrence of multiple defect types. The authors are not aware of any work in the literature that studies this practical problem. A strategy based on one-versus-all (OVA) class binarization is proposed to improve fault diagnostics accuracy while reducing the number of scenarios for data collection, by predicting concurrent defects from training data of normal and single defects. The proposed OVA diagnostic approach is evaluated with empirical analysis using support vector machine (SVM) and C4.5 decision tree, two popular classification algorithms frequently applied to system health diagnostics and prognostics. Statistical features are extracted from the time domain and the frequency domain. Prediction performance of the proposed strategy is compared with that of a simple multi-class classification, as well as that of random guess and worst-case classification. We have verified the potential of the proposed OVA diagnostic strategy in performance improvements for single-defect diagnosis and predictions of BPFO plus BPFI concurrent defects using two laboratory-collected vibration data sets. PMID:24419162
A one-versus-all class binarization strategy for bearing diagnostics of concurrent defects.
Ng, Selina S Y; Tse, Peter W; Tsui, Kwok L
2014-01-13
In bearing diagnostics using a data-driven modeling approach, a concern is the need for data from all possible scenarios to build a practical model for all operating conditions. This paper is a study on bearing diagnostics with the concurrent occurrence of multiple defect types. The authors are not aware of any work in the literature that studies this practical problem. A strategy based on one-versus-all (OVA) class binarization is proposed to improve fault diagnostics accuracy while reducing the number of scenarios for data collection, by predicting concurrent defects from training data of normal and single defects. The proposed OVA diagnostic approach is evaluated with empirical analysis using support vector machine (SVM) and C4.5 decision tree, two popular classification algorithms frequently applied to system health diagnostics and prognostics. Statistical features are extracted from the time domain and the frequency domain. Prediction performance of the proposed strategy is compared with that of a simple multi-class classification, as well as that of random guess and worst-case classification. We have verified the potential of the proposed OVA diagnostic strategy in performance improvements for single-defect diagnosis and predictions of BPFO plus BPFI concurrent defects using two laboratory-collected vibration data sets.
Cost-effectiveness of WHO-Recommended Algorithms for TB Case Finding at Ethiopian HIV Clinics.
Adelman, Max W; McFarland, Deborah A; Tsegaye, Mulugeta; Aseffa, Abraham; Kempker, Russell R; Blumberg, Henry M
2018-01-01
The World Health Organization (WHO) recommends active tuberculosis (TB) case finding and a rapid molecular diagnostic test (Xpert MTB/RIF) to detect TB among people living with HIV (PLHIV) in high-burden settings. Information on the cost-effectiveness of these recommended strategies is crucial for their implementation. We conducted a model-based cost-effectiveness analysis comparing 2 algorithms for TB screening and diagnosis at Ethiopian HIV clinics: (1) WHO-recommended symptom screen combined with Xpert for PLHIV with a positive symptom screen and (2) current recommended practice algorithm (CRPA; based on symptom screening, smear microscopy, and clinical TB diagnosis). Our primary outcome was US$ per disability-adjusted life-year (DALY) averted. Secondary outcomes were additional true-positive diagnoses, and false-negative and false-positive diagnoses averted. Compared with CRPA, combining a WHO-recommended symptom screen with Xpert was highly cost-effective (incremental cost of $5 per DALY averted). Among a cohort of 15 000 PLHIV with a TB prevalence of 6% (900 TB cases), this algorithm detected 8 more true-positive cases than CRPA, and averted 2045 false-positive and 8 false-negative diagnoses compared with CRPA. The WHO-recommended algorithm was marginally costlier ($240 000) than CRPA ($239 000). In sensitivity analysis, the symptom screen/Xpert algorithm was dominated at low Xpert sensitivity (66%). In this model-based analysis, combining a WHO-recommended symptom screen with Xpert for TB diagnosis among PLHIV was highly cost-effective ($5 per DALY averted) and more sensitive than CRPA in a high-burden, resource-limited setting.
A Comparative Analysis of the ADOS-G and ADOS-2 Algorithms: Preliminary Findings.
Dorlack, Taylor P; Myers, Orrin B; Kodituwakku, Piyadasa W
2018-06-01
The Autism Diagnostic Observation Schedule (ADOS) is a widely utilized observational assessment tool for diagnosis of autism spectrum disorders. The original ADOS was succeeded by the ADOS-G with noted improvements. More recently, the ADOS-2 was introduced to further increase its diagnostic accuracy. Studies examining the validity of the ADOS have produced mixed findings, and pooled relationship trends between the algorithm versions are yet to be analyzed. The current review seeks to compare the relative merits of the ADOS-G and ADOS-2 algorithms, Modules 1-3. Eight studies met inclusion criteria for the review, and six were selected for paired comparisons of the sensitivity and specificity of the ADOS. Results indicate several contradictory findings, underscoring the importance of further study.
Automatic analysis and classification of surface electromyography.
Abou-Chadi, F E; Nashar, A; Saad, M
2001-01-01
In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device.
[Diagnosis of diaphragmatic hernia].
Alecu, L
2002-01-01
Diaphragmatic hernias (congenital and traumatic) belongs to thoracoabdominal surgery which is a borderline chapter. Considering frequency, they are on the second place in the diaphragmatic pathology, after hiatal hernias. The author presents the criterias of the clinical examination, based on the bibliographic datas: also by presents the imagistic investigations used for identification of the diaphragmatic hernias, excepting the oesophageal hiatus hernias. There are some particular features appearing in the diagnostical algorithm, too.
Chest CT window settings with multiscale adaptive histogram equalization: pilot study.
Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald
2002-06-01
Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.
Pediatric Benign Soft Tissue Oral and Maxillofacial Pathology.
Glickman, Alexandra; Karlis, Vasiliki
2016-02-01
Despite the many types of oral pathologic lesions found in infants and children, the most commonly encountered are benign soft tissue lesions. The clinical features, diagnostic criteria, and treatment algorithms of pathologies in the age group from birth to 18 years of age are summarized based on their prevalence in each given age distribution. Treatment modalities include both medical and surgical management. Copyright © 2016 Elsevier Inc. All rights reserved.
Agent-based station for on-line diagnostics by self-adaptive laser Doppler vibrometry
NASA Astrophysics Data System (ADS)
Serafini, S.; Paone, N.; Castellini, P.
2013-12-01
A self-adaptive diagnostic system based on laser vibrometry is proposed for quality control of mechanical defects by vibration testing; it is developed for appliances at the end of an assembly line, but its characteristics are generally suited for testing most types of electromechanical products. It consists of a laser Doppler vibrometer, equipped with scanning mirrors and a camera, which implements self-adaptive bahaviour for optimizing the measurement. The system is conceived as a Quality Control Agent (QCA) and it is part of a Multi Agent System that supervises all the production line. The QCA behaviour is defined so to minimize measurement uncertainty during the on-line tests and to compensate target mis-positioning under guidance of a vision system. Best measurement conditions are reached by maximizing the amplitude of the optical Doppler beat signal (signal quality) and consequently minimize uncertainty. In this paper, the optimization strategy for measurement enhancement achieved by the down-hill algorithm (Nelder-Mead algorithm) and its effect on signal quality improvement is discussed. Tests on a washing machine in controlled operating conditions allow to evaluate the efficacy of the method; significant reduction of noise on vibration velocity spectra is observed. Results from on-line tests are presented, which demonstrate the potential of the system for industrial quality control.
A novel diagnostic tool reveals mitochondrial pathology in human diseases and aging.
Scheibye-Knudsen, Morten; Scheibye-Alsing, Karsten; Canugovi, Chandrika; Croteau, Deborah L; Bohr, Vilhelm A
2013-03-01
The inherent complex and pleiotropic phenotype of mitochondrial diseases poses a significant diagnostic challenge for clinicians as well as an analytical barrier for scientists. To overcome these obstacles we compiled a novel database, www.mitodb.com, containing the clinical features of primary mitochondrial diseases. Based on this we developed a number of qualitative and quantitative measures, enabling us to determine whether a disorder can be characterized as mitochondrial. These included a clustering algorithm, a disease network, a mitochondrial barcode and two scoring algorithms. Using these tools we detected mitochondrial involvement in a number of diseases not previously recorded as mitochondrial. As a proof of principle Cockayne syndrome, ataxia with oculomotor apraxia 1 (AOA1), spinocerebellar ataxia with axonal neuropathy 1 (SCAN1) and ataxia-telangiectasia have recently been shown to have mitochondrial dysfunction and those diseases showed strong association with mitochondrial disorders. We next evaluated mitochondrial involvement in aging and detected two distinct categories of accelerated aging disorders, one of them being associated with mitochondrial dysfunction. Normal aging seemed to associate stronger with the mitochondrial diseases than the non-mitochondrial partially supporting a mitochondrial theory of aging.
[Iron-deficiency anaemia in everyday gynaecological practice].
Lukanova, M; Popov, I
2004-01-01
Iron-deficiency anaemia /IDA/ is of utmost significance to clinical practice. Chronic haemorrhages from the genital tract are the major etiological factor for its appearance in 60-70% of the patients. Abnormal genital bleeding for the specialist in Obstetrics and gynaecology and IDA for the haematologist are frequently met problems in their everyday practice, which require detailed examination, good colaboration and synchronization between the work of both specialists. Diagnosing and etiological treatment of IDA of gynaecologic origin by mutual timely and adequate co-operation of gynaecologist and haematologist. Clinical survey based on the algorithm worked out. Its everyday application started in July-August 2001 and till today /30.04.2003/ 253 cases with IDA in the Department of Gynaecology are taken in. A record of proceedings was made for every patient and that helped the further diagnostic and therapeutic activity and respective data processing. The data and results obtained verify the achievement of final diagnostic specification of IDA, the role of the algorithm as a stepping-stone to its etiological treatment, complete and durable correction of iron deficiency.
Open Energy Information System version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
OpenEIS was created to provide standard methods for authoring, sharing, testing, using, and improving algorithms for operational building energy efficiency with building managers and building owners. OpenEIS is designed as a no-cost/low-cost solution that will propagate the fault detection and diagnostic (FDD) solutions into the marketplace by providing state- of- the-art analytical and diagnostic algorithms. As OpenEIS penetrates the market, demand by control system manufacturers and integrators serving small and medium commercial customers will help push these types of commercial software tool offerings into the broader marketplace.
Circulating Tumor Cells: What Is in It for the Patient? A Vision towards the Future
van de Stolpe, Anja; den Toonder, Jaap M. J.
2014-01-01
Knowledge on cellular signal transduction pathways as drivers of cancer growth and metastasis has fuelled development of “targeted therapy” which “targets” aberrant oncogenic signal transduction pathways. These drugs require nearly invariably companion diagnostic tests to identify the tumor-driving pathway and the cause of the abnormal pathway activity in a tumor sample, both for therapy response prediction as well as for monitoring of therapy response and emerging secondary drug resistance. Obtaining sufficient tumor material for this analysis in the metastatic setting is a challenge, and circulating tumor cells (CTCs) may provide an attractive alternative to biopsy on the premise that they can be captured from blood and the companion diagnostic test results are correctly interpreted. We discuss novel companion diagnostic directions, including the challenges, to identify the tumor driving pathway in CTCs, which in combination with a digital pathology platform and algorithms to quantitatively interpret complex CTC diagnostic results may enable optimized therapy response prediction and monitoring. In contrast to CTC-based companion diagnostics, CTC enumeration is envisioned to be largely replaced by cell free tumor DNA measurements in blood for therapy response and recurrence monitoring. The recent emergence of novel in vitro human model systems in the form of cancer-on-a-chip may enable elucidation of some of the so far elusive characteristics of CTCs, and is expected to contribute to more efficient CTC capture and CTC-based diagnostics. PMID:24879438
On-Chip Imaging of Schistosoma haematobium Eggs in Urine for Diagnosis by Computer Vision
Linder, Ewert; Grote, Anne; Varjo, Sami; Linder, Nina; Lebbad, Marianne; Lundin, Mikael; Diwan, Vinod; Hannuksela, Jari; Lundin, Johan
2013-01-01
Background Microscopy, being relatively easy to perform at low cost, is the universal diagnostic method for detection of most globally important parasitic infections. As quality control is hard to maintain, misdiagnosis is common, which affects both estimates of parasite burdens and patient care. Novel techniques for high-resolution imaging and image transfer over data networks may offer solutions to these problems through provision of education, quality assurance and diagnostics. Imaging can be done directly on image sensor chips, a technique possible to exploit commercially for the development of inexpensive “mini-microscopes”. Images can be transferred for analysis both visually and by computer vision both at point-of-care and at remote locations. Methods/Principal Findings Here we describe imaging of helminth eggs using mini-microscopes constructed from webcams and mobile phone cameras. The results show that an inexpensive webcam, stripped off its optics to allow direct application of the test sample on the exposed surface of the sensor, yields images of Schistosoma haematobium eggs, which can be identified visually. Using a highly specific image pattern recognition algorithm, 4 out of 5 eggs observed visually could be identified. Conclusions/Significance As proof of concept we show that an inexpensive imaging device, such as a webcam, may be easily modified into a microscope, for the detection of helminth eggs based on on-chip imaging. Furthermore, algorithms for helminth egg detection by machine vision can be generated for automated diagnostics. The results can be exploited for constructing simple imaging devices for low-cost diagnostics of urogenital schistosomiasis and other neglected tropical infectious diseases. PMID:24340107
Aguilar, Suzette M.; Shea, Jacob D.; Al-Joumayly, Mudar A.; Van Veen, Barry D.; Behdad, Nader; Hagness, Susan C.
2011-01-01
We propose the use of a polycaprolactone (PCL)-based thermoplastic mesh as a tissue-immobilization interface for microwave imaging and microwave hyperthermia treatment. An investigation of the dielectric properties of two PCL-based thermoplastic materials in the frequency range of 0.5 – 3.5 GHz is presented. The frequency-dependent dielectric constant and effective conductivity of the PCL-based thermoplastics are characterized using measurements of microstrip transmission lines fabricated on substrates comprised of the thermoplastic meshes. We also examine the impact of the presence of a PCL-based thermoplastic mesh on microwave breast imaging. We use a numerical test bed comprised of a previously reported three-dimensional anatomically realistic breast phantom and a multi-frequency microwave inverse scattering algorithm. We demonstrate that the PCL-based thermoplastic material and the assumed biocompatible medium of vegetable oil are sufficiently well matched such that the PCL layer may be neglected by the imaging solution without sacrificing imaging quality. Our results suggest that PCL-based thermoplastics are promising materials as tissue immobilization structures for microwave diagnostic and therapeutic applications. PMID:21622068
Strategies to improve the efficiency of celiac disease diagnosis in the laboratory.
González, Delia Almeida; de Armas, Laura García; Rodríguez, Itahisa Marcelino; Almeida, Ana Arencibia; García, Miriam García; Gannar, Fadoua; de León, Antonio Cabrera
2017-10-01
The demand for testing to detect celiac disease (CD) autoantibodies has increased, together with the cost per case diagnosed, resulting in the adoption of measures to restrict laboratory testing. We designed this study to determine whether opportunistic screening to detect CD-associated autoantibodies had advantages compared to efforts to restrict testing, and to identify the most cost-effective diagnostic strategy. We compared a group of 1678 patients in which autoantibody testing was restricted to cases in which the test referral was considered appropriate (G1) to a group of 2140 patients in which test referrals were not reviewed or restricted (G2). Two algorithms A (quantifying IgA and Tissue transglutaminase IgA [TG-IgA] in all patients), and B (quantifying only TG-IgA in all patients) were used in each group, and the cost-effectiveness of each strategy was calculated. TG-IgA autoantibodies were positive in 62 G1 patients and 69 G2 patients. Among those positive for tissue transglutaminase IgA and endomysial IgA autoantibodies, the proportion of patients with de novo autoantibodies was lower (p=0.028) in G1 (11/62) than in G2 (24/69). Algorithm B required fewer determinations than algorithm A in both G1 (2310 vs 3493; p<0.001) and G2 (2196 vs 4435; p<0.001). With algorithm B the proportion of patients in whom IgA was tested was lower (p<0.001) in G2 (29/2140) than in G1 (617/1678). The lowest cost per case diagnosed (4.63 euros/patient) was found with algorithm B in G2. We conclude that opportunistic screening has advantages compared to efforts in the laboratory to restrict CD diagnostic testing. The most cost-effective strategy was based on the use of an appropriate algorithm. Copyright © 2017. Published by Elsevier B.V.
A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics
2003-03-01
and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection
ERIC Educational Resources Information Center
Wiggins, Lisa D.; Reynolds, Ann; Rice, Catherine E.; Moody, Eric J.; Bernal, Pilar; Blaskey, Lisa; Rosenberg, Steven A.; Lee, Li-Ching; Levy, Susan E.
2015-01-01
The Study to Explore Early Development (SEED) is a multi-site case-control study designed to explore the relationship between autism spectrum disorder (ASD) phenotypes and etiologies. The goals of this paper are to (1) describe the SEED algorithm that uses the Autism Diagnostic Interview-Revised (ADI-R) and Autism Diagnostic Observation Schedule…
ERIC Educational Resources Information Center
Kim, So Hyun; Thurm, Audrey; Shumway, Stacy; Lord, Catherine
2013-01-01
Using two independent datasets provided by National Institute of Health funded consortia, the Collaborative Programs for Excellence in Autism and Studies to Advance Autism Research and Treatment (n = 641) and the National Institute of Mental Health (n = 167), diagnostic validity and factor structure of the new Autism Diagnostic Interview (ADI-R)…
NASA Astrophysics Data System (ADS)
Nam, Kyoung Won; Kim, In Young; Kang, Ho Chul; Yang, Hee Kyung; Yoon, Chang Ki; Hwang, Jeong Min; Kim, Young Jae; Kim, Tae Yun; Kim, Kwang Gi
2012-10-01
Accurate measurement of binocular misalignment between both eyes is important for proper preoperative management, surgical planning, and postoperative evaluation of patients with strabismus. In this study, we proposed a new computerized diagnostic algorithm that can calculate the angle of binocular eye misalignment photographically by using a dedicated three-dimensional eye model mimicking the structure of the natural human eye. To evaluate the performance of the proposed algorithm, eight healthy volunteers and eight individuals with strabismus were recruited in this study, the horizontal deviation angle, vertical deviation angle, and angle of eye misalignment were calculated and the angular differences between the healthy and the strabismus groups were evaluated using the nonparametric Mann-Whitney test and the Pearson correlation test. The experimental results demonstrated a statistically significant difference between the healthy and strabismus groups (p = 0.015 < 0.05), but no statistically significant difference between the proposed method and the Krimsky test (p = 0.912 > 0.05). The measurements of the two methods were highly correlated (r = 0.969, p < 0.05). From the experimental results, we believe that the proposed diagnostic method has the potential to be a diagnostic tool that measures the physical disorder of the human eye to diagnose non-invasively the severity of strabismus.
Autism in the Faroe Islands: Diagnostic Stability from Childhood to Early Adult Life
Kočovská, Eva; Billstedt, Eva; Ellefsen, Asa; Kampmann, Hanna; Gillberg, I. Carina; Biskupstø, Rannvá; Andorsdóttir, Guðrið; Stóra, Tormóður; Minnis, Helen; Gillberg, Christopher
2013-01-01
Childhood autism or autism spectrum disorder (ASD) has been regarded as one of the most stable diagnostic categories applied to young children with psychiatric/developmental disorders. The stability over time of a diagnosis of ASD is theoretically interesting and important for various diagnostic and clinical reasons. We studied the diagnostic stability of ASD from childhood to early adulthood in the Faroe Islands: a total school age population sample (8–17-year-olds) was screened and diagnostically assessed for AD in 2002 and 2009. This paper compares both independent clinical diagnosis and Diagnostic Interview for Social and Communication Disorders (DISCO) algorithm diagnosis at two time points, separated by seven years. The stability of clinical ASD diagnosis was perfect for AD, good for “atypical autism”/PDD-NOS, and less than perfect for Asperger syndrome (AS). Stability of the DISCO algorithm subcategory diagnoses was more variable but still good for AD. Both systems showed excellent stability over the seven-year period for “any ASD” diagnosis, although a number of clear cases had been missed at the original screening in 2002. The findings support the notion that subcategories of ASD should be collapsed into one overarching diagnostic entity with subgrouping achieved on other “non-autism” variables, such as IQ and language levels and overall adaptive functioning. PMID:23476144
Efficient fault diagnosis of helicopter gearboxes
NASA Technical Reports Server (NTRS)
Chin, H.; Danai, K.; Lewicki, D. G.
1993-01-01
Application of a diagnostic system to a helicopter gearbox is presented. The diagnostic system is a nonparametric pattern classifier that uses a multi-valued influence matrix (MVIM) as its diagnostic model and benefits from a fast learning algorithm that enables it to estimate its diagnostic model from a small number of measurement-fault data. To test this diagnostic system, vibration measurements were collected from a helicopter gearbox test stand during accelerated fatigue tests and at various fault instances. The diagnostic results indicate that the MVIM system can accurately detect and diagnose various gearbox faults so long as they are included in training.
NASA Astrophysics Data System (ADS)
Li, Shao-Xin; Zeng, Qiu-Yao; Li, Lin-Fang; Zhang, Yan-Jiao; Wan, Ming-Ming; Liu, Zhi-Ming; Xiong, Hong-Lian; Guo, Zhou-Yi; Liu, Song-Hao
2013-02-01
The ability of combining serum surface-enhanced Raman spectroscopy (SERS) with support vector machine (SVM) for improving classification esophageal cancer patients from normal volunteers is investigated. Two groups of serum SERS spectra based on silver nanoparticles (AgNPs) are obtained: one group from patients with pathologically confirmed esophageal cancer (n=30) and the other group from healthy volunteers (n=31). Principal components analysis (PCA), conventional SVM (C-SVM) and conventional SVM combination with PCA (PCA-SVM) methods are implemented to classify the same spectral dataset. Results show that a diagnostic accuracy of 77.0% is acquired for PCA technique, while diagnostic accuracies of 83.6% and 85.2% are obtained for C-SVM and PCA-SVM methods based on radial basis functions (RBF) models. The results prove that RBF SVM models are superior to PCA algorithm in classification serum SERS spectra. The study demonstrates that serum SERS in combination with SVM technique has great potential to provide an effective and accurate diagnostic schema for noninvasive detection of esophageal cancer.
Wei, Ting-Yen; Yen, Tzung-Hai; Cheng, Chao-Min
2018-01-01
Acute pesticide intoxication is a common method of suicide globally. This article reviews current diagnostic methods and makes suggestions for future development. In the case of paraquat intoxication, it is characterized by multi-organ failure, causing substantial mortality and morbidity. Early diagnosis may save the life of a paraquat intoxication patient. Conventional paraquat intoxication diagnostic methods, such as symptom review and urine sodium dithionite assay, are time-consuming and impractical in resource-scarce areas where most intoxication cases occur. Several experimental and clinical studies have shown the potential of portable Surface Enhanced Raman Scattering (SERS), paper-based devices, and machine learning for paraquat intoxication diagnosis. Portable SERS and new SERS substrates maintain the sensitivity of SERS while being less costly and more convenient than conventional SERS. Paper-based devices provide the advantages of price and portability. Machine learning algorithms can be implemented as a mobile phone application and facilitate diagnosis in resource-limited areas. Although these methods have not yet met all features of an ideal diagnostic method, the combination and development of these methods offer much promise.
Sun, Xiang; Allison, Carrie; Auyeung, Bonnie; Zhang, Zhixiang; Matthews, Fiona E; Baron-Cohen, Simon; Brayne, Carol
2015-11-01
Research to date in mainland China has mainly focused on children with autistic disorder rather than Autism Spectrum Conditions and the diagnosis largely depended on clinical judgment without the use of diagnostic instruments. Whether children who have been diagnosed in China before meet the diagnostic criteria of Autism Spectrum Conditions is not known nor how many such children would meet these criteria. The aim of this study was to evaluate children with a known diagnosis of autism in mainland China using the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview-Revised to verify that children who were given a diagnosis of autism made by Chinese clinicians in China were mostly children with severe autism. Of 50 children with an existing diagnosis of autism made by Chinese clinicians, 47 children met the diagnosis of autism on the Autism Diagnostic Observation Schedule algorithm and 44 children met the diagnosis of autism on the Autism Diagnostic Interview-Revised algorithm. Using the Gwet's alternative chance-corrected statistic, the agreement between the Chinese diagnosis and the Autism Diagnostic Observation Schedule diagnosis was very good (AC1 = 0.94, p < 0.005, 95% confidence interval (0.86, 1.00)), so was the agreement between the Chinese diagnosis and the Autism Diagnostic Interview-Revised (AC1 = 0.91, p < 0.005, 95% confidence interval (0.81, 1.00)). The agreement between the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview-Revised was lower but still very good (AC1 = 0.83, p < 0.005). © The Author(s) 2015.
Sun, Xiang; Allison, Carrie; Auyeung, Bonnie; Zhang, Zhixiang; Matthews, Fiona E; Baron-Cohen, Simon; Brayne, Carol
2016-01-01
Research to date in mainland China has mainly focused on children with autistic disorder rather than Autism Spectrum Conditions and the diagnosis largely depended on clinical judgment without the use of diagnostic instruments. Whether children who have been diagnosed in China before meet the diagnostic criteria of Autism Spectrum Conditions is not known nor how many such children would meet these criteria. The aim of this study was to evaluate children with a known diagnosis of autism in mainland China using the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview–Revised to verify that children who were given a diagnosis of autism made by Chinese clinicians in China were mostly children with severe autism. Of 50 children with an existing diagnosis of autism made by Chinese clinicians, 47 children met the diagnosis of autism on the Autism Diagnostic Observation Schedule algorithm and 44 children met the diagnosis of autism on the Autism Diagnostic Interview–Revised algorithm. Using the Gwet’s alternative chance-corrected statistic, the agreement between the Chinese diagnosis and the Autism Diagnostic Observation Schedule diagnosis was very good (AC1 = 0.94, p < 0.005, 95% confidence interval (0.86, 1.00)), so was the agreement between the Chinese diagnosis and the Autism Diagnostic Interview–Revised (AC1 = 0.91, p < 0.005, 95% confidence interval (0.81, 1.00)). The agreement between the Autism Diagnostic Observation Schedule and the Autism Diagnostic Interview–Revised was lower but still very good (AC1 = 0.83, p < 0.005). PMID:25757721
NASA Technical Reports Server (NTRS)
Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla
1987-01-01
Traditional expert systems, such as diagnostic and training systems, interact with users only through a keyboard and screen, and are usually symbolic in nature. Expert systems that require access to data bases, complex simulations and real-time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general purpose workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The latter approach was chosen to implement TEXSYS, the thermal expert system, developed by NASA Ames Research Center in conjunction with Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. This paper will explore the integration options, and present several possible solutions.
NASA Astrophysics Data System (ADS)
Chang, Bingguo; Chen, Xiaofei
2018-05-01
Ultrasonography is an important examination for the diagnosis of chronic liver disease. The doctor gives the liver indicators and suggests the patient's condition according to the description of ultrasound report. With the rapid increase in the amount of data of ultrasound report, the workload of professional physician to manually distinguish ultrasound results significantly increases. In this paper, we use the spectral clustering method to cluster analysis of the description of the ultrasound report, and automatically generate the ultrasonic diagnostic diagnosis by machine learning. 110 groups ultrasound examination report of chronic liver disease were selected as test samples in this experiment, and the results were validated by spectral clustering and compared with k-means clustering algorithm. The results show that the accuracy of spectral clustering is 92.73%, which is higher than that of k-means clustering algorithm, which provides a powerful ultrasound-assisted diagnosis for patients with chronic liver disease.
Chan, Wing Cheuk; Papaconstantinou, Dean; Lee, Mildred; Telfer, Kendra; Jo, Emmanuel; Drury, Paul L; Tobias, Martin
2018-05-01
To validate the New Zealand Ministry of Health (MoH) Virtual Diabetes Register (VDR) using longitudinal laboratory results and to develop an improved algorithm for estimating diabetes prevalence at a population level. The assigned diabetes status of individuals based on the 2014 version of the MoH VDR is compared to the diabetes status based on the laboratory results stored in the Auckland regional laboratory result repository (TestSafe) using the New Zealand diabetes diagnostic criteria. The existing VDR algorithm is refined by reviewing the sensitivity and positive predictive value of the each of the VDR algorithm rules individually and as a combination. The diabetes prevalence estimate based on the original 2014 MoH VDR was 17% higher (n = 108,505) than the corresponding TestSafe prevalence estimate (n = 92,707). Compared to the diabetes prevalence based on TestSafe, the original VDR has a sensitivity of 89%, specificity of 96%, positive predictive value of 76% and negative predictive value of 98%. The modified VDR algorithm has improved the positive predictive value by 6.1% and the specificity by 1.4% with modest reductions in sensitivity of 2.2% and negative predictive value of 0.3%. At an aggregated level the overall diabetes prevalence estimated by the modified VDR is 5.7% higher than the corresponding estimate based on TestSafe. The Ministry of Health Virtual Diabetes Register algorithm has been refined to provide a more accurate diabetes prevalence estimate at a population level. The comparison highlights the potential value of a national population long term condition register constructed from both laboratory results and administrative data. Copyright © 2018 Elsevier B.V. All rights reserved.
Periprosthetic joint infections: a clinical practice algorithm.
Volpe, Luigi; Indelli, Pier Francesco; Latella, Leonardo; Poli, Paolo; Yakupoglu, Jale; Marcucci, Massimiliano
2014-01-01
periprosthetic joint infection (PJI) accounts for 25% of failed total knee arthroplasties (TKAs) and 15% of failed total hip arthroplasties (THAs). The purpose of the present study was to design a multidisciplinary diagnostic algorithm to detect a PJI as cause of a painful TKA or THA. from April 2010 to October 2012, 111 patients with suspected PJI were evaluated. The study group comprised 75 females and 36 males with an average age of 71 years (range, 48 to 94 years). Eighty-four patients had a painful THA, while 27 reported a painful TKA. The stepwise diagnostic algorithm, applied in all the patients, included: measurement of serum C-reactive protein (CRP) and erythrocyte sedimentation rate (ESR) levels; imaging studies, including standard radiological examination, standard technetium-99m-methylene diphosphonate (MDP) bone scan (if positive, confirmation by LeukoScan was obtained); and joint aspiration with analysis of synovial fluid. following application of the stepwise diagnostic algorithm, 24 out of our 111 screened patients were classified as having a suspected PJI (21.7%). CRP and ESR levels were negative in 84 and positive in 17 cases; 93.7% of the patients had a positive technetium-labeled bone scan, and 23% a positive LeukoScan. Preoperative synovial fluid analysis was positive in 13.5%; analysis of synovial fluid obtained by preoperative aspiration showed a leucocyte count of > 3000 cells μ/l in 52% of the patients. the present study showed that the diagnosis of PJI requires the application of a multimodal diagnostic protocol in order to avoid complications related to surgical revision of a misdiagnosed "silent" PJI. Level IV, therapeutic case series.
Sanjuán, Pilar; Rodríguez-Núñez, Nuria; Rábade, Carlos; Lama, Adriana; Ferreiro, Lucía; González-Barcala, Francisco Javier; Alvarez-Dobaño, José Manuel; Toubes, María Elena; Golpe, Antonio; Valdés, Luis
2014-05-01
Clinical probability scores (CPS) determine the pre-test probability of pulmonary embolism (PE) and assess the need for the tests required in these patients. Our objective is to investigate if PE is diagnosed according to clinical practice guidelines. Retrospective study of clinically suspected PE in the emergency department between January 2010 and December 2012. A D-dimer value ≥ 500 ng/ml was considered positive. PE was diagnosed on the basis of the multislice computed tomography angiography and, to a lesser extent, with other imaging techniques. The CPS used was the revised Geneva scoring system. There was 3,924 cases of suspected PE (56% female). Diagnosis was determined in 360 patients (9.2%) and the incidence was 30.6 cases per 100,000 inhabitants/year. Sensitivity and the negative predictive value of the D-dimer test were 98.7% and 99.2% respectively. CPS was calculated in only 24 cases (0.6%) and diagnostic algorithms were not followed in 2,125 patients (54.2%): in 682 (17.4%) because clinical probability could not be estimated and in 482 (37.6%), 852 (46.4%) and 109 (87.9%) with low, intermediate and high clinical probability, respectively, because the diagnostic algorithms for these probabilities were not applied. CPS are rarely calculated in the diagnosis of PE and the diagnostic algorithm is rarely used in clinical practice. This may result in procedures with potential significant side effects being unnecessarily performed or to a high risk of underdiagnosis. Copyright © 2013 SEPAR. Published by Elsevier Espana. All rights reserved.
Medical Image Compression Based on Vector Quantization with Variable Block Sizes in Wavelet Domain
Jiang, Huiyan; Ma, Zhiyuan; Hu, Yang; Yang, Benqiang; Zhang, Libo
2012-01-01
An optimized medical image compression algorithm based on wavelet transform and improved vector quantization is introduced. The goal of the proposed method is to maintain the diagnostic-related information of the medical image at a high compression ratio. Wavelet transformation was first applied to the image. For the lowest-frequency subband of wavelet coefficients, a lossless compression method was exploited; for each of the high-frequency subbands, an optimized vector quantization with variable block size was implemented. In the novel vector quantization method, local fractal dimension (LFD) was used to analyze the local complexity of each wavelet coefficients, subband. Then an optimal quadtree method was employed to partition each wavelet coefficients, subband into several sizes of subblocks. After that, a modified K-means approach which is based on energy function was used in the codebook training phase. At last, vector quantization coding was implemented in different types of sub-blocks. In order to verify the effectiveness of the proposed algorithm, JPEG, JPEG2000, and fractal coding approach were chosen as contrast algorithms. Experimental results show that the proposed method can improve the compression performance and can achieve a balance between the compression ratio and the image visual quality. PMID:23049544
Medical image compression based on vector quantization with variable block sizes in wavelet domain.
Jiang, Huiyan; Ma, Zhiyuan; Hu, Yang; Yang, Benqiang; Zhang, Libo
2012-01-01
An optimized medical image compression algorithm based on wavelet transform and improved vector quantization is introduced. The goal of the proposed method is to maintain the diagnostic-related information of the medical image at a high compression ratio. Wavelet transformation was first applied to the image. For the lowest-frequency subband of wavelet coefficients, a lossless compression method was exploited; for each of the high-frequency subbands, an optimized vector quantization with variable block size was implemented. In the novel vector quantization method, local fractal dimension (LFD) was used to analyze the local complexity of each wavelet coefficients, subband. Then an optimal quadtree method was employed to partition each wavelet coefficients, subband into several sizes of subblocks. After that, a modified K-means approach which is based on energy function was used in the codebook training phase. At last, vector quantization coding was implemented in different types of sub-blocks. In order to verify the effectiveness of the proposed algorithm, JPEG, JPEG2000, and fractal coding approach were chosen as contrast algorithms. Experimental results show that the proposed method can improve the compression performance and can achieve a balance between the compression ratio and the image visual quality.
Improving Remote Health Monitoring: A Low-Complexity ECG Compression Approach
Al-Ali, Abdulla; Mohamed, Amr; Ward, Rabab
2018-01-01
Recent advances in mobile technology have created a shift towards using battery-driven devices in remote monitoring settings and smart homes. Clinicians are carrying out diagnostic and screening procedures based on the electrocardiogram (ECG) signals collected remotely for outpatients who need continuous monitoring. High-speed transmission and analysis of large recorded ECG signals are essential, especially with the increased use of battery-powered devices. Exploring low-power alternative compression methodologies that have high efficiency and that enable ECG signal collection, transmission, and analysis in a smart home or remote location is required. Compression algorithms based on adaptive linear predictors and decimation by a factor B/K are evaluated based on compression ratio (CR), percentage root-mean-square difference (PRD), and heartbeat detection accuracy of the reconstructed ECG signal. With two databases (153 subjects), the new algorithm demonstrates the highest compression performance (CR=6 and PRD=1.88) and overall detection accuracy (99.90% sensitivity, 99.56% positive predictivity) over both databases. The proposed algorithm presents an advantage for the real-time transmission of ECG signals using a faster and more efficient method, which meets the growing demand for more efficient remote health monitoring. PMID:29337892
Improving Remote Health Monitoring: A Low-Complexity ECG Compression Approach.
Elgendi, Mohamed; Al-Ali, Abdulla; Mohamed, Amr; Ward, Rabab
2018-01-16
Recent advances in mobile technology have created a shift towards using battery-driven devices in remote monitoring settings and smart homes. Clinicians are carrying out diagnostic and screening procedures based on the electrocardiogram (ECG) signals collected remotely for outpatients who need continuous monitoring. High-speed transmission and analysis of large recorded ECG signals are essential, especially with the increased use of battery-powered devices. Exploring low-power alternative compression methodologies that have high efficiency and that enable ECG signal collection, transmission, and analysis in a smart home or remote location is required. Compression algorithms based on adaptive linear predictors and decimation by a factor B / K are evaluated based on compression ratio (CR), percentage root-mean-square difference (PRD), and heartbeat detection accuracy of the reconstructed ECG signal. With two databases (153 subjects), the new algorithm demonstrates the highest compression performance ( CR = 6 and PRD = 1.88 ) and overall detection accuracy (99.90% sensitivity, 99.56% positive predictivity) over both databases. The proposed algorithm presents an advantage for the real-time transmission of ECG signals using a faster and more efficient method, which meets the growing demand for more efficient remote health monitoring.
Microwave-based medical diagnosis using particle swarm optimization algorithm
NASA Astrophysics Data System (ADS)
Modiri, Arezoo
This dissertation proposes and investigates a novel architecture intended for microwave-based medical diagnosis (MBMD). Furthermore, this investigation proposes novel modifications of particle swarm optimization algorithm for achieving enhanced convergence performance. MBMD has been investigated through a variety of innovative techniques in the literature since the 1990's and has shown significant promise in early detection of some specific health threats. In comparison to the X-ray- and gamma-ray-based diagnostic tools, MBMD does not expose patients to ionizing radiation; and due to the maturity of microwave technology, it lends itself to miniaturization of the supporting systems. This modality has been shown to be effective in detecting breast malignancy, and hence, this study focuses on the same modality. A novel radiator device and detection technique is proposed and investigated in this dissertation. As expected, hardware design and implementation are of paramount importance in such a study, and a good deal of research, analysis, and evaluation has been done in this regard which will be reported in ensuing chapters of this dissertation. It is noteworthy that an important element of any detection system is the algorithm used for extracting signatures. Herein, the strong intrinsic potential of the swarm-intelligence-based algorithms in solving complicated electromagnetic problems is brought to bear. This task is accomplished through addressing both mathematical and electromagnetic problems. These problems are called benchmark problems throughout this dissertation, since they have known answers. After evaluating the performance of the algorithm for the chosen benchmark problems, the algorithm is applied to MBMD tumor detection problem. The chosen benchmark problems have already been tackled by solution techniques other than particle swarm optimization (PSO) algorithm, the results of which can be found in the literature. However, due to the relatively high level of complexity and randomness inherent to the selection of electromagnetic benchmark problems, a trend to resort to oversimplification in order to arrive at reasonable solutions has been taken in literature when utilizing analytical techniques. Here, an attempt has been made to avoid oversimplification when using the proposed swarm-based optimization algorithms.
Anatomical brain images alone can accurately diagnose chronic neuropsychiatric illnesses.
Bansal, Ravi; Staib, Lawrence H; Laine, Andrew F; Hao, Xuejun; Xu, Dongrong; Liu, Jun; Weissman, Myrna; Peterson, Bradley S
2012-01-01
Diagnoses using imaging-based measures alone offer the hope of improving the accuracy of clinical diagnosis, thereby reducing the costs associated with incorrect treatments. Previous attempts to use brain imaging for diagnosis, however, have had only limited success in diagnosing patients who are independent of the samples used to derive the diagnostic algorithms. We aimed to develop a classification algorithm that can accurately diagnose chronic, well-characterized neuropsychiatric illness in single individuals, given the availability of sufficiently precise delineations of brain regions across several neural systems in anatomical MR images of the brain. We have developed an automated method to diagnose individuals as having one of various neuropsychiatric illnesses using only anatomical MRI scans. The method employs a semi-supervised learning algorithm that discovers natural groupings of brains based on the spatial patterns of variation in the morphology of the cerebral cortex and other brain regions. We used split-half and leave-one-out cross-validation analyses in large MRI datasets to assess the reproducibility and diagnostic accuracy of those groupings. In MRI datasets from persons with Attention-Deficit/Hyperactivity Disorder, Schizophrenia, Tourette Syndrome, Bipolar Disorder, or persons at high or low familial risk for Major Depressive Disorder, our method discriminated with high specificity and nearly perfect sensitivity the brains of persons who had one specific neuropsychiatric disorder from the brains of healthy participants and the brains of persons who had a different neuropsychiatric disorder. Although the classification algorithm presupposes the availability of precisely delineated brain regions, our findings suggest that patterns of morphological variation across brain surfaces, extracted from MRI scans alone, can successfully diagnose the presence of chronic neuropsychiatric disorders. Extensions of these methods are likely to provide biomarkers that will aid in identifying biological subtypes of those disorders, predicting disease course, and individualizing treatments for a wide range of neuropsychiatric illnesses.
Eramudugolla, Ranmalee; Mortby, Moyra E; Sachdev, Perminder; Meslin, Chantal; Kumar, Rajeev; Anstey, Kaarin J
2017-03-04
There is little information on the application and impact of revised criteria for diagnosing dementia and mild cognitive impairment (MCI), now termed major and mild neurocognitive disorders (NCDs) in the DSM-5. We evaluate a psychometric algorithm for diagnosing DSM-5 NCDs in a community-dwelling sample, and characterize the neuropsychological and functional profile of expert-diagnosed DSM-5 NCDs relative to DSM-IV dementia and International Working Group criteria for MCI. A population-based sample of 1644 adults aged 72-78 years was assessed. Algorithmic diagnostic criteria used detailed neuropsychological data, medical history, longitudinal cognitive performance, and informant interview. Those meeting all criteria for at least one diagnosis had data reviewed by a neurologist (expert diagnosis) who achieved consensus with a psychiatrist for complex cases. The algorithm accurately classified DSM-5 major NCD (area under the curve (AUC) = 0.95, 95% confidence interval (CI) 0.92-0.97), DSM-IV dementia (AUC = 0.91, 95% CI 0.85-0.97), DSM-5 mild NCD (AUC = 0.75, 95% CI 0.70-0.80), and MCI (AUC = 0.76, 95% CI 0.72-0.81) when compared to expert diagnosis. Expert diagnosis of dementia using DSM-5 criteria overlapped with 90% of DSM-IV dementia cases, but resulted in a 127% increase in diagnosis relative to DSM-IV. Additional cases had less severe memory, language impairment, and instrumental activities of daily living (IADL) impairments compared to cases meeting DSM-IV criteria for dementia. DSM-5 mild NCD overlapped with 83% of MCI cases and resulted in a 19% increase in diagnosis. These additional cases had a subtly different neurocognitive profile to MCI cases, including poorer social cognition. DSM-5 NCD criteria can be operationalized in a psychometric algorithm in a population setting. Expert diagnosis using DSM-5 NCD criteria captured most cases with DSM-IV dementia and MCI in our sample, but included many additional cases suggesting that DSM-5 criteria are broader in their categorization.
Schellhaas, Barbara; Pfeifer, Lukas; Kielisch, Christian; Goertz, Ruediger Stephan; Neurath, Markus F; Strobel, Deike
2018-06-07
This pilot study aimed at assessing interobserver agreement with two contrast-enhanced ultrasound (CEUS) algorithms for the diagnosis of hepatocellular carcinoma (HCC) in high-risk patients. Focal liver lesions in 55 high-risk patients were assessed independently by three blinded observers with two standardized CEUS algorithms: ESCULAP (Erlanger Synopsis of Contrast-Enhanced Ultrasound for Liver Lesion Assessment in Patients at risk) and ACR-CEUS-LI-RADSv.2016 (American College of Radiology CEUS-Liver Imaging Reporting and Data System). Lesions were categorized according to size and ultrasound contrast enhancement in the arterial, portal-venous and late phase. Interobserver agreement for assessment of enhancement pattern and categorization was compared between both CEUS algorithms. Additionally, diagnostic accuracy for the definitive diagnosis of HCC was compared. Histology and/or CE-MRI and follow-up served as reference standards. 55 patients were included in the study (male/female, 44/ 11; mean age: 65.9 years). 90.9 % had cirrhosis. Histological findings were available in 39/55 lesions (70.9 %). Reference standard of the 55 lesions revealed 48 HCCs, 2 intrahepatic cholangiocellular carcinomas (ICCs), and 5 non-HCC-non-ICC lesions. Interobserver agreement was moderate to substantial for arterial phase hyperenhancement (ĸ = 0.53 - 0.67), and fair to moderate for contrast washout in the portal-venous or late phase (ĸ = 0.33 - 0.53). Concerning the CEUS-based algorithms, the interreader agreement was substantial for the ESCULAP category (ĸ = 0.64 - 0.68) and fair for the CEUS-LI-RADS ® category (ĸ = 0.3 - 0.39). Disagreement between observers was mostly due to different perception of washout. Interobserver agreement is better for ESCULAP than for CEUS-LI-RADS ® . This is mostly due to the fact that perception of contrast washout varies between different observers. However, interobserver agreement is good for arterial phase hyperenhancement, which is the key diagnostic feature for the diagnosis of HCC with CEUS in the cirrhotic liver. © Georg Thieme Verlag KG Stuttgart · New York.
Distributed Prognostics and Health Management with a Wireless Network Architecture
NASA Technical Reports Server (NTRS)
Goebel, Kai; Saha, Sankalita; Sha, Bhaskar
2013-01-01
A heterogeneous set of system components monitored by a varied suite of sensors and a particle-filtering (PF) framework, with the power and the flexibility to adapt to the different diagnostic and prognostic needs, has been developed. Both the diagnostic and prognostic tasks are formulated as a particle-filtering problem in order to explicitly represent and manage uncertainties in state estimation and remaining life estimation. Current state-of-the-art prognostic health management (PHM) systems are mostly centralized in nature, where all the processing is reliant on a single processor. This can lead to a loss in functionality in case of a crash of the central processor or monitor. Furthermore, with increases in the volume of sensor data as well as the complexity of algorithms, traditional centralized systems become for a number of reasons somewhat ungainly for successful deployment, and efficient distributed architectures can be more beneficial. The distributed health management architecture is comprised of a network of smart sensor devices. These devices monitor the health of various subsystems or modules. They perform diagnostics operations and trigger prognostics operations based on user-defined thresholds and rules. The sensor devices, called computing elements (CEs), consist of a sensor, or set of sensors, and a communication device (i.e., a wireless transceiver beside an embedded processing element). The CE runs in either a diagnostic or prognostic operating mode. The diagnostic mode is the default mode where a CE monitors a given subsystem or component through a low-weight diagnostic algorithm. If a CE detects a critical condition during monitoring, it raises a flag. Depending on availability of resources, a networked local cluster of CEs is formed that then carries out prognostics and fault mitigation by efficient distribution of the tasks. It should be noted that the CEs are expected not to suspend their previous tasks in the prognostic mode. When the prognostics task is over, and after appropriate actions have been taken, all CEs return to their original default configuration. Wireless technology-based implementation would ensure more flexibility in terms of sensor placement. It would also allow more sensors to be deployed because the overhead related to weights of wired systems is not present. Distributed architectures are furthermore generally robust with regard to recovery from node failures.
Rehm, K; Seeley, G W; Dallas, W J; Ovitt, T W; Seeger, J F
1990-01-01
One of the goals of our research in the field of digital radiography has been to develop contrast-enhancement algorithms for eventual use in the display of chest images on video devices with the aim of preserving the diagnostic information presently available with film, some of which would normally be lost because of the smaller dynamic range of video monitors. The ASAHE algorithm discussed in this article has been tested by investigating observer performance in a difficult detection task involving phantoms and simulated lung nodules, using film as the output medium. The results of the experiment showed that the algorithm is successful in providing contrast-enhanced, natural-looking chest images while maintaining diagnostic information. The algorithm did not effect an increase in nodule detectability, but this was not unexpected because film is a medium capable of displaying a wide range of gray levels. It is sufficient at this stage to show that there is no degradation in observer performance. Future tests will evaluate the performance of the ASAHE algorithm in preparing chest images for video display.
NASA Technical Reports Server (NTRS)
Russell, B. Don
1989-01-01
This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms.
Advances in Patch-Based Adaptive Mesh Refinement Scalability
Gunney, Brian T.N.; Anderson, Robert W.
2015-12-18
Patch-based structured adaptive mesh refinement (SAMR) is widely used for high-resolution simu- lations. Combined with modern supercomputers, it could provide simulations of unprecedented size and resolution. A persistent challenge for this com- bination has been managing dynamically adaptive meshes on more and more MPI tasks. The dis- tributed mesh management scheme in SAMRAI has made some progress SAMR scalability, but early al- gorithms still had trouble scaling past the regime of 105 MPI tasks. This work provides two critical SAMR regridding algorithms, which are integrated into that scheme to ensure efficiency of the whole. The clustering algorithm is an extensionmore » of the tile- clustering approach, making it more flexible and efficient in both clustering and parallelism. The partitioner is a new algorithm designed to prevent the network congestion experienced by its prede- cessor. We evaluated performance using weak- and strong-scaling benchmarks designed to be difficult for dynamic adaptivity. Results show good scaling on up to 1.5M cores and 2M MPI tasks. Detailed timing diagnostics suggest scaling would continue well past that.« less
Advances in Patch-Based Adaptive Mesh Refinement Scalability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunney, Brian T.N.; Anderson, Robert W.
Patch-based structured adaptive mesh refinement (SAMR) is widely used for high-resolution simu- lations. Combined with modern supercomputers, it could provide simulations of unprecedented size and resolution. A persistent challenge for this com- bination has been managing dynamically adaptive meshes on more and more MPI tasks. The dis- tributed mesh management scheme in SAMRAI has made some progress SAMR scalability, but early al- gorithms still had trouble scaling past the regime of 105 MPI tasks. This work provides two critical SAMR regridding algorithms, which are integrated into that scheme to ensure efficiency of the whole. The clustering algorithm is an extensionmore » of the tile- clustering approach, making it more flexible and efficient in both clustering and parallelism. The partitioner is a new algorithm designed to prevent the network congestion experienced by its prede- cessor. We evaluated performance using weak- and strong-scaling benchmarks designed to be difficult for dynamic adaptivity. Results show good scaling on up to 1.5M cores and 2M MPI tasks. Detailed timing diagnostics suggest scaling would continue well past that.« less
NASA Astrophysics Data System (ADS)
Losik, L.
A predictive medicine program allows disease and illness including mental illness to be predicted using tools created to identify the presence of accelerated aging (a.k.a. disease) in electrical and mechanical equipment. When illness and disease can be predicted, actions can be taken so that the illness and disease can be prevented and eliminated. A predictive medicine program uses the same tools and practices from a prognostic and health management program to process biological and engineering diagnostic data provided in analog telemetry during prelaunch readiness and space exploration missions. The biological and engineering diagnostic data necessary to predict illness and disease is collected from the pre-launch spaceflight readiness activities and during space flight for the ground crew to perform a prognostic analysis on the results from a diagnostic analysis. The diagnostic, biological data provided in telemetry is converted to prognostic (predictive) data using the predictive algorithms. Predictive algorithms demodulate telemetry behavior. They illustrate the presence of accelerated aging/disease in normal appearing systems that function normally. Mental illness can predicted using biological diagnostic measurements provided in CCSDS telemetry from a spacecraft such as the ISS or from a manned spacecraft in deep space. The measurements used to predict mental illness include biological and engineering data from an astronaut's circadian and ultranian rhythms. This data originates deep in the brain that is also damaged from the long-term exposure to cortisol and adrenaline anytime the body's fight or flight response is activated. This paper defines the brain's FOFR; the diagnostic, biological and engineering measurements needed to predict mental illness, identifies the predictive algorithms necessary to process the behavior in CCSDS analog telemetry to predict and thus prevent mental illness from occurring on human spaceflight missions.
Long-term surface EMG monitoring using K-means clustering and compressive sensing
NASA Astrophysics Data System (ADS)
Balouchestani, Mohammadreza; Krishnan, Sridhar
2015-05-01
In this work, we present an advanced K-means clustering algorithm based on Compressed Sensing theory (CS) in combination with the K-Singular Value Decomposition (K-SVD) method for Clustering of long-term recording of surface Electromyography (sEMG) signals. The long-term monitoring of sEMG signals aims at recording of the electrical activity produced by muscles which are very useful procedure for treatment and diagnostic purposes as well as for detection of various pathologies. The proposed algorithm is examined for three scenarios of sEMG signals including healthy person (sEMG-Healthy), a patient with myopathy (sEMG-Myopathy), and a patient with neuropathy (sEMG-Neuropathr), respectively. The proposed algorithm can easily scan large sEMG datasets of long-term sEMG recording. We test the proposed algorithm with Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) dimensionality reduction methods. Then, the output of the proposed algorithm is fed to K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers in order to calclute the clustering performance. The proposed algorithm achieves a classification accuracy of 99.22%. This ability allows reducing 17% of Average Classification Error (ACE), 9% of Training Error (TE), and 18% of Root Mean Square Error (RMSE). The proposed algorithm also reduces 14% clustering energy consumption compared to the existing K-Means clustering algorithm.
European Respiratory Society guidelines for the diagnosis of primary ciliary dyskinesia.
Lucas, Jane S; Barbato, Angelo; Collins, Samuel A; Goutaki, Myrofora; Behan, Laura; Caudri, Daan; Dell, Sharon; Eber, Ernst; Escudier, Estelle; Hirst, Robert A; Hogg, Claire; Jorissen, Mark; Latzin, Philipp; Legendre, Marie; Leigh, Margaret W; Midulla, Fabio; Nielsen, Kim G; Omran, Heymut; Papon, Jean-Francois; Pohunek, Petr; Redfern, Beatrice; Rigau, David; Rindlisbacher, Bernhard; Santamaria, Francesca; Shoemark, Amelia; Snijders, Deborah; Tonia, Thomy; Titieni, Andrea; Walker, Woolf T; Werner, Claudius; Bush, Andrew; Kuehni, Claudia E
2017-01-01
The diagnosis of primary ciliary dyskinesia is often confirmed with standard, albeit complex and expensive, tests. In many cases, however, the diagnosis remains difficult despite the array of sophisticated diagnostic tests. There is no "gold standard" reference test. Hence, a Task Force supported by the European Respiratory Society has developed this guideline to provide evidence-based recommendations on diagnostic testing, especially in light of new developments in such tests, and the need for robust diagnoses of patients who might enter randomised controlled trials of treatments. The guideline is based on pre-defined questions relevant for clinical care, a systematic review of the literature, and assessment of the evidence using the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) approach. It focuses on clinical presentation, nasal nitric oxide, analysis of ciliary beat frequency and pattern by high-speed video-microscopy analysis, transmission electron microscopy, genotyping and immunofluorescence. It then used a modified Delphi survey to develop an algorithm for the use of diagnostic tests to definitively confirm and exclude the diagnosis of primary ciliary dyskinesia; and to provide advice when the diagnosis was not conclusive. Finally, this guideline proposes a set of quality criteria for future research on the validity of diagnostic methods for primary ciliary dyskinesia. Copyright ©ERS 2017.
Breast Cancer Diagnostic System Final Report CRADA No. TC02098.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubenchik, A. M.; DaSilva, L. B.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Liver more National Laboratory (LLNL) and BioTelligent, Inc. together with a Russian Institution (BioFil, Ltd.), to develop a new system ( diagnostic device, operating procedures, algorithms and software) to accurately distinguish between benign and malignant breast tissue (Breast Cancer Diagnostic System, BCDS).
[EBOLA HEMORRHAGIC FEVER: DIAGNOSTICS, ETIOTROPIC AND PATHOGENETIC THERAPY, PREVENTION].
Zhdanov, K V; Zakharenko, S M; Kovalenko, A N; Semenov, A V; Fisun, A Ya
2015-01-01
The data on diagnostics, etiotropic and pathogenetic therapy, prevention of Ebola hemorrhagic fever are presented including diagnostic algorithms for different clinical situations. Fundamentals of pathogenetic therapy are described. Various groups of medications used for antiviral therapy of conditions caused by Ebola virus are characterized. Experimental drugs at different stages of clinical studies are considered along with candidate vaccines being developed for the prevention of the disease.
ERIC Educational Resources Information Center
Gelhorn, Heather; Hartman, Christie; Sakai, Joseph; Stallings, Michael; Young, Susan; Rhee, So Hyun; Corley, Robin; Hewitt, John; Hopger, Christian; Crowley, Thomas D.
2008-01-01
Clinical interviews of approximately 5,587 adolescents revealed that DSM-IV diagnostic categories were found to be different in terms of the severity of alcohol use disorders (AUDs). However, a substantial inconsistency and overlap was found in severity of AUDs across categories. The need for an alternative diagnostic algorithm which considers all…
Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad
2018-02-01
Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.
Chatlapalli, S; Nazeran, H; Melarkod, V; Krishnam, R; Estrada, E; Pamula, Y; Cabrera, S
2004-01-01
The electrocardiogram (ECG) signal is used extensively as a low cost diagnostic tool to provide information concerning the heart's state of health. Accurate determination of the QRS complex, in particular, reliable detection of the R wave peak, is essential in computer based ECG analysis. ECG data from Physionet's Sleep-Apnea database were used to develop, test, and validate a robust heart rate variability (HRV) signal derivation algorithm. The HRV signal was derived from pre-processed ECG signals by developing an enhanced Hilbert transform (EHT) algorithm with built-in missing beat detection capability for reliable QRS detection. The performance of the EHT algorithm was then compared against that of a popular Hilbert transform-based (HT) QRS detection algorithm. Autoregressive (AR) modeling of the HRV power spectrum for both EHT- and HT-derived HRV signals was achieved and different parameters from their power spectra as well as approximate entropy were derived for comparison. Poincare plots were then used as a visualization tool to highlight the detection of the missing beats in the EHT method After validation of the EHT algorithm on ECG data from the Physionet, the algorithm was further tested and validated on a dataset obtained from children undergoing polysomnography for detection of sleep disordered breathing (SDB). Sensitive measures of accurate HRV signals were then derived to be used in detecting and diagnosing sleep disordered breathing in children. All signal processing algorithms were implemented in MATLAB. We present a description of the EHT algorithm and analyze pilot data for eight children undergoing nocturnal polysomnography. The pilot data demonstrated that the EHT method provides an accurate way of deriving the HRV signal and plays an important role in extraction of reliable measures to distinguish between periods of normal and sleep disordered breathing (SDB) in children.
Stationary intraoral tomosynthesis for dental imaging
NASA Astrophysics Data System (ADS)
Inscoe, Christina R.; Wu, Gongting; Soulioti, Danai E.; Platin, Enrique; Mol, Andre; Gaalaas, Laurence R.; Anderson, Michael R.; Tucker, Andrew W.; Boyce, Sarah; Shan, Jing; Gonzales, Brian; Lu, Jianping; Zhou, Otto
2017-03-01
Despite recent advances in dental radiography, the diagnostic accuracies for some of the most common dental diseases have not improved significantly, and in some cases remain low. Intraoral x-ray is the most commonly used x-ray diagnostic tool in dental clinics. It however suffers from the typical limitations of a 2D imaging modality including structure overlap. Cone-beam computed tomography (CBCT) uses high radiation dose and suffers from image artifacts and relatively low resolution. The purpose of this study is to investigate the feasibility of developing a stationary intraoral tomosynthesis (s-IOT) using spatially distributed carbon nanotube (CNT) x-ray array technology, and to evaluate its diagnostic accuracy compared to conventional 2D intraoral x-ray. A bench-top s-IOT device was constructed using a linear CNT based X-ray source array and a digital intraoral detector. Image reconstruction was performed using an iterative reconstruction algorithm. Studies were performed to optimize the imaging configuration. For evaluation of s-IOT's diagnostic accuracy, images of a dental quality assurance phantom, and extracted human tooth specimens were acquired. Results show s-IOT increases the diagnostic sensitivity for caries compared to intraoral x-ray at a comparable dose level.
Tomographic capabilities of the new GEM based SXR diagnostic of WEST
NASA Astrophysics Data System (ADS)
Jardin, A.; Mazon, D.; O'Mullane, M.; Mlynar, J.; Loffelmann, V.; Imrisek, M.; Chernyshova, M.; Czarski, T.; Kasprowicz, G.; Wojenski, A.; Bourdelle, C.; Malard, P.
2016-07-01
The tokamak WEST (Tungsten Environment in Steady-State Tokamak) will start operating by the end of 2016 as a test bed for the ITER divertor components in long pulse operation. In this context, radiative cooling of heavy impurities like tungsten (W) in the Soft X-ray (SXR) range [0.1 keV; 20 keV] is a critical issue for the plasma core performances. Thus reliable tools are required to monitor the local impurity density and avoid W accumulation. The WEST SXR diagnostic will be equipped with two new GEM (Gas Electron Multiplier) based poloidal cameras allowing to perform 2D tomographic reconstructions in tunable energy bands. In this paper tomographic capabilities of the Minimum Fisher Information (MFI) algorithm developed for Tore Supra and upgraded for WEST are investigated, in particular through a set of emissivity phantoms and the standard WEST scenario including reconstruction errors, influence of noise as well as computational time.
Sideris, Costas; Alshurafa, Nabil; Pourhomayoun, Mohammad; Shahmohammadi, Farhad; Samy, Lauren; Sarrafzadeh, Majid
2015-01-01
In this paper, we propose a novel methodology for utilizing disease diagnostic information to predict severity of condition for Congestive Heart Failure (CHF) patients. Our methodology relies on a novel, clustering-based, feature extraction framework using disease diagnostic information. To reduce the dimensionality we identify disease clusters using cooccurence frequencies. We then utilize these clusters as features to predict patient severity of condition. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 patients. We compare our cluster-based feature set with another that incorporates the Charlson comorbidity score as a feature and demonstrate an accuracy improvement of up to 14% in the predictability of the severity of condition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voisin, Sophie; Pinto, Frank M; Morin-Ducote, Garnetta
2013-01-01
Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels. Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from 4 Radiology residents and 2 breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADsmore » images features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated. Results: Diagnostic error can be predicted reliably by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model (AUC=0.79). Personalized user modeling was far more accurate for the more experienced readers (average AUC of 0.837 0.029) than for the less experienced ones (average AUC of 0.667 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features. Conclusions: Diagnostic errors in mammography can be predicted reliably by leveraging the radiologists gaze behavior and image content.« less
Enhanced CT images by the wavelet transform improving diagnostic accuracy of chest nodules.
Guo, Xiuhua; Liu, Xiangye; Wang, Huan; Liang, Zhigang; Wu, Wei; He, Qian; Li, Kuncheng; Wang, Wei
2011-02-01
The objective of this study was to compare the diagnostic accuracy in the interpretation of chest nodules using original CT images versus enhanced CT images based on the wavelet transform. The CT images of 118 patients with cancers and 60 with benign nodules were used in this study. All images were enhanced through an algorithm based on the wavelet transform. Two experienced radiologists interpreted all the images in two reading sessions. The reading sessions were separated by a minimum of 1 month in order to minimize the effect of observer's recall. The Mann-Whitney U nonparametric test was used to analyze the interpretation results between original and enhanced images. The Kruskal-Wallis H nonparametric test of K independent samples was used to investigate the related factors which could affect the diagnostic accuracy of observers. The area under the ROC curves for the original and enhanced images was 0.681 and 0.736, respectively. There is significant difference in diagnosing the malignant nodules between the original and enhanced images (z = 7.122, P < 0.001), whereas there is no significant difference in diagnosing the benign nodules (z = 0.894, P = 0.371). The results showed that there is significant difference between original and enhancement images when the size of nodules was larger than 2 cm (Z = -2.509, P = 0.012, indicating the size of the nodules is a critical evaluating factor of the diagnostic accuracy of observers). This study indicated that the image enhancement based on wavelet transform could improve the diagnostic accuracy of radiologists for the malignant chest nodules.
Diagnosis of Posttraumatic Stress Disorder in Preschool Children
ERIC Educational Resources Information Center
De Young, Alexandra C.; Kenardy, Justin A.; Cobham, Vanessa E.
2011-01-01
This study investigated the existing diagnostic algorithms for posttraumatic stress disorder (PTSD) to determine the most developmentally sensitive and valid approach for diagnosing this disorder in preschoolers. Participants were 130 parents of unintentionally burned children (1-6 years). Diagnostic interviews were conducted with parents to…
Approximation algorithms for a genetic diagnostics problem.
Kosaraju, S R; Schäffer, A A; Biesecker, L G
1998-01-01
We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.
Zhan, Liang; Zhou, Jiayu; Wang, Yalin; Jin, Yan; Jahanshad, Neda; Prasad, Gautam; Nir, Talia M.; Leonardo, Cassandra D.; Ye, Jieping; Thompson, Paul M.; for the Alzheimer’s Disease Neuroimaging Initiative
2015-01-01
Alzheimer’s disease (AD) involves a gradual breakdown of brain connectivity, and network analyses offer a promising new approach to track and understand disease progression. Even so, our ability to detect degenerative changes in brain networks depends on the methods used. Here we compared several tractography and feature extraction methods to see which ones gave best diagnostic classification for 202 people with AD, mild cognitive impairment or normal cognition, scanned with 41-gradient diffusion-weighted magnetic resonance imaging as part of the Alzheimer’s Disease Neuroimaging Initiative (ADNI) project. We computed brain networks based on whole brain tractography with nine different methods – four of them tensor-based deterministic (FACT, RK2, SL, and TL), two orientation distribution function (ODF)-based deterministic (FACT, RK2), two ODF-based probabilistic approaches (Hough and PICo), and one “ball-and-stick” approach (Probtrackx). Brain networks derived from different tractography algorithms did not differ in terms of classification performance on ADNI, but performing principal components analysis on networks helped classification in some cases. Small differences may still be detectable in a truly vast cohort, but these experiments help assess the relative advantages of different tractography algorithms, and different post-processing choices, when used for classification. PMID:25926791
Clinical approaches to infertility in the bitch.
Wilborn, Robyn R; Maxwell, Herris S
2012-05-01
When presented with the apparently infertile bitch, the practitioner must sort through a myriad of facts, historical events, and diagnostic tests to uncover the etiology of the problem. Many bitches that present for infertility are reproductively normal and are able to conceive with appropriate intervention and breeding management. An algorithmic approach is helpful in cases of infertility, where simple questions lead to the next appropriate step. Most bitches can be categorized as either cyclic or acyclic, and then further classified based on historical data and diagnostic testing. Each female has a unique set of circumstances that can affect her reproductive potential. By utilizing all available information and a logical approach, the clinician can narrow the list of differentials and reach a diagnosis more quickly.
Mashin, V A; Mashina, M N
2004-12-01
In the paper, outcomes of the researches devoted to factor analysis of heart rate variability parameters and definition of the most informative parameters for diagnostics of functional states and an evaluation of level of stability to mental loads, are presented. The factor structure of parameters, which unclude integral level of heart rate variability (1), balance between activity of vagus and brain cortical-limbic systems (2), integrated level of cardiovascular system functioning (3), is substantiated. Factor analysis outcomes have been used for construction of functional state classification, for their differential diagnostics, and for development and check of algorithm for evaluation of the stability level in mental loads.
Mena, Luis J.; Orozco, Eber E.; Felix, Vanessa G.; Ostos, Rodolfo; Melgarejo, Jesus; Maestre, Gladys E.
2012-01-01
Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses. PMID:22924062
Russian guidelines for the management of COPD: algorithm of pharmacologic treatment
Aisanov, Zaurbek; Avdeev, Sergey; Arkhipov, Vladimir; Belevskiy, Andrey; Chuchalin, Alexander; Leshchenko, Igor; Ovcharenko, Svetlana; Shmelev, Evgeny; Miravitlles, Marc
2018-01-01
The high prevalence of COPD together with its high level of misdiagnosis and late diagnosis dictate the necessity for the development and implementation of clinical practice guidelines (CPGs) in order to improve the management of this disease. High-quality, evidence-based international CPGs need to be adapted to the particular situation of each country or region. A new version of the Russian Respiratory Society guidelines released at the end of 2016 was based on the proposal by Global Initiative for Obstructive Lung Disease but adapted to the characteristics of the Russian health system and included an algorithm of pharmacologic treatment of COPD. The proposed algorithm had to comply with the requirements of the Russian Ministry of Health to be included into the unified electronic rubricator, which required a balance between the level of information and the simplicity of the graphic design. This was achieved by: exclusion of the initial diagnostic process, grouping together the common pharmacologic and nonpharmacologic measures for all patients, and the decision not to use the letters A–D for simplicity and clarity. At all stages of the treatment algorithm, efficacy and safety have to be carefully assessed. Escalation and de-escalation is possible in the case of lack of or insufficient efficacy or safety issues. Bronchodilators should not be discontinued except in the case of significant side effects. At the same time, inhaled corticosteroid (ICS) withdrawal is not represented in the algorithm, because it was agreed that there is insufficient evidence to establish clear criteria for ICSs discontinuation. Finally, based on the Global Initiative for Obstructive Lung Disease statement, the proposed algorithm reflects and summarizes different approaches to the pharmacological treatment of COPD taking into account the reality of health care in the Russian Federation. PMID:29386887
Kamali, Tahereh; Stashuk, Daniel
2016-10-01
Robust and accurate segmentation of brain white matter (WM) fiber bundles assists in diagnosing and assessing progression or remission of neuropsychiatric diseases such as schizophrenia, autism and depression. Supervised segmentation methods are infeasible in most applications since generating gold standards is too costly. Hence, there is a growing interest in designing unsupervised methods. However, most conventional unsupervised methods require the number of clusters be known in advance which is not possible in most applications. The purpose of this study is to design an unsupervised segmentation algorithm for brain white matter fiber bundles which can automatically segment fiber bundles using intrinsic diffusion tensor imaging data information without considering any prior information or assumption about data distributions. Here, a new density based clustering algorithm called neighborhood distance entropy consistency (NDEC), is proposed which discovers natural clusters within data by simultaneously utilizing both local and global density information. The performance of NDEC is compared with other state of the art clustering algorithms including chameleon, spectral clustering, DBSCAN and k-means using Johns Hopkins University publicly available diffusion tensor imaging data. The performance of NDEC and other employed clustering algorithms were evaluated using dice ratio as an external evaluation criteria and density based clustering validation (DBCV) index as an internal evaluation metric. Across all employed clustering algorithms, NDEC obtained the highest average dice ratio (0.94) and DBCV value (0.71). NDEC can find clusters with arbitrary shapes and densities and consequently can be used for WM fiber bundle segmentation where there is no distinct boundary between various bundles. NDEC may also be used as an effective tool in other pattern recognition and medical diagnostic systems in which discovering natural clusters within data is a necessity. Copyright © 2016 Elsevier B.V. All rights reserved.
Russian guidelines for the management of COPD: algorithm of pharmacologic treatment.
Aisanov, Zaurbek; Avdeev, Sergey; Arkhipov, Vladimir; Belevskiy, Andrey; Chuchalin, Alexander; Leshchenko, Igor; Ovcharenko, Svetlana; Shmelev, Evgeny; Miravitlles, Marc
2018-01-01
The high prevalence of COPD together with its high level of misdiagnosis and late diagnosis dictate the necessity for the development and implementation of clinical practice guidelines (CPGs) in order to improve the management of this disease. High-quality, evidence-based international CPGs need to be adapted to the particular situation of each country or region. A new version of the Russian Respiratory Society guidelines released at the end of 2016 was based on the proposal by Global Initiative for Obstructive Lung Disease but adapted to the characteristics of the Russian health system and included an algorithm of pharmacologic treatment of COPD. The proposed algorithm had to comply with the requirements of the Russian Ministry of Health to be included into the unified electronic rubricator, which required a balance between the level of information and the simplicity of the graphic design. This was achieved by: exclusion of the initial diagnostic process, grouping together the common pharmacologic and nonpharmacologic measures for all patients, and the decision not to use the letters A-D for simplicity and clarity. At all stages of the treatment algorithm, efficacy and safety have to be carefully assessed. Escalation and de-escalation is possible in the case of lack of or insufficient efficacy or safety issues. Bronchodilators should not be discontinued except in the case of significant side effects. At the same time, inhaled corticosteroid (ICS) withdrawal is not represented in the algorithm, because it was agreed that there is insufficient evidence to establish clear criteria for ICSs discontinuation. Finally, based on the Global Initiative for Obstructive Lung Disease statement, the proposed algorithm reflects and summarizes different approaches to the pharmacological treatment of COPD taking into account the reality of health care in the Russian Federation.
Heller, Monika D; Roots, Kurt; Srivastava, Sanjana; Schumann, Jennifer; Srivastava, Jaideep; Hale, T Sigi
2013-10-01
Attention deficit hyperactivity disorder (ADHD) is found in 9.5 percent of the U.S. population and poses lifelong challenges. Current diagnostic approaches rely on evaluation forms completed by teachers and/or parents, although they are not specifically trained to recognize cognitive disorders. The most accurate diagnosis is by a psychiatrist, often only available to children with severe symptoms. Development of a tool that is engaging and objective and aids medical providers is needed in the diagnosis of ADHD. The goal of this research is to work toward the development of such a tool. The proposed approach takes advantage of two trends: The rapid adoption of tangible user interface devices and the popularity of interactive videogames. CogCubed Inc. (Minneapolis, MN) has created "Groundskeeper," a game on the Sifteo Cubes (Sifteo, Inc., San Francisco, CA) game system with elements that exercise skills affected by ADHD. "Groundskeeper" was evaluated for 52 patients, with and without ADHD. Gameplay data were mathematically transformed into ADHD-indicative feature variables and subjected to machine learning algorithms to develop diagnostic models to aid psychiatric clinical assessments of ADHD. The effectiveness of the developed model was evaluated against the diagnostic impressions of two licensed child/adolescent psychiatrists using semistructured interviews. Our predictive algorithms were highly accurate in correctly predicting diagnoses based on gameplay of "Groundskeeper." The F-measure, a measure of diagnosis accuracy, from the predictive models gave values as follows: ADHD, inattentive type, 78 percent (P>0.05); ADHD, combined type, 75 percent (P<0.05); anxiety disorders, 71%; and depressive disorders, 76%. This represents a promising new approach to screening tools for ADHD.
voomDDA: discovery of diagnostic biomarkers and classification of RNA-seq data.
Zararsiz, Gokmen; Goksuluk, Dincer; Klaus, Bernd; Korkmaz, Selcuk; Eldem, Vahap; Karabulut, Erdem; Ozturk, Ahmet
2017-01-01
RNA-Seq is a recent and efficient technique that uses the capabilities of next-generation sequencing technology for characterizing and quantifying transcriptomes. One important task using gene-expression data is to identify a small subset of genes that can be used to build diagnostic classifiers particularly for cancer diseases. Microarray based classifiers are not directly applicable to RNA-Seq data due to its discrete nature. Overdispersion is another problem that requires careful modeling of mean and variance relationship of the RNA-Seq data. In this study, we present voomDDA classifiers: variance modeling at the observational level (voom) extensions of the nearest shrunken centroids (NSC) and the diagonal discriminant classifiers. VoomNSC is one of these classifiers and brings voom and NSC approaches together for the purpose of gene-expression based classification. For this purpose, we propose weighted statistics and put these weighted statistics into the NSC algorithm. The VoomNSC is a sparse classifier that models the mean-variance relationship using the voom method and incorporates voom's precision weights into the NSC classifier via weighted statistics. A comprehensive simulation study was designed and four real datasets are used for performance assessment. The overall results indicate that voomNSC performs as the sparsest classifier. It also provides the most accurate results together with power-transformed Poisson linear discriminant analysis, rlog transformed support vector machines and random forests algorithms. In addition to prediction purposes, the voomNSC classifier can be used to identify the potential diagnostic biomarkers for a condition of interest. Through this work, statistical learning methods proposed for microarrays can be reused for RNA-Seq data. An interactive web application is freely available at http://www.biosoft.hacettepe.edu.tr/voomDDA/.
NASA Astrophysics Data System (ADS)
Abernethy, Jennifer A.
Pilots' ability to avoid clear-air turbulence (CAT) during flight affects the safety of the millions of people who fly commercial airlines and other aircraft, and turbulence costs millions in injuries and aircraft maintenance every year. Forecasting CAT is not straightforward, however; microscale features like the turbulence eddies that affect aircraft (100m) are below the current resolution of operational numerical weather prediction (NWP) models, and the only evidence of CAT episodes, until recently, has been sparse, subjective reports from pilots known as PIREPs. To forecast CAT, researchers use a simple weighted sum of top-performing turbulence indicators derived from NWP model outputs---termed diagnostics---based on their agreement with current PIREPs. However, a new, quantitative source of observation data---high-density measurements made by sensor equipment and software on aircraft, called in-situ measurements---is now available. The main goal of this thesis is to develop new data analysis and processing techniques to apply to the model and new observation data, in order to improve CAT forecasting accuracy. This thesis shows that using in-situ data improves forecasting accuracy and that automated machine learning algorithms such as support vector machines (SVM), logistic regression, and random forests, can match current performance while eliminating almost all hand-tuning. Feature subset selection is paired with the new algorithms to choose diagnostics that predict well as a group rather than individually. Specializing forecasts and choice of diagnostics by geographic region further improves accuracy because of the geographic variation in turbulence sources. This work uses random forests to find climatologically-relevant regions based on these variations and implements a forecasting system testbed which brings these techniques together to rapidly prototype new, regionalized versions of operational CAT forecasting systems.
NASA Astrophysics Data System (ADS)
Liu, Xingchen; Hu, Zhiyong; He, Qingbo; Zhang, Shangbin; Zhu, Jun
2017-10-01
Doppler distortion and background noise can reduce the effectiveness of wayside acoustic train bearing monitoring and fault diagnosis. This paper proposes a method of combining a microphone array and matching pursuit algorithm to overcome these difficulties. First, a dictionary is constructed based on the characteristics and mechanism of a far-field assumption. Then, the angle of arrival of the train bearing is acquired when applying matching pursuit to analyze the acoustic array signals. Finally, after obtaining the resampling time series, the Doppler distortion can be corrected, which is convenient for further diagnostic work. Compared with traditional single-microphone Doppler correction methods, the advantages of the presented array method are its robustness to background noise and its barely requiring pre-measuring parameters. Simulation and experimental study show that the proposed method is effective in performing wayside acoustic bearing fault diagnosis.
Evaluation of the WHO criteria for the classification of patients with mastocytosis.
Sánchez-Muñoz, Laura; Alvarez-Twose, Ivan; García-Montero, Andrés C; Teodosio, Cristina; Jara-Acevedo, María; Pedreira, Carlos E; Matito, Almudena; Morgado, Jose Mario T; Sánchez, Maria Luz; Mollejo, Manuela; Gonzalez-de-Olano, David; Orfao, Alberto; Escribano, Luis
2011-09-01
Diagnosis and classification of mastocytosis is currently based on the World Health Organization (WHO) criteria. Here, we evaluate the utility of the WHO criteria for the diagnosis and classification of a large series of mastocytosis patients (n=133), and propose a new algorithm that could be routinely applied for refined diagnosis and classification of the disease. Our results confirm the utility of the WHO criteria and provide evidence for the need of additional information for (1) a more precise diagnosis of mastocytosis, (2) specific identification of new forms of the disease, (3) the differential diagnosis between cutaneous mastocytosis vs systemic mastocytosis, and (4) improved distinction between indolent systemic mastocytosis and aggressive systemic mastocytosis. Based on our results, a new algorithm is proposed for a better diagnostic definition and prognostic classification of mastocytosis, as confirmed prospectively in an independent validation series of 117 mastocytosis patients.
NASA Astrophysics Data System (ADS)
Huang, Shaohua; Wang, Lan; Chen, Weisheng; Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Li, Buhong; Chen, Rong
2014-11-01
Non-invasive esophagus cancer detection based on urine surface-enhanced Raman spectroscopy (SERS) analysis was presented. Urine SERS spectra were measured on esophagus cancer patients (n = 56) and healthy volunteers (n = 36) for control analysis. Tentative assignments of the urine SERS spectra indicated some interesting esophagus cancer-specific biomolecular changes, including a decrease in the relative content of urea and an increase in the percentage of uric acid in the urine of esophagus cancer patients compared to that of healthy subjects. Principal component analysis (PCA) combined with linear discriminant analysis (LDA) was employed to analyze and differentiate the SERS spectra between normal and esophagus cancer urine. The diagnostic algorithms utilizing a multivariate analysis method achieved a diagnostic sensitivity of 89.3% and specificity of 83.3% for separating esophagus cancer samples from normal urine samples. These results from the explorative work suggested that silver nano particle-based urine SERS analysis coupled with PCA-LDA multivariate analysis has potential for non-invasive detection of esophagus cancer.
A new statistical PCA-ICA algorithm for location of R-peaks in ECG.
Chawla, M P S; Verma, H K; Kumar, Vinod
2008-09-16
The success of ICA to separate the independent components from the mixture depends on the properties of the electrocardiogram (ECG) recordings. This paper discusses some of the conditions of independent component analysis (ICA) that could affect the reliability of the separation and evaluation of issues related to the properties of the signals and number of sources. Principal component analysis (PCA) scatter plots are plotted to indicate the diagnostic features in the presence and absence of base-line wander in interpreting the ECG signals. In this analysis, a newly developed statistical algorithm by authors, based on the use of combined PCA-ICA for two correlated channels of 12-channel ECG data is proposed. ICA technique has been successfully implemented in identifying and removal of noise and artifacts from ECG signals. Cleaned ECG signals are obtained using statistical measures like kurtosis and variance of variance after ICA processing. This analysis also paper deals with the detection of QRS complexes in electrocardiograms using combined PCA-ICA algorithm. The efficacy of the combined PCA-ICA algorithm lies in the fact that the location of the R-peaks is bounded from above and below by the location of the cross-over points, hence none of the peaks are ignored or missed.
Misdiagnosed HIV infection in pregnant women initiating universal ART in South Africa.
Hsiao, Nei-Yuan; Zerbe, Allison; Phillips, Tamsin K; Myer, Landon; Abrams, Elaine J
2017-08-29
Rapid diagnostic tests (RDTs) are the primary diagnostic tools for HIV used in resource-constrained settings. Without a proper confirmation algorithm, there is concern that false-positive (FP) RDTs could result in misdiagnosis of HIV infection and inappropriate antiretroviral treatment (ART) initiation, but programmatic data on FP are few. We examined the accuracy of RDT diagnosis among HIV-infected pregnant women attending public sector antenatal services in Cape Town, South Africa. We describe the proportion of women found to have started on ART erroneously due to FP RDT results based on pre-ART viral load (VL) testing and enzyme-linked immunosorbent assay (ELISA). We analysed 952 consecutively enrolled pregnant women diagnosed as HIV infected based on two RDTs per local guideline and found 4.5% (43/952) of pre-ART VL results to be <50 copies/ml. After excluding 6 women who had detectable virus on subsequent VL measurements, ELISA was performed on the 37 remaining women. Of these, 3/952 (0.3%) HIV RDT diagnoses were found to be FP. We estimate that using ELISA to confirm all positive RDTs would cost $1110 (uncertainty interval $381-$5382) to identify one patient erroneously initiated on ART, while it costs $3912 for a lifetime of antiretrovirals with VL monitoring for one person. Compared to the cost of confirming the RDT-based diagnoses, the cost of HIV misdiagnosis is high. While testing programmes based on RDT should strive for constant quality improvement, where resources permit, laboratory confirmation algorithms can play an important role in strengthening the quality of HIV diagnosis in the era of universal ART.
Neuroanatomical features in soldiers with post-traumatic stress disorder.
Sussman, D; Pang, E W; Jetly, R; Dunkley, B T; Taylor, M J
2016-03-31
Posttraumatic stress disorder (PTSD), an anxiety disorder that can develop after exposure to psychological trauma, impacts up to 20 % of soldiers returning from combat-related deployment. Advanced neuroimaging holds diagnostic and prognostic potential for furthering our understanding of its etiology. Previous imaging studies on combat-related PTSD have focused on selected structures, such as the hippocampi and cortex, but none conducted a comprehensive examination of both the cerebrum and cerebellum. The present study provides a complete analysis of cortical, subcortical, and cerebellar anatomy in a single cohort. Forty-seven magnetic resonance images (MRIs) were collected from 24 soldiers with PTSD and 23 Control soldiers. Each image was segmented into 78 cortical brain regions and 81,924 vertices using the corticometric iterative vertex based estimation of thickness algorithm, allowing for both a region-based and a vertex-based cortical analysis, respectively. Subcortical volumetric analyses of the hippocampi, cerebellum, thalamus, globus pallidus, caudate, putamen, and many sub-regions were conducted following their segmentation using Multiple Automatically Generated Templates Brain algorithm. Participants with PTSD were found to have reduced cortical thickness, primarily in the frontal and temporal lobes, with no preference for laterality. The region-based analyses further revealed localized thinning as well as thickening in several sub-regions. These results were accompanied by decreased volumes of the caudate and right hippocampus, as computed relative to total cerebral volume. Enlargement in several cerebellar lobules (relative to total cerebellar volume) was also observed in the PTSD group. These data highlight the distributed structural differences between soldiers with and without PTSD, and emphasize the diagnostic potential of high-resolution MRI.
School-Based Screening for Suicide Risk: Balancing Costs and Benefits
Wilcox, Holly; Huo, Yanling; Turner, J. Blake; Fisher, Prudence; Shaffer, David
2010-01-01
Objectives. We examined the effects of a scoring algorithm change on the burden and sensitivity of a screen for adolescent suicide risk. Methods. The Columbia Suicide Screen was used to screen 641 high school students for high suicide risk (recent ideation or lifetime attempt and depression, or anxiety, or substance use), determined by subsequent blind assessment with the Diagnostic Interview Schedule for Children. We compared the accuracy of different screen algorithms in identifying high-risk cases. Results. A screen algorithm comprising recent ideation or lifetime attempt or depression, anxiety, or substance-use problems set at moderate-severity level classed 35% of students as positive and identified 96% of high-risk students. Increasing the algorithm's threshold reduced the proportion identified to 24% and identified 92% of high-risk cases. Asking only about recent suicidal ideation or lifetime suicide attempt identified 17% of the students and 89% of high-risk cases. The proportion of nonsuicidal diagnosis–bearing students found with the 3 algorithms was 62%, 34%, and 12%, respectively. Conclusions. The Columbia Suicide Screen threshold can be altered to reduce the screen-positive population, saving costs and time while identifying almost all students at high risk for suicide. PMID:20634467
Chatzistamatiou, Kimon; Moysiadis, Theodoros; Moschaki, Viktoria; Panteleris, Nikolaos; Agorastos, Theodoros
2016-07-01
The objective of the present study was to identify the most effective cervical cancer screening algorithm incorporating different combinations of cytology, HPV testing and genotyping. Women 25-55years old recruited for the "HERMES" (HEllenic Real life Multicentric cErvical Screening) study were screened in terms of cytology and high-risk (hr) HPV testing with HPV 16/18 genotyping. Women positive for cytology or/and hrHPV were referred for colposcopy, biopsy and treatment. Ten screening algorithms based on different combinations of cytology, HPV testing and HPV 16/18 genotyping were investigated in terms of diagnostic accuracy. Three clusters of algorithms were formed according to the balance between effectiveness and harm caused by screening. The cluster showing the best balance included two algorithms based on co-testing and two based on HPV primary screening with HPV 16/18 genotyping. Among these, hrHPV testing with HPV 16/18 genotyping and reflex cytology (atypical squamous cells of undetermined significance - ASCUS threshold) presented the optimal combination of sensitivity (82.9%) and specificity relative to cytology alone (0.99) with 1.26 false positive rate relative to cytology alone. HPV testing with HPV 16/18 genotyping, referring HPV 16/18 positive women directly to colposcopy, and hrHPV (non 16/18) positive women to reflex cytology (ASCUS threshold), as a triage method to colposcopy, reflects the best equilibrium between screening effectiveness and harm. Algorithms, based on cytology as initial screening method, on co-testing or HPV primary without genotyping, and on HPV primary with genotyping but without cytology triage, are not supported according to the present analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Cremers, Charlotte H P; Dankbaar, Jan Willem; Vergouwen, Mervyn D I; Vos, Pieter C; Bennink, Edwin; Rinkel, Gabriel J E; Velthuis, Birgitta K; van der Schaaf, Irene C
2015-05-01
Tracer delay-sensitive perfusion algorithms in CT perfusion (CTP) result in an overestimation of the extent of ischemia in thromboembolic stroke. In diagnosing delayed cerebral ischemia (DCI) after aneurysmal subarachnoid hemorrhage (aSAH), delayed arrival of contrast due to vasospasm may also overestimate the extent of ischemia. We investigated the diagnostic accuracy of tracer delay-sensitive and tracer delay-insensitive algorithms for detecting DCI. From a prospectively collected series of aSAH patients admitted between 2007-2011, we included patients with any clinical deterioration other than rebleeding within 21 days after SAH who underwent NCCT/CTP/CTA imaging. Causes of clinical deterioration were categorized into DCI and no DCI. CTP maps were calculated with tracer delay-sensitive and tracer delay-insensitive algorithms and were visually assessed for the presence of perfusion deficits by two independent observers with different levels of experience. The diagnostic value of both algorithms was calculated for both observers. Seventy-one patients were included. For the experienced observer, the positive predictive values (PPVs) were 0.67 for the delay-sensitive and 0.66 for the delay-insensitive algorithm, and the negative predictive values (NPVs) were 0.73 and 0.74. For the less experienced observer, PPVs were 0.60 for both algorithms, and NPVs were 0.66 for the delay-sensitive and 0.63 for the delay-insensitive algorithm. Test characteristics are comparable for tracer delay-sensitive and tracer delay-insensitive algorithms for the visual assessment of CTP in diagnosing DCI. This indicates that both algorithms can be used for this purpose.
Rothermundt, Christian; Bailey, Alexandra; Cerbone, Linda; Eisen, Tim; Escudier, Bernard; Gillessen, Silke; Grünwald, Viktor; Larkin, James; McDermott, David; Oldenburg, Jan; Porta, Camillo; Rini, Brian; Schmidinger, Manuela; Sternberg, Cora; Putora, Paul M
2015-09-01
With the advent of targeted therapies, many treatment options in the first-line setting of metastatic clear cell renal cell carcinoma (mccRCC) have emerged. Guidelines and randomized trial reports usually do not elucidate the decision criteria for the different treatment options. In order to extract the decision criteria for the optimal therapy for patients, we performed an analysis of treatment algorithms from experts in the field. Treatment algorithms for the treatment of mccRCC from experts of 11 institutions were obtained, and decision trees were deduced. Treatment options were identified and a list of unified decision criteria determined. The final decision trees were analyzed with a methodology based on diagnostic nodes, which allows for an automated cross-comparison of decision trees. The most common treatment recommendations were determined, and areas of discordance were identified. The analysis revealed heterogeneity in most clinical scenarios. The recommendations selected for first-line treatment of mccRCC included sunitinib, pazopanib, temsirolimus, interferon-α combined with bevacizumab, high-dose interleukin-2, sorafenib, axitinib, everolimus, and best supportive care. The criteria relevant for treatment decisions were performance status, Memorial Sloan Kettering Cancer Center risk group, only or mainly lung metastases, cardiac insufficiency, hepatic insufficiency, age, and "zugzwang" (composite of multiple, related criteria). In the present study, we used diagnostic nodes to compare treatment algorithms in the first-line treatment of mccRCC. The results illustrate the heterogeneity of the decision criteria and treatment strategies for mccRCC and how available data are interpreted and implemented differently among experts. The data provided in the present report should not be considered to serve as treatment recommendations for the management of treatment-naïve patients with multiple metastases from metastatic clear cell renal cell carcinoma outside a clinical trial; however, the data highlight the different treatment options and the criteria used to select them. The diversity in decision making and how results from phase III trials can be interpreted and implemented differently in daily practice are demonstrated. ©AlphaMed Press.
ERIC Educational Resources Information Center
Hus, Vanessa; Lord, Catherine
2013-01-01
The Autism Diagnostic Interview-Revised (ADI-R) is commonly used to inform diagnoses of autism spectrum disorders (ASD). Considering the time dedicated to using the ADI-R, it is of interest to expand the ways in which information obtained from this interview is used. The current study examines how algorithm totals reflecting past (ADI-Diagnostic)…
Development of a novel diagnostic algorithm to predict NASH in HCV-positive patients.
Gallotta, Andrea; Paneghetti, Laura; Mrázová, Viera; Bednárová, Adriana; Kružlicová, Dáša; Frecer, Vladimir; Miertus, Stanislav; Biasiolo, Alessandra; Martini, Andrea; Pontisso, Patrizia; Fassina, Giorgio
2018-05-01
Non-alcoholic steato-hepatitis (NASH) is a severe disease characterised by liver inflammation and progressive hepatic fibrosis, which may progress to cirrhosis and hepatocellular carcinoma. Clinical evidence suggests that in hepatitis C virus patients steatosis and NASH are associated with faster fibrosis progression and hepatocellular carcinoma. A safe and reliable non-invasive diagnostic method to detect NASH at its early stages is still needed to prevent progression of the disease. We prospectively enrolled 91 hepatitis C virus-positive patients with histologically proven chronic liver disease: 77 patients were included in our study; of these, 10 had NASH. For each patient, various clinical and serological variables were collected. Different algorithms combining squamous cell carcinoma antigen-immunoglobulin-M (SCCA-IgM) levels with other common clinical data were created to provide the probability of having NASH. Our analysis revealed a statistically significant correlation between the histological presence of NASH and SCCA-IgM, insulin, homeostasis model assessment, haemoglobin, high-density lipoprotein and ferritin levels, and smoke. Compared to the use of a single marker, algorithms that combined four, six or seven variables identified NASH with higher accuracy. The best diagnostic performance was obtained with the logistic regression combination, which included all seven variables correlated with NASH. The combination of SCCA-IgM with common clinical data shows promising diagnostic performance for the detection of NASH in hepatitis C virus patients.
Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli
2002-01-01
Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.
Stothard, J Russell; Adams, Emily
2014-12-01
There are many reasons why detection of parasites of medical and veterinary importance is vital and where novel diagnostic and surveillance tools are required. From a medical perspective alone, these originate from a desire for better clinical management and rational use of medications. Diagnosis can be at the individual-level, at close to patient settings in testing a clinical suspicion or at the community-level, perhaps in front of a computer screen, in classification of endemic areas and devising appropriate control interventions. Thus diagnostics for parasitic diseases has a broad remit as parasites are not only tied with their definitive hosts but also in some cases with their vectors/intermediate hosts. Application of current diagnostic tools and decision algorithms in sustaining control programmes, or in elimination settings, can be problematic and even ill-fitting. For example in resource-limited settings, are current diagnostic tools sufficiently robust for operational use at scale or are they confounded by on-the-ground realities; are the diagnostic algorithms underlying public health interventions always understood and well-received within communities which are targeted for control? Within this Special Issue (SI) covering a variety of diseases and diagnostic settings some answers are forthcoming. An important theme, however, throughout the SI is to acknowledge that cross-talk and continuous feedback between development and application of diagnostic tests is crucial if they are to be used effectively and appropriately.
New system for digital to analog transformation and reconstruction of 12-lead ECGs.
Kothadia, Roshni; Kulecz, Walter B; Kofman, Igor S; Black, Adam J; Grier, James W; Schlegel, Todd T
2013-01-01
We describe initial validation of a new system for digital to analog conversion (DAC) and reconstruction of 12-lead ECGs. The system utilizes an open and optimized software format with a commensurately optimized DAC hardware configuration to accurately reproduce, from digital files, the original analog electrocardiographic signals of previously instrumented patients. By doing so, the system also ultimately allows for transmission of data collected on one manufacturer's 12-lead ECG hardware/software into that of any other. To initially validate the system, we compared original and post-DAC re-digitized 12-lead ECG data files (∼5-minutes long) in two types of validation studies in 10 patients. The first type quantitatively compared the total waveform voltage differences between the original and re-digitized data while the second type qualitatively compared the automated electrocardiographic diagnostic statements generated by the original versus re-digitized data. The grand-averaged difference in root mean squared voltage between the original and re-digitized data was 20.8 µV per channel when re-digitization involved the same manufacturer's analog to digital converter (ADC) as the original digitization, and 28.4 µV per channel when it involved a different manufacturer's ADC. Automated diagnostic statements generated by the original versus reconstructed data did not differ when using the diagnostic algorithm from the same manufacturer on whose device the original data were collected, and differed only slightly for just 1 of 10 patients when using a third-party diagnostic algorithm throughout. Original analog 12-lead ECG signals can be reconstructed from digital data files with accuracy sufficient for clinical use. Such reconstructions can readily enable automated second opinions for difficult-to-interpret 12-lead ECGs, either locally or remotely through the use of dedicated or cloud-based servers.
Goldkorn, Ronen; Goitein, Orly; Ben-Zekery, Sagit; Shlomo, Nir; Narodetsky, Michael; Livne, Moran; Sabbag, Avi; Asher, Elad; Matetzky, Shlomi
2016-01-01
An accelerated diagnostic protocol for evaluating low-risk patients with acute chest pain in a cardiologist-based chest pain unit (CPU) is widely employed today. However, limited data exist regarding the feasibility of such an algorithm for patients with a history of prior coronary artery disease (CAD). The aim of the current study was to assess the feasibility and safety of evaluating patients with a history of prior CAD using an accelerated diagnostic protocol. We evaluated 1,220 consecutive patients presenting with acute chest pain and hospitalized in our CPU. Patients were stratified according to whether they had a history of prior CAD or not. The primary composite outcome was defined as a composite of readmission due to chest pain, acute coronary syndrome, coronary revascularization, or death during a 60-day follow-up period. Overall, 268 (22%) patients had a history of prior CAD. Non-invasive evaluation was performed in 1,112 (91%) patients. While patients with a history of prior CAD had more comorbidities, the two study groups were similar regarding hospitalization rates (9% vs. 13%, p = 0.08), coronary angiography (13% vs. 11%, p = 0.41), and revascularization (6.5% vs. 5.7%, p = 0.8) performed during CPU evaluation. At 60-days the primary endpoint was observed in 12 (1.6%) and 6 (3.2%) patients without and with a history of prior CAD, respectively (p = 0.836). No mortalities were recorded. To conclude, Patients with a history of prior CAD can be expeditiously and safely evaluated using an accelerated diagnostic protocol in a CPU with outcomes not differing from patients without such a history. PMID:27669521
Maximum likelihood phase-retrieval algorithm: applications.
Nahrstedt, D A; Southwell, W H
1984-12-01
The maximum likelihood estimator approach is shown to be effective in determining the wave front aberration in systems involving laser and flow field diagnostics and optical testing. The robustness of the algorithm enables convergence even in cases of severe wave front error and real, nonsymmetrical, obscured amplitude distributions.
Quint, Jennifer K; Müllerova, Hana; DiSantostefano, Rachael L; Forbes, Harriet; Eaton, Susan; Hurst, John R; Davis, Kourtney; Smeeth, Liam
2014-01-01
Objectives The optimal method of identifying people with chronic obstructive pulmonary disease (COPD) from electronic primary care records is not known. We assessed the accuracy of different approaches using the Clinical Practice Research Datalink, a UK electronic health record database. Setting 951 participants registered with a CPRD practice in the UK between 1 January 2004 and 31 December 2012. Individuals were selected for ≥1 of 8 algorithms to identify people with COPD. General practitioners were sent a brief questionnaire and additional evidence to support a COPD diagnosis was requested. All information received was reviewed independently by two respiratory physicians whose opinion was taken as the gold standard. Primary outcome measure The primary measure of accuracy was the positive predictive value (PPV), the proportion of people identified by each algorithm for whom COPD was confirmed. Results 951 questionnaires were sent and 738 (78%) returned. After quality control, 696 (73.2%) patients were included in the final analysis. All four algorithms including a specific COPD diagnostic code performed well. Using a diagnostic code alone, the PPV was 86.5% (77.5–92.3%) while requiring a diagnosis plus spirometry plus specific medication; the PPV was slightly higher at 89.4% (80.7–94.5%) but reduced case numbers by 10%. Algorithms without specific diagnostic codes had low PPVs (range 12.2–44.4%). Conclusions Patients with COPD can be accurately identified from UK primary care records using specific diagnostic codes. Requiring spirometry or COPD medications only marginally improved accuracy. The high accuracy applies since the introduction of an incentivised disease register for COPD as part of Quality and Outcomes Framework in 2004. PMID:25056980
Baltzer, Pascal A T; Dietzel, Matthias; Kaiser, Werner A
2013-08-01
In the face of multiple available diagnostic criteria in MR-mammography (MRM), a practical algorithm for lesion classification is needed. Such an algorithm should be as simple as possible and include only important independent lesion features to differentiate benign from malignant lesions. This investigation aimed to develop a simple classification tree for differential diagnosis in MRM. A total of 1,084 lesions in standardised MRM with subsequent histological verification (648 malignant, 436 benign) were investigated. Seventeen lesion criteria were assessed by 2 readers in consensus. Classification analysis was performed using the chi-squared automatic interaction detection (CHAID) method. Results include the probability for malignancy for every descriptor combination in the classification tree. A classification tree incorporating 5 lesion descriptors with a depth of 3 ramifications (1, root sign; 2, delayed enhancement pattern; 3, border, internal enhancement and oedema) was calculated. Of all 1,084 lesions, 262 (40.4 %) and 106 (24.3 %) could be classified as malignant and benign with an accuracy above 95 %, respectively. Overall diagnostic accuracy was 88.4 %. The classification algorithm reduced the number of categorical descriptors from 17 to 5 (29.4 %), resulting in a high classification accuracy. More than one third of all lesions could be classified with accuracy above 95 %. • A practical algorithm has been developed to classify lesions found in MR-mammography. • A simple decision tree consisting of five criteria reaches high accuracy of 88.4 %. • Unique to this approach, each classification is associated with a diagnostic certainty. • Diagnostic certainty of greater than 95 % is achieved in 34 % of all cases.
Sideroudi, Haris; Labiris, Georgios; Georgantzoglou, Kimon; Ntonti, Panagiota; Siganos, Charalambos; Kozobolis, Vassilios
2017-07-01
To develop an algorithm for the Fourier analysis of posterior corneal videokeratographic data and to evaluate the derived parameters in the diagnosis of Subclinical Keratoconus (SKC) and Keratoconus (KC). This was a cross-sectional, observational study that took place in the Eye Institute of Thrace, Democritus University, Greece. Eighty eyes formed the KC group, 55 eyes formed the SKC group while 50 normal eyes populated the control group. A self-developed algorithm in visual basic for Microsoft Excel performed a Fourier series harmonic analysis for the posterior corneal sagittal curvature data. The algorithm decomposed the obtained curvatures into a spherical component, regular astigmatism, asymmetry and higher order irregularities for averaged central 4 mm and for each individual ring separately (1, 2, 3 and 4 mm). The obtained values were evaluated for their diagnostic capacity using receiver operating curves (ROC). Logistic regression was attempted for the identification of a combined diagnostic model. Significant differences were detected in regular astigmatism, asymmetry and higher order irregularities among groups. For the SKC group, the parameters with high diagnostic ability (AUC > 90%) were the higher order irregularities, the asymmetry and the regular astigmatism, mainly in the corneal periphery. Higher predictive accuracy was identified using diagnostic models that combined the asymmetry, regular astigmatism and higher order irregularities in averaged 3and 4 mm area (AUC: 98.4%, Sensitivity: 91.7% and Specificity:100%). Fourier decomposition of posterior Keratometric data provides parameters with high accuracy in differentiating SKC from normal corneas and should be included in the prompt diagnosis of KC. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.
Diagnostic algorithm for relapsing acquired demyelinating syndromes in children.
Hacohen, Yael; Mankad, Kshitij; Chong, W K; Barkhof, Frederik; Vincent, Angela; Lim, Ming; Wassmer, Evangeline; Ciccarelli, Olga; Hemingway, Cheryl
2017-07-18
To establish whether children with relapsing acquired demyelinating syndromes (RDS) and myelin oligodendrocyte glycoprotein antibodies (MOG-Ab) show distinctive clinical and radiologic features and to generate a diagnostic algorithm for the main RDS for clinical use. A panel reviewed the clinical characteristics, MOG-Ab and aquaporin-4 (AQP4) Ab, intrathecal oligoclonal bands, and Epstein-Barr virus serology results of 110 children with RDS. A neuroradiologist blinded to the diagnosis scored the MRI scans. Clinical, radiologic, and serologic tests results were compared. The findings showed that 56.4% of children were diagnosed with multiple sclerosis (MS), 25.4% with neuromyelitis optica spectrum disorder (NMOSD), 12.7% with multiphasic disseminated encephalomyelitis (MDEM), and 5.5% with relapsing optic neuritis (RON). Blinded analysis defined baseline MRI as typical of MS in 93.5% of children with MS. Acute disseminated encephalomyelitis presentation was seen only in the non-MS group. Of NMOSD cases, 30.7% were AQP4-Ab positive. MOG-Ab were found in 83.3% of AQP4-Ab-negative NMOSD, 100% of MDEM, and 33.3% of RON. Children with MOG-Ab were younger, were less likely to present with area postrema syndrome, and had lower disability, longer time to relapse, and more cerebellar peduncle lesions than children with AQP4-Ab NMOSD. A diagnostic algorithm applicable to any episode of CNS demyelination leads to 4 main phenotypes: MS, AQP4-Ab NMOSD, MOG-Ab-associated disease, and antibody-negative RDS. Children with MS and AQP4-Ab NMOSD showed features typical of adult cases. Because MOG-Ab-positive children showed notable and distinctive clinical and MRI features, they were grouped into a unified phenotype (MOG-Ab-associated disease), included in a new diagnostic algorithm. © 2017 American Academy of Neurology.
The influence of parental concern on the utility of autism diagnostic instruments.
Havdahl, Karoline Alexandra; Bishop, Somer L; Surén, Pål; Øyen, Anne-Siri; Lord, Catherine; Pickles, Andrew; von Tetzchner, Stephen; Schjølberg, Synnve; Gunnes, Nina; Hornig, Mady; Lipkin, W Ian; Susser, Ezra; Bresnahan, Michaeline; Magnus, Per; Stenberg, Nina; Reichborn-Kjennerud, Ted; Stoltenberg, Camilla
2017-10-01
The parental report-based Autism Diagnostic Interview-Revised (ADI-R) and the clinician observation-based Autism Diagnostic Observation Schedule (ADOS) have been validated primarily in U.S. clinics specialized in autism spectrum disorder (ASD), in which most children are referred by their parents because of ASD concern. This study assessed diagnostic agreement of the ADOS-2 and ADI-R toddler algorithms in a more broadly based sample of 679 toddlers (age 35-47 months) from the Norwegian Mother and Child Cohort. We also examined whether parental concern about ASD influenced instrument performance, comparing toddlers identified based on parental ASD concern (n = 48) and parent-reported signs of developmental problems (screening) without a specific concern about ASD (n = 400). The ADOS cutoffs showed consistently well-balanced sensitivity and specificity. The ADI-R cutoffs demonstrated good specificity, but reduced sensitivity, missing 43% of toddlers whose parents were not specifically concerned about ASD. The ADI-R and ADOS dimensional scores agreed well with clinical diagnoses (area under the curve ≥ 0.85), contributing additively to their prediction. On the ADI-R, different cutoffs were needed according to presence or absence of parental ASD concern, in order to achieve comparable balance of sensitivity and specificity. These results highlight the importance of taking parental concern about ASD into account when interpreting scores from parental report-based instruments such as the ADI-R. While the ADOS cutoffs performed consistently well, the additive contributions of ADI-R and ADOS scores to the prediction of ASD diagnosis underscore the value of combining instruments based on parent accounts and clinician observation in evaluation of ASD. Autism Res 2017, 10: 1672-1686. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.
Yadav, Ravi K; Begum, Viquar U; Addepalli, Uday K; Senthil, Sirisha; Garudadri, Chandra S; Rao, Harsha L
2016-02-01
To compare the abilities of retinal nerve fiber layer (RNFL) parameters of variable corneal compensation (VCC) and enhanced corneal compensation (ECC) algorithms of scanning laser polarimetry (GDx) in detecting various severities of glaucoma. Two hundred and eighty-five eyes of 194 subjects from the Longitudinal Glaucoma Evaluation Study who underwent GDx VCC and ECC imaging were evaluated. Abilities of RNFL parameters of GDx VCC and ECC to diagnose glaucoma were compared using area under receiver operating characteristic curves (AUC), sensitivities at fixed specificities, and likelihood ratios. After excluding 5 eyes that failed to satisfy manufacturer-recommended quality parameters with ECC and 68 with VCC, 56 eyes of 41 normal subjects and 161 eyes of 121 glaucoma patients [36 eyes with preperimetric glaucoma, 52 eyes with early (MD>-6 dB), 34 with moderate (MD between -6 and -12 dB), and 39 with severe glaucoma (MD<-12 dB)] were included for the analysis. Inferior RNFL, average RNFL, and nerve fiber indicator parameters showed the best AUCs and sensitivities both with GDx VCC and ECC in diagnosing all severities of glaucoma. AUCs and sensitivities of all RNFL parameters were comparable between the VCC and ECC algorithms (P>0.20 for all comparisons). Likelihood ratios associated with the diagnostic categorization of RNFL parameters were comparable between the VCC and ECC algorithms. In scans satisfying the manufacturer-recommended quality parameters, which were significantly greater with ECC than VCC algorithm, diagnostic abilities of GDx ECC and VCC in glaucoma were similar.
Twenty-four cases of imported zika virus infections diagnosed by molecular methods.
Alejo-Cancho, Izaskun; Torner, Nuria; Oliveira, Inés; Martínez, Ana; Muñoz, José; Jane, Mireia; Gascón, Joaquim; Requena-Méndez, Ana; Vilella, Anna; Marcos, M Ángeles; Pinazo, María Jesús; Gonzalo, Verónica; Rodriguez, Natalia; Martínez, Miguel J
2016-10-01
Zika virus is an emerging flavivirus widely spreading through Latin America. Molecular diagnosis of the infection can be performed using serum, urine and saliva samples, although a well-defined diagnostic algorithm is not yet established. We describe a series of 24 cases of imported zika virus infection into Catalonia (northeastern Spain). Based on our findings, testing of paired serum and urine samples is recommended. Copyright © 2016 Elsevier Inc. All rights reserved.
Diagnosis and sensor validation through knowledge of structure and function
NASA Technical Reports Server (NTRS)
Scarl, Ethan A.; Jamieson, John R.; Delaune, Carl I.
1987-01-01
The liquid oxygen expert system 'LES' is proposed as the first capable of diagnostic reasoning from sensor data, using model-based knowledge of structure and function to find the expected state of all system objects, including sensors. The approach is generally algorithmic rather than heuristic, and represents uncertainties as sets of possibilities. Functional relationships are inverted to determine hypothetical values for potentially faulty objects, and may include conditional functions not normally considered to have inverses.
Experimental Validation of a Resilient Monitoring and Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen-Chiao Lin; Kris R. E. Villez; Humberto E. Garcia
2014-05-01
Complex, high performance, engineering systems have to be closely monitored and controlled to ensure safe operation and protect public from potential hazards. One of the main challenges in designing monitoring and control algorithms for these systems is that sensors and actuators may be malfunctioning due to malicious or natural causes. To address this challenge, this paper addresses a resilient monitoring and control (ReMAC) system by expanding previously developed resilient condition assessment monitoring systems and Kalman filter-based diagnostic methods and integrating them with a supervisory controller developed here. While the monitoring and diagnostic algorithms assess plant cyber and physical health conditions,more » the supervisory controller selects, from a set of candidates, the best controller based on the current plant health assessments. To experimentally demonstrate its enhanced performance, the developed ReMAC system is then used for monitoring and control of a chemical reactor with a water cooling system in a hardware-in-the-loop setting, where the reactor is computer simulated and the water cooling system is implemented by a machine condition monitoring testbed at Idaho National Laboratory. Results show that the ReMAC system is able to make correct plant health assessments despite sensor malfunctioning due to cyber attacks and make decisions that achieve best control actions despite possible actuator malfunctioning. Monitoring challenges caused by mismatches between assumed system component models and actual measurements are also identified for future work.« less
NASA Astrophysics Data System (ADS)
Ivanov, Arkady P.; Barun, Vladimir V.
2007-05-01
A calculation scheme and an algorithm to simultaneously diagnose several structural and biophysical parameters of skin by reflected light are constructed in the paper. The procedure is based the fact that, after absorption and scattering, light reflected by tissue contains information on its optically active chromophores and structure. The problem on isolating the desired parameters is a spectroscopic one under multiple scattering conditions. The latter considerably complicates the solution of the problem and requires the elaboration of an approach that is specific to the object studied. The procedure presented in the paper is based on spectral tissue model properties proposed earlier and engineering methods for solving the radiative transfer equation. The desired parameters are melanin and blood volume fractions, f and c, epidermis thickness d, mean diameter D of capillaries, and blood oxygenation degree S. Spectral diffuse reflectance R(λ) of skin over the range of 400 to 850 nm was calculated as a first stage. Then the sensitivity of R(λ) to the above parameters was studied to optimize the algorithm by wavelengths and to propose an experimental scheme for diagnostics. It is shown that blood volume fraction and f*d product can be rather surely determined by the reflected green -- red light. One can find f and d separately as well as D by the blue reflectance. The last stage is the derivation of S at about 600 nm.
Implementation of several mathematical algorithms to breast tissue density classification
NASA Astrophysics Data System (ADS)
Quintana, C.; Redondo, M.; Tirao, G.
2014-02-01
The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories.
Benefits and challenges of molecular diagnostics for childhood tuberculosis.
Gutierrez, Cristina
2016-12-01
Expanding tuberculosis (TB)-diagnostic services, including access to rapid tests, is a World Health Organization (WHO) strategy to accelerate progress toward ending TB. Faster and more sensitive molecular tests capable of diagnosing TB and drug-resistant TB have the technical capacity to address limitations associated with smears and cultures by increasing accuracy and shortening turnaround times as compared with those of these conventional laboratory methods. Nucleic acid amplification assays used to detect and analyze Mycobacterium tuberculosis (MTB)-complex nucleic acids can be used directly on specimens from patients suspected of having TB. Recently, several commercial molecular tests were developed to detect MTB and determine the drug resistance (DR) based on detection of specific genetic mutations conferring resistance. The first to be endorsed by the WHO was molecular line-probe assay technology. This test uses polymerase chain reaction (PCR) and reverse-hybridization methods to rapidly identify MTB and DR-related mutations simultaneously. More recently, the WHO endorsed Xpert MTB/RIF, Cepheid Inc, CA, USA, a fully automated assay used for TB diagnosis that relies upon PCR techniques for detection of TB and rifampicin resistance-related mutations. Other promising molecular TB assays for simplifying PCR-based testing protocols and increasing their accuracy are under development and evaluation. Although we lack a practical gold standard for the diagnosis of childhood TB, its bacteriological confirmation is always recommended to be sought whenever possible prior to a diagnostic decision being made. Conventional diagnostic laboratory TB tests are less efficient for children as compared with adults, because sufficient sputum samples are more difficult to collect from infants and young children, and their disease is often paucibacillary, resulting in smear-negative disease. These inherent challenges associated with childhood TB are due to immunological- and pathophysiological-response differences relative to those observed in adults. Several recent meta-analyses showed low sensitivity estimates of PCR-based TB assays for paucibacillary forms of TB (extrapulmonary TB and smear-negative pulmonary disease), which represent the vast majority of childhood TB cases. Despite the lack of evidence regarding use of the rapid molecular assays to identify TB and detect DR in children, and due to the clinical nature of childhood TB, TB-expert groups recommend including rapid methods for TB identification and DR detection in diagnostic algorithms for children suspected of both smear-positive and -negative pulmonary or extrapulmonary TB, both with or without human immunodeficiency virus (HIV)-coinfection, when combined with standard methods (including clinical, microbiological, and radiological assessment) for diagnosing active TB and conventional DR. Since 2011, the WHO has specifically recommended use of the Xpert MTB/RIF test as an initial diagnostic tool for children with suspected HIV-associated TB or multidrug-resistant TB based on successful treatment data related to adults. Implementation of the rapid molecular assays for rapid detection of TB and DR should occur in laboratories with proven capability to run molecular tests and where quality control systems are implemented. Molecular approaches should be more largely tested in children, given their status as the group in whom the diagnostic dilemma is most pronounced. These tests should also be included in specific childhood TB diagnostic algorithms adapted to the local/national context in combination with other strategies for improving diagnostics, including more effective specimen collection. Copyright © 2016.
Feeding Disorders in Children with Developmental Disabilities.
ERIC Educational Resources Information Center
Schwarz, Steven M.
2003-01-01
This article describes an approach to evaluating and managing feeding disorders in children with developmental disabilities and examines effects of these management strategies on growth and clinical outcomes. A structured approach is stressed and a diagnostic and treatment algorithm is presented. Use with 79 children found that diagnostic-specific…
Danforth, Kim N; Early, Megan I; Ngan, Sharon; Kosco, Anne E; Zheng, Chengyi; Gould, Michael K
2012-08-01
Lung nodules are commonly encountered in clinical practice, yet little is known about their management in community settings. An automated method for identifying patients with lung nodules would greatly facilitate research in this area. Using members of a large, community-based health plan from 2006 to 2010, we developed a method to identify patients with lung nodules, by combining five diagnostic codes, four procedural codes, and a natural language processing algorithm that performed free text searches of radiology transcripts. An experienced pulmonologist reviewed a random sample of 116 radiology transcripts, providing a reference standard for the natural language processing algorithm. With the use of an automated method, we identified 7112 unique members as having one or more incident lung nodules. The mean age of the patients was 65 years (standard deviation 14 years). There were slightly more women (54%) than men, and Hispanics and non-whites comprised 45% of the lung nodule cohort. Thirty-six percent were never smokers whereas 11% were current smokers. Fourteen percent of the patients were subsequently diagnosed with lung cancer. The sensitivity and specificity of the natural language processing algorithm for identifying the presence of lung nodules were 96% and 86%, respectively, compared with clinician review. Among the true positive transcripts in the validation sample, only 35% were solitary and unaccompanied by one or more associated findings, and 56% measured 8 to 30 mm in diameter. A combination of diagnostic codes, procedural codes, and a natural language processing algorithm for free text searching of radiology reports can accurately and efficiently identify patients with incident lung nodules, many of whom are subsequently diagnosed with lung cancer.
Kitchen, Levi; Lawrence, Matthew; Speicher, Matthew; Frumkin, Kenneth
2016-01-01
Introduction Unilateral leg swelling with suspicion of deep venous thrombosis (DVT) is a common emergency department (ED) presentation. Proximal DVT (thrombus in the popliteal or femoral veins) can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT) often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS), a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1) propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS) is unavailable; and (2) summarize the controversy surrounding IC-DVT treatment. Discussion The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. Conclusion When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient’s risk factors for both thrombus propagation and complications of anticoagulation. PMID:27429688
Kitchen, Levi; Lawrence, Matthew; Speicher, Matthew; Frumkin, Kenneth
2016-07-01
Unilateral leg swelling with suspicion of deep venous thrombosis (DVT) is a common emergency department (ED) presentation. Proximal DVT (thrombus in the popliteal or femoral veins) can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT) often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS), a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1) propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS) is unavailable; and (2) summarize the controversy surrounding IC-DVT treatment. The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient's risk factors for both thrombus propagation and complications of anticoagulation.
NASA Astrophysics Data System (ADS)
Kodali, Anuradha
In this thesis, we develop dynamic multiple fault diagnosis (DMFD) algorithms to diagnose faults that are sporadic and coupled. Firstly, we formulate a coupled factorial hidden Markov model-based (CFHMM) framework to diagnose dependent faults occurring over time (dynamic case). Here, we implement a mixed memory Markov coupling model to determine the most likely sequence of (dependent) fault states, the one that best explains the observed test outcomes over time. An iterative Gauss-Seidel coordinate ascent optimization method is proposed for solving the problem. A soft Viterbi algorithm is also implemented within the framework for decoding dependent fault states over time. We demonstrate the algorithm on simulated and real-world systems with coupled faults; the results show that this approach improves the correct isolation rate as compared to the formulation where independent fault states are assumed. Secondly, we formulate a generalization of set-covering, termed dynamic set-covering (DSC), which involves a series of coupled set-covering problems over time. The objective of the DSC problem is to infer the most probable time sequence of a parsimonious set of failure sources that explains the observed test outcomes over time. The DSC problem is NP-hard and intractable due to the fault-test dependency matrix that couples the failed tests and faults via the constraint matrix, and the temporal dependence of failure sources over time. Here, the DSC problem is motivated from the viewpoint of a dynamic multiple fault diagnosis problem, but it has wide applications in operations research, for e.g., facility location problem. Thus, we also formulated the DSC problem in the context of a dynamically evolving facility location problem. Here, a facility can be opened, closed, or can be temporarily unavailable at any time for a given requirement of demand points. These activities are associated with costs or penalties, viz., phase-in or phase-out for the opening or closing of a facility, respectively. The set-covering matrix encapsulates the relationship among the rows (tests or demand points) and columns (faults or locations) of the system at each time. By relaxing the coupling constraints using Lagrange multipliers, the DSC problem can be decoupled into independent subproblems, one for each column. Each subproblem is solved using the Viterbi decoding algorithm, and a primal feasible solution is constructed by modifying the Viterbi solutions via a heuristic. The proposed Viterbi-Lagrangian relaxation algorithm (VLRA) provides a measure of suboptimality via an approximate duality gap. As a major practical extension of the above problem, we also consider the problem of diagnosing faults with delayed test outcomes, termed delay-dynamic set-covering (DDSC), and experiment with real-world problems that exhibit masking faults. Also, we present simulation results on OR-library datasets (set-covering formulations are predominantly validated on these matrices in the literature), posed as facility location problems. Finally, we implement these algorithms to solve problems in aerospace and automotive applications. Firstly, we address the diagnostic ambiguity problem in aerospace and automotive applications by developing a dynamic fusion framework that includes dynamic multiple fault diagnosis algorithms. This improves the correct fault isolation rate, while minimizing the false alarm rates, by considering multiple faults instead of the traditional data-driven techniques based on single fault (class)-single epoch (static) assumption. The dynamic fusion problem is formulated as a maximum a posteriori decision problem of inferring the fault sequence based on uncertain outcomes of multiple binary classifiers over time. The fusion process involves three steps: the first step transforms the multi-class problem into dichotomies using error correcting output codes (ECOC), thereby solving the concomitant binary classification problems; the second step fuses the outcomes of multiple binary classifiers over time using a sliding window or block dynamic fusion method that exploits temporal data correlations over time. We solve this NP-hard optimization problem via a Lagrangian relaxation (variational) technique. The third step optimizes the classifier parameters, viz., probabilities of detection and false alarm, using a genetic algorithm. The proposed algorithm is demonstrated by computing the diagnostic performance metrics on a twin-spool commercial jet engine, an automotive engine, and UCI datasets (problems with high classification error are specifically chosen for experimentation). We show that the primal-dual optimization framework performed consistently better than any traditional fusion technique, even when it is forced to give a single fault decision across a range of classification problems. Secondly, we implement the inference algorithms to diagnose faults in vehicle systems that are controlled by a network of electronic control units (ECUs). The faults, originating from various interactions and especially between hardware and software, are particularly challenging to address. Our basic strategy is to divide the fault universe of such cyber-physical systems in a hierarchical manner, and monitor the critical variables/signals that have impact at different levels of interactions. The proposed diagnostic strategy is validated on an electrical power generation and storage system (EPGS) controlled by two ECUs in an environment with CANoe/MATLAB co-simulation. Eleven faults are injected with the failures originating in actuator hardware, sensor, controller hardware and software components. Diagnostic matrix is established to represent the relationship between the faults and the test outcomes (also known as fault signatures) via simulations. The results show that the proposed diagnostic strategy is effective in addressing the interaction-caused faults.
Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T
2017-07-01
Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.
[Monogenic and syndromic symptoms of morbid obesity. Rare but important].
Wiegand, S; Krude, H
2015-02-01
Monogenic and syndromic obesity are rare diseases with variable manifestation. Therefore diagnosis is difficult and often delayed. The purpose of this work was to develop a clinical diagnostic algorithm for earlier diagnosis. Available publications for clinical symptoms and molecular defects of monogenic and syndromic obesity cases were evaluated. Monogenic and syndromic obesity can be expected in cases with early manifestation before the age of 5 years and a BMI above 40 or above the 99th percentile. Syndromic cases are mostly associated with a low IQ and dwarfism. Monogenic cases are associated with additional endocrine defects. Measurement of serum leptin proves the treatable leptin deficiency. Sequencing of the melanocortin-4 receptor gene (MC4R) allows diagnosis of the most frequent monogenic form of obesity. Treatment with a melanocyte-stimulating hormone (MSH) analog can be expected in the future. Early treatment of children with Prader-Willi syndrome can prevent severe obesity. Because in some cases treatment is available, monogenic and syndromic obesity should be diagnosed early. Based on the disease symptoms, serum leptin, and MC4R sequencing, a diagnostic algorithm is proposed, which can be used to diagnose cases of morbid obesity.
Digestive disease management in Japan: a report on the 6th diagnostic pathology summer fest in 2012.
Ichikawa, Kazuhito; Fujimori, Takahiro; Moriya, Takuya; Ochiai, Atsushi; Yoshinaga, Shigetaka; Kushima, Ryouji; Nagahama, Ryuji; Ohkura, Yasuo; Tanaka, Shinji; Ajioka, Yoichi; Hirata, Ichiro; Tanaka, Masanori; Hoshihara, Yoshio; Kinoshita, Yoshikazu; Sasano, Hironobu; Iwashita, Akinori; Tomita, Shigeki; Hirota, Seiichi; Yao, Takashi; Fujii, Shigehiko; Matsuda, Takahisa; Ueno, Hideki; Ishikawa, Yuichi; Takubo, Kaiyo; Fukushima, Noriyoshi; Sugai, Tamotsu; Iwafuchi, Mitsuya; Imura, Jhoji; Manabe, Toshiaki; Fukayama, Masahisa
2013-01-01
The 6th Diagnostic Pathology Summer Fest, held in Tokyo on August 25-26, 2012, opened its gates for everyone in the medical profession. Basic pathology training can contribute to the improvement of algorithms for diagnosis and treatment. The 6th Summer Fest with the theme 'Pathology and Clinical Treatment of Gastrointestinal Diseases' was held at the Ito International Research Center, The University of Tokyo. On August 25, 'Treatment of Early Gastrointestinal Cancer and New Guidelines' was discussed in the first session, followed by 'Biopsy Diagnosis of Digestive Tract: Key Points of Pathological Diagnosis for Inflammation and Their Clinical Significance' in the second session. On August 26, cases were discussed in the third session, and issues on pathological diagnosis and classification of neuroendorcrine tumor in the fourth session. The summaries of speeches and discussions are introduced along with the statements of each speaker. This meeting was not a formal evidence-based consensus conference, and 20 experts gave talks on their areas of specialty. Discussion was focused on how the management strategy should be standardized on the algorithm of patient care. © 2013 S. Karger AG, Basel.
Modeling paradigms for medical diagnostic decision support: a survey and future directions.
Wagholikar, Kavishwar B; Sundararajan, Vijayraghavan; Deshpande, Ashok W
2012-10-01
Use of computer based decision tools to aid clinical decision making, has been a primary goal of research in biomedical informatics. Research in the last five decades has led to the development of Medical Decision Support (MDS) applications using a variety of modeling techniques, for a diverse range of medical decision problems. This paper surveys literature on modeling techniques for diagnostic decision support, with a focus on decision accuracy. Trends and shortcomings of research in this area are discussed and future directions are provided. The authors suggest that-(i) Improvement in the accuracy of MDS application may be possible by modeling of vague and temporal data, research on inference algorithms, integration of patient information from diverse sources and improvement in gene profiling algorithms; (ii) MDS research would be facilitated by public release of de-identified medical datasets, and development of opensource data-mining tool kits; (iii) Comparative evaluations of different modeling techniques are required to understand characteristics of the techniques, which can guide developers in choice of technique for a particular medical decision problem; and (iv) Evaluations of MDS applications in clinical setting are necessary to foster physicians' utilization of these decision aids.
Comparison of Traditional and Reverse Syphilis Screening Algorithms in Medical Health Checkups.
Nah, Eun Hee; Cho, Seon; Kim, Suyoung; Cho, Han Ik; Chai, Jong Yil
2017-11-01
The syphilis diagnostic algorithms applied in different countries vary significantly depending on the local syphilis epidemiology and other considerations, including the expected workload, the need for automation in the laboratory and budget factors. This study was performed to investigate the efficacy of traditional and reverse syphilis diagnostic algorithms during general health checkups. In total, 1,000 blood specimens were obtained from 908 men and 92 women during their regular health checkups. Traditional screening and reverse screening were applied to the same specimens using automatic rapid plasma regain (RPR) and Treponema pallidum latex agglutination (TPLA) tests, respectively. Specimens that were reverse algorithm (TPLA) reactive, were subjected to a second treponemal test performed by using the chemiluminescent microparticle immunoassay (CMIA). Of the 1,000 specimens tested, 68 (6.8%) were reactive by reverse screening (TPLA) compared with 11 (1.1%) by traditional screening (RPR). The traditional algorithm failed to detect 48 specimens [TPLA(+)/RPR(-)/CMIA(+)]. The median TPLA cutoff index (COI) was higher in CMIA-reactive cases than in CMIA-nonreactive cases (90.5 vs 12.5 U). The reverse screening algorithm could detect the subjects with possible latent syphilis who were not detected by the traditional algorithm. Those individuals could be provided with opportunities for evaluating syphilis during their health checkups. The COI values of the initial TPLA test may be helpful in excluding false-positive TPLA test results in the reverse algorithm. © The Korean Society for Laboratory Medicine
NASA Astrophysics Data System (ADS)
Li, Yuanbo; Cui, Xiaoqian; Wang, Hongbei; Zhao, Mengge; Ding, Hongbin
2017-10-01
Digital speckle pattern interferometry (DSPI) can diagnose the topography evolution in real-time, continuous and non-destructive, and has been considered as a most promising technique for Plasma-Facing Components (PFCs) topography diagnostic under the complicated environment of tokamak. It is important for the study of digital speckle pattern interferometry to enhance speckle patterns and obtain the real topography of the ablated crater. In this paper, two kinds of numerical model based on flood-fill algorithm has been developed to obtain the real profile by unwrapping from the wrapped phase in speckle interference pattern, which can be calculated through four intensity images by means of 4-step phase-shifting technique. During the process of phase unwrapping by means of flood-fill algorithm, since the existence of noise pollution, and other inevitable factors will lead to poor quality of the reconstruction results, this will have an impact on the authenticity of the restored topography. The calculation of the quality parameters was introduced to obtain the quality-map from the wrapped phase map, this work presents two different methods to calculate the quality parameters. Then quality parameters are used to guide the path of flood-fill algorithm, and the pixels with good quality parameters are given priority calculation, so that the quality of speckle interference pattern reconstruction results are improved. According to the comparison between the flood-fill algorithm which is suitable for speckle pattern interferometry and the quality-guided flood-fill algorithm (with two different calculation approaches), the errors which caused by noise pollution and the discontinuous of the strips were successfully reduced.
Model-based diagnostics of gas turbine engine lubrication systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byington, C.S.
1998-09-01
The objective of the current research was to develop improved methodology for diagnosing anomalies and maintaining oil lubrication systems for gas turbine engines. The effort focused on the development of reasoning modules that utilize the existing, inexpensive sensors and are applicable to on-line monitoring within the full-authority digital engine controller (FADEC) of the engine. The target application is the Enhanced TF-40B gas turbine engine that powers the Landing Craft Air Cushion (LCAC) platform. To accomplish the development of the requisite data fusion algorithms and automated reasoning for the diagnostic modules, Penn State ARL produced a generic Turbine Engine Lubrication Systemmore » Simulator (TELSS) and Data Fusion Workbench (DFW). TELSS is a portable simulator code that calculates lubrication system parameters based upon one-dimensional fluid flow resistance network equations. Validation of the TF- 40B modules was performed using engineering and limited test data. The simulation model was used to analyze operational data from the LCAC fleet. The TELSS, as an integral portion of the DFW, provides the capability to experiment with combinations of variables and feature vectors that characterize normal and abnormal operation of the engine lubrication system. The model-based diagnostics approach is applicable to all gas turbine engines and mechanical transmissions with similar pressure-fed lubrication systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandoval, D; Mlady, G; Selwyn, R
Purpose: To bring together radiologists, technologists, and physicists to utilize post-processing techniques in digital radiography (DR) in order to optimize image acquisition and improve image quality. Methods: Sub-optimal images acquired on a new General Electric (GE) DR system were flagged for follow-up by radiologists and reviewed by technologists and medical physicists. Various exam types from adult musculoskeletal (n=35), adult chest (n=4), and pediatric (n=7) were chosen for review. 673 total images were reviewed. These images were processed using five customized algorithms provided by GE. An image score sheet was created allowing the radiologist to assign a numeric score to eachmore » of the processed images, this allowed for objective comparison to the original images. Each image was scored based on seven properties: 1) overall image look, 2) soft tissue contrast, 3) high contrast, 4) latitude, 5) tissue equalization, 6) edge enhancement, 7) visualization of structures. Additional space allowed for additional comments not captured in scoring categories. Radiologists scored the images from 1 – 10 with 1 being non-diagnostic quality and 10 being superior diagnostic quality. Scores for each custom algorithm for each image set were summed. The algorithm with the highest score for each image set was then set as the default processing. Results: Images placed into the PACS “QC folder” for image processing reasons decreased. Feedback from radiologists was, overall, that image quality for these studies had improved. All default processing for these image types was changed to the new algorithm. Conclusion: This work is an example of the collaboration between radiologists, technologists, and physicists at the University of New Mexico to add value to the radiology department. The significant amount of work required to prepare the processing algorithms, reprocessing and scoring of the images was eagerly taken on by all team members in order to produce better quality images and improve patient care.« less