Eliseev, Platon; Balantcev, Grigory; Nikishova, Elena; Gaida, Anastasia; Bogdanova, Elena; Enarson, Donald; Ornstein, Tara; Detjen, Anne; Dacombe, Russell; Gospodarevskaya, Elena; Phillips, Patrick P J; Mann, Gillian; Squire, Stephen Bertel; Mariandyshev, Andrei
2016-01-01
In the Arkhangelsk region of Northern Russia, multidrug-resistant (MDR) tuberculosis (TB) rates in new cases are amongst the highest in the world. In 2014, MDR-TB rates reached 31.7% among new cases and 56.9% among retreatment cases. The development of new diagnostic tools allows for faster detection of both TB and MDR-TB and should lead to reduced transmission by earlier initiation of anti-TB therapy. The PROVE-IT (Policy Relevant Outcomes from Validating Evidence on Impact) Russia study aimed to assess the impact of the implementation of line probe assay (LPA) as part of an LPA-based diagnostic algorithm for patients with presumptive MDR-TB focusing on time to treatment initiation with time from first-care seeking visit to the initiation of MDR-TB treatment rather than diagnostic accuracy as the primary outcome, and to assess treatment outcomes. We hypothesized that the implementation of LPA would result in faster time to treatment initiation and better treatment outcomes. A culture-based diagnostic algorithm used prior to LPA implementation was compared to an LPA-based algorithm that replaced BacTAlert and Löwenstein Jensen (LJ) for drug sensitivity testing. A total of 295 MDR-TB patients were included in the study, 163 diagnosed with the culture-based algorithm, 132 with the LPA-based algorithm. Among smear positive patients, the implementation of the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 50 and 66 days compared to the culture-based algorithm (BacTAlert and LJ respectively, p<0.001). In smear negative patients, the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 78 days when compared to the culture-based algorithm (LJ, p<0.001). However, several weeks were still needed for treatment initiation in LPA-based algorithm, 24 days in smear positive, and 62 days in smear negative patients. Overall treatment outcomes were better in LPA-based algorithm compared to culture-based algorithm (p = 0.003). Treatment success rates at 20 months of treatment were higher in patients diagnosed with the LPA-based algorithm (65.2%) as compared to those diagnosed with the culture-based algorithm (44.8%). Mortality was also lower in the LPA-based algorithm group (7.6%) compared to the culture-based algorithm group (15.9%). There was no statistically significant difference in smear and culture conversion rates between the two algorithms. The results of the study suggest that the introduction of LPA leads to faster time to MDR diagnosis and earlier treatment initiation as well as better treatment outcomes for patients with MDR-TB. These findings also highlight the need for further improvements within the health system to reduce both patient and diagnostic delays to truly optimize the impact of new, rapid diagnostics.
Treatment Algorithms Based on Tumor Molecular Profiling: The Essence of Precision Medicine Trials.
Le Tourneau, Christophe; Kamal, Maud; Tsimberidou, Apostolia-Maria; Bedard, Philippe; Pierron, Gaëlle; Callens, Céline; Rouleau, Etienne; Vincent-Salomon, Anne; Servant, Nicolas; Alt, Marie; Rouzier, Roman; Paoletti, Xavier; Delattre, Olivier; Bièche, Ivan
2016-04-01
With the advent of high-throughput molecular technologies, several precision medicine (PM) studies are currently ongoing that include molecular screening programs and PM clinical trials. Molecular profiling programs establish the molecular profile of patients' tumors with the aim to guide therapy based on identified molecular alterations. The aim of prospective PM clinical trials is to assess the clinical utility of tumor molecular profiling and to determine whether treatment selection based on molecular alterations produces superior outcomes compared with unselected treatment. These trials use treatment algorithms to assign patients to specific targeted therapies based on tumor molecular alterations. These algorithms should be governed by fixed rules to ensure standardization and reproducibility. Here, we summarize key molecular, biological, and technical criteria that, in our view, should be addressed when establishing treatment algorithms based on tumor molecular profiling for PM trials. © The Author 2015. Published by Oxford University Press.
SU-E-T-538: Evaluation of IMRT Dose Calculation Based on Pencil-Beam and AAA Algorithms.
Yuan, Y; Duan, J; Popple, R; Brezovich, I
2012-06-01
To evaluate the accuracy of dose calculation for intensity modulated radiation therapy (IMRT) based on Pencil Beam (PB) and Analytical Anisotropic Algorithm (AAA) computation algorithms. IMRT plans of twelve patients with different treatment sites, including head/neck, lung and pelvis, were investigated. For each patient, dose calculation with PB and AAA algorithms using dose grid sizes of 0.5 mm, 0.25 mm, and 0.125 mm, were compared with composite-beam ion chamber and film measurements in patient specific QA. Discrepancies between the calculation and the measurement were evaluated by percentage error for ion chamber dose and γ〉l failure rate in gamma analysis (3%/3mm) for film dosimetry. For 9 patients, ion chamber dose calculated with AAA-algorithms is closer to ion chamber measurement than that calculated with PB algorithm with grid size of 2.5 mm, though all calculated ion chamber doses are within 3% of the measurements. For head/neck patients and other patients with large treatment volumes, γ〉l failure rate is significantly reduced (within 5%) with AAA-based treatment planning compared to generally more than 10% with PB-based treatment planning (grid size=2.5 mm). For lung and brain cancer patients with medium and small treatment volumes, γ〉l failure rates are typically within 5% for both AAA and PB-based treatment planning (grid size=2.5 mm). For both PB and AAA-based treatment planning, improvements of dose calculation accuracy with finer dose grids were observed in film dosimetry of 11 patients and in ion chamber measurements for 3 patients. AAA-based treatment planning provides more accurate dose calculation for head/neck patients and other patients with large treatment volumes. Compared with film dosimetry, a γ〉l failure rate within 5% can be achieved for AAA-based treatment planning. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
McKinney, Mark C; Riley, Jeffrey B
2007-12-01
The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time < 450 seconds with > 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.
Sequential drug treatment algorithm for agitation and aggression in Alzheimer's and mixed dementia.
Davies, Simon Jc; Burhan, Amer M; Kim, Donna; Gerretsen, Philip; Graff-Guerrero, Ariel; Woo, Vincent L; Kumar, Sanjeev; Colman, Sarah; Pollock, Bruce G; Mulsant, Benoit H; Rajji, Tarek K
2018-05-01
Behavioural and psychological symptoms of dementia (BPSD) include agitation and aggression in people with dementia. BPSD is common on inpatient psychogeriatric units and may prevent individuals from living at home or in residential/nursing home settings. Several drugs and non-pharmacological treatments have been shown to be effective in reducing behavioural and psychological symptoms of dementia. Algorithmic treatment may address the challenge of synthesizing this evidence-based knowledge. A multidisciplinary team created evidence-based algorithms for the treatment of behavioural and psychological symptoms of dementia. We present drug treatment algorithms for agitation and aggression associated with Alzheimer's and mixed Alzheimer's/vascular dementia. Drugs were appraised by psychiatrists based on strength of evidence of efficacy, time to onset of clinical effect, tolerability, ease of use, and efficacy for indications other than behavioural and psychological symptoms of dementia. After baseline assessment and discontinuation of potentially exacerbating medications, sequential trials are recommended with risperidone, aripiprazole or quetiapine, carbamazepine, citalopram, gabapentin, and prazosin. Titration schedules are proposed, with adjustments for frailty. Additional guidance is given on use of electroconvulsive therapy, optimization of existing cholinesterase inhibitors/memantine, and use of pro re nata medications. This algorithm-based approach for drug treatment of agitation/aggression in Alzheimer's/mixed dementia has been implemented in several Canadian Hospital Inpatient Units. Impact should be assessed in future research.
Aiello, Lloyd Paul; Beck, Roy W; Bressler, Neil M; Browning, David J; Chalam, K V; Davis, Matthew; Ferris, Frederick L; Glassman, Adam R; Maturi, Raj K; Stockdale, Cynthia R; Topping, Trexler M
2011-12-01
To describe the underlying principles used to develop a web-based algorithm that incorporated intravitreal anti-vascular endothelial growth factor (anti-VEGF) treatment for diabetic macular edema (DME) in a Diabetic Retinopathy Clinical Research Network (DRCR.net) randomized clinical trial. Discussion of treatment protocol for DME. Subjects with vision loss resulting from DME involving the center of the macula. The DRCR.net created an algorithm incorporating anti-VEGF injections in a comparative effectiveness randomized clinical trial evaluating intravitreal ranibizumab with prompt or deferred (≥24 weeks) focal/grid laser treatment in eyes with vision loss resulting from center-involved DME. Results confirmed that intravitreal ranibizumab with prompt or deferred laser provides superior visual acuity outcomes compared with prompt laser alone through at least 2 years. Duplication of this algorithm may not be practical for clinical practice. To share their opinion on how ophthalmologists might emulate the study protocol, participating DRCR.net investigators developed guidelines based on the algorithm's underlying rationale. Clinical guidelines based on a DRCR.net protocol. The treatment protocol required real-time feedback from a web-based data entry system for intravitreal injections, focal/grid laser treatment, and follow-up intervals. Guidance from this system indicated whether treatment was required or given at investigator discretion and when follow-up should be scheduled. Clinical treatment guidelines, based on the underlying clinical rationale of the DRCR.net protocol, include repeating treatment monthly as long as there is improvement in edema compared with the previous month or until the retina is no longer thickened. If thickening recurs or worsens after discontinuing treatment, treatment is resumed. Duplication of the approach used in the DRCR.net randomized clinical trial to treat DME involving the center of the macula with intravitreal ranibizumab may not be practical in clinical practice, but likely can be emulated based on an understanding of the underlying rationale for the study protocol. Inherent differences between a web-based treatment algorithm and a clinical approach may lead to differences in outcomes that are impossible to predict. The closer the clinical approach is to the algorithm used in the study, the more likely the outcomes will be similar to those published. Proprietary or commercial disclosure may be found after the references. Copyright © 2011 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Bjoerke-Bertheussen, Jeanette; Schoeyen, Helle; Andreassen, Ole A; Malt, Ulrik F; Oedegaard, Ketil J; Morken, Gunnar; Sundet, Kjetil; Vaaler, Arne E; Auestad, Bjoern; Kessler, Ute
2017-12-21
Electroconvulsive therapy is an effective treatment for bipolar depression, but there are concerns about whether it causes long-term neurocognitive impairment. In this multicenter randomized controlled trial, in-patients with treatment-resistant bipolar depression were randomized to either algorithm-based pharmacologic treatment or right unilateral electroconvulsive therapy. After the 6-week treatment period, all of the patients received maintenance pharmacotherapy as recommended by their clinician guided by a relevant treatment algorithm. Patients were assessed at baseline and at 6 months. Neurocognitive functions were assessed using the Measurement and Treatment Research to Improve Cognition in Schizophrenia (MATRICS) Consensus Cognitive Battery, and autobiographical memory consistency was assessed using the Autobiographical Memory Interview-Short Form. Seventy-three patients entered the trial, of whom 51 and 26 completed neurocognitive assessments at baseline and 6 months, respectively. The MATRICS Consensus Cognitive Battery composite score improved by 4.1 points in both groups (P = .042) from baseline to 6 months (from 40.8 to 44.9 and from 41.9 to 46.0 in the algorithm-based pharmacologic treatment and electroconvulsive therapy groups, respectively). The Autobiographical Memory Interview-Short Form consistency scores were reduced in both groups (72.3% vs 64.3% in the algorithm-based pharmacologic treatment and electroconvulsive therapy groups, respectively; P = .085). This study did not find that right unilateral electroconvulsive therapy caused long-term impairment in neurocognitive functions compared to algorithm-based pharmacologic treatment in bipolar depression as measured using standard neuropsychological tests, but due to the low number of patients in the study the results should be interpreted with caution. ClinicalTrials.gov: NCT00664976. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Dunbar, Rory; Caldwell, Judy; Lombard, Carl; Beyers, Nulda
2017-01-01
Setting Primary health services in Cape Town, South Africa where the introduction of Xpert® MTB/RIF (Xpert) enabled simultaneous screening for tuberculosis (TB) and drug susceptibility in all presumptive cases. Study aim To compare the proportion of TB cases with drug susceptibility tests undertaken and multidrug-resistant tuberculosis (MDR-TB) diagnosed pre-treatment and during the course of 1st line treatment in the previous smear/culture and the newly introduced Xpert-based algorithms. Methods TB cases identified in a previous stepped-wedge study of TB yield in five sub-districts over seven one-month time-points prior to, during and after the introduction of the Xpert-based algorithm were analysed. We used a combination of patient identifiers to identify all drug susceptibility tests undertaken from electronic laboratory records. Differences in the proportions of DST undertaken and MDR-TB cases diagnosed between algorithms were estimated using a binomial regression model. Results Pre-treatment, the probability of having a DST undertaken (RR = 1.82)(p<0.001) and being diagnosed with MDR-TB (RR = 1.42)(p<0.001) was higher in the Xpert-based algorithm than in the smear/culture-based algorithm. For cases evaluated during the course of 1st-line TB treatment, there was no significant difference in the proportion with DST undertaken (RR = 1.02)(p = 0.848) or MDR-TB diagnosed (RR = 1.12)(p = 0.678) between algorithms. Conclusion Universal screening for drug susceptibility in all presumptive TB cases in the Xpert-based algorithm resulted in a higher overall proportion of MDR-TB cases being diagnosed and is an important strategy in reducing transmission. The previous strategy of only screening new TB cases when 1st line treatment failed did not compensate for cases missed pre-treatment. PMID:28199375
Aiello, Lloyd Paul; Beck, Roy W; Bressler, Neil M.; Browning, David J.; Chalam, KV; Davis, Matthew; Ferris, Frederick L; Glassman, Adam; Maturi, Raj; Stockdale, Cynthia R.; Topping, Trexler
2011-01-01
Objective Describe the underlying principles used to develop a web-based algorithm that incorporated intravitreal anti-vascular endothelial growth factor (anti-VEGF) treatment for diabetic macular edema (DME) in a Diabetic Retinopathy Clinical Research Network (DRCR.net) randomized clinical trial. Design Discussion of treatment protocol for DME. Participants Subjects with vision loss from DME involving the center of the macula. Methods The DRCR.net created an algorithm incorporating anti-VEGF injections in a comparative effectiveness randomized clinical trial evaluating intravitreal ranibizumab with prompt or deferred (≥24 weeks) focal/grid laser in eyes with vision loss from center-involved DME. Results confirmed that intravitreal ranibizumab with prompt or deferred laser provides superior visual acuity outcomes, compared with prompt laser alone through at least 2 years. Duplication of this algorithm may not be practical for clinical practice. In order to share their opinion on how ophthalmologists might emulate the study protocol, participating DRCR.net investigators developed guidelines based on the algorithm's underlying rationale. Main Outcome Measures Clinical guidelines based on a DRCR.net protocol. Results The treatment protocol required real time feedback from a web-based data entry system for intravitreal injections, focal/grid laser, and follow-up intervals. Guidance from this system indicated whether treatment was required or given at investigator discretion and when follow-up should be scheduled. Clinical treatment guidelines, based on the underlying clinical rationale of the DRCR.net protocol, include repeating treatment monthly as long as there is improvement in edema compared with the previous month, or until the retina is no longer thickened. If thickening recurs or worsens after discontinuing treatment, treatment is resumed. Conclusions Duplication of the approach used in the DRCR.net randomized clinical trial to treat DME involving the center of the macula with intravitreal ranibizumab may not be practical in clinical practice, but likely can be emulated based on an understanding of the underlying rationale for the study protocol. Inherent differences between a web-based treatment algorithm and a clinical approach may lead to differences in outcomes that are impossible to predict. The closer the clinical approach is to the algorithm used in the study, the more likely the outcomes will be similar to those published. PMID:22136692
Yakhelef, N; Audibert, M; Varaine, F; Chakaya, J; Sitienei, J; Huerga, H; Bonnet, M
2014-05-01
In 2007, the World Health Organization recommended introducing rapid Mycobacterium tuberculosis culture into the diagnostic algorithm of smear-negative pulmonary tuberculosis (TB). To assess the cost-effectiveness of introducing a rapid non-commercial culture method (thin-layer agar), together with Löwenstein-Jensen culture to diagnose smear-negative TB at a district hospital in Kenya. Outcomes (number of true TB cases treated) were obtained from a prospective study evaluating the effectiveness of a clinical and radiological algorithm (conventional) against the alternative algorithm (conventional plus M. tuberculosis culture) in 380 smear-negative TB suspects. The costs of implementing each algorithm were calculated using a 'micro-costing' or 'ingredient-based' method. We then compared the cost and effectiveness of conventional vs. culture-based algorithms and estimated the incremental cost-effectiveness ratio. The costs of conventional and culture-based algorithms per smear-negative TB suspect were respectively €39.5 and €144. The costs per confirmed and treated TB case were respectively €452 and €913. The culture-based algorithm led to diagnosis and treatment of 27 more cases for an additional cost of €1477 per case. Despite the increase in patients started on treatment thanks to culture, the relatively high cost of a culture-based algorithm will make it difficult for resource-limited countries to afford.
Fokkema, M; Smits, N; Zeileis, A; Hothorn, T; Kelderman, H
2017-10-25
Identification of subgroups of patients for whom treatment A is more effective than treatment B, and vice versa, is of key importance to the development of personalized medicine. Tree-based algorithms are helpful tools for the detection of such interactions, but none of the available algorithms allow for taking into account clustered or nested dataset structures, which are particularly common in psychological research. Therefore, we propose the generalized linear mixed-effects model tree (GLMM tree) algorithm, which allows for the detection of treatment-subgroup interactions, while accounting for the clustered structure of a dataset. The algorithm uses model-based recursive partitioning to detect treatment-subgroup interactions, and a GLMM to estimate the random-effects parameters. In a simulation study, GLMM trees show higher accuracy in recovering treatment-subgroup interactions, higher predictive accuracy, and lower type II error rates than linear-model-based recursive partitioning and mixed-effects regression trees. Also, GLMM trees show somewhat higher predictive accuracy than linear mixed-effects models with pre-specified interaction effects, on average. We illustrate the application of GLMM trees on an individual patient-level data meta-analysis on treatments for depression. We conclude that GLMM trees are a promising exploratory tool for the detection of treatment-subgroup interactions in clustered datasets.
Ricken, Roland; Wiethoff, Katja; Reinhold, Thomas; Schietsch, Kathrin; Stamm, Thomas; Kiermeir, Julia; Neu, Peter; Heinz, Andreas; Bauer, Michael; Adli, Mazda
2011-11-01
The German Algorithm Project, Phase 2 (GAP2) revealed that a standardized stepwise treatment regimen (SSTR) results in better treatment outcomes than treatment as usual (TAU) in depressed inpatients. The objective of this study was a health economic evaluation of SSTR based on a cost effectiveness analysis (CEA). GAP2 was a randomized controlled study with 148 patients. In an intention to treat (ITT) analysis direct treatment costs for study duration (SD) and total time in hospital (TTH; enrolment to discharge) were calculated based on daily hospital charges followed by a CEA to calculate cost expenditure per remitted patient. Treatment costs in SSTR compared to TAU were significantly lower for SD (SSTR: 10 830 € ± 8 632 €, TAU: 15 202 € ± 12 483 €; p = 0.026) and did not differ significantly for TTH (SSTR: 21 561 € ± 16 162 €; TAU: 18 248 € ± 13 454; p = 0.208). CEA revealed that the costs per remission in SSTR were significantly lower for SD (SSTR: 20 035 € ± 15 970 €; SSTR: 38 793 € ± 31 853 €; p<0.0001) and TTH (SSTR: 31 285 € ± 23 451 €; TAU: 38 581 € ± 28 449 €, p = 0.041). Indirect costs were not assessed. Different dropout rates in TAU and SSTR complicated interpretation of data. An SSTR-based algorithm results in a superior cost effectiveness at no significant extra costs. Implementation of treatment algorithms in inpatient-care may help reduce treatment costs. Copyright © 2011 Elsevier B.V. All rights reserved.
Validation of an algorithm-based definition of treatment resistance in patients with schizophrenia.
Ajnakina, Olesya; Horsdal, Henriette Thisted; Lally, John; MacCabe, James H; Murray, Robin M; Gasse, Christiane; Wimberley, Theresa
2018-02-19
Large-scale pharmacoepidemiological research on treatment resistance relies on accurate identification of people with treatment-resistant schizophrenia (TRS) based on data that are retrievable from administrative registers. This is usually approached by operationalising clinical treatment guidelines by using prescription and hospital admission information. We examined the accuracy of an algorithm-based definition of TRS based on clozapine prescription and/or meeting algorithm-based eligibility criteria for clozapine against a gold standard definition using case notes. We additionally validated a definition entirely based on clozapine prescription. 139 schizophrenia patients aged 18-65years were followed for a mean of 5years after first presentation to psychiatric services in South-London, UK. The diagnostic accuracy of the algorithm-based measure against the gold standard was measured with sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). A total of 45 (32.4%) schizophrenia patients met the criteria for the gold standard definition of TRS; applying the algorithm-based definition to the same cohort led to 44 (31.7%) patients fulfilling criteria for TRS with sensitivity, specificity, PPV and NPV of 62.2%, 83.0%, 63.6% and 82.1%, respectively. The definition based on lifetime clozapine prescription had sensitivity, specificity, PPV and NPV of 40.0%, 94.7%, 78.3% and 76.7%, respectively. Although a perfect definition of TRS cannot be derived from available prescription and hospital registers, these results indicate that researchers can confidently use registries to identify individuals with TRS for research and clinical practices. Copyright © 2018 Elsevier B.V. All rights reserved.
Implementation of an Algorithm for Prosthetic Joint Infection: Deviations and Problems.
Mühlhofer, Heinrich M L; Kanz, Karl-Georg; Pohlig, Florian; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; von Eisenhart-Rothe, Ruediger; Schauwecker, Johannes
The outcome of revision surgery in arthroplasty is based on a precise diagnosis. In addition, the treatment varies based on whether the prosthetic failure is caused by aseptic or septic loosening. Algorithms can help to identify periprosthetic joint infections (PJI) and standardize diagnostic steps, however, algorithms tend to oversimplify the treatment of complex cases. We conducted a process analysis during the implementation of a PJI algorithm to determine problems and deviations associated with the implementation of this algorithm. Fifty patients who were treated after implementing a standardized algorithm were monitored retrospectively. Their treatment plans and diagnostic cascades were analyzed for deviations from the implemented algorithm. Each diagnostic procedure was recorded, compared with the algorithm, and evaluated statistically. We detected 52 deviations while treating 50 patients. In 25 cases, no discrepancy was observed. Synovial fluid aspiration was not performed in 31.8% of patients (95% confidence interval [CI], 18.1%-45.6%), while white blood cell counts (WBCs) and neutrophil differentiation were assessed in 54.5% of patients (95% CI, 39.8%-69.3%). We also observed that the prolonged incubation of cultures was not requested in 13.6% of patients (95% CI, 3.5%-23.8%). In seven of 13 cases (63.6%; 95% CI, 35.2%-92.1%), arthroscopic biopsy was performed; 6 arthroscopies were performed in discordance with the algorithm (12%; 95% CI, 3%-21%). Self-critical analysis of diagnostic processes and monitoring of deviations using algorithms are important and could increase the quality of treatment by revealing recurring faults.
Hirano, Jinichi; Watanabe, Koichiro; Suzuki, Takefumi; Uchida, Hiroyuki; Den, Ryosuke; Kishimoto, Taishiro; Nagasawa, Takashi; Tomita, Yusuke; Hara, Koichiro; Ochi, Hiromi; Kobayashi, Yoshimi; Ishii, Mutsuko; Fujita, Akane; Kanai, Yoshihiko; Goto, Megumi; Hayashi, Hiromi; Inamura, Kanako; Ooshima, Fumiko; Sumida, Mariko; Ozawa, Tomoko; Sekigawa, Kayoko; Nagaoka, Maki; Yoshimura, Kae; Konishi, Mika; Inagaki, Ataru; Saito, Takuya; Motohashi, Nobutaka; Mimura, Masaru; Okubo, Yoshiro; Kato, Motoichiro
2013-01-01
Objective The use of an algorithm may facilitate measurement-based treatment and result in more rational therapy. We conducted a 1-year, open-label study to compare various outcomes of algorithm-based treatment (ALGO) for schizophrenia versus treatment-as-usual (TAU), for which evidence has been very scarce. Methods In ALGO, patients with schizophrenia (Diagnostic and Statistical Manual of Mental Disorders, fourth edition) were treated with an algorithm consisting of a series of antipsychotic monotherapies that was guided by the total scores in the positive and negative syndrome scale (PANSS). When posttreatment PANSS total scores were above 70% of those at baseline in the first and second stages, or above 80% in the 3rd stage, patients proceeded to the next treatment stage with different antipsychotics. In contrast, TAU represented the best clinical judgment by treating psychiatrists. Results Forty-two patients (21 females, 39.0 ± 10.9 years-old) participated in this study. The baseline PANSS total score indicated the presence of severe psychopathology and was significantly higher in the ALGO group (n = 25; 106.9 ± 20.0) than in the TAU group (n = 17; 92.2 ± 18.3) (P = 0.021). As a result of treatment, there were no significant differences in the PANSS reduction rates, premature attrition rates, as well as in a variety of other clinical measures between the groups. Despite an effort to make each group unique in pharmacologic treatment, it was found that pharmacotherapy in the TAU group eventually became similar in quality to that of the ALGO group. Conclusion While the results need to be carefully interpreted in light of a hard-to-distinguish treatment manner between the two groups and more studies are necessary, algorithm-based antipsychotic treatments for schizophrenia compared well to treatment-as-usual in this study. PMID:24143104
NASA Astrophysics Data System (ADS)
Devpura, S.; Siddiqui, M. S.; Chen, D.; Liu, D.; Li, H.; Kumar, S.; Gordon, J.; Ajlouni, M.; Movsas, B.; Chetty, I. J.
2014-03-01
The purpose of this study was to systematically evaluate dose distributions computed with 5 different dose algorithms for patients with lung cancers treated using stereotactic ablative body radiotherapy (SABR). Treatment plans for 133 lung cancer patients, initially computed with a 1D-pencil beam (equivalent-path-length, EPL-1D) algorithm, were recalculated with 4 other algorithms commissioned for treatment planning, including 3-D pencil-beam (EPL-3D), anisotropic analytical algorithm (AAA), collapsed cone convolution superposition (CCC), and Monte Carlo (MC). The plan prescription dose was 48 Gy in 4 fractions normalized to the 95% isodose line. Tumors were classified according to location: peripheral tumors surrounded by lung (lung-island, N=39), peripheral tumors attached to the rib-cage or chest wall (lung-wall, N=44), and centrally-located tumors (lung-central, N=50). Relative to the EPL-1D algorithm, PTV D95 and mean dose values computed with the other 4 algorithms were lowest for "lung-island" tumors with smallest field sizes (3-5 cm). On the other hand, the smallest differences were noted for lung-central tumors treated with largest field widths (7-10 cm). Amongst all locations, dose distribution differences were most strongly correlated with tumor size for lung-island tumors. For most cases, convolution/superposition and MC algorithms were in good agreement. Mean lung dose (MLD) values computed with the EPL-1D algorithm were highly correlated with that of the other algorithms (correlation coefficient =0.99). The MLD values were found to be ~10% lower for small lung-island tumors with the model-based (conv/superposition and MC) vs. the correction-based (pencil-beam) algorithms with the model-based algorithms predicting greater low dose spread within the lungs. This study suggests that pencil beam algorithms should be avoided for lung SABR planning. For the most challenging cases, small tumors surrounded entirely by lung tissue (lung-island type), a Monte-Carlo-based algorithm may be warranted.
Pokharel, Shyam; Rana, Suresh; Blikenstaff, Joseph; Sadeghi, Amir; Prestidge, Bradley
2013-07-08
The purpose of this study is to investigate the effectiveness of the HIPO planning and optimization algorithm for real-time prostate HDR brachytherapy. This study consists of 20 patients who underwent ultrasound-based real-time HDR brachytherapy of the prostate using the treatment planning system called Oncentra Prostate (SWIFT version 3.0). The treatment plans for all patients were optimized using inverse dose-volume histogram-based optimization followed by graphical optimization (GRO) in real time. The GRO is manual manipulation of isodose lines slice by slice. The quality of the plan heavily depends on planner expertise and experience. The data for all patients were retrieved later, and treatment plans were created and optimized using HIPO algorithm with the same set of dose constraints, number of catheters, and set of contours as in the real-time optimization algorithm. The HIPO algorithm is a hybrid because it combines both stochastic and deterministic algorithms. The stochastic algorithm, called simulated annealing, searches the optimal catheter distributions for a given set of dose objectives. The deterministic algorithm, called dose-volume histogram-based optimization (DVHO), optimizes three-dimensional dose distribution quickly by moving straight downhill once it is in the advantageous region of the search space given by the stochastic algorithm. The PTV receiving 100% of the prescription dose (V100) was 97.56% and 95.38% with GRO and HIPO, respectively. The mean dose (D(mean)) and minimum dose to 10% volume (D10) for the urethra, rectum, and bladder were all statistically lower with HIPO compared to GRO using the student pair t-test at 5% significance level. HIPO can provide treatment plans with comparable target coverage to that of GRO with a reduction in dose to the critical structures.
Towards an optimal treatment algorithm for metastatic pancreatic ductal adenocarcinoma (PDA)
Uccello, M.; Moschetta, M.; Mak, G.; Alam, T.; Henriquez, C. Murias; Arkenau, H.-T.
2018-01-01
Chemotherapy remains the mainstay of treatment for advanced pancreatic ductal adenocarcinoma (pda). Two randomized trials have demonstrated superiority of the combination regimens folfirinox (5-fluorouracil, leucovorin, oxaliplatin, and irinotecan) and gemcitabine plus nab-paclitaxel over gemcitabine monotherapy as a first-line treatment in adequately fit subjects. Selected pda patients progressing to first-line therapy can receive secondline treatment with moderate clinical benefit. Nevertheless, the optimal algorithm and the role of combination therapy in second-line are still unclear. Published second-line pda clinical trials enrolled patients progressing to gemcitabine-based therapies in use before the approval of nab-paclitaxel and folfirinox. The evolving scenario in second-line may affect the choice of the first-line treatment. For example, nanoliposomal irinotecan plus 5-fluouracil and leucovorin is a novel second-line option which will be suitable only for patients progressing to gemcitabine-based therapy. Therefore, clinical judgement and appropriate patient selection remain key elements in treatment decision. In this review, we aim to illustrate currently available options and define a possible algorithm to guide treatment choice. Future clinical trials taking into account sequential treatment as a new paradigm in pda will help define a standard algorithm. PMID:29507500
Yoshimura, Jumpei; Kinoshita, Takahiro; Yamakawa, Kazuma; Matsushima, Asako; Nakamoto, Naoki; Hamasaki, Toshimitsu; Fujimi, Satoshi
2017-06-19
Ventilator-associated pneumonia (VAP) is a common and serious problem in intensive care units (ICUs). Several studies have suggested that the Gram stain of endotracheal aspirates is a useful method for accurately diagnosing VAP. However, the usefulness of the Gram stain in predicting which microorganisms cause VAP has not been established. The purpose of this study was to evaluate whether a Gram stain of endotracheal aspirates could be used to determine appropriate initial antimicrobial therapy for VAP. Data on consecutive episodes of microbiologically confirmed VAP were collected from February 2013 to February 2016 in the ICU of a tertiary care hospital in Japan. We constructed two hypothetical empirical antimicrobial treatment algorithms for VAP: a guidelines-based algorithm (GLBA) based on the recommendations of the American Thoracic Society-Infectious Diseases Society of America (ATS-IDSA) guidelines and a Gram stain-based algorithm (GSBA) which limited the choice of initial antimicrobials according to the results of bedside Gram stains. The GLBA and the GSBA were retrospectively reviewed for each VAP episode. The initial coverage rates and the selection of broad-spectrum antimicrobial agents were compared between the two algorithms. During the study period, 219 suspected VAP episodes were observed and 131 episodes were assessed for analysis. Appropriate antimicrobial coverage rates were not significantly different between the two algorithms (GLBA 95.4% versus GSBA 92.4%; p = 0.134). The number of episodes for which antimethicillin-resistant Staphylococcus aureus agents were selected as an initial treatment was larger in the GLBA than in the GSBA (71.0% versus 31.3%; p < 0.001), as were the number of episodes for which antipseudomonal agents were recommended as an initial treatment (70.2% versus 51.9%; p < 0.001). Antimicrobial treatment based on Gram stain results may restrict the administration of broad-spectrum antimicrobial agents without increasing the risk of treatment failure. UMIN-CTR, UMIN000026457 . Registered 8 March 2017 (retrospectively registered).
Approaches to drug therapy for COPD in Russia: a proposed therapeutic algorithm.
Zykov, Kirill A; Ovcharenko, Svetlana I
2017-01-01
Until recently, there have been few clinical algorithms for the management of patients with COPD. Current evidence-based clinical management guidelines can appear to be complex, and they lack clear step-by-step instructions. For these reasons, we chose to create a simple and practical clinical algorithm for the management of patients with COPD, which would be applicable to real-world clinical practice, and which was based on clinical symptoms and spirometric parameters that would take into account the pathophysiological heterogeneity of COPD. This optimized algorithm has two main fields, one for nonspecialist treatment by primary care and general physicians and the other for treatment by specialized pulmonologists. Patients with COPD are treated with long-acting bronchodilators and short-acting drugs on a demand basis. If the forced expiratory volume in one second (FEV 1 ) is ≥50% of predicted and symptoms are mild, treatment with a single long-acting muscarinic antagonist or long-acting beta-agonist is proposed. When FEV 1 is <50% of predicted and/or the COPD assessment test score is ≥10, the use of combined bronchodilators is advised. If there is no response to treatment after three months, referral to a pulmonary specialist is recommended for pathophysiological endotyping: 1) eosinophilic endotype with peripheral blood or sputum eosinophilia >3%; 2) neutrophilic endotype with peripheral blood neutrophilia >60% or green sputum; or 3) pauci-granulocytic endotype. It is hoped that this simple, optimized, step-by-step algorithm will help to individualize the treatment of COPD in real-world clinical practice. This algorithm has yet to be evaluated prospectively or by comparison with other COPD management algorithms, including its effects on patient treatment outcomes. However, it is hoped that this algorithm may be useful in daily clinical practice for physicians treating patients with COPD in Russia.
Cropsey, Karen L.; Jardin, Bianca; Burkholder, Greer; Clark, C. Brendan; Raper, James L.; Saag, Michael
2015-01-01
Background Smoking now represents one of the biggest modifiable risk factors for disease and mortality in PLHIV. To produce significant changes in smoking rates among this population, treatments will need to be both acceptable to the larger segment of PLHIV smokers as well as feasible to implement in busy HIV clinics. The purpose of this study was to evaluate the feasibility and effects of a novel proactive algorithm-based intervention in an HIV/AIDS clinic. Methods PLHIV smokers (N =100) were proactively identified via their electronic medical records and were subsequently randomized at baseline to receive a 12-week pharmacotherapy-based algorithm treatment or treatment as usual. Participants were tracked in-person for 12-weeks. Participants provided information on smoking behaviors and associated constructs of cessation at each follow-up session. Results The findings revealed that many smokers reported utilizing prescribed medications when provided with a supply of cessation medication as determined by an algorithm. Compared to smokers receiving treatment as usual, PLHIV smokers prescribed these medications reported more quit attempts and greater reduction in smoking. Proxy measures of cessation readiness (e.g., motivation, self-efficacy) also favored participants receiving algorithm treatment. Conclusions This algorithm-derived treatment produced positive changes across a number of important clinical markers associated with smoking cessation. Given these promising findings coupled with the brief nature of this treatment, the overall pattern of results suggests strong potential for dissemination into clinical settings as well as significant promise for further advancing clinical health outcomes in this population. PMID:26181705
A pragmatic evidence-based clinical management algorithm for burning mouth syndrome.
Kim, Yohanan; Yoo, Timothy; Han, Peter; Liu, Yuan; Inman, Jared C
2018-04-01
Burning mouth syndrome is a poorly understood disease process with no current standard of treatment. The goal of this article is to provide an evidence-based, practical, clinical algorithm as a guideline for the treatment of burning mouth syndrome. Using available evidence and clinical experience, a multi-step management algorithm was developed. A retrospective cohort study was then performed, following STROBE statement guidelines, comparing outcomes of patients who were managed using the algorithm and those who were managed without. Forty-seven patients were included in the study, with 21 (45%) managed using the algorithm and 26 (55%) managed without. The mean age overall was 60.4 ±16.5 years, and most patients (39, 83%) were female. Cohorts showed no statistical difference in age, sex, overall follow-up time, dysgeusia, geographic tongue, or psychiatric disorder; xerostomia, however, was significantly different, skewed toward the algorithm group. Significantly more non-algorithm patients did not continue care (69% vs. 29%, p =0.001). The odds ratio of not continuing care for the non-algorithm group compared to the algorithm group was 5.6 [1.6, 19.8]. Improvement in pain was significantly more likely in the algorithm group ( p =0.001), with an odds ratio of 27.5 [3.1, 242.0]. We present a basic clinical management algorithm for burning mouth syndrome which may increase the likelihood of pain improvement and patient follow-up. Key words: Burning mouth syndrome, burning tongue, glossodynia, oral pain, oral burning, therapy, treatment.
Estimating the chance of success in IVF treatment using a ranking algorithm.
Güvenir, H Altay; Misirli, Gizem; Dilbaz, Serdar; Ozdegirmenci, Ozlem; Demir, Berfu; Dilbaz, Berna
2015-09-01
In medicine, estimating the chance of success for treatment is important in deciding whether to begin the treatment or not. This paper focuses on the domain of in vitro fertilization (IVF), where estimating the outcome of a treatment is very crucial in the decision to proceed with treatment for both the clinicians and the infertile couples. IVF treatment is a stressful and costly process. It is very stressful for couples who want to have a baby. If an initial evaluation indicates a low pregnancy rate, decision of the couple may change not to start the IVF treatment. The aim of this study is twofold, firstly, to develop a technique that can be used to estimate the chance of success for a couple who wants to have a baby and secondly, to determine the attributes and their particular values affecting the outcome in IVF treatment. We propose a new technique, called success estimation using a ranking algorithm (SERA), for estimating the success of a treatment using a ranking-based algorithm. The particular ranking algorithm used here is RIMARC. The performance of the new algorithm is compared with two well-known algorithms that assign class probabilities to query instances. The algorithms used in the comparison are Naïve Bayes Classifier and Random Forest. The comparison is done in terms of area under the ROC curve, accuracy and execution time, using tenfold stratified cross-validation. The results indicate that the proposed SERA algorithm has a potential to be used successfully to estimate the probability of success in medical treatment.
Khosravi, H R; Nodehi, Mr Golrokh; Asnaashari, Kh; Mahdavi, S R; Shirazi, A R; Gholami, S
2012-07-01
The aim of this study was to evaluate and analytically compare different calculation algorithms applied in our country radiotherapy centers base on the methodology developed by IAEA for treatment planning systems (TPS) commissioning (IAEA TEC-DOC 1583). Thorax anthropomorphic phantom (002LFC CIRS inc.), was used to measure 7 tests that simulate the whole chain of external beam TPS. The dose were measured with ion chambers and the deviation between measured and TPS calculated dose was reported. This methodology, which employs the same phantom and the same setup test cases, was tested in 4 different hospitals which were using 5 different algorithms/ inhomogeneity correction methods implemented in different TPS. The algorithms in this study were divided into two groups including correction based and model based algorithms. A total of 84 clinical test case datasets for different energies and calculation algorithms were produced, which amounts of differences in inhomogeneity points with low density (lung) and high density (bone) was decreased meaningfully with advanced algorithms. The number of deviations outside agreement criteria was increased with the beam energy and decreased with advancement of the TPS calculation algorithm. Large deviations were seen in some correction based algorithms, so sophisticated algorithms, would be preferred in clinical practices, especially for calculation in inhomogeneous media. Use of model based algorithms with lateral transport calculation, is recommended. Some systematic errors which were revealed during this study, is showing necessity of performing periodic audits on TPS in radiotherapy centers. © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beltran, C; Kamal, H
Purpose: To provide a multicriteria optimization algorithm for intensity modulated radiation therapy using pencil proton beam scanning. Methods: Intensity modulated radiation therapy using pencil proton beam scanning requires efficient optimization algorithms to overcome the uncertainties in the Bragg peaks locations. This work is focused on optimization algorithms that are based on Monte Carlo simulation of the treatment planning and use the weights and the dose volume histogram (DVH) control points to steer toward desired plans. The proton beam treatment planning process based on single objective optimization (representing a weighted sum of multiple objectives) usually leads to time-consuming iterations involving treatmentmore » planning team members. We proved a time efficient multicriteria optimization algorithm that is developed to run on NVIDIA GPU (Graphical Processing Units) cluster. The multicriteria optimization algorithm running time benefits from up-sampling of the CT voxel size of the calculations without loss of fidelity. Results: We will present preliminary results of Multicriteria optimization for intensity modulated proton therapy based on DVH control points. The results will show optimization results of a phantom case and a brain tumor case. Conclusion: The multicriteria optimization of the intensity modulated radiation therapy using pencil proton beam scanning provides a novel tool for treatment planning. Work support by a grant from Varian Inc.« less
Evaluation of a treatment-based classification algorithm for low back pain: a cross-sectional study.
Stanton, Tasha R; Fritz, Julie M; Hancock, Mark J; Latimer, Jane; Maher, Christopher G; Wand, Benedict M; Parent, Eric C
2011-04-01
Several studies have investigated criteria for classifying patients with low back pain (LBP) into treatment-based subgroups. A comprehensive algorithm was created to translate these criteria into a clinical decision-making guide. This study investigated the translation of the individual subgroup criteria into a comprehensive algorithm by studying the prevalence of patients meeting the criteria for each treatment subgroup and the reliability of the classification. This was a cross-sectional, observational study. Two hundred fifty patients with acute or subacute LBP were recruited from the United States and Australia to participate in the study. Trained physical therapists performed standardized assessments on all participants. The researchers used these findings to classify participants into subgroups. Thirty-one participants were reassessed to determine interrater reliability of the algorithm decision. Based on individual subgroup criteria, 25.2% (95% confidence interval [CI]=19.8%-30.6%) of the participants did not meet the criteria for any subgroup, 49.6% (95% CI=43.4%-55.8%) of the participants met the criteria for only one subgroup, and 25.2% (95% CI=19.8%-30.6%) of the participants met the criteria for more than one subgroup. The most common combination of subgroups was manipulation + specific exercise (68.4% of the participants who met the criteria for 2 subgroups). Reliability of the algorithm decision was moderate (kappa=0.52, 95% CI=0.27-0.77, percentage of agreement=67%). Due to a relatively small patient sample, reliability estimates are somewhat imprecise. These findings provide important clinical data to guide future research and revisions to the algorithm. The finding that 25% of the participants met the criteria for more than one subgroup has important implications for the sequencing of treatments in the algorithm. Likewise, the finding that 25% of the participants did not meet the criteria for any subgroup provides important information regarding potential revisions to the algorithm's bottom table (which guides unclear classifications). Reliability of the algorithm is sufficient for clinical use.
A pragmatic evidence-based clinical management algorithm for burning mouth syndrome
Yoo, Timothy; Han, Peter; Liu, Yuan; Inman, Jared C.
2018-01-01
Background Burning mouth syndrome is a poorly understood disease process with no current standard of treatment. The goal of this article is to provide an evidence-based, practical, clinical algorithm as a guideline for the treatment of burning mouth syndrome. Material and Methods Using available evidence and clinical experience, a multi-step management algorithm was developed. A retrospective cohort study was then performed, following STROBE statement guidelines, comparing outcomes of patients who were managed using the algorithm and those who were managed without. Results Forty-seven patients were included in the study, with 21 (45%) managed using the algorithm and 26 (55%) managed without. The mean age overall was 60.4 ±16.5 years, and most patients (39, 83%) were female. Cohorts showed no statistical difference in age, sex, overall follow-up time, dysgeusia, geographic tongue, or psychiatric disorder; xerostomia, however, was significantly different, skewed toward the algorithm group. Significantly more non-algorithm patients did not continue care (69% vs. 29%, p=0.001). The odds ratio of not continuing care for the non-algorithm group compared to the algorithm group was 5.6 [1.6, 19.8]. Improvement in pain was significantly more likely in the algorithm group (p=0.001), with an odds ratio of 27.5 [3.1, 242.0]. Conclusions We present a basic clinical management algorithm for burning mouth syndrome which may increase the likelihood of pain improvement and patient follow-up. Key words:Burning mouth syndrome, burning tongue, glossodynia, oral pain, oral burning, therapy, treatment. PMID:29750091
NASA Astrophysics Data System (ADS)
Penfold, Scott; Zalas, Rafał; Casiraghi, Margherita; Brooke, Mark; Censor, Yair; Schulte, Reinhard
2017-05-01
A split feasibility formulation for the inverse problem of intensity-modulated radiation therapy treatment planning with dose-volume constraints included in the planning algorithm is presented. It involves a new type of sparsity constraint that enables the inclusion of a percentage-violation constraint in the model problem and its handling by continuous (as opposed to integer) methods. We propose an iterative algorithmic framework for solving such a problem by applying the feasibility-seeking CQ-algorithm of Byrne combined with the automatic relaxation method that uses cyclic projections. Detailed implementation instructions are furnished. Functionality of the algorithm was demonstrated through the creation of an intensity-modulated proton therapy plan for a simple 2D C-shaped geometry and also for a realistic base-of-skull chordoma treatment site. Monte Carlo simulations of proton pencil beams of varying energy were conducted to obtain dose distributions for the 2D test case. A research release of the Pinnacle 3 proton treatment planning system was used to extract pencil beam doses for a clinical base-of-skull chordoma case. In both cases the beamlet doses were calculated to satisfy dose-volume constraints according to our new algorithm. Examination of the dose-volume histograms following inverse planning with our algorithm demonstrated that it performed as intended. The application of our proposed algorithm to dose-volume constraint inverse planning was successfully demonstrated. Comparison with optimized dose distributions from the research release of the Pinnacle 3 treatment planning system showed the algorithm could achieve equivalent or superior results.
Wang, Wendy T J; Olson, Sharon L; Campbell, Anne H; Hanten, William P; Gleeson, Peggy B
2003-03-01
The purpose of this study was to determine the effectiveness of an individualized physical therapy intervention in treating neck pain based on a clinical reasoning algorithm. Treatment effectiveness was examined by assessing changes in impairment, physical performance, and disability in response to intervention. One treatment group of 30 patients with neck pain completed physical therapy treatment. The control group of convenience was formed by a cohort group of 27 subjects who also had neck pain but did not receive treatment for various reasons. There were no significant differences between groups in demographic data and the initial test scores of the outcome measures. A quasi-experimental, nonequivalent, pretest-posttest control group design was used. A physical therapist rendered an eclectic intervention to the treatment group based on a clinical decision-making algorithm. Treatment outcome measures included the following five dependent variables: cervical range of motion, numeric pain rating, timed weighted overhead endurance, the supine capital flexion endurance test, and the Patient Specific Functional Scale. Both the treatment and control groups completed the initial and follow-up examinations, with an average duration of 4 wk between tests. Five mixed analyses of variance with follow-up tests showed a significant difference for all outcome measures in the treatment group compared with the control group. After an average 4 wk of physical therapy intervention, patients in the treatment group demonstrated statistically significant increases of cervical range of motion, decrease of pain, increases of physical performance measures, and decreases in the level of disability. The control group showed no differences in all five outcome variables between the initial and follow-up test scores. This study delineated algorithm-based clinical reasoning strategies for evaluating and treating patients with cervical pain. The algorithm can help clinicians classify patients with cervical pain into clinical patterns and provides pattern-specific guidelines for physical therapy interventions. An organized and specific physical therapy program was effective in improving the status of patients with neck pain.
Zhao, Li; Chen, Chunxia; Li, Bei; Dong, Li; Guo, Yingqiang; Xiao, Xijun; Zhang, Eryong; Qin, Li
2014-01-01
Objective To study the performance of pharmacogenetics-based warfarin dosing algorithms in the initial and the stable warfarin treatment phases in a cohort of Han-Chinese patients undertaking mechanic heart valve replacement. Methods We searched PubMed, Chinese National Knowledge Infrastructure and Wanfang databases for selecting pharmacogenetics-based warfarin dosing models. Patients with mechanic heart valve replacement were consecutively recruited between March 2012 and July 2012. The predicted warfarin dose of each patient was calculated and compared with the observed initial and stable warfarin doses. The percentage of patients whose predicted dose fell within 20% of their actual therapeutic dose (percentage within 20%), and the mean absolute error (MAE) were utilized to evaluate the predictive accuracy of all the selected algorithms. Results A total of 8 algorithms including Du, Huang, Miao, Wei, Zhang, Lou, Gage, and International Warfarin Pharmacogenetics Consortium (IWPC) model, were tested in 181 patients. The MAE of the Gage, IWPC and 6 Han-Chinese pharmacogenetics-based warfarin dosing algorithms was less than 0.6 mg/day in accuracy and the percentage within 20% exceeded 45% in all of the selected models in both the initial and the stable treatment stages. When patients were stratified according to the warfarin dose range, all of the equations demonstrated better performance in the ideal-dose range (1.88–4.38 mg/day) than the low-dose range (<1.88 mg/day). Among the 8 algorithms compared, the algorithms of Wei, Huang, and Miao showed a lower MAE and higher percentage within 20% in both the initial and the stable warfarin dose prediction and in the low-dose and the ideal-dose ranges. Conclusions All of the selected pharmacogenetics-based warfarin dosing regimens performed similarly in our cohort. However, the algorithms of Wei, Huang, and Miao showed a better potential for warfarin prediction in the initial and the stable treatment phases in Han-Chinese patients undertaking mechanic heart valve replacement. PMID:24728385
Zhao, Li; Chen, Chunxia; Li, Bei; Dong, Li; Guo, Yingqiang; Xiao, Xijun; Zhang, Eryong; Qin, Li
2014-01-01
To study the performance of pharmacogenetics-based warfarin dosing algorithms in the initial and the stable warfarin treatment phases in a cohort of Han-Chinese patients undertaking mechanic heart valve replacement. We searched PubMed, Chinese National Knowledge Infrastructure and Wanfang databases for selecting pharmacogenetics-based warfarin dosing models. Patients with mechanic heart valve replacement were consecutively recruited between March 2012 and July 2012. The predicted warfarin dose of each patient was calculated and compared with the observed initial and stable warfarin doses. The percentage of patients whose predicted dose fell within 20% of their actual therapeutic dose (percentage within 20%), and the mean absolute error (MAE) were utilized to evaluate the predictive accuracy of all the selected algorithms. A total of 8 algorithms including Du, Huang, Miao, Wei, Zhang, Lou, Gage, and International Warfarin Pharmacogenetics Consortium (IWPC) model, were tested in 181 patients. The MAE of the Gage, IWPC and 6 Han-Chinese pharmacogenetics-based warfarin dosing algorithms was less than 0.6 mg/day in accuracy and the percentage within 20% exceeded 45% in all of the selected models in both the initial and the stable treatment stages. When patients were stratified according to the warfarin dose range, all of the equations demonstrated better performance in the ideal-dose range (1.88-4.38 mg/day) than the low-dose range (<1.88 mg/day). Among the 8 algorithms compared, the algorithms of Wei, Huang, and Miao showed a lower MAE and higher percentage within 20% in both the initial and the stable warfarin dose prediction and in the low-dose and the ideal-dose ranges. All of the selected pharmacogenetics-based warfarin dosing regimens performed similarly in our cohort. However, the algorithms of Wei, Huang, and Miao showed a better potential for warfarin prediction in the initial and the stable treatment phases in Han-Chinese patients undertaking mechanic heart valve replacement.
A study of optimization techniques in HDR brachytherapy for the prostate
NASA Astrophysics Data System (ADS)
Pokharel, Ghana Shyam
Several studies carried out thus far are in favor of dose escalation to the prostate gland to have better local control of the disease. But optimal way of delivery of higher doses of radiation therapy to the prostate without hurting neighboring critical structures is still debatable. In this study, we proposed that real time high dose rate (HDR) brachytherapy with highly efficient and effective optimization could be an alternative means of precise delivery of such higher doses. This approach of delivery eliminates the critical issues such as treatment setup uncertainties and target localization as in external beam radiation therapy. Likewise, dosimetry in HDR brachytherapy is not influenced by organ edema and potential source migration as in permanent interstitial implants. Moreover, the recent report of radiobiological parameters further strengthen the argument of using hypofractionated HDR brachytherapy for the management of prostate cancer. Firstly, we studied the essential features and requirements of real time HDR brachytherapy treatment planning system. Automating catheter reconstruction with fast editing tools, fast yet accurate dose engine, robust and fast optimization and evaluation engine are some of the essential requirements for such procedures. Moreover, in most of the cases we performed, treatment plan optimization took significant amount of time of overall procedure. So, making treatment plan optimization automatic or semi-automatic with sufficient speed and accuracy was the goal of the remaining part of the project. Secondly, we studied the role of optimization function and constraints in overall quality of optimized plan. We have studied the gradient based deterministic algorithm with dose volume histogram (DVH) and more conventional variance based objective functions for optimization. In this optimization strategy, the relative weight of particular objective in aggregate objective function signifies its importance with respect to other objectives. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was done to keep time window desirable for real time procedures. Therefore, it requires further study with improved conditions to realize the full potential of the algorithm.
[New calculation algorithms in brachytherapy for iridium 192 treatments].
Robert, C; Dumas, I; Martinetti, F; Chargari, C; Haie-Meder, C; Lefkopoulos, D
2018-05-18
Since 1995, the brachytherapy dosimetry protocols follow the methodology recommended by the Task Group 43. This methodology, which has the advantage of being fast, is based on several approximations that are not always valid in clinical conditions. Model-based dose calculation algorithms have recently emerged in treatment planning stations and are considered as a major evolution by allowing for consideration of the patient's finite dimensions, tissue heterogeneities and the presence of high atomic number materials in applicators. In 2012, a report from the American Association of Physicists in Medicine Radiation Therapy Task Group 186 reviews these models and makes recommendations for their clinical implementation. This review focuses on the use of model-based dose calculation algorithms in the context of iridium 192 treatments. After a description of these algorithms and their clinical implementation, a summary of the main questions raised by these new methods is performed. Considerations regarding the choice of the medium used for the dose specification and the recommended methodology for assigning materials characteristics are especially described. In the last part, recent concrete examples from the literature illustrate the capabilities of these new algorithms on clinical cases. Copyright © 2018 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
An algorithm for the evaluation and treatment of sacroiliac joint dysfunction.
Carlson, Samuel W; Magee, Sean; Carlson, Walter O
2014-11-01
Approximately 90 percent of adults experience an episode of low back pain in their lifetime. Sacroiliac joint (SIJ) dysfunction has been shown to cause approximately 13-30 percent of LBP in the adult population. SIJ fusion is becoming an increasingly popular treatment alternative for SIJ dysfunction. This paper presents a literature-based algorithm to assist the clinician in the evaluation and treatment of patients with suspected SIJ dysfunction.
Evaluation of mathematical algorithms for automatic patient alignment in radiosurgery.
Williams, Kenneth M; Schulte, Reinhard W; Schubert, Keith E; Wroe, Andrew J
2015-06-01
Image registration techniques based on anatomical features can serve to automate patient alignment for intracranial radiosurgery procedures in an effort to improve the accuracy and efficiency of the alignment process as well as potentially eliminate the need for implanted fiducial markers. To explore this option, four two-dimensional (2D) image registration algorithms were analyzed: the phase correlation technique, mutual information (MI) maximization, enhanced correlation coefficient (ECC) maximization, and the iterative closest point (ICP) algorithm. Digitally reconstructed radiographs from the treatment planning computed tomography scan of a human skull were used as the reference images, while orthogonal digital x-ray images taken in the treatment room were used as the captured images to be aligned. The accuracy of aligning the skull with each algorithm was compared to the alignment of the currently practiced procedure, which is based on a manual process of selecting common landmarks, including implanted fiducials and anatomical skull features. Of the four algorithms, three (phase correlation, MI maximization, and ECC maximization) demonstrated clinically adequate (ie, comparable to the standard alignment technique) translational accuracy and improvements in speed compared to the interactive, user-guided technique; however, the ICP algorithm failed to give clinically acceptable results. The results of this work suggest that a combination of different algorithms may provide the best registration results. This research serves as the initial groundwork for the translation of automated, anatomy-based 2D algorithms into a real-world system for 2D-to-2D image registration and alignment for intracranial radiosurgery. This may obviate the need for invasive implantation of fiducial markers into the skull and may improve treatment room efficiency and accuracy. © The Author(s) 2014.
ERIC Educational Resources Information Center
Hughes, Carroll W.; Emslie, Graham J.; Crismon, M. Lynn; Posner, Kelly; Birmaher, Boris; Ryan, Neal; Jensen, Peter; Curry, John; Vitiello, Benedetto; Lopez, Molly; Shon, Steve P.; Pliszka, Steven R.; Trivedi, Madhukar H.
2007-01-01
Objective: To revise and update consensus guidelines for medication treatment algorithms for childhood major depressive disorder based on new scientific evidence and expert clinical consensus when evidence is lacking. Method: A consensus conference was held January 13-14, 2005, that included academic clinicians and researchers, practicing…
A fast optimization algorithm for multicriteria intensity modulated proton therapy planning.
Chen, Wei; Craft, David; Madden, Thomas M; Zhang, Kewu; Kooy, Hanne M; Herman, Gabor T
2010-09-01
To describe a fast projection algorithm for optimizing intensity modulated proton therapy (IMPT) plans and to describe and demonstrate the use of this algorithm in multicriteria IMPT planning. The authors develop a projection-based solver for a class of convex optimization problems and apply it to IMPT treatment planning. The speed of the solver permits its use in multicriteria optimization, where several optimizations are performed which span the space of possible treatment plans. The authors describe a plan database generation procedure which is customized to the requirements of the solver. The optimality precision of the solver can be specified by the user. The authors apply the algorithm to three clinical cases: A pancreas case, an esophagus case, and a tumor along the rib cage case. Detailed analysis of the pancreas case shows that the algorithm is orders of magnitude faster than industry-standard general purpose algorithms (MOSEK'S interior point optimizer, primal simplex optimizer, and dual simplex optimizer). Additionally, the projection solver has almost no memory overhead. The speed and guaranteed accuracy of the algorithm make it suitable for use in multicriteria treatment planning, which requires the computation of several diverse treatment plans. Additionally, given the low memory overhead of the algorithm, the method can be extended to include multiple geometric instances and proton range possibilities, for robust optimization.
Fosse-Edorh, S; Rigou, A; Morin, S; Fezeu, L; Mandereau-Bruno, L; Fagot-Campagna, A
2017-10-01
Medico-administrative databases represent a very interesting source of information in the field of endocrine, nutritional and metabolic diseases. The objective of this article is to describe the early works of the Redsiam working group in this field. Algorithms developed in France in the field of diabetes, the treatment of dyslipidemia, precocious puberty, and bariatric surgery based on the National Inter-schema Information System on Health Insurance (SNIIRAM) data were identified and described. Three algorithms for identifying people with diabetes are available in France. These algorithms are based either on full insurance coverage for diabetes or on claims of diabetes treatments, or on the combination of these two methods associated with hospitalizations related to diabetes. Each of these algorithms has a different purpose, and the choice should depend on the goal of the study. Algorithms for identifying people treated for dyslipidemia or precocious puberty or who underwent bariatric surgery are also available. Early work from the Redsiam working group in the field of endocrine, nutritional and metabolic diseases produced an inventory of existing algorithms in France, linked with their goals, together with a presentation of their limitations and advantages, providing useful information for the scientific community. This work will continue with discussions about algorithms on the incidence of diabetes in children, thyroidectomy for thyroid nodules, hypothyroidism, hypoparathyroidism, and amyloidosis. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan
2015-01-01
Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.
2015-01-01
Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007
Russian guidelines for the management of COPD: algorithm of pharmacologic treatment
Aisanov, Zaurbek; Avdeev, Sergey; Arkhipov, Vladimir; Belevskiy, Andrey; Chuchalin, Alexander; Leshchenko, Igor; Ovcharenko, Svetlana; Shmelev, Evgeny; Miravitlles, Marc
2018-01-01
The high prevalence of COPD together with its high level of misdiagnosis and late diagnosis dictate the necessity for the development and implementation of clinical practice guidelines (CPGs) in order to improve the management of this disease. High-quality, evidence-based international CPGs need to be adapted to the particular situation of each country or region. A new version of the Russian Respiratory Society guidelines released at the end of 2016 was based on the proposal by Global Initiative for Obstructive Lung Disease but adapted to the characteristics of the Russian health system and included an algorithm of pharmacologic treatment of COPD. The proposed algorithm had to comply with the requirements of the Russian Ministry of Health to be included into the unified electronic rubricator, which required a balance between the level of information and the simplicity of the graphic design. This was achieved by: exclusion of the initial diagnostic process, grouping together the common pharmacologic and nonpharmacologic measures for all patients, and the decision not to use the letters A–D for simplicity and clarity. At all stages of the treatment algorithm, efficacy and safety have to be carefully assessed. Escalation and de-escalation is possible in the case of lack of or insufficient efficacy or safety issues. Bronchodilators should not be discontinued except in the case of significant side effects. At the same time, inhaled corticosteroid (ICS) withdrawal is not represented in the algorithm, because it was agreed that there is insufficient evidence to establish clear criteria for ICSs discontinuation. Finally, based on the Global Initiative for Obstructive Lung Disease statement, the proposed algorithm reflects and summarizes different approaches to the pharmacological treatment of COPD taking into account the reality of health care in the Russian Federation. PMID:29386887
Russian guidelines for the management of COPD: algorithm of pharmacologic treatment.
Aisanov, Zaurbek; Avdeev, Sergey; Arkhipov, Vladimir; Belevskiy, Andrey; Chuchalin, Alexander; Leshchenko, Igor; Ovcharenko, Svetlana; Shmelev, Evgeny; Miravitlles, Marc
2018-01-01
The high prevalence of COPD together with its high level of misdiagnosis and late diagnosis dictate the necessity for the development and implementation of clinical practice guidelines (CPGs) in order to improve the management of this disease. High-quality, evidence-based international CPGs need to be adapted to the particular situation of each country or region. A new version of the Russian Respiratory Society guidelines released at the end of 2016 was based on the proposal by Global Initiative for Obstructive Lung Disease but adapted to the characteristics of the Russian health system and included an algorithm of pharmacologic treatment of COPD. The proposed algorithm had to comply with the requirements of the Russian Ministry of Health to be included into the unified electronic rubricator, which required a balance between the level of information and the simplicity of the graphic design. This was achieved by: exclusion of the initial diagnostic process, grouping together the common pharmacologic and nonpharmacologic measures for all patients, and the decision not to use the letters A-D for simplicity and clarity. At all stages of the treatment algorithm, efficacy and safety have to be carefully assessed. Escalation and de-escalation is possible in the case of lack of or insufficient efficacy or safety issues. Bronchodilators should not be discontinued except in the case of significant side effects. At the same time, inhaled corticosteroid (ICS) withdrawal is not represented in the algorithm, because it was agreed that there is insufficient evidence to establish clear criteria for ICSs discontinuation. Finally, based on the Global Initiative for Obstructive Lung Disease statement, the proposed algorithm reflects and summarizes different approaches to the pharmacological treatment of COPD taking into account the reality of health care in the Russian Federation.
Scott, Frank I; Shah, Yash; Lasch, Karen; Luo, Michelle; Lewis, James D
2018-01-18
Vedolizumab, an α4β7 integrin monoclonal antibody inhibiting gut lymphocyte trafficking, is an effective treatment for ulcerative colitis (UC). We evaluated the optimal position of vedolizumab in the UC treatment paradigm. Using Markov modeling, we assessed multiple algorithms for the treatment of UC. The base case was a 35-year-old male with steroid-dependent moderately to severely active UC without previous immunomodulator or biologic use. The model included 4 different algorithms over 1 year, with vedolizumab use prior to: initiating azathioprine (Algorithm 1), combination therapy with infliximab and azathioprine (Algorithm 2), combination therapy with an alternative anti-tumor necrosis factor (anti-TNF) and azathioprine (Algorithm 3), and colectomy (Algorithm 4). Transition probabilities and quality-adjusted life-year (QALY) estimates were derived from the published literature. Primary analyses included simulating 100 trials of 100,000 individuals, assessing clinical outcomes, and QALYs. Sensitivity analyses employed longer time horizons and ranges for all variables. Algorithm 1 (vedolizumab use prior to all other therapies) was the preferred strategy, resulting in 8981 additional individuals in remission, 18 fewer cases of lymphoma, and 1087 fewer serious infections per 100,000 patients compared with last-line use (A4). Algorithm 1 also resulted in 0.0197 to 0.0205 more QALYs compared with other algorithms. This benefit increased with longer time horizons. Algorithm 1 was preferred in all sensitivity analyses. The model suggests that treatment algorithms positioning vedolizumab prior to other therapies should be considered for individuals with moderately to severely active steroid-dependent UC. Further prospective research is needed to confirm these simulated results. © 2018 Crohn’s & Colitis Foundation of America. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Khan, Muhammad Burhan; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Lai, Koon Chun
2017-12-01
Image processing and analysis is an effective tool for monitoring and fault diagnosis of activated sludge (AS) wastewater treatment plants. The AS image comprise of flocs (microbial aggregates) and filamentous bacteria. In this paper, nine different approaches are proposed for image segmentation of phase-contrast microscopic (PCM) images of AS samples. The proposed strategies are assessed for their effectiveness from the perspective of microscopic artifacts associated with PCM. The first approach uses an algorithm that is based on the idea that different color space representation of images other than red-green-blue may have better contrast. The second uses an edge detection approach. The third strategy, employs a clustering algorithm for the segmentation and the fourth applies local adaptive thresholding. The fifth technique is based on texture-based segmentation and the sixth uses watershed algorithm. The seventh adopts a split-and-merge approach. The eighth employs Kittler's thresholding. Finally, the ninth uses a top-hat and bottom-hat filtering-based technique. The approaches are assessed, and analyzed critically with reference to the artifacts of PCM. Gold approximations of ground truth images are prepared to assess the segmentations. Overall, the edge detection-based approach exhibits the best results in terms of accuracy, and the texture-based algorithm in terms of false negative ratio. The respective scenarios are explained for suitability of edge detection and texture-based algorithms.
Phan, Philippe; Mezghani, Neila; Aubin, Carl-Éric; de Guise, Jacques A; Labelle, Hubert
2011-07-01
Adolescent idiopathic scoliosis (AIS) is a complex spinal deformity whose assessment and treatment present many challenges. Computer applications have been developed to assist clinicians. A literature review on computer applications used in AIS evaluation and treatment has been undertaken. The algorithms used, their accuracy and clinical usability were analyzed. Computer applications have been used to create new classifications for AIS based on 2D and 3D features, assess scoliosis severity or risk of progression and assist bracing and surgical treatment. It was found that classification accuracy could be improved using computer algorithms that AIS patient follow-up and screening could be done using surface topography thereby limiting radiation and that bracing and surgical treatment could be optimized using simulations. Yet few computer applications are routinely used in clinics. With the development of 3D imaging and databases, huge amounts of clinical and geometrical data need to be taken into consideration when researching and managing AIS. Computer applications based on advanced algorithms will be able to handle tasks that could otherwise not be done which can possibly improve AIS patients' management. Clinically oriented applications and evidence that they can improve current care will be required for their integration in the clinical setting.
NASA Astrophysics Data System (ADS)
Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.
2010-08-01
Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC-based SBRT treatment planning in the routine clinical setting.
Ostreĭkov, I F; Podkopaev, V N; Moiseev, D B; Karpysheva, E V; Markova, L A; Sizov, S V
1997-01-01
Total mortality decreased by 2.5 times in the wards for intensive care of the newborns in the Tushino Pediatric Hospital in 1996 and is now 7.6%. Such results are due to a complex of measures, one such measure being the development and introduction of an algorithm for the diagnosis and treatment of newborns hospitalized in intensive care wards. The algorithm facilitates the work of the staff, helps earlier diagnose a disease, and, hence, carry out timely scientifically based therapy.
Formal verification of an oral messages algorithm for interactive consistency
NASA Technical Reports Server (NTRS)
Rushby, John
1992-01-01
The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.
Korean Medication Algorithm for Depressive Disorder: Comparisons with Other Treatment Guidelines
Wang, Hee Ryung; Bahk, Won-Myong; Seo, Jeong Seok; Woo, Young Sup; Park, Young-Min; Jeong, Jong-Hyun; Kim, Won; Shim, Se-Hoon; Lee, Jung Goo; Jon, Duk-In; Min, Kyung Joon
2017-01-01
In this review, we compared recommendations from the Korean Medication Algorithm Project for Depressive Disorder 2017 (KMAP-DD 2017) to other global treatment guidelines for depression. Six global treatment guidelines were reviewed; among the six, 4 were evidence-based guidelines, 1 was an expert consensus-based guideline, and 1 was an amalgamation of both evidence and expert consensus-based recommendations. The recommendations in the KMAP-DD 2017 were generally similar to those in other global treatment guidelines, although there were some differences between the guidelines. The KMAP-DD 2017 appeared to reflect current changes in the psychopharmacology of depression quite well, like other recently published evidence-based guidelines. As an expert consensus-based guideline, the KMAP-DD 2017 had some limitations. However, considering there are situations in which clinical evidence cannot be drawn from planned clinical trials, the KMAP-DD 2017 may be helpful for Korean psychiatrists making decisions in the clinical settings by complementing previously published evidence-based guidelines. PMID:28783928
Apostolidis, Apostolos; Averbeck, Marcio Augusto; Sahai, Arun; Rahnama'i, Mohhamad Sajjad; Anding, Ralf; Robinson, Dudley; Gravas, Stavros; Dmochowski, Roger
2017-04-01
To review and assess the definitions of drug resistance and the evidence supporting treatment for drug resistant overactive bladder/detrusor overactivity (OAB/DO). Evidence review of the extant literature and consensus of opinion was used to derive the summary recommendations. Drug resistance or drug refractory status has been inconsistently defined and reported in current evident sources. Recent publications use some correlation of lack of efficacy and or experienced side effects to define drug resistance. Algorithms based upon these definitions largely relate to the appropriate use of neuromodulation or botulinum neurotoxin, based upon patient selection and patient choice. Current treatment pathways are hampered by inability to consistently profile patients to optimize management, particularly after failure of initial pragmatic treatment. Further research is recommended to better identify patient phenotype for purposes of directing optimized therapy for OAB/DO. Current treatment algorithms are influenced by extensive data generated from recent neuromodulation and botulinum neurotoxin trials. © 2017 Wiley Periodicals, Inc.
Sorensen, Lene; Nielsen, Bent; Stage, Kurt B; Brøsen, Kim; Damkier, Per
2008-01-01
The objective of the study was to develop, implement and evaluate two treatment algorithms for schizophrenia and depression at a psychiatric hospital department. The treatment algorithms were based on available literature and developed in collaboration between psychiatrists, clinical pharmacologists and a clinical pharmacist. The treatment algorithms were introduced at a meeting for all psychiatrists, reinforced by the project psychiatrists in the daily routine and used for educational purposes of young doctors and medical students. A quantitative pre-post evaluation was conducted using data from medical charts, and qualitative information was collected by interviews. In general, no significant differences were found when comparing outcomes from 104 charts from the baseline period with 96 charts from the post-intervention period. Most of the patients (65% in the post-intervention period) admitted during the data collection periods did not receive any medication changes. Of the patients undergoing medication changes in the post-intervention period, 56% followed the algorithms, and 70% of the patients admitted to the psychiatric hospital department for the first time had their medications changed according to the algorithms. All of the 10 interviewed doctors found the algorithms useful. The treatment algorithms were successfully implemented with a high degree of satisfaction among the interviewed doctors. The majority of patients admitted to the psychiatric hospital department for the first time had their medications changed according to the algorithms.
2013-01-01
Background The high burden and rising incidence of cardiovascular disease (CVD) in resource constrained countries necessitates implementation of robust and pragmatic primary and secondary prevention strategies. Many current CVD management guidelines recommend absolute cardiovascular (CV) risk assessment as a clinically sound guide to preventive and treatment strategies. Development of non-laboratory based cardiovascular risk assessment algorithms enable absolute risk assessment in resource constrained countries. The objective of this review is to evaluate the performance of existing non-laboratory based CV risk assessment algorithms using the benchmarks for clinically useful CV risk assessment algorithms outlined by Cooney and colleagues. Methods A literature search to identify non-laboratory based risk prediction algorithms was performed in MEDLINE, CINAHL, Ovid Premier Nursing Journals Plus, and PubMed databases. The identified algorithms were evaluated using the benchmarks for clinically useful cardiovascular risk assessment algorithms outlined by Cooney and colleagues. Results Five non-laboratory based CV risk assessment algorithms were identified. The Gaziano and Framingham algorithms met the criteria for appropriateness of statistical methods used to derive the algorithms and endpoints. The Swedish Consultation, Framingham and Gaziano algorithms demonstrated good discrimination in derivation datasets. Only the Gaziano algorithm was externally validated where it had optimal discrimination. The Gaziano and WHO algorithms had chart formats which made them simple and user friendly for clinical application. Conclusion Both the Gaziano and Framingham non-laboratory based algorithms met most of the criteria outlined by Cooney and colleagues. External validation of the algorithms in diverse samples is needed to ascertain their performance and applicability to different populations and to enhance clinicians’ confidence in them. PMID:24373202
Influence of different dose calculation algorithms on the estimate of NTCP for lung complications.
Hedin, Emma; Bäck, Anna
2013-09-06
Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose-volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient-specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm-specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction-based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman-Kutcher-Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm-specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types.
A Pharmacotherapy Algorithm for Stabilization and Maintenance of Pediatric Bipolar Disorder
ERIC Educational Resources Information Center
Pavuluri, Mani N.; Henry, David B.; Devineni, Bhargavi; Carbray, Julie A.; Naylor, Michael W.; Janicak, Philip G.
2004-01-01
Objective: To assess the feasibility and effectiveness of an evidence-based pharmacotherapy algorithm in the treatment of pediatric bipolar disorder. Method: The study reports the results of a study of 64 bipolar type I subjects who were treated according to an algorithm developed in our specialty clinic. All subjects had been diagnosed using the…
Nasal Base Retraction: A Treatment Algorithm.
Tas, Süleyman; Colakoglu, Salih; Lee, Bernard Travis
2017-06-01
Nasal base retraction results from cephalic malposition of the alar base in the vertical plane, which causes disharmony of the alar base with the rest of the nose structures. Correcting nasal base retraction is very important for improved aesthetic outcomes; however, there is a limited body of literature about this deformity and its treatment. Create a nasal base retraction treatment algorithm based on a severity classification system. This is a retrospective case review study of 53 patients who underwent rhinoplasty with correction of alar base retraction by the senior author (S.T.). The minimum follow-up time was 6 months. Levator labii alaque nasi muscle dissection or alar base release with or without a rim graft on the effected side were performed based on the severity of the alar base retraction. Aesthetic results were assessed with objective grading of preoperative and postoperative patient photographs by two independent plastic surgeons. Functional improvement was assessed with patient self-evaluations of nasal patency. Also, a rhinoplasty outcomes evaluation (ROE) questionnaire was distributed to patients. Comparison of preoperative and postoperative photographs demonstrated that nasal base asymmetry was significantly improved in all cases, and 85% of the patients had complete symmetry. Nasal obstruction was also significantly reduced after surgery (P < 0.001). The majority of patients reported satisfaction (92.5%), with an ROE total score greater than or equal to 20. New techniques and a treatment algorithm for correcting nasal base retraction deformities that will help rhinoplasty surgeons obtain aesthetically and functionally pleasing outcomes for patients. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com
Zhou, Lu; Zhen, Xin; Lu, Wenting; Dou, Jianhong; Zhou, Linghong
2012-01-01
To validate the efficiency of an improved Demons deformable registration algorithm and evaluate its application in registration of the treatment image and the planning image in image-guided radiotherapy (IGRT). Based on Brox's gradient constancy assumption and Malis's efficient second-order minimization algorithm, a grey value gradient similarity term was added into the original energy function, and a formula was derived to calculate the update of transformation field. The limited Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) algorithm was used to optimize the energy function for automatic determination of the iteration number. The proposed algorithm was validated using mathematically deformed images, physically deformed phantom images and clinical tumor images. Compared with the original Additive Demons algorithm, the improved Demons algorithm achieved a higher precision and a faster convergence speed. Due to the influence of different scanning conditions in fractionated radiation, the density range of the treatment image and the planning image may be different. The improved Demons algorithm can achieve faster and more accurate radiotherapy.
Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V
2007-12-01
The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.
NASA Astrophysics Data System (ADS)
Laguda, Edcer Jerecho
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient's medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method. Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated. Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated. Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Finkelsztejn, Alessandro; Gabbai, Alberto Alain; Fragoso, Yara Dadalti; Carrá, Adriana; Macías-Islas, Miguel Angel; Arcega-Revilla, Raul; García-Bonitto, Juan; Oehninger-Gatti, Carlos Luis; Orozco-Escobar, Geraldine; Tarulla, Adriana; Vergara, Fernando; Vizcarra, Darwin
2012-10-01
It is estimated that circa 50,000 individuals have relapsing-remitting multiple sclerosis in Latin America. European and North-American algorithms for the treatment of multiple sclerosis do not foresee our regional difficulties and the access of patients to treatment. The Latin American Multiple Sclerosis Forum is an independent and supra-institutional group of experts that has assessed the latest scientific evidence regarding efficacy and safety of disease-modifying treatments. Accesses to treatment and pharmacovigilance programs for each of the eight countries represented at the Forum were also analyzed. A specific set of guidelines based upon evidence-based recommendations was designed for Latin America. Future perspectives of multiple sclerosis treatment were also discussed. The present paper translated an effort from representatives of eight countries discussing a matter that cannot be adapted to our region directly from purely European and North-American guidelines for treatment.
Prosthetic joint infection development of an evidence-based diagnostic algorithm.
Mühlhofer, Heinrich M L; Pohlig, Florian; Kanz, Karl-Georg; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; Kelch, Sarah; Harrasser, Norbert; von Eisenhart-Rothe, Rüdiger; Schauwecker, Johannes
2017-03-09
Increasing rates of prosthetic joint infection (PJI) have presented challenges for general practitioners, orthopedic surgeons and the health care system in the recent years. The diagnosis of PJI is complex; multiple diagnostic tools are used in the attempt to correctly diagnose PJI. Evidence-based algorithms can help to identify PJI using standardized diagnostic steps. We reviewed relevant publications between 1990 and 2015 using a systematic literature search in MEDLINE and PUBMED. The selected search results were then classified into levels of evidence. The keywords were prosthetic joint infection, biofilm, diagnosis, sonication, antibiotic treatment, implant-associated infection, Staph. aureus, rifampicin, implant retention, pcr, maldi-tof, serology, synovial fluid, c-reactive protein level, total hip arthroplasty (THA), total knee arthroplasty (TKA) and combinations of these terms. From an initial 768 publications, 156 publications were stringently reviewed. Publications with class I-III recommendations (EAST) were considered. We developed an algorithm for the diagnostic approach to display the complex diagnosis of PJI in a clear and logically structured process according to ISO 5807. The evidence-based standardized algorithm combines modern clinical requirements and evidence-based treatment principles. The algorithm provides a detailed transparent standard operating procedure (SOP) for diagnosing PJI. Thus, consistently high, examiner-independent process quality is assured to meet the demands of modern quality management in PJI diagnosis.
Bhidayasiri, Roongroj; Jitkritsadakul, Onanong; Friedman, Joseph H; Fahn, Stanley
2018-06-15
Management of tardive syndromes (TS) is challenging, with only a few evidence-based therapeutic algorithms reported in the American Academy of Neurology (AAN) guideline in 2013. To update the evidence-based recommendations and provide a practical treatment algorithm for management of TS by addressing 5 questions: 1) Is withdrawal of dopamine receptor blocking agents (DRBAs) an effective TS treatment? 2) Does switching from typical to atypical DRBAs reduce TS symptoms? 3) What is the efficacy of pharmacologic agents in treating TS? 4) Do patients with TS benefit from chemodenervation with botulinum toxin? 5) Do patients with TS benefit from surgical therapy? Systematic reviews were conducted by searching PsycINFO, Ovid MEDLINE, PubMed, EMBASE, Web of Science and Cochrane for articles published between 2012 and 2017 to identify new evidence published after the 2013 AAN guidelines. Articles were classified according to an AAN 4-tiered evidence-rating scheme. To the extent possible, for each study we attempted to categorize results based on the description of the population enrolled (tardive dyskinesia [TD], tardive dystonia, tardive tremor, etc.). Recommendations were based on the evidence. New evidence was combined with the existing guideline evidence to inform our recommendations. Deutetrabenazine and valbenazine are established as effective treatments of TD (Level A) and must be recommended as treatment. Clonazepam and Ginkgo biloba probably improve TD (Level B) and should be considered as treatment. Amantadine and tetrabenazine might be considered as TD treatment (Level C). Pallidal deep brain stimulation possibly improves TD and might be considered as a treatment for intractable TD (Level C). There is insufficient evidence to support or refute TS treatment by withdrawing causative agents or switching from typical to atypical DRBA (Level U). Copyright © 2018 Elsevier B.V. All rights reserved.
Development of antibiotic regimens using graph based evolutionary algorithms.
Corns, Steven M; Ashlock, Daniel A; Bryden, Kenneth M
2013-12-01
This paper examines the use of evolutionary algorithms in the development of antibiotic regimens given to production animals. A model is constructed that combines the lifespan of the animal and the bacteria living in the animal's gastro-intestinal tract from the early finishing stage until the animal reaches market weight. This model is used as the fitness evaluation for a set of graph based evolutionary algorithms to assess the impact of diversity control on the evolving antibiotic regimens. The graph based evolutionary algorithms have two objectives: to find an antibiotic treatment regimen that maintains the weight gain and health benefits of antibiotic use and to reduce the risk of spreading antibiotic resistant bacteria. This study examines different regimens of tylosin phosphate use on bacteria populations divided into Gram positive and Gram negative types, with a focus on Campylobacter spp. Treatment regimens were found that provided decreased antibiotic resistance relative to conventional methods while providing nearly the same benefits as conventional antibiotic regimes. By using a graph to control the information flow in the evolutionary algorithm, a variety of solutions along the Pareto front can be found automatically for this and other multi-objective problems. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chiou, Jin-Chern
1990-01-01
Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.
Clinical algorithms to aid osteoarthritis guideline dissemination.
Meneses, S R F; Goode, A P; Nelson, A E; Lin, J; Jordan, J M; Allen, K D; Bennell, K L; Lohmander, L S; Fernandes, L; Hochberg, M C; Underwood, M; Conaghan, P G; Liu, S; McAlindon, T E; Golightly, Y M; Hunter, D J
2016-09-01
Numerous scientific organisations have developed evidence-based recommendations aiming to optimise the management of osteoarthritis (OA). Uptake, however, has been suboptimal. The purpose of this exercise was to harmonize the recent recommendations and develop a user-friendly treatment algorithm to facilitate translation of evidence into practice. We updated a previous systematic review on clinical practice guidelines (CPGs) for OA management. The guidelines were assessed using the Appraisal of Guidelines for Research and Evaluation for quality and the standards for developing trustworthy CPGs as established by the National Academy of Medicine (NAM). Four case scenarios and algorithms were developed by consensus of a multidisciplinary panel. Sixteen guidelines were included in the systematic review. Most recommendations were directed toward physicians and allied health professionals, and most had multi-disciplinary input. Analysis for trustworthiness suggests that many guidelines still present a lack of transparency. A treatment algorithm was developed for each case scenario advised by recommendations from guidelines and based on panel consensus. Strategies to facilitate the implementation of guidelines in clinical practice are necessary. The algorithms proposed are examples of how to apply recommendations in the clinical context, helping the clinician to visualise the patient flow and timing of different treatment modalities. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Yorio, Jeff; Viswanathan, Sundeep; See, Raphael; Uchal, Linda; McWhorter, Jo Ann; Spencer, Nali; Murphy, Sabina; Khera, Amit; de Lemos, James A; McGuire, Darren K
2008-01-01
The application of disease management algorithms by physician extenders has been shown to improve therapeutic adherence in selected populations. It is unknown whether this strategy would improve adherence to secondary prevention goals after acute coronary syndromes (ACSs) in a largely indigent county hospital setting. Patients admitted for ACS were randomized at the time of discharge to usual follow-up care versus the same care with the addition of a physician extender visit. Physician extender visits were conducted according to a treatment algorithm based on contemporary practice guidelines. Groups were compared using the primary end point of achievement of low-density lipoprotein treatment goals at 3 months after discharge and achievement of additional evidence-based practice goals. One hundred forty consecutive patients were randomized. A similar proportion of patients returned for study follow-up in both groups at 3 months (54 [79%]/68 in the usual care group vs 57 [79%]/72 in the intervention group; P = 0.97). Among those completing the 3-month visit, a low-density lipoprotein cholesterol level less than 100 mg/dL was achieved in 37 (69%) of the usual care patients compared with 35 (57%) of those in the intervention group (P = 0.43). There was no statistical difference in implementation of therapeutic lifestyle changes (smoking cessation, cardiac rehabilitation, or exercise) between groups. Prescription rates of evidence-based therapeutics at 3 months were similar in both groups. The implementation of a post-ACS clinic run by a physician extender applying a disease management algorithm did not measurably improve adherence to evidence-based secondary prevention treatment goals. Despite initially high rates of evidence-based treatment at discharge, adherence with follow-up appointments and sustained implementation of evidence-based therapies remains a significant challenge in this high-risk cohort.
Franke, Konstantin H; Krumkamp, Ralf; Mohammed, Aliyu; Sarpong, Nimako; Owusu-Dabo, Ellis; Brinkel, Johanna; Fobil, Julius N; Marinovic, Axel Bonacic; Asihene, Philip; Boots, Mark; May, Jürgen; Kreuels, Benno
2018-03-27
The aim of this study was the development and evaluation of an algorithm-based diagnosis-tool, applicable on mobile phones, to support guardians in providing appropriate care to sick children. The algorithm was developed on the basis of the Integrated Management of Childhood Illness (IMCI) guidelines and evaluated at a hospital in Ghana. Two hundred and thirty-seven guardians applied the tool to assess their child's symptoms. Data recorded by the tool and health records completed by a physician were compared in terms of symptom detection, disease assessment and treatment recommendation. To compare both assessments, Kappa statistics and predictive values were calculated. The tool detected the symptoms of cough, fever, diarrhoea and vomiting with good agreement to the physicians' findings (kappa = 0.64; 0.59; 0.57 and 0.42 respectively). The disease assessment barely coincided with the physicians' findings. The tool's treatment recommendation correlated with the physicians' assessments in 93 out of 237 cases (39.2% agreement, kappa = 0.11), but underestimated a child's condition in only seven cases (3.0%). The algorithm-based tool achieved reliable symptom detection and treatment recommendations were administered conformably to the physicians' assessment. Testing in domestic environment is envisaged.
Friedli, Natalie; Stanga, Zeno; Culkin, Alison; Crook, Martin; Laviano, Alessandro; Sobotka, Lubos; Kressig, Reto W; Kondrup, Jens; Mueller, Beat; Schuetz, Philipp
2018-03-01
Refeeding syndrome (RFS) can be a life-threatening metabolic condition after nutritional replenishment if not recognized early and treated adequately. There is a lack of evidence-based treatment and monitoring algorithm for daily clinical practice. The aim of the study was to propose an expert consensus guideline for RFS for the medical inpatient (not including anorexic patients) regarding risk factors, diagnostic criteria, and preventive and therapeutic measures based on a previous systematic literature search. Based on a recent qualitative systematic review on the topic, we developed clinically relevant recommendations as well as a treatment and monitoring algorithm for the clinical management of inpatients regarding RFS. With international experts, these recommendations were discussed and agreement with the recommendation was rated. Upon hospital admission, we recommend the use of specific screening criteria (i.e., low body mass index, large unintentional weight loss, little or no nutritional intake, history of alcohol or drug abuse) for risk assessment regarding the occurrence of RFS. According to the patient's individual risk for RFS, a careful start of nutritional therapy with a stepwise increase in energy and fluids goals and supplementation of electrolyte and vitamins, as well as close clinical monitoring, is recommended. We also propose criteria for the diagnosis of imminent and manifest RFS with practical treatment recommendations with adoption of the nutritional therapy. Based on the available evidence, we developed a practical algorithm for risk assessment, treatment, and monitoring of RFS in medical inpatients. In daily routine clinical care, this may help to optimize and standardize the management of this vulnerable patient population. We encourage future quality studies to further refine these recommendations. Copyright © 2017 Elsevier Inc. All rights reserved.
A MULTICORE BASED PARALLEL IMAGE REGISTRATION METHOD
Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.
2012-01-01
Image registration is a crucial step for many image-assisted clinical applications such as surgery planning and treatment evaluation. In this paper we proposed a landmark based nonlinear image registration algorithm for matching 2D image pairs. The algorithm was shown to be effective and robust under conditions of large deformations. In landmark based registration, the most important step is establishing the correspondence among the selected landmark points. This usually requires an extensive search which is often computationally expensive. We introduced a nonregular data partition algorithm using the K-means clustering algorithm to group the landmarks based on the number of available processing cores. The step optimizes the memory usage and data transfer. We have tested our method using IBM Cell Broadband Engine (Cell/B.E.) platform. PMID:19964921
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, S; Zhang, H; Zhang, B
2015-06-15
Purpose: To clinically evaluate the differences in volumetric modulated arc therapy (VMAT) treatment plan and delivery between two commercial treatment planning systems. Methods: Two commercial VMAT treatment planning systems with different VMAT optimization algorithms and delivery approaches were evaluated. This study included 16 clinical VMAT plans performed with the first system: 2 spine, 4 head and neck (HN), 2 brain, 4 pancreas, and 4 pelvis plans. These 16 plans were then re-optimized with the same number of arcs using the second treatment planning system. Planning goals were invariant between the two systems. Gantry speed, dose rate modulation, MLC modulation, planmore » quality, number of monitor units (MUs), VMAT quality assurance (QA) results, and treatment delivery time were compared between the 2 systems. VMAT QA results were performed using Mapcheck2 and analyzed with gamma analysis (3mm/3% and 2mm/2%). Results: Similar plan quality was achieved with each VMAT optimization algorithm, and the difference in delivery time was minimal. Algorithm 1 achieved planning goals by highly modulating the MLC (total distance traveled by leaves (TL) = 193 cm average over control points per plan), while maintaining a relatively constant dose rate (dose-rate change <100 MU/min). Algorithm 2 involved less MLC modulation (TL = 143 cm per plan), but greater dose-rate modulation (range = 0-600 MU/min). The average number of MUs was 20% less for algorithm 2 (ratio of MUs for algorithms 2 and 1 ranged from 0.5-1). VMAT QA results were similar for all disease sites except HN plans. For HN plans, the average gamma passing rates were 88.5% (2mm/2%) and 96.9% (3mm/3%) for algorithm 1 and 97.9% (2mm/2%) and 99.6% (3mm/3%) for algorithm 2. Conclusion: Both VMAT optimization algorithms achieved comparable plan quality; however, fewer MUs were needed and QA results were more robust for Algorithm 2, which more highly modulated dose rate.« less
Terris-Prestholt, Fern; Vickerman, Peter; Torres-Rueda, Sergio; Santesso, Nancy; Sweeney, Sedona; Mallma, Patricia; Shelley, Katharine D; Garcia, Patricia J; Bronzan, Rachel; Gill, Michelle M; Broutet, Nathalie; Wi, Teodora; Watts, Charlotte; Mabey, David; Peeling, Rosanna W; Newman, Lori
2015-06-01
Rapid plasma reagin (RPR) is frequently used to test women for maternal syphilis. Rapid syphilis immunochromatographic strip tests detecting only Treponema pallidum antibodies (single RSTs) or both treponemal and non-treponemal antibodies (dual RSTs) are now available. This study assessed the cost-effectiveness of algorithms using these tests to screen pregnant women. Observed costs of maternal syphilis screening and treatment using clinic-based RPR and single RSTs in 20 clinics across Peru, Tanzania, and Zambia were used to model the cost-effectiveness of algorithms using combinations of RPR, single, and dual RSTs, and no and mass treatment. Sensitivity analyses determined drivers of key results. Although this analysis found screening using RPR to be relatively cheap, most (>70%) true cases went untreated. Algorithms using single RSTs were the most cost-effective in all observed settings, followed by dual RSTs, which became the most cost-effective if dual RST costs were halved. Single test algorithms dominated most sequential testing algorithms, although sequential algorithms reduced overtreatment. Mass treatment was relatively cheap and effective in the absence of screening supplies, though treated many uninfected women. This analysis highlights the advantages of introducing RSTs in three diverse settings. The results should be applicable to other similar settings. Copyright © 2015 International Federation of Gynecology and Obstetrics. All rights reserved.
Terris-Prestholt, Fern; Vickerman, Peter; Torres-Rueda, Sergio; Santesso, Nancy; Sweeney, Sedona; Mallma, Patricia; Shelley, Katharine D.; Garcia, Patricia J.; Bronzan, Rachel; Gill, Michelle M.; Broutet, Nathalie; Wi, Teodora; Watts, Charlotte; Mabey, David; Peeling, Rosanna W.; Newman, Lori
2015-01-01
Objective Rapid plasma reagin (RPR) is frequently used to test women for maternal syphilis. Rapid syphilis immunochromatographic strip tests detecting only Treponema pallidum antibodies (single RSTs) or both treponemal and non-treponemal antibodies (dual RSTs) are now available. This study assessed the cost-effectiveness of algorithms using these tests to screen pregnant women. Methods Observed costs of maternal syphilis screening and treatment using clinic-based RPR and single RSTs in 20 clinics across Peru, Tanzania, and Zambia were used to model the cost-effectiveness of algorithms using combinations of RPR, single, and dual RSTs, and no and mass treatment. Sensitivity analyses determined drivers of key results. Results Although this analysis found screening using RPR to be relatively cheap, most (> 70%) true cases went untreated. Algorithms using single RSTs were the most cost-effective in all observed settings, followed by dual RSTs, which became the most cost-effective if dual RST costs were halved. Single test algorithms dominated most sequential testing algorithms, although sequential algorithms reduced overtreatment. Mass treatment was relatively cheap and effective in the absence of screening supplies, though treated many uninfected women. Conclusion This analysis highlights the advantages of introducing RSTs in three diverse settings. The results should be applicable to other similar settings. PMID:25963907
Algorithms for optimizing the treatment of depression: making the right decision at the right time.
Adli, M; Rush, A J; Möller, H-J; Bauer, M
2003-11-01
Medication algorithms for the treatment of depression are designed to optimize both treatment implementation and the appropriateness of treatment strategies. Thus, they are essential tools for treating and avoiding refractory depression. Treatment algorithms are explicit treatment protocols that provide specific therapeutic pathways and decision-making tools at critical decision points throughout the treatment process. The present article provides an overview of major projects of algorithm research in the field of antidepressant therapy. The Berlin Algorithm Project and the Texas Medication Algorithm Project (TMAP) compare algorithm-guided treatments with treatment as usual. The Sequenced Treatment Alternatives to Relieve Depression Project (STAR*D) compares different treatment strategies in treatment-resistant patients.
Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson
2008-01-01
We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...
Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei
2015-04-11
To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.
A Computerized Decision Support System for Depression in Primary Care
Kurian, Benji T.; Trivedi, Madhukar H.; Grannemann, Bruce D.; Claassen, Cynthia A.; Daly, Ella J.; Sunderajan, Prabha
2009-01-01
Objective: In 2004, results from The Texas Medication Algorithm Project (TMAP) showed better clinical outcomes for patients whose physicians adhered to a paper-and-pencil algorithm compared to patients who received standard clinical treatment for major depressive disorder (MDD). However, implementation of and fidelity to the treatment algorithm among various providers was observed to be inadequate. A computerized decision support system (CDSS) for the implementation of the TMAP algorithm for depression has since been developed to improve fidelity and adherence to the algorithm. Method: This was a 2-group, parallel design, clinical trial (one patient group receiving MDD treatment from physicians using the CDSS and the other patient group receiving usual care) conducted at 2 separate primary care clinics in Texas from March 2005 through June 2006. Fifty-five patients with MDD (DSM-IV criteria) with no significant difference in disease characteristics were enrolled, 32 of whom were treated by physicians using CDSS and 23 were treated by physicians using usual care. The study's objective was to evaluate the feasibility and efficacy of implementing a CDSS to assist physicians acutely treating patients with MDD compared to usual care in primary care. Primary efficacy outcomes for depression symptom severity were based on the 17-item Hamilton Depression Rating Scale (HDRS17) evaluated by an independent rater. Results: Patients treated by physicians employing CDSS had significantly greater symptom reduction, based on the HDRS17, than patients treated with usual care (P < .001). Conclusions: The CDSS algorithm, utilizing measurement-based care, was superior to usual care for patients with MDD in primary care settings. Larger randomized controlled trials are needed to confirm these findings. Trial Registration: clinicaltrials.gov Identifier: NCT00551083 PMID:19750065
A computerized decision support system for depression in primary care.
Kurian, Benji T; Trivedi, Madhukar H; Grannemann, Bruce D; Claassen, Cynthia A; Daly, Ella J; Sunderajan, Prabha
2009-01-01
In 2004, results from The Texas Medication Algorithm Project (TMAP) showed better clinical outcomes for patients whose physicians adhered to a paper-and-pencil algorithm compared to patients who received standard clinical treatment for major depressive disorder (MDD). However, implementation of and fidelity to the treatment algorithm among various providers was observed to be inadequate. A computerized decision support system (CDSS) for the implementation of the TMAP algorithm for depression has since been developed to improve fidelity and adherence to the algorithm. This was a 2-group, parallel design, clinical trial (one patient group receiving MDD treatment from physicians using the CDSS and the other patient group receiving usual care) conducted at 2 separate primary care clinics in Texas from March 2005 through June 2006. Fifty-five patients with MDD (DSM-IV criteria) with no significant difference in disease characteristics were enrolled, 32 of whom were treated by physicians using CDSS and 23 were treated by physicians using usual care. The study's objective was to evaluate the feasibility and efficacy of implementing a CDSS to assist physicians acutely treating patients with MDD compared to usual care in primary care. Primary efficacy outcomes for depression symptom severity were based on the 17-item Hamilton Depression Rating Scale (HDRS(17)) evaluated by an independent rater. Patients treated by physicians employing CDSS had significantly greater symptom reduction, based on the HDRS(17), than patients treated with usual care (P < .001). The CDSS algorithm, utilizing measurement-based care, was superior to usual care for patients with MDD in primary care settings. Larger randomized controlled trials are needed to confirm these findings. clinicaltrials.gov Identifier: NCT00551083.
Wiethoff, Katja; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Laakmann, Gregor; Brieger, Peter; Cordes, Joachim; Malevani, Jaroslav; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Kronmüller, Klaus-Thomas; Smolka, Michael N; Schlattmann, Peter; Berger, Maximilian; Ricken, Roland; Stamm, Thomas J; Heinz, Andreas; Bauer, Michael
2017-01-01
Abstract Background Treatment algorithms are considered as key to improve outcomes by enhancing the quality of care. This is the first randomized controlled study to evaluate the clinical effect of algorithm-guided treatment in inpatients with major depressive disorder. Methods Inpatients, aged 18 to 70 years with major depressive disorder from 10 German psychiatric departments were randomized to 5 different treatment arms (from 2000 to 2005), 3 of which were standardized stepwise drug treatment algorithms (ALGO). The fourth arm proposed medications and provided less specific recommendations based on a computerized documentation and expert system (CDES), the fifth arm received treatment as usual (TAU). ALGO included 3 different second-step strategies: lithium augmentation (ALGO LA), antidepressant dose-escalation (ALGO DE), and switch to a different antidepressant (ALGO SW). Time to remission (21-item Hamilton Depression Rating Scale ≤9) was the primary outcome. Results Time to remission was significantly shorter for ALGO DE (n=91) compared with both TAU (n=84) (HR=1.67; P=.014) and CDES (n=79) (HR=1.59; P=.031) and ALGO SW (n=89) compared with both TAU (HR=1.64; P=.018) and CDES (HR=1.56; P=.038). For both ALGO LA (n=86) and ALGO DE, fewer antidepressant medications were needed to achieve remission than for CDES or TAU (P<.001). Remission rates at discharge differed across groups; ALGO DE had the highest (89.2%) and TAU the lowest rates (66.2%). Conclusions A highly structured algorithm-guided treatment is associated with shorter times and fewer medication changes to achieve remission with depressed inpatients than treatment as usual or computerized medication choice guidance. PMID:28645191
Adli, Mazda; Wiethoff, Katja; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Laakmann, Gregor; Brieger, Peter; Cordes, Joachim; Malevani, Jaroslav; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Kronmüller, Klaus-Thomas; Smolka, Michael N; Schlattmann, Peter; Berger, Maximilian; Ricken, Roland; Stamm, Thomas J; Heinz, Andreas; Bauer, Michael
2017-09-01
Treatment algorithms are considered as key to improve outcomes by enhancing the quality of care. This is the first randomized controlled study to evaluate the clinical effect of algorithm-guided treatment in inpatients with major depressive disorder. Inpatients, aged 18 to 70 years with major depressive disorder from 10 German psychiatric departments were randomized to 5 different treatment arms (from 2000 to 2005), 3 of which were standardized stepwise drug treatment algorithms (ALGO). The fourth arm proposed medications and provided less specific recommendations based on a computerized documentation and expert system (CDES), the fifth arm received treatment as usual (TAU). ALGO included 3 different second-step strategies: lithium augmentation (ALGO LA), antidepressant dose-escalation (ALGO DE), and switch to a different antidepressant (ALGO SW). Time to remission (21-item Hamilton Depression Rating Scale ≤9) was the primary outcome. Time to remission was significantly shorter for ALGO DE (n=91) compared with both TAU (n=84) (HR=1.67; P=.014) and CDES (n=79) (HR=1.59; P=.031) and ALGO SW (n=89) compared with both TAU (HR=1.64; P=.018) and CDES (HR=1.56; P=.038). For both ALGO LA (n=86) and ALGO DE, fewer antidepressant medications were needed to achieve remission than for CDES or TAU (P<.001). Remission rates at discharge differed across groups; ALGO DE had the highest (89.2%) and TAU the lowest rates (66.2%). A highly structured algorithm-guided treatment is associated with shorter times and fewer medication changes to achieve remission with depressed inpatients than treatment as usual or computerized medication choice guidance. © The Author 2017. Published by Oxford University Press on behalf of CINP.
Naidoo, Pren; van Niekerk, Margaret; du Toit, Elizabeth; Beyers, Nulda; Leon, Natalie
2015-10-28
Although new molecular diagnostic tests such as GenoType MTBDRplus and Xpert® MTB/RIF have reduced multidrug-resistant tuberculosis (MDR-TB) treatment initiation times, patients' experiences of diagnosis and treatment initiation are not known. This study aimed to explore and compare MDR-TB patients' experiences of their diagnostic and treatment initiation pathway in GenoType MTBDRplus and Xpert® MTB/RIF-based diagnostic algorithms. The study was undertaken in Cape Town, South Africa where primary health-care services provided free TB diagnosis and treatment. A smear, culture and GenoType MTBDRplus diagnostic algorithm was used in 2010, with Xpert® MTB/RIF phased in from 2011-2013. Participants diagnosed in each algorithm at four facilities were purposively sampled, stratifying by age, gender and MDR-TB risk profiles. We conducted in-depth qualitative interviews using a semi-structured interview guide. Through constant comparative analysis we induced common and divergent themes related to symptom recognition, health-care access, testing for MDR-TB and treatment initiation within and between groups. Data were triangulated with clinical information and health visit data from a structured questionnaire. We identified both enablers and barriers to early MDR-TB diagnosis and treatment. Half the patients had previously been treated for TB; most recognised recurring symptoms and reported early health-seeking. Those who attributed symptoms to other causes delayed health-seeking. Perceptions of poor public sector services were prevalent and may have contributed both to deferred health-seeking and to patient's use of the private sector, contributing to delays. However, once on treatment, most patients expressed satisfaction with public sector care. Two patients in the Xpert® MTB/RIF-based algorithm exemplified its potential to reduce delays, commencing MDR-TB treatment within a week of their first health contact. However, most patients in both algorithms experienced substantial delays. Avoidable health system delays resulted from providers not testing for TB at initial health contact, non-adherence to testing algorithms, results not being available and failure to promptly recall patients with positive results. Whilst the introduction of rapid tests such as Xpert® MTB/RIF can expedite MDR-TB diagnosis and treatment initiation, the full benefits are unlikely to be realised without reducing delays in health-seeking and addressing the structural barriers present in the health-care system.
ALGORITHM FOR TREATMENT OF PATIENTS WITH MESIAL OCCLUSION USING PROPRIETARY ORTHODONTIC DEVICE.
Flis, P; Filonenko, V; Doroshenko, N
2017-10-01
Early elimination of dentoalveolar apparatus orthodontic disorders is the dominant concept in the treatment technique. to present a sagittal anomalies treatment algorithm, of Class III particularly, in the transitional bite period by the proposed design of individual orthodontic devices with a movable ramp. The treatment algorithm consisted of several blocks: the motivation, the etiological factors establishment, the plan and treatment tactics creation basing on careful diagnosis, the stages of the active period of treatment and the patient management in the retention period. Anthropometric measurements of the maxilla and mandible models were performed to determine the degree of dental arches development. The length of the dental arches was determined on the models by the Nance method in combination with the Huckaba method and the sagital dimensions by the Mirgasizov's method. The leading role in the patients examination was taken by lateral cephalograms analyzing using the Sassouni Plus method. The proposed construction of an orthodontic appliance consists of a plastic base, a vestibular arc, retaining clasps and a ramp, which is connected with a base with two torsion springs. To demonstrate the effectiveness of the proposed construction, an example of the patient Y. treatment of at the age of 6 years 9 months is presented. After the treatment, positive morphological, functional and aesthetic changes were established. The usage of proposed orthodontic appliance with movable ramp allows to start orthodontic treatment at early age, increases its effectiveness and reduce the number of complications. The expediency of stage-by-stage treatment is approved by a positive results of such method. To achieve stable results, it is important to individualize their prognosis even at the planning stage of orthodontic treatment.
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
Olive Crown Porosity Measurement Based on Radiation Transmittance: An Assessment of Pruning Effect.
Castillo-Ruiz, Francisco J; Castro-Garcia, Sergio; Blanco-Roldan, Gregorio L; Sola-Guirado, Rafael R; Gil-Ribes, Jesus A
2016-05-19
Crown porosity influences radiation interception, air movement through the fruit orchard, spray penetration, and harvesting operation in fruit crops. The aim of the present study was to develop an accurate and reliable methodology based on transmitted radiation measurements to assess the porosity of traditional olive trees under different pruning treatments. Transmitted radiation was employed as an indirect method to measure crown porosity in two olive orchards of the Picual and Hojiblanca cultivars. Additionally, three different pruning treatments were considered to determine if the pruning system influences crown porosity. This study evaluated the accuracy and repeatability of four algorithms in measuring crown porosity under different solar zenith angles. From a 14° to 30° solar zenith angle, the selected algorithm produced an absolute error of less than 5% and a repeatability higher than 0.9. The described method and selected algorithm proved satisfactory in field results, making it possible to measure crown porosity at different solar zenith angles. However, pruning fresh weight did not show any relationship with crown porosity due to the great differences between removed branches. A robust and accurate algorithm was selected for crown porosity measurements in traditional olive trees, making it possible to discern between different pruning treatments.
GTV-based prescription in SBRT for lung lesions using advanced dose calculation algorithms.
Lacornerie, Thomas; Lisbona, Albert; Mirabel, Xavier; Lartigau, Eric; Reynaert, Nick
2014-10-16
The aim of current study was to investigate the way dose is prescribed to lung lesions during SBRT using advanced dose calculation algorithms that take into account electron transport (type B algorithms). As type A algorithms do not take into account secondary electron transport, they overestimate the dose to lung lesions. Type B algorithms are more accurate but still no consensus is reached regarding dose prescription. The positive clinical results obtained using type A algorithms should be used as a starting point. In current work a dose-calculation experiment is performed, presenting different prescription methods. Three cases with three different sizes of peripheral lung lesions were planned using three different treatment platforms. For each individual case 60 Gy to the PTV was prescribed using a type A algorithm and the dose distribution was recalculated using a type B algorithm in order to evaluate the impact of the secondary electron transport. Secondly, for each case a type B algorithm was used to prescribe 48 Gy to the PTV, and the resulting doses to the GTV were analyzed. Finally, prescriptions based on specific GTV dose volumes were evaluated. When using a type A algorithm to prescribe the same dose to the PTV, the differences regarding median GTV doses among platforms and cases were always less than 10% of the prescription dose. The prescription to the PTV based on type B algorithms, leads to a more important variability of the median GTV dose among cases and among platforms, (respectively 24%, and 28%). However, when 54 Gy was prescribed as median GTV dose, using a type B algorithm, the variability observed was minimal. Normalizing the prescription dose to the median GTV dose for lung lesions avoids variability among different cases and treatment platforms of SBRT when type B algorithms are used to calculate the dose. The combination of using a type A algorithm to optimize a homogeneous dose in the PTV and using a type B algorithm to prescribe the median GTV dose provides a very robust method for treating lung lesions.
TOMS UV Algorithm: Problems and Enhancements. 2
NASA Technical Reports Server (NTRS)
Krotkov, Nickolay; Herman, Jay; Bhartia, P. K.; Seftor, Colin; Arola, Antti; Kaurola, Jussi; Kroskinen, Lasse; Kalliskota, S.; Taalas, Petteri; Geogdzhaev, I.
2002-01-01
Satellite instruments provide global maps of surface ultraviolet (UV) irradiance by combining backscattered radiance measurements with radiative transfer models. The models are limited by uncertainties in input parameters of the atmosphere and the surface. We evaluate the effects of possible enhancements of the current Total Ozone Mapping Spectrometer (TOMS) surface UV irradiance algorithm focusing on effects of diurnal variation of cloudiness and improved treatment of snow/ice. The emphasis is on comparison between the results of the current (version 1) TOMS UV algorithm and each of the changes proposed. We evaluate different approaches for improved treatment of pixel average cloud attenuation, with and without snow/ice on the ground. In addition to treating clouds based only on the measurements at the local time of the TOMS observations, the results from other satellites and weather assimilation models can be used to estimate attenuation of the incident UV irradiance throughout the day. A new method is proposed to obtain a more realistic treatment of snow covered terrain. The method is based on a statistical relation between UV reflectivity and snow depth. The new method reduced the bias between the TOMS UV estimations and ground-based UV measurements for snow periods. The improved (version 2) algorithm will be applied to re-process the existing TOMS UV data record (since 1978) and to the future satellite sensors (e.g., Quik/TOMS, GOME, OMI on EOS/Aura and Triana/EPIC).
[Preclinical treatment of multiple trauma : what is important?].
Schweigkofler, U; Hoffmann, R
2013-09-01
Multiple trauma is still the most common cause of death in the age group below 40 years but rarely occurs in prehospital emergencies in Germany. Therefore, personal experience of emergency physicians in prehospital treatment of multiple trauma is often limited. Priority-based therapy according to standardized algorithms and advances in clinical and intensive care have reduced hospital mortality down to 13 %. Time factors, treatment and transport by Helicopter Emergency Medical Services seem to have had a significant impact on the outcome. The current German multiple trauma S3 guidelines provide algorithms for preclinical treatment. The underlying scientific evidence in this respect is, however, low.
A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support
Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.
2014-01-01
Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768
Alagar, Ananda Giri Babu; Mani, Ganesh Kadirampatti; Karunakaran, Kaviarasu
2016-01-08
Small fields smaller than 4 × 4 cm2 are used in stereotactic and conformal treatments where heterogeneity is normally present. Since dose calculation accuracy in both small fields and heterogeneity often involves more discrepancy, algorithms used by treatment planning systems (TPS) should be evaluated for achieving better treatment results. This report aims at evaluating accuracy of four model-based algorithms, X-ray Voxel Monte Carlo (XVMC) from Monaco, Superposition (SP) from CMS-Xio, AcurosXB (AXB) and analytical anisotropic algorithm (AAA) from Eclipse are tested against the measurement. Measurements are done using Exradin W1 plastic scintillator in Solid Water phantom with heterogeneities like air, lung, bone, and aluminum, irradiated with 6 and 15 MV photons of square field size ranging from 1 to 4 cm2. Each heterogeneity is introduced individually at two different depths from depth-of-dose maximum (Dmax), one setup being nearer and another farther from the Dmax. The central axis percentage depth-dose (CADD) curve for each setup is measured separately and compared with the TPS algorithm calculated for the same setup. The percentage normalized root mean squared deviation (%NRMSD) is calculated, which represents the whole CADD curve's deviation against the measured. It is found that for air and lung heterogeneity, for both 6 and 15 MV, all algorithms show maximum deviation for field size 1 × 1 cm2 and gradually reduce when field size increases, except for AAA. For aluminum and bone, all algorithms' deviations are less for 15 MV irrespective of setup. In all heterogeneity setups, 1 × 1 cm2 field showed maximum deviation, except in 6MV bone setup. All algorithms in the study, irrespective of energy and field size, when any heterogeneity is nearer to Dmax, the dose deviation is higher compared to the same heterogeneity far from the Dmax. Also, all algorithms show maximum deviation in lower-density materials compared to high-density materials.
Stoykov, Nikolay S; Kuiken, Todd A; Lowery, Madeleine M; Taflove, Allen
2003-09-01
We present what we believe to be the first algorithms that use a simple scalar-potential formulation to model linear Debye and Lorentz dielectric dispersions at low frequencies in the context of finite-element time-domain (FETD) numerical solutions of electric potential. The new algorithms, which permit treatment of multiple-pole dielectric relaxations, are based on the auxiliary differential equation method and are unconditionally stable. We validate the algorithms by comparison with the results of a previously reported method based on the Fourier transform. The new algorithms should be useful in calculating the transient response of biological materials subject to impulsive excitation. Potential applications include FETD modeling of electromyography, functional electrical stimulation, defibrillation, and effects of lightning and impulsive electric shock.
Development of a simple algorithm to guide the effective management of traumatic cardiac arrest.
Lockey, David J; Lyon, Richard M; Davies, Gareth E
2013-06-01
Major trauma is the leading worldwide cause of death in young adults. The mortality from traumatic cardiac arrest remains high but survival with good neurological outcome from cardiopulmonary arrest following major trauma has been regularly reported. Rapid, effective intervention is required to address potential reversible causes of traumatic cardiac arrest if the victim is to survive. Current ILCOR guidelines do not contain a standard algorithm for management of traumatic cardiac arrest. We present a simple algorithm to manage the major trauma patient in actual or imminent cardiac arrest. We reviewed the published English language literature on traumatic cardiac arrest and major trauma management. A treatment algorithm was developed based on this and the experience of treatment of more than a thousand traumatic cardiac arrests by a physician - paramedic pre-hospital trauma service. The algorithm addresses the need treat potential reversible causes of traumatic cardiac arrest. This includes immediate resuscitative thoracotomy in cases of penetrating chest trauma, airway management, optimising oxygenation, correction of hypovolaemia and chest decompression to exclude tension pneumothorax. The requirement to rapidly address a number of potentially reversible pathologies in a short time period lends the management of traumatic cardiac arrest to a simple treatment algorithm. A standardised approach may prevent delay in diagnosis and treatment and improve current poor survival rates. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Young, Allan; Blier, Pierre; Kasper, Siegfried; Moeller, Hans Jurgen
2017-01-01
Abstract Background: The current paper includes a systematic search of the literature, a detailed presentation of the results, and a grading of treatment options in terms of efficacy and tolerability/safety. Material and Methods: The PRISMA method was used in the literature search with the combination of the words ‘bipolar,’ ‘manic,’ ‘mania,’ ‘manic depression,’ and ‘manic depressive’ with ‘randomized,’ and ‘algorithms’ with ‘mania,’ ‘manic,’ ‘bipolar,’ ‘manic-depressive,’ or ‘manic depression.’ Relevant web pages and review articles were also reviewed. Results: The current report is based on the analysis of 57 guideline papers and 531 published papers related to RCTs, reviews, posthoc, or meta-analysis papers to March 25, 2016. The specific treatment options for acute mania, mixed episodes, acute bipolar depression, maintenance phase, psychotic and mixed features, anxiety, and rapid cycling were evaluated with regards to efficacy. Existing treatment guidelines were also reviewed. Finally, Tables reflecting efficacy and recommendation levels were created that led to the development of a precise algorithm that still has to prove its feasibility in everyday clinical practice. Conclusions: A systematic literature search was conducted on the pharmacological treatment of bipolar disorder to identify all relevant random controlled trials pertaining to all aspects of bipolar disorder and graded the data according to a predetermined method to develop a precise treatment algorithm for management of various phases of bipolar disorder. It is important to note that the some of the recommendations in the treatment algorithm were based on the secondary outcome data from posthoc analyses. PMID:27816941
Influence of different dose calculation algorithms on the estimate of NTCP for lung complications
Bäck, Anna
2013-01-01
Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose‐volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient‐specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm‐specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction‐based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman‐Kutcher‐Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm‐specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types. PACS numbers: 87.53.‐j, 87.53.Kn, 87.55.‐x, 87.55.dh, 87.55.kd PMID:24036865
Nielsen, Tine B; Wieslander, Elinore; Fogliata, Antonella; Nielsen, Morten; Hansen, Olfred; Brink, Carsten
2011-05-01
To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms. Six dose algorithms from four different treatment planning systems were investigated: Eclipse AAA, Oncentra MasterPlan Collapsed Cone and Pencil Beam, Pinnacle Collapsed Cone and XiO Multigrid Superposition, and Fast Fourier Transform Convolution. Twenty NSCLC patients treated in the period 2001-2006 at the same accelerator were included and the accelerator used for treatments were modeled in the different systems. The treatment plans were recalculated with the same number of monitor units and beam arrangements across the dose algorithms. Dose volume histograms of the GTV, PTV, combined lungs (excluding the GTV), and heart were exported and evaluated. NTCP values for heart and lungs were calculated using the relative seriality model and the LKB model, respectively. Furthermore, NTCP for the lungs were calculated from two different model parameter sets. Calculations and evaluations were performed both including and excluding density corrections. There are found statistical significant differences between the calculated dose to heart, lung, and targets across the algorithms. Mean lung dose and V20 are not very sensitive to change between the investigated dose calculation algorithms. However, the different dose levels for the PTV averaged over the patient population are varying up to 11%. The predicted NTCP values for pneumonitis vary between 0.20 and 0.24 or 0.35 and 0.48 across the investigated dose algorithms depending on the chosen model parameter set. The influence of the use of density correction in the dose calculation on the predicted NTCP values depends on the specific dose calculation algorithm and the model parameter set. For fixed values of these, the changes in NTCP can be up to 45%. Calculated NTCP values for pneumonitis are more sensitive to the choice of algorithm than mean lung dose and V20 which are also commonly used for plan evaluation. The NTCP values for heart complication are, in this study, not very sensitive to the choice of algorithm. Dose calculations based on density corrections result in quite different NTCP values than calculations without density corrections. It is therefore important when working with NTCP planning to use NTCP parameter values based on calculations and treatments similar to those for which the NTCP is of interest.
SU-E-T-629: Prediction of the ViewRay Radiotherapy Treatment Time for Clinical Logistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S; Wooten, H; Wu, Y
Purpose: An algorithm is developed in our clinic, given a new treatment plan, to predict treatment delivery time for radiation therapy (RT) treatments of patients on ViewRay magnetic resonance-image guided radiation therapy (MR-IGRT) delivery system. This algorithm is necessary for managing patient treatment appointments, and is useful as an indicator to assess the treatment plan complexity. Methods: A patient’s total treatment delivery time, not including time required for localization, may be described as the sum of four components: (1) the treatment initialization time; (2) the total beam-on time; (3) the gantry rotation time; and (4) the multileaf collimator (MLC) motionmore » time. Each of the four components is predicted separately. The total beam-on time can be calculated using both the planned beam-on time and the decay-corrected delivery dose rate. To predict the remaining components, we quantitatively analyze the patient treatment delivery record files. The initialization time is demonstrated to be random since it depends on the final gantry angle and MLC leaf positions of the previous treatment. Based on modeling the relationships between the gantry rotation angles and the corresponding rotation time, and between the furthest MLC leaf moving distance and the corresponding MLC motion time, the total delivery time is predicted using linear regression. Results: The proposed algorithm has demonstrated the feasibility of predicting the ViewRay treatment delivery time for any treatment plan of any patient. The average prediction error is 0.89 minutes or 5.34%, and the maximal prediction error is 2.09 minutes or 13.87%. Conclusion: We have developed a treatment delivery time prediction algorithm based on the analysis of previous patients’ treatment delivery records. The accuracy of our prediction is sufficient for guiding and arranging patient treatment appointments on a daily basis. The predicted delivery time could also be used as an indicator to assess the treatment plan complexity. This work was supported by a research grant from Viewray Inc.« less
[The guideline for the treatment of mood disorders in USA and Japan].
Higuchi, T
2001-08-01
Recently, the number of available antidepressants has increased dramatically and psychopharmacological treatment is becoming complex. It is important to present some guideline for supporting clinical decision making. Three different kinds of guideline for the treatment of mood disorders, that is, the APA style guideline, the algorithm and the consensus guideline, have been developed in our country. The APA style guideline and the algorithm are basically evidence based and the consensus guideline is developed through the consensus panel format. These guidelines should be used as 'a starting point' for specifying decisions that will be modified occasionally.
Gandolla, Marta; Molteni, Franco; Ward, Nick S; Guanziroli, Eleonora; Ferrigno, Giancarlo; Pedrocchi, Alessandra
2015-11-01
The foreseen outcome of a rehabilitation treatment is a stable improvement on the functional outcomes, which can be longitudinally assessed through multiple measures to help clinicians in functional evaluation. In this study, we propose an automatic comprehensive method of combining multiple measures in order to assess a functional improvement. As test-bed, a functional electrical stimulation based treatment for foot drop correction performed with chronic post-stroke participants is presented. Patients were assessed on five relevant outcome measures before, after intervention, and at a follow-up time-point. A novel algorithm based on variables minimum detectable change is proposed and implemented in a custom-made software, combining the outcome measures to obtain a unique parameter: capacity score. The difference between capacity scores at different timing is three holded to obtain improvement evaluation. Ten clinicians evaluated patients on the Improvement Clinical Global Impression scale. Eleven patients underwent the treatment, and five resulted to achieve a stable functional improvement, as assessed by the proposed algorithm. A statistically significant agreement between intra-clinicians and algorithm-clinicians evaluations was demonstrated. The proposed method evaluates functional improvement on a single-subject yes/no base by merging different measures (e.g., kinematic, muscular) and it is validated against clinical evaluation.
Harlaar, Jaap; Brehm, Merel; Becher, Jules G; Bregman, Daan J J; Buurke, Jaap; Holtkamp, Fred; De Groot, Vincent; Nollet, Frans
2010-09-01
Ankle Foot Orthoses (AFOs) to promote walking ability are a common treatment in patients with neurological or muscular diseases. However, guidelines on the prescription of AFOs are currently based on a low level of evidence regarding their efficacy. Recent studies aiming to demonstrate the efficacy of wearing an AFO in respect to walking ability are not always conclusive. In this paper it is argued to recognize two levels of evidence related to the ICF levels. Activity level evidence expresses the gain in walking ability for the patient, while mechanical evidence expresses the correct functioning of the AFO. Used in combination for the purpose of evaluating the efficacy of orthotic treatment, a conjunct improvement at both levels reinforces the treatment algorithm that is used. Conversely, conflicting outcomes will challenge current treatment algorithms and the supposed working mechanism of the AFO. A treatment algorithm must use relevant information as an input, derived from measurements with a high precision. Its result will be a specific AFO that matches the patient's needs, specified by the mechanical characterization of the AFO footwear combination. It is concluded that research on the efficacy of AFOs should use parameters from two levels of evidence, to prove the efficacy of a treatment algorithm, i.e., how to prescribe a well-matched AFO.
Development and evaluation of an articulated registration algorithm for human skeleton registration
NASA Astrophysics Data System (ADS)
Yip, Stephen; Perk, Timothy; Jeraj, Robert
2014-03-01
Accurate registration over multiple scans is necessary to assess treatment response of bone diseases (e.g. metastatic bone lesions). This study aimed to develop and evaluate an articulated registration algorithm for the whole-body skeleton registration in human patients. In articulated registration, whole-body skeletons are registered by auto-segmenting into individual bones using atlas-based segmentation, and then rigidly aligning them. Sixteen patients (weight = 80-117 kg, height = 168-191 cm) with advanced prostate cancer underwent the pre- and mid-treatment PET/CT scans over a course of cancer therapy. Skeletons were extracted from the CT images by thresholding (HU>150). Skeletons were registered using the articulated, rigid, and deformable registration algorithms to account for position and postural variability between scans. The inter-observers agreement in the atlas creation, the agreement between the manually and atlas-based segmented bones, and the registration performances of all three registration algorithms were all assessed using the Dice similarity index—DSIobserved, DSIatlas, and DSIregister. Hausdorff distance (dHausdorff) of the registered skeletons was also used for registration evaluation. Nearly negligible inter-observers variability was found in the bone atlases creation as the DSIobserver was 96 ± 2%. Atlas-based and manual segmented bones were in excellent agreement with DSIatlas of 90 ± 3%. Articulated (DSIregsiter = 75 ± 2%, dHausdorff = 0.37 ± 0.08 cm) and deformable registration algorithms (DSIregister = 77 ± 3%, dHausdorff = 0.34 ± 0.08 cm) considerably outperformed the rigid registration algorithm (DSIregsiter = 59 ± 9%, dHausdorff = 0.69 ± 0.20 cm) in the skeleton registration as the rigid registration algorithm failed to capture the skeleton flexibility in the joints. Despite superior skeleton registration performance, deformable registration algorithm failed to preserve the local rigidity of bones as over 60% of the skeletons were deformed. Articulated registration is superior to rigid and deformable registrations by capturing global flexibility while preserving local rigidity inherent in skeleton registration. Therefore, articulated registration can be employed to accurately register the whole-body human skeletons, and it enables the treatment response assessment of various bone diseases.
Dosimetric verification of radiotherapy treatment planning systems in Serbia: national audit
2012-01-01
Background Independent external audits play an important role in quality assurance programme in radiation oncology. The audit supported by the IAEA in Serbia was designed to review the whole chain of activities in 3D conformal radiotherapy (3D-CRT) workflow, from patient data acquisition to treatment planning and dose delivery. The audit was based on the IAEA recommendations and focused on dosimetry part of the treatment planning and delivery processes. Methods The audit was conducted in three radiotherapy departments of Serbia. An anthropomorphic phantom was scanned with a computed tomography unit (CT) and treatment plans for eight different test cases involving various beam configurations suggested by the IAEA were prepared on local treatment planning systems (TPSs). The phantom was irradiated following the treatment plans for these test cases and doses in specific points were measured with an ionization chamber. The differences between the measured and calculated doses were reported. Results The measurements were conducted for different photon beam energies and TPS calculation algorithms. The deviation between the measured and calculated values for all test cases made with advanced algorithms were within the agreement criteria, while the larger deviations were observed for simpler algorithms. The number of measurements with results outside the agreement criteria increased with the increase of the beam energy and decreased with TPS calculation algorithm sophistication. Also, a few errors in the basic dosimetry data in TPS were detected and corrected. Conclusions The audit helped the users to better understand the operational features and limitations of their TPSs and resulted in increased confidence in dose calculation accuracy using TPSs. The audit results indicated the shortcomings of simpler algorithms for the test cases performed and, therefore the transition to more advanced algorithms is highly desirable. PMID:22971539
Dosimetric verification of radiotherapy treatment planning systems in Serbia: national audit.
Rutonjski, Laza; Petrović, Borislava; Baucal, Milutin; Teodorović, Milan; Cudić, Ozren; Gershkevitsh, Eduard; Izewska, Joanna
2012-09-12
Independent external audits play an important role in quality assurance programme in radiation oncology. The audit supported by the IAEA in Serbia was designed to review the whole chain of activities in 3D conformal radiotherapy (3D-CRT) workflow, from patient data acquisition to treatment planning and dose delivery. The audit was based on the IAEA recommendations and focused on dosimetry part of the treatment planning and delivery processes. The audit was conducted in three radiotherapy departments of Serbia. An anthropomorphic phantom was scanned with a computed tomography unit (CT) and treatment plans for eight different test cases involving various beam configurations suggested by the IAEA were prepared on local treatment planning systems (TPSs). The phantom was irradiated following the treatment plans for these test cases and doses in specific points were measured with an ionization chamber. The differences between the measured and calculated doses were reported. The measurements were conducted for different photon beam energies and TPS calculation algorithms. The deviation between the measured and calculated values for all test cases made with advanced algorithms were within the agreement criteria, while the larger deviations were observed for simpler algorithms. The number of measurements with results outside the agreement criteria increased with the increase of the beam energy and decreased with TPS calculation algorithm sophistication. Also, a few errors in the basic dosimetry data in TPS were detected and corrected. The audit helped the users to better understand the operational features and limitations of their TPSs and resulted in increased confidence in dose calculation accuracy using TPSs. The audit results indicated the shortcomings of simpler algorithms for the test cases performed and, therefore the transition to more advanced algorithms is highly desirable.
Model-based clustering for RNA-seq data.
Si, Yaqing; Liu, Peng; Li, Pinghua; Brutnell, Thomas P
2014-01-15
RNA-seq technology has been widely adopted as an attractive alternative to microarray-based methods to study global gene expression. However, robust statistical tools to analyze these complex datasets are still lacking. By grouping genes with similar expression profiles across treatments, cluster analysis provides insight into gene functions and networks, and hence is an important technique for RNA-seq data analysis. In this manuscript, we derive clustering algorithms based on appropriate probability models for RNA-seq data. An expectation-maximization algorithm and another two stochastic versions of expectation-maximization algorithms are described. In addition, a strategy for initialization based on likelihood is proposed to improve the clustering algorithms. Moreover, we present a model-based hybrid-hierarchical clustering method to generate a tree structure that allows visualization of relationships among clusters as well as flexibility of choosing the number of clusters. Results from both simulation studies and analysis of a maize RNA-seq dataset show that our proposed methods provide better clustering results than alternative methods such as the K-means algorithm and hierarchical clustering methods that are not based on probability models. An R package, MBCluster.Seq, has been developed to implement our proposed algorithms. This R package provides fast computation and is publicly available at http://www.r-project.org
Merkx, Maarten J M; Schippers, Gerard M; Koeter, Maarten J W; Vuijk, Pieter Jelle; Oudejans, Suzan; de Vries, Carlijn C Q; van den Brink, Wim
2007-03-01
To examine the feasibility of implementing evidence-based guidelines for patient-treatment-matching to levels of care in two Dutch substance abuse treatment centres. Multi-centre observational follow-up study. Two large substance abuse treatment centres (SATCs). All 4394 referrals to the two SATCs in 2003. Baseline patient characteristics needed for treatment allocation according to protocol, treatment allocation according to matching protocol, treatment allocation according to actual level of care (LOC) entered. Comparison of recommended and actual LOC entered. Evaluation of reasons for observed differences between recommended and actual LOC entered. Data needed for treatment allocation according to protocol were available for 2269 (51.6%) patients. Data needed for evaluation of actual LOC entered were available for 1765 (40.2%) patients. Of these patients, 1089 (60.8%) were allocated according to protocol: 48.4% based on the guideline algorithm and 12.4% based on clinically justified deviations from this algorithm. The main reason for deviation was a different appraisal of addiction severity, made by the intake counsellor compared to the protocol. The feasibility of guideline-based treatment allocation is seriously limited due to inadequate data collection of patient characteristics and suboptimal guideline-based treatment allocation. As a consequence, only 24.4% of the patients could be evaluated as being matched properly to the treatment planned. The results indicate several barriers which limit the adequate implementation of patient-treatment-matching guidelines: problems in the infrastructure of data collection and storage and the inertia of intake staff who did not adhere to the guidelines for assessment and matching.
Ohura, Takehiko; Sanada, Hiromi; Mino, Yoshio
2004-01-01
In recent years, the concept of cost-effectiveness, including medical delivery and health service fee systems, has become widespread in Japanese health care. In the field of pressure ulcer management, the recent introduction of penalty subtraction in the care fee system emphasizes the need for prevention and cost-effective care of pressure ulcer. Previous cost-effectiveness research on pressure ulcer management tended to focus only on "hardware" costs such as those for pharmaceuticals and medical supplies, while neglecting other cost aspects, particularly those involving the cost of labor. Thus, cost-effectiveness in pressure ulcer care has not yet been fully established. To provide true cost effectiveness data, a comparative prospective study was initiated in patients with stage II and III pressure ulcers. Considering the potential impact of the pressure reduction mattress on clinical outcome, in particular, the same type of pressure reduction mattresses are utilized in all the cases in the study. The cost analysis method used was Activity-Based Costing, which measures material and labor cost aspects on a daily basis. A reduction in the Pressure Sore Status Tool (PSST) score was used to measure clinical effectiveness. Patients were divided into three groups based on the treatment method and on the use of a consistent algorithm of wound care: 1. MC/A group, modern dressings with a treatment algorithm (control cohort). 2. TC/A group, traditional care (ointment and gauze) with a treatment algorithm. 3. TC/NA group, traditional care (ointment and gauze) without a treatment algorithm. The results revealed that MC/A is more cost-effective than both TC/A and TC/NA. This suggests that appropriate utilization of modern dressing materials and a pressure ulcer care algorithm would contribute to reducing health care costs, improved clinical results, and, ultimately, greater cost-effectiveness.
Basu, Partha; Meheus, Filip; Chami, Youssef; Hariprasad, Roopa; Zhao, Fanghui; Sankaranarayanan, Rengaswamy
2017-07-01
Management algorithms for screen-positive women in cervical cancer prevention programs have undergone substantial changes in recent years. The WHO strongly recommends human papillomavirus (HPV) testing for primary screening, if affordable, or if not, then visual inspection with acetic acid (VIA), and promotes treatment directly following screening through the screen-and-treat approach (one or two clinic visits). While VIA-positive women can be offered immediate ablative treatment based on certain eligibility criteria, HPV-positive women need to undergo subsequent VIA to determine their eligibility. Simpler ablative methods of treatment such as cryotherapy and thermal coagulation have been demonstrated to be effective and to have excellent safety profiles, and these have become integral parts of new management algorithms. The challenges faced by low-resource countries are many and include, from the management perspective, identifying an affordable point-of-care HPV detection test, minimizing over-treatment, and installing an effective information system to ensure high compliance to treatment and follow-up. © 2017 The Authors. International Journal of Gynecology & Obstetrics published by John Wiley & Sons Ltd on behalf of International Federation of Gynecology and Obstetrics.
Harrison, Michelle; Collins, Curtis D
2015-03-01
Procalcitonin has emerged as a promising biomarker of bacterial infection. Published literature demonstrates that use of procalcitonin testing and an associated treatment pathway reduces duration of antibiotic therapy without impacting mortality. The objective of this study was to determine the financial impact of utilizing a procalcitonin-guided treatment algorithm in hospitalized patients with sepsis. Cost-minimization and cost-utility analysis. Hypothetical cohort of adult ICU patients with suspected bacterial infection and sepsis. Utilizing published clinical and economic data, a decision analytic model was developed from the U.S. hospital perspective. Effectiveness and utility measures were defined using cost-per-clinical episode and cost per quality-adjusted life years (QALYs). Upper and lower sensitivity ranges were determined for all inputs. Univariate and probabilistic sensitivity analyses assessed the robustness of our model and variables. Incremental cost-effectiveness ratios (ICERs) were calculated and compared to predetermined willingness-to-pay thresholds. Base-case results predicted the use of a procalcitonin-guided treatment algorithm dominated standard care with improved quality (0.0002 QALYs) and decreased overall treatment costs ($65). The model was sensitive to a number of key variables that had the potential to impact results, including algorithm adherence (<42.3%), number and cost of procalcitonin tests ordered (≥9 and >$46), days of antimicrobial reduction (<1.6 d), incidence of nephrotoxicity and rate of nephrotoxicity reduction. The combination of procalcitonin testing with an evidence-based treatment algorithm may improve patients' quality of life while decreasing costs in ICU patients with suspected bacterial infection and sepsis; however, results were highly dependent on a number of variables and assumptions.
2013-01-02
intensity data from the SNP array were normalized using the Affymetrix GeneChip Targeted Genotyping Analysis Software ( GTGS ). To assess robustness of SNP...calls, genotypes were called using three algorithms: (i) GTGS , (ii) illuminus (27), and (iii) a heuristic algorithm based on discrete cutoffs of
An algorithmic approach for the treatment of severe uncontrolled asthma
Zervas, Eleftherios; Samitas, Konstantinos; Papaioannou, Andriana I.; Bakakos, Petros; Loukides, Stelios; Gaga, Mina
2018-01-01
A small subgroup of patients with asthma suffers from severe disease that is either partially controlled or uncontrolled despite intensive, guideline-based treatment. These patients have significantly impaired quality of life and although they constitute <5% of all asthma patients, they are responsible for more than half of asthma-related healthcare costs. Here, we review a definition for severe asthma and present all therapeutic options currently available for these severe asthma patients. Moreover, we suggest a specific algorithmic treatment approach for the management of severe, difficult-to-treat asthma based on specific phenotype characteristics and biomarkers. The diagnosis and management of severe asthma requires specialised experience, time and effort to comprehend the needs and expectations of each individual patient and incorporate those as well as his/her specific phenotype characteristics into the management planning. Although some new treatment options are currently available for these patients, there is still a need for further research into severe asthma and yet more treatment options. PMID:29531957
Three-Dimensional Electron Beam Dose Calculations.
NASA Astrophysics Data System (ADS)
Shiu, Almon Sowchee
The MDAH pencil-beam algorithm developed by Hogstrom et al (1981) has been widely used in clinics for electron beam dose calculations for radiotherapy treatment planning. The primary objective of this research was to address several deficiencies of that algorithm and to develop an enhanced version. Two enhancements have been incorporated into the pencil-beam algorithm; one models fluence rather than planar fluence, and the other models the bremsstrahlung dose using measured beam data. Comparisons of the resulting calculated dose distributions with measured dose distributions for several test phantoms have been made. From these results it is concluded (1) that the fluence-based algorithm is more accurate to use for the dose calculation in an inhomogeneous slab phantom, and (2) the fluence-based calculation provides only a limited improvement to the accuracy the calculated dose in the region just downstream of the lateral edge of an inhomogeneity. The source of the latter inaccuracy is believed primarily due to assumptions made in the pencil beam's modeling of the complex phantom or patient geometry. A pencil-beam redefinition model was developed for the calculation of electron beam dose distributions in three dimensions. The primary aim of this redefinition model was to solve the dosimetry problem presented by deep inhomogeneities, which was the major deficiency of the enhanced version of the MDAH pencil-beam algorithm. The pencil-beam redefinition model is based on the theory of electron transport by redefining the pencil beams at each layer of the medium. The unique approach of this model is that all the physical parameters of a given pencil beam are characterized for multiple energy bins. Comparisons of the calculated dose distributions with measured dose distributions for a homogeneous water phantom and for phantoms with deep inhomogeneities have been made. From these results it is concluded that the redefinition algorithm is superior to the conventional, fluence-based, pencil-beam algorithm, especially in predicting the dose distribution downstream of a local inhomogeneity. The accuracy of this algorithm appears sufficient for clinical use, and the algorithm is structured for future expansion of the physical model if required for site specific treatment planning problems.
Bejan, Cosmin Adrian; Wei, Wei-Qi; Denny, Joshua C
2015-01-01
Objective To evaluate the contribution of the MEDication Indication (MEDI) resource and SemRep for identifying treatment relations in clinical text. Materials and methods We first processed clinical documents with SemRep to extract the Unified Medical Language System (UMLS) concepts and the treatment relations between them. Then, we incorporated MEDI into a simple algorithm that identifies treatment relations between two concepts if they match a medication-indication pair in this resource. For a better coverage, we expanded MEDI using ontology relationships from RxNorm and UMLS Metathesaurus. We also developed two ensemble methods, which combined the predictions of SemRep and the MEDI algorithm. We evaluated our selected methods on two datasets, a Vanderbilt corpus of 6864 discharge summaries and the 2010 Informatics for Integrating Biology and the Bedside (i2b2)/Veteran's Affairs (VA) challenge dataset. Results The Vanderbilt dataset included 958 manually annotated treatment relations. A double annotation was performed on 25% of relations with high agreement (Cohen's κ = 0.86). The evaluation consisted of comparing the manual annotated relations with the relations identified by SemRep, the MEDI algorithm, and the two ensemble methods. On the first dataset, the best F1-measure results achieved by the MEDI algorithm and the union of the two resources (78.7 and 80, respectively) were significantly higher than the SemRep results (72.3). On the second dataset, the MEDI algorithm achieved better precision and significantly lower recall values than the best system in the i2b2 challenge. The two systems obtained comparable F1-measure values on the subset of i2b2 relations with both arguments in MEDI. Conclusions Both SemRep and MEDI can be used to extract treatment relations from clinical text. Knowledge-based extraction with MEDI outperformed use of SemRep alone, but superior performance was achieved by integrating both systems. The integration of knowledge-based resources such as MEDI into information extraction systems such as SemRep and the i2b2 relation extractors may improve treatment relation extraction from clinical text. PMID:25336593
Weigel, Ralf; Schlickum, Linda; Weisser, Gerald; Krauss, Joachim K
2015-01-01
Surgical treatment for chronic subdural haematoma (CSH) has been analysed by applying evidence-based medicine (EBM) criteria earlier. Whether implementation of EBM-derived key factors into an optimised treatment algorithm would improve outcome, however, needs to be clarified. Symptomatic patients with CSH who fulfilled the inclusion criteria were either assigned to an optimised treatment algorithm (OA-EBM group) or to a control group treated by the standard departmental surgical technique (SDST group) in a prospective design. For the OA-EBM algorithm only one burr hole, extensive intraoperative irrigation and a closed system drainage with meticulous avoidance of entry of air was mandatory. A two-catheter technique was used to reduce intracavital air. Final endpoints were neurological outcome (Markwalder Score), recurrence and the amount of intracranial air. A total of 93 out of 117 patients were evaluated accounting for 113 cases because 20 patients had bilateral haematomas. Demographic data of 68 cases in the SDST group did not differ from 45 cases in the OA-EBM group. The Markwalder Score showed greater improvement in the OA-EBM group (0.5 ± 0.6 vs. 1.0 ± 1.0, p = 0.003). The recurrence rate was 18% (12 patients) in the SDST group versus 2% (1 patient) in the OA-EBM group (p < 0.05). The amount of intracranial air was significantly lower in the OA-EBM group (3.3 ± 5.0 cm(3) vs. 5.2 ± 7.7 cm(3)) with p = 0.04. In the standard group computerised tomography scanning was performed slightly earlier (3 ± 1.7 days vs. 3.6 ± 1.4 days). When comparing only non-recurrent cases in both groups no significant difference was apparent. Implementation of EBM key factors into a treatment algorithm for CSH can improve neurological outcome in a typical neurosurgical department, reduce recurrence and minimise the amount of postoperative air within the haematoma cavity.
A novel clinical decision support algorithm for constructing complete medication histories.
Long, Ju; Yuan, Michael Juntao
2017-07-01
A patient's complete medication history is a crucial element for physicians to develop a full understanding of the patient's medical conditions and treatment options. However, due to the fragmented nature of medical data, this process can be very time-consuming and often impossible for physicians to construct a complete medication history for complex patients. In this paper, we describe an accurate, computationally efficient and scalable algorithm to construct a medication history timeline. The algorithm is developed and validated based on 1 million random prescription records from a large national prescription data aggregator. Our evaluation shows that the algorithm can be scaled horizontally on-demand, making it suitable for future delivery in a cloud-computing environment. We also propose that this cloud-based medication history computation algorithm could be integrated into Electronic Medical Records, enabling informed clinical decision-making at the point of care. Copyright © 2017 Elsevier B.V. All rights reserved.
A Risk-based Model Predictive Control Approach to Adaptive Interventions in Behavioral Health
Zafra-Cabeza, Ascensión; Rivera, Daniel E.; Collins, Linda M.; Ridao, Miguel A.; Camacho, Eduardo F.
2010-01-01
This paper examines how control engineering and risk management techniques can be applied in the field of behavioral health through their use in the design and implementation of adaptive behavioral interventions. Adaptive interventions are gaining increasing acceptance as a means to improve prevention and treatment of chronic, relapsing disorders, such as abuse of alcohol, tobacco, and other drugs, mental illness, and obesity. A risk-based Model Predictive Control (MPC) algorithm is developed for a hypothetical intervention inspired by Fast Track, a real-life program whose long-term goal is the prevention of conduct disorders in at-risk children. The MPC-based algorithm decides on the appropriate frequency of counselor home visits, mentoring sessions, and the availability of after-school recreation activities by relying on a model that includes identifiable risks, their costs, and the cost/benefit assessment of mitigating actions. MPC is particularly suited for the problem because of its constraint-handling capabilities, and its ability to scale to interventions involving multiple tailoring variables. By systematically accounting for risks and adapting treatment components over time, an MPC approach as described in this paper can increase intervention effectiveness and adherence while reducing waste, resulting in advantages over conventional fixed treatment. A series of simulations are conducted under varying conditions to demonstrate the effectiveness of the algorithm. PMID:21643450
Mani, Ganesh Kadirampatti; Karunakaran, Kaviarasu
2016-01-01
Small fields smaller than 4×4 cm2 are used in stereotactic and conformal treatments where heterogeneity is normally present. Since dose calculation accuracy in both small fields and heterogeneity often involves more discrepancy, algorithms used by treatment planning systems (TPS) should be evaluated for achieving better treatment results. This report aims at evaluating accuracy of four model‐based algorithms, X‐ray Voxel Monte Carlo (XVMC) from Monaco, Superposition (SP) from CMS‐Xio, AcurosXB (AXB) and analytical anisotropic algorithm (AAA) from Eclipse are tested against the measurement. Measurements are done using Exradin W1 plastic scintillator in Solid Water phantom with heterogeneities like air, lung, bone, and aluminum, irradiated with 6 and 15 MV photons of square field size ranging from 1 to 4 cm2. Each heterogeneity is introduced individually at two different depths from depth‐of‐dose maximum (Dmax), one setup being nearer and another farther from the Dmax. The central axis percentage depth‐dose (CADD) curve for each setup is measured separately and compared with the TPS algorithm calculated for the same setup. The percentage normalized root mean squared deviation (%NRMSD) is calculated, which represents the whole CADD curve's deviation against the measured. It is found that for air and lung heterogeneity, for both 6 and 15 MV, all algorithms show maximum deviation for field size 1×1 cm2 and gradually reduce when field size increases, except for AAA. For aluminum and bone, all algorithms' deviations are less for 15 MV irrespective of setup. In all heterogeneity setups, 1×1 cm2 field showed maximum deviation, except in 6 MV bone setup. All algorithms in the study, irrespective of energy and field size, when any heterogeneity is nearer to Dmax, the dose deviation is higher compared to the same heterogeneity far from the Dmax. Also, all algorithms show maximum deviation in lower‐density materials compared to high‐density materials. PACS numbers: 87.53.Bn, 87.53.kn, 87.56.bd, 87.55.Kd, 87.56.jf PMID:26894345
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazur, T; Wang, Y; Fischer-Valuck, B
2015-06-15
Purpose: To develop a novel and rapid, SIFT-based algorithm for assessing feature motion on cine MR images acquired during MRI-guided radiotherapy treatments. In particular, we apply SIFT descriptors toward both partitioning cine images into respiratory states and tracking regions across frames. Methods: Among a training set of images acquired during a fraction, we densely assign SIFT descriptors to pixels within the images. We cluster these descriptors across all frames in order to produce a dictionary of trackable features. Associating the best-matching descriptors at every frame among the training images to these features, we construct motion traces for the features. Wemore » use these traces to define respiratory bins for sorting images in order to facilitate robust pixel-by-pixel tracking. Instead of applying conventional methods for identifying pixel correspondences across frames we utilize a recently-developed algorithm that derives correspondences via a matching objective for SIFT descriptors. Results: We apply these methods to a collection of lung, abdominal, and breast patients. We evaluate the procedure for respiratory binning using target sites exhibiting high-amplitude motion among 20 lung and abdominal patients. In particular, we investigate whether these methods yield minimal variation between images within a bin by perturbing the resulting image distributions among bins. Moreover, we compare the motion between averaged images across respiratory states to 4DCT data for these patients. We evaluate the algorithm for obtaining pixel correspondences between frames by tracking contours among a set of breast patients. As an initial case, we track easily-identifiable edges of lumpectomy cavities that show minimal motion over treatment. Conclusions: These SIFT-based methods reliably extract motion information from cine MR images acquired during patient treatments. While we performed our analysis retrospectively, the algorithm lends itself to prospective motion assessment. Applications of these methods include motion assessment, identifying treatment windows for gating, and determining optimal margins for treatment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Vincent W.C., E-mail: htvinwu@polyu.edu.hk; Tse, Teddy K.H.; Ho, Cola L.M.
2013-07-01
Monte Carlo (MC) simulation is currently the most accurate dose calculation algorithm in radiotherapy planning but requires relatively long processing time. Faster model-based algorithms such as the anisotropic analytical algorithm (AAA) by the Eclipse treatment planning system and multigrid superposition (MGS) by the XiO treatment planning system are 2 commonly used algorithms. This study compared AAA and MGS against MC, as the gold standard, on brain, nasopharynx, lung, and prostate cancer patients. Computed tomography of 6 patients of each cancer type was used. The same hypothetical treatment plan using the same machine and treatment prescription was computed for each casemore » by each planning system using their respective dose calculation algorithm. The doses at reference points including (1) soft tissues only, (2) bones only, (3) air cavities only, (4) soft tissue-bone boundary (Soft/Bone), (5) soft tissue-air boundary (Soft/Air), and (6) bone-air boundary (Bone/Air), were measured and compared using the mean absolute percentage error (MAPE), which was a function of the percentage dose deviations from MC. Besides, the computation time of each treatment plan was recorded and compared. The MAPEs of MGS were significantly lower than AAA in all types of cancers (p<0.001). With regards to body density combinations, the MAPE of AAA ranged from 1.8% (soft tissue) to 4.9% (Bone/Air), whereas that of MGS from 1.6% (air cavities) to 2.9% (Soft/Bone). The MAPEs of MGS (2.6%±2.1) were significantly lower than that of AAA (3.7%±2.5) in all tissue density combinations (p<0.001). The mean computation time of AAA for all treatment plans was significantly lower than that of the MGS (p<0.001). Both AAA and MGS algorithms demonstrated dose deviations of less than 4.0% in most clinical cases and their performance was better in homogeneous tissues than at tissue boundaries. In general, MGS demonstrated relatively smaller dose deviations than AAA but required longer computation time.« less
Balouchestani, Mohammadreza; Krishnan, Sridhar
2014-01-01
Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.
A point kernel algorithm for microbeam radiation therapy
NASA Astrophysics Data System (ADS)
Debus, Charlotte; Oelfke, Uwe; Bartzsch, Stefan
2017-11-01
Microbeam radiation therapy (MRT) is a treatment approach in radiation therapy where the treatment field is spatially fractionated into arrays of a few tens of micrometre wide planar beams of unusually high peak doses separated by low dose regions of several hundred micrometre width. In preclinical studies, this treatment approach has proven to spare normal tissue more effectively than conventional radiation therapy, while being equally efficient in tumour control. So far dose calculations in MRT, a prerequisite for future clinical applications are based on Monte Carlo simulations. However, they are computationally expensive, since scoring volumes have to be small. In this article a kernel based dose calculation algorithm is presented that splits the calculation into photon and electron mediated energy transport, and performs the calculation of peak and valley doses in typical MRT treatment fields within a few minutes. Kernels are analytically calculated depending on the energy spectrum and material composition. In various homogeneous materials peak, valley doses and microbeam profiles are calculated and compared to Monte Carlo simulations. For a microbeam exposure of an anthropomorphic head phantom calculated dose values are compared to measurements and Monte Carlo calculations. Except for regions close to material interfaces calculated peak dose values match Monte Carlo results within 4% and valley dose values within 8% deviation. No significant differences are observed between profiles calculated by the kernel algorithm and Monte Carlo simulations. Measurements in the head phantom agree within 4% in the peak and within 10% in the valley region. The presented algorithm is attached to the treatment planning platform VIRTUOS. It was and is used for dose calculations in preclinical and pet-clinical trials at the biomedical beamline ID17 of the European synchrotron radiation facility in Grenoble, France.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pacaci, P; Cebe, M; Mabhouti, H
Purpose: In this study, dosimetric comparison of field in field (FIF) and intensity modulated radiation therapy (IMRT) techniques used for treatment of whole breast radiotherapy (WBRT) were made. The dosimetric accuracy of treatment planning system (TPS) for Anisotropic Analytical Algorithm (AAA) and Acuros XB (AXB) algorithms in predicting PTV and OAR doses was also investigated. Methods: Two different treatment planning techniques of left-sided breast cancer were generated for rando phantom. FIF and IMRT plans were compared for doses in PTV and OAR volumes including ipsilateral lung, heart, left ascending coronary artery, contralateral lung and the contralateral breast. PTV and OARsmore » doses and homogeneity and conformality indexes were compared between two techniques. The accuracy of TPS dose calculation algorithms was tested by comparing PTV and OAR doses measured by thermoluminescent dosimetry with the dose calculated by the TPS using AAA and AXB for both techniques. Results: IMRT plans had better conformality and homogeneity indexes than FIF technique and it spared OARs better than FIF. While both algorithms overestimated PTV doses they underestimated all OAR doses. For IMRT plan, PTV doses, overestimation up to 2.5 % was seen with AAA algorithm but it decreased to 1.8 % when AXB algorithm was used. Based on the results of the anthropomorphic measurements for OAR doses, underestimation greater than 7 % is possible by the AAA. The results from the AXB are much better than the AAA algorithm. However, underestimations of 4.8 % were found in some of the points even for AXB. For FIF plan, similar trend was seen for PTV and OARs doses in both algorithm. Conclusion: When using the Eclipse TPS for breast cancer, AXB the should be used instead of the AAA algorithm, bearing in mind that the AXB may still underestimate all OAR doses.« less
2013-01-01
Background Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Method Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. Result BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. Conclusions The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates. Website: The web based application is developed and can be access through the following link http://compgenomics.utsa.edu/BRCAMoNet/ PMID:24564956
French national consensus clinical guidelines for the management of ulcerative colitis.
Peyrin-Biroulet, Laurent; Bouhnik, Yoram; Roblin, Xavier; Bonnaud, Guillaume; Hagège, Hervé; Hébuterne, Xavier
2016-07-01
Ulcerative colitis (UC) is a chronic inflammatory bowel disease of multifactorial etiology that primarily affects the colonic mucosa. The disease progresses over time, and clinical management guidelines should reflect its dynamic nature. There is limited evidence supporting UC management in specific clinical situations, thus precluding an evidence-based approach. To use a formal consensus method - the nominal group technique (NGT) - to develop a clinical practice expert opinion to outline simple algorithms and practices, optimize UC management, and assist clinicians in making treatment decisions. The consensus was developed by an expert panel of 37 gastroenterologists from various professional organizations with experience in UC management using the qualitative and iterative NGT, incorporating deliberations based on the European Crohn's and Colitis Organisation recommendations, recent reviews of scientific literature, and pertinent discussion topics developed by a steering committee. Examples of clinical cases for which there are limited evidence-based data from clinical trials were used. Two working groups proposed and voted on treatment algorithms that were then discussed and voted for by the nominal group as a whole, in order to reach a consensus. A clinical practice guideline covering management of the following clinical situations was developed: (i) moderate and severe UC; (ii) acute severe UC; (iii) pouchitis; (iv) refractory proctitis, in the form of treatment algorithms. Given the limited available evidence-based data, a formal consensus methodology was used to develop simple treatment guidelines for UC management in different clinical situations that is now accessible via an online application. Copyright © 2016 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.
Outcomes Measurement in Voice Disorders: Application of an Acoustic Index of Dysphonia Severity
ERIC Educational Resources Information Center
Awan, Shaheen N.; Roy, Nelson
2009-01-01
Purpose: The purpose of this experiment was to assess the ability of an acoustic model composed of both time-based and spectral-based measures to track change following voice disorder treatment and to serve as a possible treatment outcomes measure. Method: A weighted, four-factor acoustic algorithm consisting of shimmer, pitch sigma, the ratio of…
Putting health status guided COPD management to the test: protocol of the MARCH study.
Kocks, Janwillem; de Jong, Corina; Berger, Marjolein Y; Kerstjens, Huib A M; van der Molen, Thys
2013-07-04
Chronic Obstructive Pulmonary Disease (COPD) is a disease state characterized by airflow limitation that is not fully reversible and usually progressive. Current guidelines, among which the Dutch, have so far based their management strategy mainly on lung function impairment as measured by FEV1, while it is well known that FEV1 has a poor correlation with almost all features of COPD that matter to patients. Based on this discrepancy the GOLD 2011 update included symptoms and impact in their treatment algorithm proposal. Health status measures capture both symptoms and impact and could therefore be used as a standardized way to capture the information a doctor could otherwise only collect by careful history taking and recording. We hypothesize that a treatment algorithm that is based on a simple validated 10 item health status questionnaire, the Clinical COPD Questionnaire (CCQ), improves health status (as measured by SGRQ) and classical COPD outcomes like exacerbation frequency, patient satisfaction and health care utilization compared to usual care based on guidelines. This hypothesis will be tested in a randomized controlled trial (RCT) following 330 patients for two years. During this period general practitioners will receive treatment advices every four months that are based on the patient's health status (in half of the patients, intervention group) or on lung function (the remaining half of the patients, usual care group). During the design process, the selection of outcomes and the development of the treatment algorithm were challenging. This is discussed in detail in the manuscript to facilitate researchers in designing future studies in this changing field of implementation research. Netherlands Trial Register, NTR2643.
Management of Central Venous Access Device-Associated Skin Impairment: An Evidence-Based Algorithm.
Broadhurst, Daphne; Moureau, Nancy; Ullman, Amanda J
Patients relying on central venous access devices (CVADs) for treatment are frequently complex. Many have multiple comorbid conditions, including renal impairment, nutritional deficiencies, hematologic disorders, or cancer. These conditions can impair the skin surrounding the CVAD insertion site, resulting in an increased likelihood of skin damage when standard CVAD management practices are employed. Supported by the World Congress of Vascular Access (WoCoVA), developed an evidence- and consensus-based algorithm to improve CVAD-associated skin impairment (CASI) identification and diagnosis, guide clinical decision-making, and improve clinician confidence in managing CASI. A scoping review of relevant literature surrounding CASI management was undertaken March 2014, and results were distributed to an international advisory panel. A CASI algorithm was developed by an international advisory panel of clinicians with expertise in wounds, vascular access, pediatrics, geriatric care, home care, intensive care, infection control and acute care, using a 2-phase, modified Delphi technique. The algorithm focuses on identification and treatment of skin injury, exit site infection, noninfectious exudate, and skin irritation/contact dermatitis. It comprised 3 domains: assessment, skin protection, and patient comfort. External validation of the algorithm was achieved by prospective pre- and posttest design, using clinical scenarios and self-reported clinician confidence (Likert scale), and incorporating algorithm feasibility and face validity endpoints. The CASI algorithm was found to significantly increase participants' confidence in the assessment and management of skin injury (P = .002), skin irritation/contact dermatitis (P = .001), and noninfectious exudate (P < .01). A majority of participants reported the algorithm as easy to understand (24/25; 96%), containing all necessary information (24/25; 96%). Twenty-four of 25 (96%) stated that they would recommend the tool to guide management of CASI.
A controlled variation scheme for convection treatment in pressure-based algorithm
NASA Technical Reports Server (NTRS)
Shyy, Wei; Thakur, Siddharth; Tucker, Kevin
1993-01-01
Convection effect and source terms are two primary sources of difficulties in computing turbulent reacting flows typically encountered in propulsion devices. The present work intends to elucidate the individual as well as the collective roles of convection and source terms in the fluid flow equations, and to devise appropriate treatments and implementations to improve our current capability of predicting such flows. A controlled variation scheme (CVS) has been under development in the context of a pressure-based algorithm, which has the characteristics of adaptively regulating the amount of numerical diffusivity, relative to central difference scheme, according to the variation in local flow field. Both the basic concepts and a pragmatic assessment will be presented to highlight the status of this work.
Sensitivity study of voxel-based PET image comparison to image registration algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yip, Stephen, E-mail: syip@lroc.harvard.edu; Chen, Aileen B.; Berbeco, Ross
2014-11-01
Purpose: Accurate deformable registration is essential for voxel-based comparison of sequential positron emission tomography (PET) images for proper adaptation of treatment plan and treatment response assessment. The comparison may be sensitive to the method of deformable registration as the optimal algorithm is unknown. This study investigated the impact of registration algorithm choice on therapy response evaluation. Methods: Sixteen patients with 20 lung tumors underwent a pre- and post-treatment computed tomography (CT) and 4D FDG-PET scans before and after chemoradiotherapy. All CT images were coregistered using a rigid and ten deformable registration algorithms. The resulting transformations were then applied to themore » respective PET images. Moreover, the tumor region defined by a physician on the registered PET images was classified into progressor, stable-disease, and responder subvolumes. Particularly, voxels with standardized uptake value (SUV) decreases >30% were classified as responder, while voxels with SUV increases >30% were progressor. All other voxels were considered stable-disease. The agreement of the subvolumes resulting from difference registration algorithms was assessed by Dice similarity index (DSI). Coefficient of variation (CV) was computed to assess variability of DSI between individual tumors. Root mean square difference (RMS{sub rigid}) of the rigidly registered CT images was used to measure the degree of tumor deformation. RMS{sub rigid} and DSI were correlated by Spearman correlation coefficient (R) to investigate the effect of tumor deformation on DSI. Results: Median DSI{sub rigid} was found to be 72%, 66%, and 80%, for progressor, stable-disease, and responder, respectively. Median DSI{sub deformable} was 63%–84%, 65%–81%, and 82%–89%. Variability of DSI was substantial and similar for both rigid and deformable algorithms with CV > 10% for all subvolumes. Tumor deformation had moderate to significant impact on DSI for progressor subvolume with R{sub rigid} = − 0.60 (p = 0.01) and R{sub deformable} = − 0.46 (p = 0.01–0.20) averaging over all deformable algorithms. For stable-disease subvolumes, the correlations were significant (p < 0.001) for all registration algorithms with R{sub rigid} = − 0.71 and R{sub deformable} = − 0.72. Progressor and stable-disease subvolumes resulting from rigid registration were in excellent agreement (DSI > 70%) for RMS{sub rigid} < 150 HU. However, tumor deformation was observed to have negligible effect on DSI for responder subvolumes with insignificant |R| < 0.26, p > 0.27. Conclusions: This study demonstrated that deformable algorithms cannot be arbitrarily chosen; different deformable algorithms can result in large differences of voxel-based PET image comparison. For low tumor deformation (RMS{sub rigid} < 150 HU), rigid and deformable algorithms yield similar results, suggesting deformable registration is not required for these cases.« less
A methodology for automatic intensity-modulated radiation treatment planning for lung cancer
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng
2011-07-01
In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.
Kariuki, Jacob K; Gona, Philimon; Leveille, Suzanne G; Stuart-Shor, Eileen M; Hayman, Laura L; Cromwell, Jerry
2018-06-01
The non-lab Framingham algorithm, which substitute body mass index for lipids in the laboratory based (lab-based) Framingham algorithm, has been validated among African Americans (AAs). However, its cost-effectiveness and economic tradeoffs have not been evaluated. This study examines the incremental cost-effectiveness ratio (ICER) of two cardiovascular disease (CVD) prevention programs guided by the non-lab versus lab-based Framingham algorithm. We simulated the World Health Organization CVD prevention guidelines on a cohort of 2690 AA participants in the Atherosclerosis Risk in Communities (ARIC) cohort. Costs were estimated using Medicare fee schedules (diagnostic tests, drugs & visits), Bureau of Labor Statistics (RN wages), and estimates for managing incident CVD events. Outcomes were assumed to be true positive cases detected at a data driven treatment threshold. Both algorithms had the best balance of sensitivity/specificity at the moderate risk threshold (>10% risk). Over 12years, 82% and 77% of 401 incident CVD events were accurately predicted via the non-lab and lab-based Framingham algorithms, respectively. There were 20 fewer false negative cases in the non-lab approach translating into over $900,000 in savings over 12years. The ICER was -$57,153 for every extra CVD event prevented when using the non-lab algorithm. The approach guided by the non-lab Framingham strategy dominated the lab-based approach with respect to both costs and predictive ability. Consequently, the non-lab Framingham algorithm could potentially provide a highly effective screening tool at lower cost to address the high burden of CVD especially among AA and in resource-constrained settings where lab tests are unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.
Shrestha, Swastina; Dave, Amish J; Losina, Elena; Katz, Jeffrey N
2016-07-07
Administrative health care data are frequently used to study disease burden and treatment outcomes in many conditions including osteoarthritis (OA). OA is a chronic condition with significant disease burden affecting over 27 million adults in the US. There are few studies examining the performance of administrative data algorithms to diagnose OA. The purpose of this study is to perform a systematic review of administrative data algorithms for OA diagnosis; and, to evaluate the diagnostic characteristics of algorithms based on restrictiveness and reference standards. Two reviewers independently screened English-language articles published in Medline, Embase, PubMed, and Cochrane databases that used administrative data to identify OA cases. Each algorithm was classified as restrictive or less restrictive based on number and type of administrative codes required to satisfy the case definition. We recorded sensitivity and specificity of algorithms and calculated positive likelihood ratio (LR+) and positive predictive value (PPV) based on assumed OA prevalence of 0.1, 0.25, and 0.50. The search identified 7 studies that used 13 algorithms. Of these 13 algorithms, 5 were classified as restrictive and 8 as less restrictive. Restrictive algorithms had lower median sensitivity and higher median specificity compared to less restrictive algorithms when reference standards were self-report and American college of Rheumatology (ACR) criteria. The algorithms compared to reference standard of physician diagnosis had higher sensitivity and specificity than those compared to self-reported diagnosis or ACR criteria. Restrictive algorithms are more specific for OA diagnosis and can be used to identify cases when false positives have higher costs e.g. interventional studies. Less restrictive algorithms are more sensitive and suited for studies that attempt to identify all cases e.g. screening programs.
Hoffman, Sarah R; Vines, Anissa I; Halladay, Jacqueline R; Pfaff, Emily; Schiff, Lauren; Westreich, Daniel; Sundaresan, Aditi; Johnson, La-Shell; Nicholson, Wanda K
2018-06-01
Women with symptomatic uterine fibroids can report a myriad of symptoms, including pain, bleeding, infertility, and psychosocial sequelae. Optimizing fibroid research requires the ability to enroll populations of women with image-confirmed symptomatic uterine fibroids. Our objective was to develop an electronic health record-based algorithm to identify women with symptomatic uterine fibroids for a comparative effectiveness study of medical or surgical treatments on quality-of-life measures. Using an iterative process and text-mining techniques, an effective computable phenotype algorithm, composed of demographics, and clinical and laboratory characteristics, was developed with reasonable performance. Such algorithms provide a feasible, efficient way to identify populations of women with symptomatic uterine fibroids for the conduct of large traditional or pragmatic trials and observational comparative effectiveness studies. Symptomatic uterine fibroids, due to menorrhagia, pelvic pain, bulk symptoms, or infertility, are a source of substantial morbidity for reproductive-age women. Comparing Treatment Options for Uterine Fibroids is a multisite registry study to compare the effectiveness of hormonal or surgical fibroid treatments on women's perceptions of their quality of life. Electronic health record-based algorithms are able to identify large numbers of women with fibroids, but additional work is needed to develop electronic health record algorithms that can identify women with symptomatic fibroids to optimize fibroid research. We sought to develop an efficient electronic health record-based algorithm that can identify women with symptomatic uterine fibroids in a large health care system for recruitment into large-scale observational and interventional research in fibroid management. We developed and assessed the accuracy of 3 algorithms to identify patients with symptomatic fibroids using an iterative approach. The data source was the Carolina Data Warehouse for Health, a repository for the health system's electronic health record data. In addition to International Classification of Diseases, Ninth Revision diagnosis and procedure codes and clinical characteristics, text data-mining software was used to derive information from imaging reports to confirm the presence of uterine fibroids. Results of each algorithm were compared with expert manual review to calculate the positive predictive values for each algorithm. Algorithm 1 was composed of the following criteria: (1) age 18-54 years; (2) either ≥1 International Classification of Diseases, Ninth Revision diagnosis codes for uterine fibroids or mention of fibroids using text-mined key words in imaging records or documents; and (3) no International Classification of Diseases, Ninth Revision or Current Procedural Terminology codes for hysterectomy and no reported history of hysterectomy. The positive predictive value was 47% (95% confidence interval 39-56%). Algorithm 2 required ≥2 International Classification of Diseases, Ninth Revision diagnosis codes for fibroids and positive text-mined key words and had a positive predictive value of 65% (95% confidence interval 50-79%). In algorithm 3, further refinements included ≥2 International Classification of Diseases, Ninth Revision diagnosis codes for fibroids on separate outpatient visit dates, the exclusion of women who had a positive pregnancy test within 3 months of their fibroid-related visit, and exclusion of incidentally detected fibroids during prenatal or emergency department visits. Algorithm 3 achieved a positive predictive value of 76% (95% confidence interval 71-81%). An electronic health record-based algorithm is capable of identifying cases of symptomatic uterine fibroids with moderate positive predictive value and may be an efficient approach for large-scale study recruitment. Copyright © 2018 Elsevier Inc. All rights reserved.
Chaswal, V; Thomadsen, B R; Henderson, D L
2012-02-21
The development and application of an automated 3D greedy heuristic (GH) optimization algorithm utilizing the adjoint sensitivity fields for treatment planning to assess the advantage of directional interstitial prostate brachytherapy is presented. Directional and isotropic dose kernels generated using Monte Carlo simulations based on Best Industries model 2301 I-125 source are utilized for treatment planning. The newly developed GH algorithm is employed for optimization of the treatment plans for seven interstitial prostate brachytherapy cases using mixed sources (directional brachytherapy) and using only isotropic sources (conventional brachytherapy). All treatment plans resulted in V100 > 98% and D90 > 45 Gy for the target prostate region. For the urethra region, the D10(Ur), D90(Ur) and V150(Ur) and for the rectum region the V100cc, D2cc, D90(Re) and V90(Re) all are reduced significantly when mixed sources brachytherapy is used employing directional sources. The simulations demonstrated that the use of directional sources in the low dose-rate (LDR) brachytherapy of the prostate clearly benefits in sparing the urethra and the rectum sensitive structures from overdose. The time taken for a conventional treatment plan is less than three seconds, while the time taken for a mixed source treatment plan is less than nine seconds, as tested on an Intel Core2 Duo 2.2 GHz processor with 1GB RAM. The new 3D GH algorithm is successful in generating a feasible LDR brachytherapy treatment planning solution with an extra degree of freedom, i.e. directionality in very little time.
NASA Astrophysics Data System (ADS)
Chaswal, V.; Thomadsen, B. R.; Henderson, D. L.
2012-02-01
The development and application of an automated 3D greedy heuristic (GH) optimization algorithm utilizing the adjoint sensitivity fields for treatment planning to assess the advantage of directional interstitial prostate brachytherapy is presented. Directional and isotropic dose kernels generated using Monte Carlo simulations based on Best Industries model 2301 I-125 source are utilized for treatment planning. The newly developed GH algorithm is employed for optimization of the treatment plans for seven interstitial prostate brachytherapy cases using mixed sources (directional brachytherapy) and using only isotropic sources (conventional brachytherapy). All treatment plans resulted in V100 > 98% and D90 > 45 Gy for the target prostate region. For the urethra region, the D10Ur, D90Ur and V150Ur and for the rectum region the V100cc, D2cc, D90Re and V90Re all are reduced significantly when mixed sources brachytherapy is used employing directional sources. The simulations demonstrated that the use of directional sources in the low dose-rate (LDR) brachytherapy of the prostate clearly benefits in sparing the urethra and the rectum sensitive structures from overdose. The time taken for a conventional treatment plan is less than three seconds, while the time taken for a mixed source treatment plan is less than nine seconds, as tested on an Intel Core2 Duo 2.2 GHz processor with 1GB RAM. The new 3D GH algorithm is successful in generating a feasible LDR brachytherapy treatment planning solution with an extra degree of freedom, i.e. directionality in very little time.
Density-based clustering analyses to identify heterogeneous cellular sub-populations
NASA Astrophysics Data System (ADS)
Heaster, Tiffany M.; Walsh, Alex J.; Landman, Bennett A.; Skala, Melissa C.
2017-02-01
Autofluorescence microscopy of NAD(P)H and FAD provides functional metabolic measurements at the single-cell level. Here, density-based clustering algorithms were applied to metabolic autofluorescence measurements to identify cell-level heterogeneity in tumor cell cultures. The performance of the density-based clustering algorithm, DENCLUE, was tested in samples with known heterogeneity (co-cultures of breast carcinoma lines). DENCLUE was found to better represent the distribution of cell clusters compared to Gaussian mixture modeling. Overall, DENCLUE is a promising approach to quantify cell-level heterogeneity, and could be used to understand single cell population dynamics in cancer progression and treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.
Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vikraman, S; Ramu, M; Karrthick, Kp
Purpose: The purpose of this study was to validate the advent of COMPASS 3D dosimetry as a routine pre treatment verification tool with commercially available CMS Monaco and Oncentra Masterplan planning system. Methods: Twenty esophagus patients were selected for this study. All these patients underwent radical VMAT treatment in Elekta Linac and plans were generated in Monaco v5.0 with MonteCarlo(MC) dose calculation algorithm. COMPASS 3D dosimetry comprises an advanced dose calculation algorithm of collapsed cone convolution(CCC). To validate CCC algorithm in COMPASS, The DICOM RT Plans generated using Monaco MC algorithm were transferred to Oncentra Masterplan v4.3 TPS. Only finalmore » dose calculations were performed using CCC algorithm with out optimization in Masterplan planning system. It is proven that MC algorithm is an accurate algorithm and obvious that there will be a difference with MC and CCC algorithms. Hence CCC in COMPASS should be validated with other commercially available CCC algorithm. To use the CCC as pretreatment verification tool with reference to MC generated treatment plans, CCC in OMP and CCC in COMPASS were validated using dose volume based indices such as D98, D95 for target volumes and OAR doses. Results: The point doses for open beams were observed <1% with reference to Monaco MC algorithms. Comparisons of CCC(OMP) Vs CCC(COMPASS) showed a mean difference of 1.82%±1.12SD and 1.65%±0.67SD for D98 and D95 respectively for Target coverage. Maximum point dose of −2.15%±0.60SD difference was observed in target volume. The mean lung dose of −2.68%±1.67SD was noticed between OMP and COMPASS. The maximum point doses for spinal cord were −1.82%±0.287SD. Conclusion: In this study, the accuracy of CCC algorithm in COMPASS 3D dosimetry was validated by compared with CCC algorithm in OMP TPS. Dose calculation in COMPASS is feasible within < 2% in comparison with commercially available TPS algorithms.« less
SU-E-T-605: Performance Evaluation of MLC Leaf-Sequencing Algorithms in Head-And-Neck IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jing, J; Lin, H; Chow, J
2015-06-15
Purpose: To investigate the efficiency of three multileaf collimator (MLC) leaf-sequencing algorithms proposed by Galvin et al, Chen et al and Siochi et al using external beam treatment plans for head-and-neck intensity modulated radiation therapy (IMRT). Methods: IMRT plans for head-and-neck were created using the CORVUS treatment planning system. The plans were optimized and the fluence maps for all photon beams determined. Three different MLC leaf-sequencing algorithms based on Galvin et al, Chen et al and Siochi et al were used to calculate the final photon segmental fields and their monitor units in delivery. For comparison purpose, the maximum intensitymore » of fluence map was kept constant in different plans. The number of beam segments and total number of monitor units were calculated for the three algorithms. Results: From results of number of beam segments and total number of monitor units, we found that algorithm of Galvin et al had the largest number of monitor unit which was about 70% larger than the other two algorithms. Moreover, both algorithms of Galvin et al and Siochi et al have relatively lower number of beam segment compared to Chen et al. Although values of number of beam segment and total number of monitor unit calculated by different algorithms varied with the head-and-neck plans, it can be seen that algorithms of Galvin et al and Siochi et al performed well with a lower number of beam segment, though algorithm of Galvin et al had a larger total number of monitor units than Siochi et al. Conclusion: Although performance of the leaf-sequencing algorithm varied with different IMRT plans having different fluence maps, an evaluation is possible based on the calculated number of beam segment and monitor unit. In this study, algorithm by Siochi et al was found to be more efficient in the head-and-neck IMRT. The Project Sponsored by the Fundamental Research Funds for the Central Universities (J2014HGXJ0094) and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry.« less
Ab initio molecular simulations with numeric atom-centered orbitals
NASA Astrophysics Data System (ADS)
Blum, Volker; Gehrke, Ralf; Hanke, Felix; Havu, Paula; Havu, Ville; Ren, Xinguo; Reuter, Karsten; Scheffler, Matthias
2009-11-01
We describe a complete set of algorithms for ab initio molecular simulations based on numerically tabulated atom-centered orbitals (NAOs) to capture a wide range of molecular and materials properties from quantum-mechanical first principles. The full algorithmic framework described here is embodied in the Fritz Haber Institute "ab initio molecular simulations" (FHI-aims) computer program package. Its comprehensive description should be relevant to any other first-principles implementation based on NAOs. The focus here is on density-functional theory (DFT) in the local and semilocal (generalized gradient) approximations, but an extension to hybrid functionals, Hartree-Fock theory, and MP2/GW electron self-energies for total energies and excited states is possible within the same underlying algorithms. An all-electron/full-potential treatment that is both computationally efficient and accurate is achieved for periodic and cluster geometries on equal footing, including relaxation and ab initio molecular dynamics. We demonstrate the construction of transferable, hierarchical basis sets, allowing the calculation to range from qualitative tight-binding like accuracy to meV-level total energy convergence with the basis set. Since all basis functions are strictly localized, the otherwise computationally dominant grid-based operations scale as O(N) with system size N. Together with a scalar-relativistic treatment, the basis sets provide access to all elements from light to heavy. Both low-communication parallelization of all real-space grid based algorithms and a ScaLapack-based, customized handling of the linear algebra for all matrix operations are possible, guaranteeing efficient scaling (CPU time and memory) up to massively parallel computer systems with thousands of CPUs.
Ma, Chifeng; Chen, Hung-I; Flores, Mario; Huang, Yufei; Chen, Yidong
2013-01-01
Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rottmann, J; Berbeco, R; Keall, P
Purpose: To maximize normal tissue sparing for treatments requiring motion encompassing margins. Motion mitigation techniques including DMLC or couch tracking can freeze tumor motion within the treatment aperture potentially allowing for smaller treatment margins and thus better sparing of normal tissue. To enable for a safe application of this concept in the clinic we propose adapting margins dynamically in real-time during radiotherapy delivery based on personalized tumor localization confidence. To demonstrate technical feasibility we present a phantom study. Methods: We utilize a realistic anthropomorphic dynamic thorax phantom with a lung tumor model embedded close to the spine. The tumor, amore » 3D-printout of a patient's GTV, is moved 15mm peak-to-peak by diaphragm compression and monitored by continuous EPID imaging in real-time. Two treatment apertures are created for each beam, one representing ITV -based and the other GTV-based margin expansion. A soft tissue localization (STiL) algorithm utilizing the continuous EPID images is employed to freeze tumor motion within the treatment aperture by means of DMLC tracking. Depending on a tracking confidence measure (TCM), the treatment aperture is adjusted between the ITV and the GTV leaf. Results: We successfully demonstrate real-time personalized margin adjustment in a phantom study. We measured a system latency of about 250 ms which we compensated by utilizing a respiratory motion prediction algorithm (ridge regression). With prediction in place we observe tracking accuracies better than 1mm. For TCM=0 (as during startup) an ITV-based treatment aperture is chosen, for TCM=1 a GTV-based aperture and for 0« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopal, A; Xu, H; Chen, S
Purpose: To compare the contour propagation accuracy of two deformable image registration (DIR) algorithms in the Raystation treatment planning system – the “Hybrid” algorithm based on image intensities and anatomical information; and the “Biomechanical” algorithm based on linear anatomical elasticity and finite element modeling. Methods: Both DIR algorithms were used for CT-to-CT deformation for 20 lung radiation therapy patients that underwent treatment plan revisions. Deformation accuracy was evaluated using landmark tracking to measure the target registration error (TRE) and inverse consistency error (ICE). The deformed contours were also evaluated against physician drawn contours using Dice similarity coefficients (DSC). Contour propagationmore » was qualitatively assessed using a visual quality score assigned by physicians, and a refinement quality score (0 0.9 for lungs, > 0.85 for heart, > 0.8 for liver) and similar qualitative assessments (VQS < 0.35, RQS > 0.75 for lungs). When anatomical structures were used to control the deformation, the DSC improved more significantly for the biomechanical DIR compared to the hybrid DIR, while the VQS and RQS improved only for the controlling structures. However, while the inclusion of controlling structures improved the TRE for the hybrid DIR, it increased the TRE for the biomechanical DIR. Conclusion: The hybrid DIR was found to perform slightly better than the biomechanical DIR based on lower TRE while the DSC, VQS, and RQS studies yielded comparable results for both. The use of controlling structures showed considerable improvement in the hybrid DIR results and is recommended for clinical use in contour propagation.« less
ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.
Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L
2012-06-07
Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.
Herbal compatibility of traditional Chinese medical formulas for acquired immunodeficiency syndrome.
Cui, Meng; Li, Jinghua; Li, Haiyan; Song, Chunxin
2012-09-01
Because herbal compatibility is one of the most important reasons why Traditional Chinese Medcine (TCM) formulas are effective for acquired immunodeficiency syndrome (AIDS), our study aimed to determine the compatibility of herbs based on published AIDS clinical research in Chinese periodicals. To achieve this aim, we designed a new data-mining algorithm according to TCM data characteristics. We found 25 clinical AIDS studies, all using Chinese herbs for treatment, in the Traditional Chinese Medicine Database System, and information on diagnosis and treatment was extracted. To find out herbal compatibility, especially the formulae for herbal combinations, we proposed an improved association rule algorithm based on the frequency of combinations. In this algorithm, all the compatibility relationships were displayed in a tree structure, by which the relationship between formulas and their derivation could be clearly inferred. Data analysis showed that approximately 100 herbs have been used for treating AIDS. Based on the whole herb compatibility tree, we calculated a basic formula for AIDS: Huang Qi combined with Ren Shen, Fu Ling, Bai Zhu, Bai Zhu, Dang Gui, and Bai Shao. This formula, deriving from most of clinical prescriptions, and was chosed by most of clinicians for AIDS treatment. From data mining we found that Qi replenishment and detoxification were the main treatment principles, which coincided with the AIDS pathological mechanism in which immune function is destroyed by human immunodeficiency virus (HIV). Our data-mining results suggest that the core TCM treatment of AIDS is replenishing Qi and detoxification, by which AIDS patients' immune system may be enhanced. Compatibility of Huang Qi with some frequently-used herbs have shown real efficacy in clinical practice, which warrants pharmacological research in the future.
Multi-objective optimization of radiotherapy: distributed Q-learning and agent-based simulation
NASA Astrophysics Data System (ADS)
Jalalimanesh, Ammar; Haghighi, Hamidreza Shahabi; Ahmadi, Abbas; Hejazian, Hossein; Soltani, Madjid
2017-09-01
Radiotherapy (RT) is among the regular techniques for the treatment of cancerous tumours. Many of cancer patients are treated by this manner. Treatment planning is the most important phase in RT and it plays a key role in therapy quality achievement. As the goal of RT is to irradiate the tumour with adequately high levels of radiation while sparing neighbouring healthy tissues as much as possible, it is a multi-objective problem naturally. In this study, we propose an agent-based model of vascular tumour growth and also effects of RT. Next, we use multi-objective distributed Q-learning algorithm to find Pareto-optimal solutions for calculating RT dynamic dose. We consider multiple objectives and each group of optimizer agents attempt to optimise one of them, iteratively. At the end of each iteration, agents compromise the solutions to shape the Pareto-front of multi-objective problem. We propose a new approach by defining three schemes of treatment planning created based on different combinations of our objectives namely invasive, conservative and moderate. In invasive scheme, we enforce killing cancer cells and pay less attention about irradiation effects on normal cells. In conservative scheme, we take more care of normal cells and try to destroy cancer cells in a less stressed manner. The moderate scheme stands in between. For implementation, each of these schemes is handled by one agent in MDQ-learning algorithm and the Pareto optimal solutions are discovered by the collaboration of agents. By applying this methodology, we could reach Pareto treatment plans through building different scenarios of tumour growth and RT. The proposed multi-objective optimisation algorithm generates robust solutions and finds the best treatment plan for different conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roper, J; Bradshaw, B; Godette, K
Purpose: To create a knowledge-based algorithm for prostate LDR brachytherapy treatment planning that standardizes plan quality using seed arrangements tailored to individual physician preferences while being fast enough for real-time planning. Methods: A dataset of 130 prior cases was compiled for a physician with an active prostate seed implant practice. Ten cases were randomly selected to test the algorithm. Contours from the 120 library cases were registered to a common reference frame. Contour variations were characterized on a point by point basis using principle component analysis (PCA). A test case was converted to PCA vectors using the same process andmore » then compared with each library case using a Mahalanobis distance to evaluate similarity. Rank order PCA scores were used to select the best-matched library case. The seed arrangement was extracted from the best-matched case and used as a starting point for planning the test case. Computational time was recorded. Any subsequent modifications were recorded that required input from a treatment planner to achieve an acceptable plan. Results: The computational time required to register contours from a test case and evaluate PCA similarity across the library was approximately 10s. Five of the ten test cases did not require any seed additions, deletions, or moves to obtain an acceptable plan. The remaining five test cases required on average 4.2 seed modifications. The time to complete manual plan modifications was less than 30s in all cases. Conclusion: A knowledge-based treatment planning algorithm was developed for prostate LDR brachytherapy based on principle component analysis. Initial results suggest that this approach can be used to quickly create treatment plans that require few if any modifications by the treatment planner. In general, test case plans have seed arrangements which are very similar to prior cases, and thus are inherently tailored to physician preferences.« less
A botulinum toxin A treatment algorithm for de novo management of torticollis and laterocollis
Kupsch, Andreas; Müngersdorf, Martina; Paus, Sebastian; Stenner, Andrea; Jost, Wolfgang
2011-01-01
Objectives Few studies have investigated the injection patterns for botulinum toxin type A for the treatment of heterogeneous forms of cervical dystonia (CD). This large, prospective, open-label, multicentre study aimed to evaluate the effectiveness and safety of 500 U botulinum toxin A for the initial treatment according to a standardised algorithm of the two most frequent forms of CD, predominantly torticollis and laterocollis. Design Patients (aged ≥18 years) with CD not previously treated with botulinum neurotoxin therapy were given one treatment with 500 U Dysport, according to a defined intramuscular injection algorithm based on clinical assessment of direction of head deviation, occurrence of shoulder elevation, occurrence of tremor (all evaluated using the Tsui rating scale) and hypertrophy of the sternocleidomastoid muscle. Results In this study, 516 patients were enrolled, the majority of whom (95.0%) completed treatment. Most patients had torticollis (78.1%). At week 4, mean Tsui scores had significantly decreased by −4.01, −3.76 and −4.09 points in the total, torticollis and laterocollis populations, respectively. Symptom improvement was equally effective between groups. Tsui scores remained significantly below baseline at week 12 in both groups. Treatment was well tolerated; the most frequent adverse events were muscular weakness (13.8%), dysphagia (9.9%) and neck pain (6.6%). Conclusions Dysport 500 U is effective and well tolerated for the de novo management of a range of heterogeneous forms of CD, when using a standardised regimen that allows tailored dosing based on individual symptom assessment. Clinical trials information (NCT00447772; clinicaltrials.gov) PMID:22021883
Syndromic Algorithms for Detection of Gambiense Human African Trypanosomiasis in South Sudan
Palmer, Jennifer J.; Surur, Elizeous I.; Goch, Garang W.; Mayen, Mangar A.; Lindner, Andreas K.; Pittet, Anne; Kasparian, Serena; Checchi, Francesco; Whitty, Christopher J. M.
2013-01-01
Background Active screening by mobile teams is considered the best method for detecting human African trypanosomiasis (HAT) caused by Trypanosoma brucei gambiense but the current funding context in many post-conflict countries limits this approach. As an alternative, non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases who need testing based on their symptoms. We explored the predictive value of syndromic referral algorithms to identify symptomatic cases of HAT among a treatment-seeking population in Nimule, South Sudan. Methodology/Principal Findings Symptom data from 462 patients (27 cases) presenting for a HAT test via passive screening over a 7 month period were collected to construct and evaluate over 14,000 four item syndromic algorithms considered simple enough to be used by peripheral HCWs. For comparison, algorithms developed in other settings were also tested on our data, and a panel of expert HAT clinicians were asked to make referral decisions based on the symptom dataset. The best performing algorithms consisted of three core symptoms (sleep problems, neurological problems and weight loss), with or without a history of oedema, cervical adenopathy or proximity to livestock. They had a sensitivity of 88.9–92.6%, a negative predictive value of up to 98.8% and a positive predictive value in this context of 8.4–8.7%. In terms of sensitivity, these out-performed more complex algorithms identified in other studies, as well as the expert panel. The best-performing algorithm is predicted to identify about 9/10 treatment-seeking HAT cases, though only 1/10 patients referred would test positive. Conclusions/Significance In the absence of regular active screening, improving referrals of HAT patients through other means is essential. Systematic use of syndromic algorithms by peripheral HCWs has the potential to increase case detection and would increase their participation in HAT programmes. The algorithms proposed here, though promising, should be validated elsewhere. PMID:23350005
Conservative treatment of boundary interfaces for overlaid grids and multi-level grid adaptations
NASA Technical Reports Server (NTRS)
Moon, Young J.; Liou, Meng-Sing
1989-01-01
Conservative algorithms for boundary interfaces of overlaid grids are presented. The basic method is zeroth order, and is extended to a higher order method using interpolation and subcell decomposition. The present method, strictly based on a conservative constraint, is tested with overlaid grids for various applications of unsteady and steady supersonic inviscid flows with strong shock waves. The algorithm is also applied to a multi-level grid adaptation in which the next level finer grid is overlaid on the coarse base grid with an arbitrary orientation.
NASA Astrophysics Data System (ADS)
Mahnam, Mehdi; Gendreau, Michel; Lahrichi, Nadia; Rousseau, Louis-Martin
2017-07-01
In this paper, we propose a novel heuristic algorithm for the volumetric-modulated arc therapy treatment planning problem, optimizing the trade-off between delivery time and treatment quality. We present a new mixed integer programming model in which the multi-leaf collimator leaf positions, gantry speed, and dose rate are determined simultaneously. Our heuristic is based on column generation; the aperture configuration is modeled in the columns and the dose distribution and time restriction in the rows. To reduce the number of voxels and increase the efficiency of the master model, we aggregate similar voxels using a clustering technique. The efficiency of the algorithm and the treatment quality are evaluated on a benchmark clinical prostate cancer case. The computational results show that a high-quality treatment is achievable using a four-thread CPU. Finally, we analyze the effects of the various parameters and two leaf-motion strategies.
Kang, J E; Yu, J M; Choi, J H; Chung, I-M; Pyun, W B; Kim, S A; Lee, E K; Han, N Y; Yoon, J-H; Oh, J M; Rhie, S J
2018-06-01
Drug therapies are critical for preventing secondary complications in acute coronary syndrome (ACS). The purpose of this study was to develop and apply a pharmaceutical care service (PCS) algorithm for ACS and confirm that it is applicable through a prospective clinical trial. The ACS-PCS algorithm was developed according to extant evidence-based treatment and pharmaceutical care guidelines. Quality assurance was conducted through two methods: literature comparison and expert panel evaluation. The literature comparison was used to compare the content of the algorithm with the referenced guidelines. Expert evaluations were conducted by nine experts for 75 questionnaire items. A trial was conducted to confirm its effectiveness. Seventy-nine patients were assigned to either the pharmacist-included multidisciplinary team care (MTC) group or the usual care (UC) group. The endpoints of the trial were the prescription rate of two important drugs, readmission, emergency room (ER) visit and mortality. The main frame of the algorithm was structured with three tasks: medication reconciliation, medication optimization and transition of care. The contents and context of the algorithm were compliant with class I recommendations and the main service items from the evidence-based guidelines. Opinions from the expert panel were mostly positive. There were significant differences in beta-blocker prescription rates in the overall period (P = .013) and ER visits (four cases, 9.76%, P = .016) in the MTC group compared to the UC group, respectively. We developed a PCS algorithm for ACS based on the contents of evidence-based drug therapy and the core concept of pharmacist services. © 2018 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Connor, D; Nguyen, D; Voronenko, Y
Purpose: Integrated beam orientation and fluence map optimization is expected to be the foundation of robust automated planning but existing heuristic methods do not promise global optimality. We aim to develop a new method for beam angle selection in 4π non-coplanar IMRT systems based on solving (globally) a single convex optimization problem, and to demonstrate the effectiveness of the method by comparison with a state of the art column generation method for 4π beam angle selection. Methods: The beam angle selection problem is formulated as a large scale convex fluence map optimization problem with an additional group sparsity term thatmore » encourages most candidate beams to be inactive. The optimization problem is solved using an accelerated first-order method, the Fast Iterative Shrinkage-Thresholding Algorithm (FISTA). The beam angle selection and fluence map optimization algorithm is used to create non-coplanar 4π treatment plans for several cases (including head and neck, lung, and prostate cases) and the resulting treatment plans are compared with 4π treatment plans created using the column generation algorithm. Results: In our experiments the treatment plans created using the group sparsity method meet or exceed the dosimetric quality of plans created using the column generation algorithm, which was shown superior to clinical plans. Moreover, the group sparsity approach converges in about 3 minutes in these cases, as compared with runtimes of a few hours for the column generation method. Conclusion: This work demonstrates the first non-greedy approach to non-coplanar beam angle selection, based on convex optimization, for 4π IMRT systems. The method given here improves both treatment plan quality and runtime as compared with a state of the art column generation algorithm. When the group sparsity term is set to zero, we obtain an excellent method for fluence map optimization, useful when beam angles have already been selected. NIH R43CA183390, NIH R01CA188300, Varian Medical Systems; Part of this research took place while D. O’Connor was a summer intern at RefleXion Medical.« less
Korean Medication Algorithm for Depressive Disorder: Comparisons with Other Treatment Guidelines
Wang, Hee Ryung; Park, Young-Min; Lee, Hwang Bin; Song, Hoo Rim; Jeong, Jong-Hyun; Seo, Jeong Seok; Lim, Eun-Sung; Hong, Jeong-Wan; Kim, Won; Jon, Duk-In; Hong, Jin-Pyo; Woo, Young Sup; Min, Kyung Joon
2014-01-01
We aimed to compare the recommendations of the Korean Medication Algorithm Project for Depressive Disorder 2012 (KMAP-DD 2012) with other recently published treatment guidelines for depressive disorder. We reviewed a total of five recently published global treatment guidelines and compared each treatment recommendation of the KMAP-DD 2012 with those in other guidelines. For initial treatment recommendations, there were no significant major differences across guidelines. However, in the case of nonresponse or incomplete response to initial treatment, the second recommended treatment step varied across guidelines. For maintenance therapy, medication dose and duration differed among treatment guidelines. Further, there were several discrepancies in the recommendations for each subtype of depressive disorder across guidelines. For treatment in special populations, there were no significant differences in overall recommendations. This comparison identifies that, by and large, the treatment recommendations of the KMAP-DD 2012 are similar to those of other treatment guidelines and reflect current changes in prescription pattern for depression based on accumulated research data. Further studies will be needed to address several issues identified in our review. PMID:24605117
Tian, Xin; Xin, Mingyuan; Luo, Jian; Liu, Mingyao; Jiang, Zhenran
2017-02-01
The selection of relevant genes for breast cancer metastasis is critical for the treatment and prognosis of cancer patients. Although much effort has been devoted to the gene selection procedures by use of different statistical analysis methods or computational techniques, the interpretation of the variables in the resulting survival models has been limited so far. This article proposes a new Random Forest (RF)-based algorithm to identify important variables highly related with breast cancer metastasis, which is based on the important scores of two variable selection algorithms, including the mean decrease Gini (MDG) criteria of Random Forest and the GeneRank algorithm with protein-protein interaction (PPI) information. The new gene selection algorithm can be called PPIRF. The improved prediction accuracy fully illustrated the reliability and high interpretability of gene list selected by the PPIRF approach.
Schramm, Catherine; Vial, Céline; Bachoud-Lévi, Anne-Catherine; Katsahian, Sandrine
2018-01-01
Heterogeneity in treatment efficacy is a major concern in clinical trials. Clustering may help to identify the treatment responders and the non-responders. In the context of longitudinal cluster analyses, sample size and variability of the times of measurements are the main issues with the current methods. Here, we propose a new two-step method for the Clustering of Longitudinal data by using an Extended Baseline. The first step relies on a piecewise linear mixed model for repeated measurements with a treatment-time interaction. The second step clusters the random predictions and considers several parametric (model-based) and non-parametric (partitioning, ascendant hierarchical clustering) algorithms. A simulation study compares all options of the clustering of longitudinal data by using an extended baseline method with the latent-class mixed model. The clustering of longitudinal data by using an extended baseline method with the two model-based algorithms was the more robust model. The clustering of longitudinal data by using an extended baseline method with all the non-parametric algorithms failed when there were unequal variances of treatment effect between clusters or when the subgroups had unbalanced sample sizes. The latent-class mixed model failed when the between-patients slope variability is high. Two real data sets on neurodegenerative disease and on obesity illustrate the clustering of longitudinal data by using an extended baseline method and show how clustering may help to identify the marker(s) of the treatment response. The application of the clustering of longitudinal data by using an extended baseline method in exploratory analysis as the first stage before setting up stratified designs can provide a better estimation of treatment effect in future clinical trials.
Detection theory applied to high intensity focused ultrasound (HIFU) treatment evaluation
NASA Astrophysics Data System (ADS)
Sanghvi, Narendra; Wunderlich, Adam; Seip, Ralf; Tavakkoli, Jahangir; Dines, Kris; Baily, Michael; Crum, Lawrence
2003-04-01
The aim of this work is to develop a HIFU treatment evaluation algorithm based on 1-D pulse/echo (P/E) ultrasound data taken during HIFU exposures. The algorithm is applicable to large treatment volumes resulting from several overlapping elementary exposures. Treatments consisted of multiple HIFU exposures with an on-time of 3 seconds each, spaced 3 mm apart, and an off-time of 6 seconds in between HIFU exposures. The HIFU was paused for approximately 70 milliseconds every 0.5 seconds, while P/E data was acquired along the beam axis, using a confocal imaging transducer. Data was collected from multiple in vitro and in vivo tissue treatments, including shams. The cumulative energy change in the P/E data was found for every HIFU exposure, as a function of depth. Subsequently, a likelihood ratio test with a fixed false alarm rate was used to derive a positive or negative lesion creation decision for that position. For false alarm rates less than 5%, positive treatment outcomes were consistently detected for better than 90% of the HIFU exposures. In addition, the algorithm outcome correlated to the applied HIFU intensity level. Lesion formation was therefore successfully detected as a function of dosage. [Work supported by NIH SBIR Grant 2 R 44 CA 83244-02.
NASA Astrophysics Data System (ADS)
Tang, Xiaoli; Lin, Tong; Jiang, Steve
2009-09-01
We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.
Multispectra CWT-based algorithm (MCWT) in mass spectra for peak extraction.
Hsueh, Huey-Miin; Kuo, Hsun-Chih; Tsai, Chen-An
2008-01-01
An important objective in mass spectrometry (MS) is to identify a set of biomarkers that can be used to potentially distinguish patients between distinct treatments (or conditions) from tens or hundreds of spectra. A common two-step approach involving peak extraction and quantification is employed to identify the features of scientific interest. The selected features are then used for further investigation to understand underlying biological mechanism of individual protein or for development of genomic biomarkers to early diagnosis. However, the use of inadequate or ineffective peak detection and peak alignment algorithms in peak extraction step may lead to a high rate of false positives. Also, it is crucial to reduce the false positive rate in detecting biomarkers from ten or hundreds of spectra. Here a new procedure is introduced for feature extraction in mass spectrometry data that extends the continuous wavelet transform-based (CWT-based) algorithm to multiple spectra. The proposed multispectra CWT-based algorithm (MCWT) not only can perform peak detection for multiple spectra but also carry out peak alignment at the same time. The author' MCWT algorithm constructs a reference, which integrates information of multiple raw spectra, for feature extraction. The algorithm is applied to a SELDI-TOF mass spectra data set provided by CAMDA 2006 with known polypeptide m/z positions. This new approach is easy to implement and it outperforms the existing peak extraction method from the Bioconductor PROcess package.
Dionne, Audrey; Meloche-Dumas, Léamarie; Desjardins, Laurent; Turgeon, Jean; Saint-Cyr, Claire; Autmizguine, Julie; Spigelblatt, Linda; Fournier, Anne; Dahdah, Nagib
2017-03-01
Diagnosis of Kawasaki disease (KD) can be challenging in the absence of a confirmatory test or pathognomonic finding, especially when clinical criteria are incomplete. We recently proposed serum N-terminal pro-B-type natriuretic peptide (NT-proBNP) as an adjunctive diagnostic test. We retrospectively tested a new algorithm to help KD diagnosis based on NT-proBNP, coronary artery dilation (CAD) at onset, and abnormal serum albumin or C-reactive protein (CRP). The goal was to assess the performance of the algorithm and compare its performance with that of the 2004 American Heart Association (AHA)/American Academy of Pediatrics (AAP) algorithm. The algorithm was tested on 124 KD patients with NT-proBNP measured on admission at the present institutions between 2007 and 2013. Age at diagnosis was 3.4 ± 3.0 years, with a median of five diagnostic criteria; and 55 of the 124 patients (44%) had incomplete KD. CA complications occurred in 64 (52%), with aneurysm in 14 (11%). Using this algorithm, 120/124 (97%) were to be treated, based on high NT-proBNP alone for 79 (64%); on onset CAD for 14 (11%); and on high CRP or low albumin for 27 (22%). Using the AHA/AAP algorithm, 22/47 (47%) of the eligible patients with incomplete KD would not have been referred for treatment, compared with 3/55 (5%) with the NT-proBNP algorithm (P < 0.001). This NT-proBNP-based algorithm is efficient to identify and treat patients with KD, including those with incomplete KD. This study paves the way for a prospective validation trial of the algorithm. © 2016 Japan Pediatric Society.
Can, Anil; Orgill, Dennis P; Dietmar Ulrich, J O; Mureau, Marc A M
2014-12-01
Because the vascular anatomy of the trapezius flap is highly variable, choosing the most appropriate flap type and design is essential to optimize outcomes and minimize postoperative complications. The aim of this study was to develop a surgical treatment algorithm for trapezius flap transfers. The medical files of all consecutive patients with a myocutaneous trapezius flap reconstruction of the head, neck, and upper back area treated at three different university medical centers between July 2001 and November 2012 were reviewed. There were 43 consecutive flaps performed in 38 patients with a mean follow-up time of 15 months (range, 1-48 months). Eleven patients had a mentosternal burn scar contracture (12 flaps), 12 patients (13 flaps) presented with cancer, and 15 patients (18 flaps) were suffering from chronic wounds due to failed previous reconstruction (n = 6), osteoradionecrosis (n = 1), chronic infection (n = 3), bronchopleural fistula (n = 3), and pressure sores (n = 2). The mean defect size was 152 cm(2). Sixteen flaps were based on the superficial cervical artery (SCA; type 2), 16 were based on the dorsal scapular artery (DSA; type 3), one was based on the intercostal arteries (type 4), and 10 flaps were based on both the DSA and SCA. Recipient-site complications requiring reoperation occurred in 16.3%, including one total flap failure (2.6%). The trapezius myocutaneous flap is a valuable option to reconstruct various head and neck and upper back defects. Based on our data, a surgical treatment algorithm was developed in an attempt to reduce variation in care and improve clinical outcomes. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
An Algorithm-Based Approach for Behavior and Disease Management in Children.
Meyer, Beau D; Lee, Jessica Y; Thikkurissy, S; Casamassimo, Paul S; Vann, William F
2018-03-15
Pharmacologic behavior management for dental treatment is an approach to provide invasive yet compassionate care for young children; it can facilitate the treatment of children who otherwise may not cooperate for traditional in-office care. Some recent highly publicized procedural sedation-related tragedies have drawn attention to risks associated with pharmacologic management. However, it remains widely accepted that, by adhering to proper guidelines, procedural sedation can assist in the provision of high-quality dental care while minimizing morbidity and mortality from the procedure. The purpose of this paper was to propose an algorithm for clinicians to consider when selecting a behavior and disease management strategy for early childhood caries. This algorithm will not ensure a positive outcome but can assist clinicians when counseling caregivers about risks, benefits, and alternatives. It also emphasizes and underscores best-safety practices.
An Algorithm for Creating Virtual Controls Using Integrated and Harmonized Longitudinal Data.
Hansen, William B; Chen, Shyh-Huei; Saldana, Santiago; Ip, Edward H
2018-06-01
We introduce a strategy for creating virtual control groups-cases generated through computer algorithms that, when aggregated, may serve as experimental comparators where live controls are difficult to recruit, such as when programs are widely disseminated and randomization is not feasible. We integrated and harmonized data from eight archived longitudinal adolescent-focused data sets spanning the decades from 1980 to 2010. Collectively, these studies examined numerous psychosocial variables and assessed past 30-day alcohol, cigarette, and marijuana use. Additional treatment and control group data from two archived randomized control trials were used to test the virtual control algorithm. Both randomized controlled trials (RCTs) assessed intentions, normative beliefs, and values as well as past 30-day alcohol, cigarette, and marijuana use. We developed an algorithm that used percentile scores from the integrated data set to create age- and gender-specific latent psychosocial scores. The algorithm matched treatment case observed psychosocial scores at pretest to create a virtual control case that figuratively "matured" based on age-related changes, holding the virtual case's percentile constant. Virtual controls matched treatment case occurrence, eliminating differential attrition as a threat to validity. Virtual case substance use was estimated from the virtual case's latent psychosocial score using logistic regression coefficients derived from analyzing the treatment group. Averaging across virtual cases created group estimates of prevalence. Two criteria were established to evaluate the adequacy of virtual control cases: (1) virtual control group pretest drug prevalence rates should match those of the treatment group and (2) virtual control group patterns of drug prevalence over time should match live controls. The algorithm successfully matched pretest prevalence for both RCTs. Increases in prevalence were observed, although there were discrepancies between live and virtual control outcomes. This study provides an initial framework for creating virtual controls using a step-by-step procedure that can now be revised and validated using other prevention trial data.
NASA Astrophysics Data System (ADS)
Yip, Stephen S. F.; Coroller, Thibaud P.; Sanford, Nina N.; Huynh, Elizabeth; Mamon, Harvey; Aerts, Hugo J. W. L.; Berbeco, Ross I.
2016-01-01
Change in PET-based textural features has shown promise in predicting cancer response to treatment. However, contouring tumour volumes on longitudinal scans is time-consuming. This study investigated the usefulness of contour propagation in texture analysis for the purpose of pathologic response prediction in esophageal cancer. Forty-five esophageal cancer patients underwent PET/CT scans before and after chemo-radiotherapy. Patients were classified into responders and non-responders after the surgery. Physician-defined tumour ROIs on pre-treatment PET were propagated onto the post-treatment PET using rigid and ten deformable registration algorithms. PET images were converted into 256 discrete values. Co-occurrence, run-length, and size zone matrix textures were computed within all ROIs. The relative difference of each texture at different treatment time-points was used to predict the pathologic responders. Their predictive value was assessed using the area under the receiver-operating-characteristic curve (AUC). Propagated ROIs from different algorithms were compared using Dice similarity index (DSI). Contours propagated by the fast-demons, fast-free-form and rigid algorithms did not fully capture the high FDG uptake regions of tumours. Fast-demons propagated ROIs had the least agreement with other contours (DSI = 58%). Moderate to substantial overlap were found in the ROIs propagated by all other algorithms (DSI = 69%-79%). Rigidly propagated ROIs with co-occurrence texture failed to significantly differentiate between responders and non-responders (AUC = 0.58, q-value = 0.33), while the differentiation was significant with other textures (AUC = 0.71‒0.73, p < 0.009). Among the deformable algorithms, fast-demons (AUC = 0.68‒0.70, q-value < 0.03) and fast-free-form (AUC = 0.69‒0.74, q-value < 0.04) were the least predictive. ROIs propagated by all other deformable algorithms with any texture significantly predicted pathologic responders (AUC = 0.72‒0.78, q-value < 0.01). Propagated ROIs using deformable registration for all textures can lead to accurate prediction of pathologic response, potentially expediting the temporal texture analysis process. However, fast-demons, fast-free-form, and rigid algorithms should be applied with care due to their inferior performance compared to other algorithms.
EEG-based "serious" games and monitoring tools for pain management.
Sourina, Olga; Wang, Qiang; Nguyen, Minh Khoa
2011-01-01
EEG-based "serious games" for medical applications attracted recently more attention from the research community and industry as wireless EEG reading devices became easily available on the market. EEG-based technology has been applied in anesthesiology, psychology, etc. In this paper, we proposed and developed EEG-based "serious" games and doctor's monitoring tools that could be used for pain management. As EEG signal is considered to have a fractal nature, we proposed and develop a novel spatio-temporal fractal based algorithm for brain state quantification. The algorithm is implemented with blobby visualization tools for patient monitoring and in EEG-based "serious" games. Such games could be used by patient even at home convenience for pain management as an alternative to traditional drug treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicolae, A; Department of Physics, Ryerson University, Toronto, ON; Lu, L
Purpose: A novel, automated, algorithm for permanent prostate brachytherapy (PPB) treatment planning has been developed. The novel approach uses machine-learning (ML), a form of artificial intelligence, to substantially decrease planning time while simultaneously retaining the clinical intuition of plans created by radiation oncologists. This study seeks to compare the ML algorithm against expert-planned PPB plans to evaluate the equivalency of dosimetric and clinical plan quality. Methods: Plan features were computed from historical high-quality PPB treatments (N = 100) and stored in a relational database (RDB). The ML algorithm matched new PPB features to a highly similar case in the RDB;more » this initial plan configuration was then further optimized using a stochastic search algorithm. PPB pre-plans (N = 30) generated using the ML algorithm were compared to plan variants created by an expert dosimetrist (RT), and radiation oncologist (MD). Planning time and pre-plan dosimetry were evaluated using a one-way Student’s t-test and ANOVA, respectively (significance level = 0.05). Clinical implant quality was evaluated by expert PPB radiation oncologists as part of a qualitative study. Results: Average planning time was 0.44 ± 0.42 min compared to 17.88 ± 8.76 min for the ML algorithm and RT, respectively, a significant advantage [t(9), p = 0.01]. A post-hoc ANOVA [F(2,87) = 6.59, p = 0.002] using Tukey-Kramer criteria showed a significantly lower mean prostate V150% for the ML plans (52.9%) compared to the RT (57.3%), and MD (56.2%) plans. Preliminary qualitative study results indicate comparable clinical implant quality between RT and ML plans with a trend towards preference for ML plans. Conclusion: PPB pre-treatment plans highly comparable to those of an expert radiation oncologist can be created using a novel ML planning model. The use of an ML-based planning approach is expected to translate into improved PPB accessibility and plan uniformity.« less
Behavioral-Based Predictors of Workplace Violence in the Army STARRS
2014-10-01
Dawes RM, Faust D, Meehl PE. Clinical versus actuarial judgment. Science . 1989;243(4899): 1668-1674. 46. Grove WM, Zald DH, Lebow BS, Snitz BE, Nelson...develop an actuarial risk algorithm predicting suicide in the 12 months after US Army soldier inpatient treatment of a psychiatric disorder to target...generate an actuarial post- hospitalization suicide risk algorithm. Previous research has revealed that actuarial suicide prediction is much more
Skull base osteomyelitis: current microbiology and management.
Spielmann, P M; Yu, R; Neeff, M
2013-01-01
Skull base osteomyelitis typically presents in an immunocompromised patient with severe otalgia and otorrhoea. Pseudomonas aeruginosa is the commonest pathogenic micro-organism, and reports of resistance to fluoroquinolones are now emerging, complicating management. We reviewed our experience of this condition, and of the local pathogenic organisms. A retrospective review from 2004 to 2011 was performed. Patients were identified by their admission diagnostic code, and computerised records examined. Twenty patients were identified. A facial palsy was present in 12 patients (60 per cent). Blood cultures were uniformly negative, and culture of ear canal granulations was non-diagnostic in 71 per cent of cases. Pseudomonas aeruginosa was isolated in only 10 (50 per cent) cases; one strain was resistant to ciprofloxacin but all were sensitive to ceftazidime. Two cases of fungal skull base osteomyelitis were identified. The mortality rate was 15 per cent. The patients' treatment algorithm is presented. Our treatment algorithm reflects the need for multidisciplinary input, early microbial culture of specimens, appropriate imaging, and prolonged and systemic antimicrobial treatment. Resolution of infection must be confirmed by close follow up and imaging.
Automatic burst detection for the EEG of the preterm infant.
Jennekens, Ward; Ruijs, Loes S; Lommen, Charlotte M L; Niemarkt, Hendrik J; Pasman, Jaco W; van Kranen-Mastenbroek, Vivianne H J M; Wijn, Pieter F F; van Pul, Carola; Andriessen, Peter
2011-10-01
To aid with prognosis and stratification of clinical treatment for preterm infants, a method for automated detection of bursts, interburst-intervals (IBIs) and continuous patterns in the electroencephalogram (EEG) is developed. Results are evaluated for preterm infants with normal neurological follow-up at 2 years. The detection algorithm (MATLAB®) for burst, IBI and continuous pattern is based on selection by amplitude, time span, number of channels and numbers of active electrodes. Annotations of two neurophysiologists were used to determine threshold values. The training set consisted of EEG recordings of four preterm infants with postmenstrual age (PMA, gestational age + postnatal age) of 29-34 weeks. Optimal threshold values were based on overall highest sensitivity. For evaluation, both observers verified detections in an independent dataset of four EEG recordings with comparable PMA. Algorithm performance was assessed by calculation of sensitivity and positive predictive value. The results of algorithm evaluation are as follows: sensitivity values of 90% ± 6%, 80% ± 9% and 97% ± 5% for burst, IBI and continuous patterns, respectively. Corresponding positive predictive values were 88% ± 8%, 96% ± 3% and 85% ± 15%, respectively. In conclusion, the algorithm showed high sensitivity and positive predictive values for bursts, IBIs and continuous patterns in preterm EEG. Computer-assisted analysis of EEG may allow objective and reproducible analysis for clinical treatment.
Gerety, Gregg; Bebakar, Wan Mohamad Wan; Chaykin, Louis; Ozkaya, Mesut; Macura, Stanislava; Hersløv, Malene Lundgren; Behnke, Thomas
2016-05-01
This 26-week, multicenter, randomized, open-label, parallel-group, treat-to-target trial in adults with type 2 diabetes compared the efficacy and safety of treatment intensification algorithms with twice-daily (BID) insulin degludec/insulin aspart (IDegAsp). Patients randomized 1:1 to IDegAsp BID used either a 'Simple' algorithm (twice-weekly dose adjustments based on a single prebreakfast and pre-evening meal self-monitored plasma glucose [SMPG] measurement; IDegAsp[BIDSimple], n = 136) or a 'Stepwise' algorithm (once-weekly dose adjustments based on the lowest of 3 pre-breakfast and 3 pre-evening meal SMPG values; IDegAsp[BIDStep-wise], n = 136). After 26 weeks, mean change from baseline in glycated hemoglobin (HbA1c) with IDegAsp[BIDSimple] was noninferior to IDegAsp[BIDStep-wise] (-15 mmol/mol versus -14 mmol/mol; 95% confidence interval [CI] upper limit, <4 mmol/mol) (baseline HbA1c: 66.3 mmol/mol IDegAsp[BIDSimple] and 66.6 mmol/mol IDegAsp[BIDStep-wise]). The proportion of patients who achieved HbA1c <7.0% (<53 mmol/mol) at the end of the trial was 66.9% with IDegAsp[BIDSimple] and 62.5% with IDegAsp[BIDStep-wise]. Fasting plasma glucose levels were reduced with each titration algorithm (-1.51 mmol/L IDegAsp[BIDSimple] versus -1.95 mmol/L IDegAsp[BIDStep-wise]). Weight gain was 3.8 kg IDegAsp[BIDSimple] versus 2.6 kg IDegAsp[BIDStep-wise], and rates of overall confirmed hypoglycemia (5.16 episodes per patient-year of exposure [PYE] versus 8.93 PYE) and nocturnal confirmed hypoglycemia (0.78 PYE versus 1.33 PYE) were significantly lower with IDegAsp[BIDStep-wise] versus IDegAsp[BIDSimple]. There were no significant differences in insulin dose increments between groups. Treatment intensification with IDegAsp[BIDSimple] was noninferior to IDegAsp[BIDStep-wise]. Both titration algorithms were well tolerated; however, the more conservative step-wise algorithm led to less weight gain and fewer hypoglycemic episodes. Clinicaltrials.gov: NCT01680341.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yaparpalvi, R; Mynampati, D; Kuo, H
Purpose: To study the influence of superposition-beam model (AAA) and determinant-photon transport-solver (Acuros XB) dose calculation algorithms on the treatment plan quality metrics and on normal lung dose in Lung SBRT. Methods: Treatment plans of 10 Lung SBRT patients were randomly selected. Patients were prescribed to a total dose of 50-54Gy in 3–5 fractions (10?5 or 18?3). Doses were optimized accomplished with 6-MV using 2-arcs (VMAT). Doses were calculated using AAA algorithm with heterogeneity correction. For each plan, plan quality metrics in the categories- coverage, homogeneity, conformity and gradient were quantified. Repeat dosimetry for these AAA treatment plans was performedmore » using AXB algorithm with heterogeneity correction for same beam and MU parameters. Plan quality metrics were again evaluated and compared with AAA plan metrics. For normal lung dose, V{sub 20} and V{sub 5} to (Total lung- GTV) were evaluated. Results: The results are summarized in Supplemental Table 1. PTV volume was mean 11.4 (±3.3) cm{sup 3}. Comparing RTOG 0813 protocol criteria for conformality, AXB plans yielded on average, similar PITV ratio (individual PITV ratio differences varied from −9 to +15%), reduced target coverage (−1.6%) and increased R50% (+2.6%). Comparing normal lung doses, the lung V{sub 20} (+3.1%) and V{sub 5} (+1.5%) were slightly higher for AXB plans compared to AAA plans. High-dose spillage ((V105%PD - PTV)/ PTV) was slightly lower for AXB plans but the % low dose spillage (D2cm) was similar between the two calculation algorithms. Conclusion: AAA algorithm overestimates lung target dose. Routinely adapting to AXB for dose calculations in Lung SBRT planning may improve dose calculation accuracy, as AXB based calculations have been shown to be closer to Monte Carlo based dose predictions in accuracy and with relatively faster computational time. For clinical practice, revisiting dose-fractionation in Lung SBRT to correct for dose overestimates attributable to algorithm may very well be warranted.« less
Sharma, Subhash; Ott, Joseph; Williams, Jamone; Dickow, Danny
2011-01-01
Monte Carlo dose calculation algorithms have the potential for greater accuracy than traditional model-based algorithms. This enhanced accuracy is particularly evident in regions of lateral scatter disequilibrium, which can develop during treatments incorporating small field sizes and low-density tissue. A heterogeneous slab phantom was used to evaluate the accuracy of several commercially available dose calculation algorithms, including Monte Carlo dose calculation for CyberKnife, Analytical Anisotropic Algorithm and Pencil Beam convolution for the Eclipse planning system, and convolution-superposition for the Xio planning system. The phantom accommodated slabs of varying density; comparisons between planned and measured dose distributions were accomplished with radiochromic film. The Monte Carlo algorithm provided the most accurate comparison between planned and measured dose distributions. In each phantom irradiation, the Monte Carlo predictions resulted in gamma analysis comparisons >97%, using acceptance criteria of 3% dose and 3-mm distance to agreement. In general, the gamma analysis comparisons for the other algorithms were <95%. The Monte Carlo dose calculation algorithm for CyberKnife provides more accurate dose distribution calculations in regions of lateral electron disequilibrium than commercially available model-based algorithms. This is primarily because of the ability of Monte Carlo algorithms to implicitly account for tissue heterogeneities, density scaling functions; and/or effective depth correction factors are not required. Copyright © 2011 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
Yock, Adam D; Kim, Gwe-Ya
2017-09-01
To present the k-means clustering algorithm as a tool to address treatment planning considerations characteristic of stereotactic radiosurgery using a single isocenter for multiple targets. For 30 patients treated with stereotactic radiosurgery for multiple brain metastases, the geometric centroids and radii of each met were determined from the treatment planning system. In-house software used this as well as weighted and unweighted versions of the k-means clustering algorithm to group the targets to be treated with a single isocenter, and to position each isocenter. The algorithm results were evaluated using within-cluster sum of squares as well as a minimum target coverage metric that considered the effect of target size. Both versions of the algorithm were applied to an example patient to demonstrate the prospective determination of the appropriate number and location of isocenters. Both weighted and unweighted versions of the k-means algorithm were applied successfully to determine the number and position of isocenters. Comparing the two, both the within-cluster sum of squares metric and the minimum target coverage metric resulting from the unweighted version were less than those from the weighted version. The average magnitudes of the differences were small (-0.2 cm 2 and 0.1% for the within cluster sum of squares and minimum target coverage, respectively) but statistically significant (Wilcoxon signed-rank test, P < 0.01). The differences between the versions of the k-means clustering algorithm represented an advantage of the unweighted version for the within-cluster sum of squares metric, and an advantage of the weighted version for the minimum target coverage metric. While additional treatment planning considerations have a large influence on the final treatment plan quality, both versions of the k-means algorithm provide automatic, consistent, quantitative, and objective solutions to the tasks associated with SRS treatment planning using a single isocenter for multiple targets. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
The Use of Information from Wrong Responses in Measuring Students’ Achievement.
1980-02-01
with an appropriate treatment. We may therefore consider treatment for purposes such as: changing one’s self image; changing a person’s attitude toward... Respose time was shown to be a useful tool in helping to identify the underlying algorithm. Based on these results it seems necessary in measuring
[Scar prophylaxis and treatment].
Hammer-Hansen, Niels; Damsgaard, Tine Engberg; Rødgaard, Jes Christian
2015-10-12
Scarring is an expected result of trauma to the skin. Scars are a heterogenic group varying from small white non elevated scars to hypertrophic scars and keloids. Many different algorithms for scar prophylaxis and treatment have been presented in the literature. We discuss different types of scar formation and recently published evidence-based guidelines in regards to prophylaxis and treatment of scars written by 24 experts on scar management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Uytven, Eric, E-mail: eric.vanuytven@cancercare.mb.ca; Van Beek, Timothy; McCowan, Peter M.
2015-12-15
Purpose: Radiation treatments are trending toward delivering higher doses per fraction under stereotactic radiosurgery and hypofractionated treatment regimens. There is a need for accurate 3D in vivo patient dose verification using electronic portal imaging device (EPID) measurements. This work presents a model-based technique to compute full three-dimensional patient dose reconstructed from on-treatment EPID portal images (i.e., transmission images). Methods: EPID dose is converted to incident fluence entering the patient using a series of steps which include converting measured EPID dose to fluence at the detector plane and then back-projecting the primary source component of the EPID fluence upstream of themore » patient. Incident fluence is then recombined with predicted extra-focal fluence and used to calculate 3D patient dose via a collapsed-cone convolution method. This method is implemented in an iterative manner, although in practice it provides accurate results in a single iteration. The robustness of the dose reconstruction technique is demonstrated with several simple slab phantom and nine anthropomorphic phantom cases. Prostate, head and neck, and lung treatments are all included as well as a range of delivery techniques including VMAT and dynamic intensity modulated radiation therapy (IMRT). Results: Results indicate that the patient dose reconstruction algorithm compares well with treatment planning system computed doses for controlled test situations. For simple phantom and square field tests, agreement was excellent with a 2%/2 mm 3D chi pass rate ≥98.9%. On anthropomorphic phantoms, the 2%/2 mm 3D chi pass rates ranged from 79.9% to 99.9% in the planning target volume (PTV) region and 96.5% to 100% in the low dose region (>20% of prescription, excluding PTV and skin build-up region). Conclusions: An algorithm to reconstruct delivered patient 3D doses from EPID exit dosimetry measurements was presented. The method was applied to phantom and patient data sets, as well as for dynamic IMRT and VMAT delivery techniques. Results indicate that the EPID dose reconstruction algorithm presented in this work is suitable for clinical implementation.« less
Speeding up local correlation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kats, Daniel
2014-12-28
We present two techniques that can substantially speed up the local correlation methods. The first one allows one to avoid the expensive transformation of the electron-repulsion integrals from atomic orbitals to virtual space. The second one introduces an algorithm for the residual equations in the local perturbative treatment that, in contrast to the standard scheme, does not require holding the amplitudes or residuals in memory. It is shown that even an interpreter-based implementation of the proposed algorithm in the context of local MP2 method is faster and requires less memory than the highly optimized variants of conventional algorithms.
Morriss, Richard; Marttunnen, Sarah; Garland, Anne; Nixon, Neil; McDonald, Ruth; Sweeney, Tim; Flambert, Heather; Fox, Richard; Kaylor-Hughes, Catherine; James, Marilyn; Yang, Min
2010-11-29
Around 40 per cent of patients with unipolar depressive disorder who are treated in secondary care mental health services do not respond to first or second line treatments for depression. Such patients have 20 times the suicide rate of the general population and treatment response becomes harder to achieve and sustain the longer they remain depressed. Despite this there are no randomised controlled trials of community based service delivery interventions delivering both algorithm based pharmacotherapy and psychotherapy for patients with chronic depressive disorder in secondary care mental health services who remain moderately or severely depressed after six months treatment. Without such trials evidence based guidelines on services for such patients cannot be derived. Single blind individually randomised controlled trial of a specialist depression disorder team (psychiatrist and psychotherapist jointly assessing and providing algorithm based drug and psychological treatment) versus usual secondary care treatment. We will recruit 174 patients with unipolar depressive disorder in secondary mental health services with a Hamilton Depression Rating Scale (HDRS) score ≥ 16 and global assessment of function (GAF) ≤ 60 after ≥ 6 months treatment. The primary outcome measures will be the HDRS and GAF supplemented by economic analysis including the EQ5 D and analysis of barriers to care, implementation and the process of care. Audits to benchmark both treatment arms against national standards of care will aid the interpretation of the results of the study. This trial will be the first to assess the effectiveness and implementation of a community based specialist depression disorder team. The study has been specially designed as part of the CLAHRC Nottinghamshire, Derbyshire and Lincolnshire joint collaboration between university, health and social care organisations to provide information of direct relevance to decisions on commissioning, service provision and implementation.
Cavity control as a new quantum algorithms implementation treatment
NASA Astrophysics Data System (ADS)
AbuGhanem, M.; Homid, A. H.; Abdel-Aty, M.
2018-02-01
Based on recent experiments [ Nature 449, 438 (2007) and Nature Physics 6, 777 (2010)], a new approach for realizing quantum gates for the design of quantum algorithms was developed. Accordingly, the operation times of such gates while functioning in algorithm applications depend on the number of photons present in their resonant cavities. Multi-qubit algorithms can be realized in systems in which the photon number is increased slightly over the qubit number. In addition, the time required for operation is considerably less than the dephasing and relaxation times of the systems. The contextual use of the photon number as a main control in the realization of any algorithm was demonstrated. The results indicate the possibility of a full integration into the realization of multi-qubit multiphoton states and its application in algorithm designs. Furthermore, this approach will lead to a successful implementation of these designs in future experiments.
Rothermundt, Christian; Bailey, Alexandra; Cerbone, Linda; Eisen, Tim; Escudier, Bernard; Gillessen, Silke; Grünwald, Viktor; Larkin, James; McDermott, David; Oldenburg, Jan; Porta, Camillo; Rini, Brian; Schmidinger, Manuela; Sternberg, Cora; Putora, Paul M
2015-09-01
With the advent of targeted therapies, many treatment options in the first-line setting of metastatic clear cell renal cell carcinoma (mccRCC) have emerged. Guidelines and randomized trial reports usually do not elucidate the decision criteria for the different treatment options. In order to extract the decision criteria for the optimal therapy for patients, we performed an analysis of treatment algorithms from experts in the field. Treatment algorithms for the treatment of mccRCC from experts of 11 institutions were obtained, and decision trees were deduced. Treatment options were identified and a list of unified decision criteria determined. The final decision trees were analyzed with a methodology based on diagnostic nodes, which allows for an automated cross-comparison of decision trees. The most common treatment recommendations were determined, and areas of discordance were identified. The analysis revealed heterogeneity in most clinical scenarios. The recommendations selected for first-line treatment of mccRCC included sunitinib, pazopanib, temsirolimus, interferon-α combined with bevacizumab, high-dose interleukin-2, sorafenib, axitinib, everolimus, and best supportive care. The criteria relevant for treatment decisions were performance status, Memorial Sloan Kettering Cancer Center risk group, only or mainly lung metastases, cardiac insufficiency, hepatic insufficiency, age, and "zugzwang" (composite of multiple, related criteria). In the present study, we used diagnostic nodes to compare treatment algorithms in the first-line treatment of mccRCC. The results illustrate the heterogeneity of the decision criteria and treatment strategies for mccRCC and how available data are interpreted and implemented differently among experts. The data provided in the present report should not be considered to serve as treatment recommendations for the management of treatment-naïve patients with multiple metastases from metastatic clear cell renal cell carcinoma outside a clinical trial; however, the data highlight the different treatment options and the criteria used to select them. The diversity in decision making and how results from phase III trials can be interpreted and implemented differently in daily practice are demonstrated. ©AlphaMed Press.
Kapanen, Mika K.; Hyödynmaa, Simo J.; Wigren, Tuija K.; Pitkänen, Maunu A.
2014-01-01
The accuracy of dose calculation is a key challenge in stereotactic body radiotherapy (SBRT) of the lung. We have benchmarked three photon beam dose calculation algorithms — pencil beam convolution (PBC), anisotropic analytical algorithm (AAA), and Acuros XB (AXB) — implemented in a commercial treatment planning system (TPS), Varian Eclipse. Dose distributions from full Monte Carlo (MC) simulations were regarded as a reference. In the first stage, for four patients with central lung tumors, treatment plans using 3D conformal radiotherapy (CRT) technique applying 6 MV photon beams were made using the AXB algorithm, with planning criteria according to the Nordic SBRT study group. The plans were recalculated (with same number of monitor units (MUs) and identical field settings) using BEAMnrc and DOSXYZnrc MC codes. The MC‐calculated dose distributions were compared to corresponding AXB‐calculated dose distributions to assess the accuracy of the AXB algorithm, to which then other TPS algorithms were compared. In the second stage, treatment plans were made for ten patients with 3D CRT technique using both the PBC algorithm and the AAA. The plans were recalculated (with same number of MUs and identical field settings) with the AXB algorithm, then compared to original plans. Throughout the study, the comparisons were made as a function of the size of the planning target volume (PTV), using various dose‐volume histogram (DVH) and other parameters to quantitatively assess the plan quality. In the first stage also, 3D gamma analyses with threshold criteria 3%/3 mm and 2%/2 mm were applied. The AXB‐calculated dose distributions showed relatively high level of agreement in the light of 3D gamma analysis and DVH comparison against the full MC simulation, especially with large PTVs, but, with smaller PTVs, larger discrepancies were found. Gamma agreement index (GAI) values between 95.5% and 99.6% for all the plans with the threshold criteria 3%/3 mm were achieved, but 2%/2 mm threshold criteria showed larger discrepancies. The TPS algorithm comparison results showed large dose discrepancies in the PTV mean dose (D50%), nearly 60%, for the PBC algorithm, and differences of nearly 20% for the AAA, occurring also in the small PTV size range. This work suggests the application of independent plan verification, when the AAA or the AXB algorithm are utilized in lung SBRT having PTVs smaller than 20‐25 cc. The calculated data from this study can be used in converting the SBRT protocols based on type ‘a’ and/or type ‘b’ algorithms for the most recent generation type ‘c’ algorithms, such as the AXB algorithm. PACS numbers: 87.55.‐x, 87.55.D‐, 87.55.K‐, 87.55.kd, 87.55.Qr PMID:24710454
Paudel, Moti R; Kim, Anthony; Sarfehnia, Arman; Ahmad, Sayed B; Beachey, David J; Sahgal, Arjun; Keller, Brian M
2016-11-08
A new GPU-based Monte Carlo dose calculation algorithm (GPUMCD), devel-oped by the vendor Elekta for the Monaco treatment planning system (TPS), is capable of modeling dose for both a standard linear accelerator and an Elekta MRI linear accelerator. We have experimentally evaluated this algorithm for a standard Elekta Agility linear accelerator. A beam model was developed in the Monaco TPS (research version 5.09.06) using the commissioned beam data for a 6 MV Agility linac. A heterogeneous phantom representing several scenarios - tumor-in-lung, lung, and bone-in-tissue - was designed and built. Dose calculations in Monaco were done using both the current clinical Monte Carlo algorithm, XVMC, and the new GPUMCD algorithm. Dose calculations in a Pinnacle TPS were also produced using the collapsed cone convolution (CCC) algorithm with heterogeneity correc-tion. Calculations were compared with the measured doses using an ionization chamber (A1SL) and Gafchromic EBT3 films for 2 × 2 cm2, 5 × 5 cm2, and 10 × 10 cm2 field sizes. The percentage depth doses (PDDs) calculated by XVMC and GPUMCD in a homogeneous solid water phantom were within 2%/2 mm of film measurements and within 1% of ion chamber measurements. For the tumor-in-lung phantom, the calculated doses were within 2.5%/2.5 mm of film measurements for GPUMCD. For the lung phantom, doses calculated by all of the algorithms were within 3%/3 mm of film measurements, except for the 2 × 2 cm2 field size where the CCC algorithm underestimated the depth dose by ~ 5% in a larger extent of the lung region. For the bone phantom, all of the algorithms were equivalent and calculated dose to within 2%/2 mm of film measurements, except at the interfaces. Both GPUMCD and XVMC showed interface effects, which were more pronounced for GPUMCD and were comparable to film measurements, whereas the CCC algorithm showed these effects poorly. © 2016 The Authors.
Upper cervical injuries: Clinical results using a new treatment algorithm
Joaquim, Andrei F.; Ghizoni, Enrico; Tedeschi, Helder; Yacoub, Alexandre R. D.; Brodke, Darrel S.; Vaccaro, Alexander R.; Patel, Alpesh A.
2015-01-01
Introduction: Upper cervical injuries (UCI) have a wide range of radiological and clinical presentation due to the unique complex bony, ligamentous and vascular anatomy. We recently proposed a rational approach in an attempt to unify prior classification system and guide treatment. In this paper, we evaluate the clinical results of our algorithm for UCI treatment. Materials and Methods: A prospective cohort series of patients with UCI was performed. The primary outcome was the AIS. Surgical treatment was proposed based on our protocol: Ligamentous injuries (abnormal misalignment, facet perched or locked, increase atlanto-dens interval) were treated surgically. Bone fractures without ligamentous injuries were treated with a rigid cervical orthosis, with exception of fractures in the dens base with risk factors for non-union. Results: Twenty-three patients treated initially conservatively had some follow-up (mean of 171 days, range from 60 to 436 days). All of them were neurologically intact. None of the patients developed a new neurological deficit. Fifteen patients were initially surgically treated (mean of 140 days of follow-up, ranging from 60 to 270 days). In the surgical group, preoperatively, 11 (73.3%) patients were AIS E, 2 (13.3%) AIS C and 2 (13.3%) AIS D. At the final follow-up, the American Spine Injury Association (ASIA) score was: 13 (86.6%) AIS E and 2 (13.3%) AIS D. None of the patients had neurological worsening during the follow-up. Conclusions: This prospective cohort suggested that our UCI treatment algorithm can be safely used. Further prospective studies with longer follow-up are necessary to further establish its clinical validity and safety. PMID:25788816
Thengumpallil, Sheeba; Germond, Jean-François; Bourhis, Jean; Bochud, François; Moeckli, Raphaël
2016-06-01
To investigate the impact of Toshiba phase- and amplitude-sorting algorithms on the margin strategies for free-breathing lung radiotherapy treatments in the presence of breathing variations. 4D CT of a sphere inside a dynamic thorax phantom was acquired. The 4D CT was reconstructed according to the phase- and amplitude-sorting algorithms. The phantom was moved by reproducing amplitude, frequency, and a mix of amplitude and frequency variations. Artefact analysis was performed for Mid-Ventilation and ITV-based strategies on the images reconstructed by phase- and amplitude-sorting algorithms. The target volume deviation was assessed by comparing the target volume acquired during irregular motion to the volume acquired during regular motion. The amplitude-sorting algorithm shows reduced artefacts for only amplitude variations while the phase-sorting algorithm for only frequency variations. For amplitude and frequency variations, both algorithms perform similarly. Most of the artefacts are blurring and incomplete structures. We found larger artefacts and volume differences for the Mid-Ventilation with respect to the ITV strategy, resulting in a higher relative difference of the surface distortion value which ranges between maximum 14.6% and minimum 4.1%. The amplitude- is superior to the phase-sorting algorithm in the reduction of motion artefacts for amplitude variations while phase-sorting for frequency variations. A proper choice of 4D CT sorting algorithm is important in order to reduce motion artefacts, especially if Mid-Ventilation strategy is used. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shan, Bonan; Wang, Jiang; Deng, Bin; Wei, Xile; Yu, Haitao; Zhang, Zhen; Li, Huiyan
2016-07-01
This paper proposes an epilepsy detection and closed-loop control strategy based on Particle Swarm Optimization (PSO) algorithm. The proposed strategy can effectively suppress the epileptic spikes in neural mass models, where the epileptiform spikes are recognized as the biomarkers of transitions from the normal (interictal) activity to the seizure (ictal) activity. In addition, the PSO algorithm shows capabilities of accurate estimation for the time evolution of key model parameters and practical detection for all the epileptic spikes. The estimation effects of unmeasurable parameters are improved significantly compared with unscented Kalman filter. When the estimated excitatory-inhibitory ratio exceeds a threshold value, the epileptiform spikes can be inhibited immediately by adopting the proportion-integration controller. Besides, numerical simulations are carried out to illustrate the effectiveness of the proposed method as well as the potential value for the model-based early seizure detection and closed-loop control treatment design.
Evaluation of GMI and PMI diffeomorphic‐based demons algorithms for aligning PET and CT Images
Yang, Juan; Zhang, You; Yin, Yong
2015-01-01
Fusion of anatomic information in computed tomography (CT) and functional information in F18‐FDG positron emission tomography (PET) is crucial for accurate differentiation of tumor from benign masses, designing radiotherapy treatment plan and staging of cancer. Although current PET and CT images can be acquired from combined F18‐FDG PET/CT scanner, the two acquisitions are scanned separately and take a long time, which may induce potential positional errors in global and local caused by respiratory motion or organ peristalsis. So registration (alignment) of whole‐body PET and CT images is a prerequisite for their meaningful fusion. The purpose of this study was to assess the performance of two multimodal registration algorithms for aligning PET and CT images. The proposed gradient of mutual information (GMI)‐based demons algorithm, which incorporated the GMI between two images as an external force to facilitate the alignment, was compared with the point‐wise mutual information (PMI) diffeomorphic‐based demons algorithm whose external force was modified by replacing the image intensity difference in diffeomorphic demons algorithm with the PMI to make it appropriate for multimodal image registration. Eight patients with esophageal cancer(s) were enrolled in this IRB‐approved study. Whole‐body PET and CT images were acquired from a combined F18‐FDG PET/CT scanner for each patient. The modified Hausdorff distance (dMH) was used to evaluate the registration accuracy of the two algorithms. Of all patients, the mean values and standard deviations (SDs) of dMH were 6.65 (± 1.90) voxels and 6.01 (± 1.90) after the GMI‐based demons and the PMI diffeomorphic‐based demons registration algorithms respectively. Preliminary results on oncological patients showed that the respiratory motion and organ peristalsis in PET/CT esophageal images could not be neglected, although a combined F18‐FDG PET/CT scanner was used for image acquisition. The PMI diffeomorphic‐based demons algorithm was more accurate than the GMI‐based demons algorithm in registering PET/CT esophageal images. PACS numbers: 87.57.nj, 87.57. Q‐, 87.57.uk PMID:26218993
Evaluation of GMI and PMI diffeomorphic-based demons algorithms for aligning PET and CT Images.
Yang, Juan; Wang, Hongjun; Zhang, You; Yin, Yong
2015-07-08
Fusion of anatomic information in computed tomography (CT) and functional information in 18F-FDG positron emission tomography (PET) is crucial for accurate differentiation of tumor from benign masses, designing radiotherapy treatment plan and staging of cancer. Although current PET and CT images can be acquired from combined 18F-FDG PET/CT scanner, the two acquisitions are scanned separately and take a long time, which may induce potential positional errors in global and local caused by respiratory motion or organ peristalsis. So registration (alignment) of whole-body PET and CT images is a prerequisite for their meaningful fusion. The purpose of this study was to assess the performance of two multimodal registration algorithms for aligning PET and CT images. The proposed gradient of mutual information (GMI)-based demons algorithm, which incorporated the GMI between two images as an external force to facilitate the alignment, was compared with the point-wise mutual information (PMI) diffeomorphic-based demons algorithm whose external force was modified by replacing the image intensity difference in diffeomorphic demons algorithm with the PMI to make it appropriate for multimodal image registration. Eight patients with esophageal cancer(s) were enrolled in this IRB-approved study. Whole-body PET and CT images were acquired from a combined 18F-FDG PET/CT scanner for each patient. The modified Hausdorff distance (d(MH)) was used to evaluate the registration accuracy of the two algorithms. Of all patients, the mean values and standard deviations (SDs) of d(MH) were 6.65 (± 1.90) voxels and 6.01 (± 1.90) after the GMI-based demons and the PMI diffeomorphic-based demons registration algorithms respectively. Preliminary results on oncological patients showed that the respiratory motion and organ peristalsis in PET/CT esophageal images could not be neglected, although a combined 18F-FDG PET/CT scanner was used for image acquisition. The PMI diffeomorphic-based demons algorithm was more accurate than the GMI-based demons algorithm in registering PET/CT esophageal images.
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
[Automatic Sleep Stage Classification Based on an Improved K-means Clustering Algorithm].
Xiao, Shuyuan; Wang, Bei; Zhang, Jian; Zhang, Qunfeng; Zou, Junzhong
2016-10-01
Sleep stage scoring is a hotspot in the field of medicine and neuroscience.Visual inspection of sleep is laborious and the results may be subjective to different clinicians.Automatic sleep stage classification algorithm can be used to reduce the manual workload.However,there are still limitations when it encounters complicated and changeable clinical cases.The purpose of this paper is to develop an automatic sleep staging algorithm based on the characteristics of actual sleep data.In the proposed improved K-means clustering algorithm,points were selected as the initial centers by using a concept of density to avoid the randomness of the original K-means algorithm.Meanwhile,the cluster centers were updated according to the‘Three-Sigma Rule’during the iteration to abate the influence of the outliers.The proposed method was tested and analyzed on the overnight sleep data of the healthy persons and patients with sleep disorders after continuous positive airway pressure(CPAP)treatment.The automatic sleep stage classification results were compared with the visual inspection by qualified clinicians and the averaged accuracy reached 76%.With the analysis of morphological diversity of sleep data,it was proved that the proposed improved K-means algorithm was feasible and valid for clinical practice.
Automatic delineation of tumor volumes by co-segmentation of combined PET/MR data
NASA Astrophysics Data System (ADS)
Leibfarth, S.; Eckert, F.; Welz, S.; Siegel, C.; Schmidt, H.; Schwenzer, N.; Zips, D.; Thorwarth, D.
2015-07-01
Combined PET/MRI may be highly beneficial for radiotherapy treatment planning in terms of tumor delineation and characterization. To standardize tumor volume delineation, an automatic algorithm for the co-segmentation of head and neck (HN) tumors based on PET/MR data was developed. Ten HN patient datasets acquired in a combined PET/MR system were available for this study. The proposed algorithm uses both the anatomical T2-weighted MR and FDG-PET data. For both imaging modalities tumor probability maps were derived, assigning each voxel a probability of being cancerous based on its signal intensity. A combination of these maps was subsequently segmented using a threshold level set algorithm. To validate the method, tumor delineations from three radiation oncologists were available. Inter-observer variabilities and variabilities between the algorithm and each observer were quantified by means of the Dice similarity index and a distance measure. Inter-observer variabilities and variabilities between observers and algorithm were found to be comparable, suggesting that the proposed algorithm is adequate for PET/MR co-segmentation. Moreover, taking into account combined PET/MR data resulted in more consistent tumor delineations compared to MR information only.
[Hand Therapy in the Treatment of Patients with CRPS].
Körbler, C; Pfau, M; Becker, F; Koester, U; Werdin, F
2015-06-01
In the modern treatment of CRPS a multidisciplinary concept is firmly established (MMPT, multimodal pain therapy). Besides medical therapy and psychotherapy, physio- and occupational therapy count as basic treatment options. Although physio- and occupational therapy (in the following called hand therapy) are the most important basic treatments, the therapy is hardly standardised and there are few scientific investigations concerning their application. Therefore the purpose of this paper is to present the applied hand therapeutic techniques with regard to function/performance, application and effectiveness, and to derive a suitable treatment algorithm. The techniques used in hand therapy are presented and reviewed in regard to their effectiveness by means of a literature search. It turns out that exercise therapy, manual therapy, graded motor imaging, CO2 baths and occupational therapy have a proven benefit for the patients. Although for many of the treatments reliable evidence-based data are lacking a treatment algorithm was established but there is a strong need for further investigations concerning the therapeutic effectiveness in the treatment of CRPS. © Georg Thieme Verlag KG Stuttgart · New York.
Bellmunt, Joaquim; Calvo, Emiliano; Castellano, Daniel; Climent, Miguel Angel; Esteban, Emilio; García del Muro, Xavier; González-Larriba, José Luis; Maroto, Pablo; Trigo, José Manuel
2009-03-01
For almost the last two decades, interleukin-2 and interferon-alpha have been the only systemic treatment options available for metastatic renal cell carcinoma. However, in recent years, five new targeted therapies namely sunitinib, sorafenib, temsirolimus, everolimus and bevacizumab have demonstrated clinical activity in these patients. With the availability of new targeted agents that are active in this disease, there is a need to continuously update the treatment algorithm of the disease. Due to the important advances obtained, the Spanish Oncology Genitourinary Group (SOGUG) has considered it would be useful to review the current status of the disease, including the genetic and molecular biology factors involved, the current predicting models for development of metastases as well as the role of surgery, radiotherapy and systemic therapies in the early- or late management of the disease. Based on this previous work, a treatment algorithm was developed.
Shinnar, Shlomo; Gloss, David; Alldredge, Brian; Arya, Ravindra; Bainbridge, Jacquelyn; Bare, Mary; Bleck, Thomas; Dodson, W. Edwin; Garrity, Lisa; Jagoda, Andy; Lowenstein, Daniel; Pellock, John; Riviello, James; Sloan, Edward; Treiman, David M.
2016-01-01
CONTEXT: The optimal pharmacologic treatment for early convulsive status epilepticus is unclear. OBJECTIVE: To analyze efficacy, tolerability and safety data for anticonvulsant treatment of children and adults with convulsive status epilepticus and use this analysis to develop an evidence-based treatment algorithm. DATA SOURCES: Structured literature review using MEDLINE, Embase, Current Contents, and Cochrane library supplemented with article reference lists. STUDY SELECTION: Randomized controlled trials of anticonvulsant treatment for seizures lasting longer than 5 minutes. DATA EXTRACTION: Individual studies were rated using predefined criteria and these results were used to form recommendations, conclusions, and an evidence-based treatment algorithm. RESULTS: A total of 38 randomized controlled trials were identified, rated and contributed to the assessment. Only four trials were considered to have class I evidence of efficacy. Two studies were rated as class II and the remaining 32 were judged to have class III evidence. In adults with convulsive status epilepticus, intramuscular midazolam, intravenous lorazepam, intravenous diazepam and intravenous phenobarbital are established as efficacious as initial therapy (Level A). Intramuscular midazolam has superior effectiveness compared to intravenous lorazepam in adults with convulsive status epilepticus without established intravenous access (Level A). In children, intravenous lorazepam and intravenous diazepam are established as efficacious at stopping seizures lasting at least 5 minutes (Level A) while rectal diazepam, intramuscular midazolam, intranasal midazolam, and buccal midazolam are probably effective (Level B). No significant difference in effectiveness has been demonstrated between intravenous lorazepam and intravenous diazepam in adults or children with convulsive status epilepticus (Level A). Respiratory and cardiac symptoms are the most commonly encountered treatment-emergent adverse events associated with intravenous anticonvulsant drug administration in adults with convulsive status epilepticus (Level A). The rate of respiratory depression in patients with convulsive status epilepticus treated with benzodiazepines is lower than in patients with convulsive status epilepticus treated with placebo indicating that respiratory problems are an important consequence of untreated convulsive status epilepticus (Level A). When both are available, fosphenytoin is preferred over phenytoin based on tolerability but phenytoin is an acceptable alternative (Level A). In adults, compared to the first therapy, the second therapy is less effective while the third therapy is substantially less effective (Level A). In children, the second therapy appears less effective and there are no data about third therapy efficacy (Level C). The evidence was synthesized into a treatment algorithm. CONCLUSIONS: Despite the paucity of well-designed randomized controlled trials, practical conclusions and an integrated treatment algorithm for the treatment of convulsive status epilepticus across the age spectrum (infants through adults) can be constructed. Multicenter, multinational efforts are needed to design, conduct and analyze additional randomized controlled trials that can answer the many outstanding clinically relevant questions identified in this guideline. PMID:26900382
Accelerated probabilistic inference of RNA structure evolution
Holmes, Ian
2005-01-01
Background Pairwise stochastic context-free grammars (Pair SCFGs) are powerful tools for evolutionary analysis of RNA, including simultaneous RNA sequence alignment and secondary structure prediction, but the associated algorithms are intensive in both CPU and memory usage. The same problem is faced by other RNA alignment-and-folding algorithms based on Sankoff's 1985 algorithm. It is therefore desirable to constrain such algorithms, by pre-processing the sequences and using this first pass to limit the range of structures and/or alignments that can be considered. Results We demonstrate how flexible classes of constraint can be imposed, greatly reducing the computational costs while maintaining a high quality of structural homology prediction. Any score-attributed context-free grammar (e.g. energy-based scoring schemes, or conditionally normalized Pair SCFGs) is amenable to this treatment. It is now possible to combine independent structural and alignment constraints of unprecedented general flexibility in Pair SCFG alignment algorithms. We outline several applications to the bioinformatics of RNA sequence and structure, including Waterman-Eggert N-best alignments and progressive multiple alignment. We evaluate the performance of the algorithm on test examples from the RFAM database. Conclusion A program, Stemloc, that implements these algorithms for efficient RNA sequence alignment and structure prediction is available under the GNU General Public License. PMID:15790387
Utility of an Algorithm to Increase the Accuracy of Medication History in an Obstetrical Setting.
Corbel, Aline; Baud, David; Chaouch, Aziz; Beney, Johnny; Csajka, Chantal; Panchaud, Alice
2016-01-01
In an obstetrical setting, inaccurate medication histories at hospital admission may result in failure to identify potentially harmful treatments for patients and/or their fetus(es). This prospective study was conducted to assess average concordance rates between (1) a medication list obtained with a one-page structured medication history algorithm developed for the obstetrical setting and (2) the medication list reported in medical records and obtained by open-ended questions based on standard procedures. Both lists were converted into concordance rate using a best possible medication history approach as the reference (information obtained by patients, prescribers and community pharmacists' interviews). The algorithm-based method obtained a higher average concordance rate than the standard method, with respectively 90.2% [CI95% 85.8-94.3] versus 24.6% [CI95%15.3-34.4] concordance rates (p<0.01). Our algorithm-based method strongly enhanced the accuracy of the medication history in our obstetric population, without using substantial resources. Its implementation is an effective first step to the medication reconciliation process, which has been recognized as a very important component of patients' drug safety.
A novel time-domain signal processing algorithm for real time ventricular fibrillation detection
NASA Astrophysics Data System (ADS)
Monte, G. E.; Scarone, N. C.; Liscovsky, P. O.; Rotter S/N, P.
2011-12-01
This paper presents an application of a novel algorithm for real time detection of ECG pathologies, especially ventricular fibrillation. It is based on segmentation and labeling process of an oversampled signal. After this treatment, analyzing sequence of segments, global signal behaviours are obtained in the same way like a human being does. The entire process can be seen as a morphological filtering after a smart data sampling. The algorithm does not require any ECG digital signal pre-processing, and the computational cost is low, so it can be embedded into the sensors for wearable and permanent applications. The proposed algorithms could be the input signal description to expert systems or to artificial intelligence software in order to detect other pathologies.
Yiannakou, Marinos; Trimikliniotis, Michael; Yiallouras, Christos; Damianou, Christakis
2016-02-01
Due to the heating in the pre-focal field the delay between successive movements in high intensity focused ultrasound (HIFU) are sometimes as long as 60s, resulting to treatment time in the order of 2-3h. Because there is generally a requirement to reduce treatment time, we were motivated to explore alternative transducer motion algorithms in order to reduce pre-focal heating and treatment time. A 1 MHz single element transducer with 4 cm diameter and 10 cm focal length was used. A simulation model was developed that estimates the temperature, thermal dose and lesion development in the pre-focal field. The simulated temperature history that was combined with the motion algorithms produced thermal maps in the pre-focal region. Polyacrylimde gel phantom was used to evaluate the induced pre-focal heating for each motion algorithm used, and also was used to assess the accuracy of the simulation model. Three out of the six algorithms having successive steps close to each other, exhibited severe heating in the pre-focal field. Minimal heating was produced with the algorithms having successive steps apart from each other (square, square spiral and random). The last three algorithms were improved further (with small cost in time), thus eliminating completely the pre-focal heating and reducing substantially the treatment time as compared to traditional algorithms. Out of the six algorithms, 3 were successful in eliminating the pre-focal heating completely. Because these 3 algorithms required no delay between successive movements (except in the last part of the motion), the treatment time was reduced by 93%. Therefore, it will be possible in the future, to achieve treatment time of focused ultrasound therapies shorter than 30 min. The rate of ablated volume achieved with one of the proposed algorithms was 71 cm(3)/h. The intention of this pilot study was to demonstrate that the navigation algorithms play the most important role in reducing pre-focal heating. By evaluating in the future, all commercially available geometries, it will be possible to reduce the treatment time, for thermal ablation protocols intended for oncological targets. Copyright © 2015 Elsevier B.V. All rights reserved.
Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms
Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas
2016-01-01
Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640
Riordan, Stephen M.; Bopage, Rohan; Lloyd, Andrew R.
2018-01-01
Introduction Achievement of the 2030 World Health Organisation (WHO) global hepatitis C virus (HCV) elimination targets will be underpinned by scale-up of testing and use of direct-acting antiviral treatments. In Australia, despite publically-funded testing and treatment, less than 15% of patients were treated in the first year of treatment access, highlighting the need for greater efficiency of health service delivery. To this end, non-invasive fibrosis algorithms were examined to reduce reliance on transient elastography (TE) which is currently utilised for the assessment of cirrhosis in most Australian clinical settings. Materials and methods This retrospective and prospective study, with derivation and validation cohorts, examined consecutive patients in a tertiary referral centre, a sexual health clinic, and a prison-based hepatitis program. The negative predictive value (NPV) of seven non-invasive algorithms were measured using published and newly derived cut-offs. The number of TEs avoided for each algorithm, or combination of algorithms, was determined. Results The 850 patients included 780 (92%) with HCV mono-infection, and 70 (8%) co-infected with HIV or hepatitis B. The mono-infected cohort included 612 men (79%), with an overall prevalence of cirrhosis of 16% (125/780). An ‘APRI’ algorithm cut-off of 1.0 had a 94% NPV (95%CI: 91–96%). Newly derived cut-offs of ‘APRI’ (0.49), ‘FIB-4’ (0.93) and ‘GUCI’ (0.5) algorithms each had NPVs of 99% (95%CI: 97–100%), allowing avoidance of TE in 40% (315/780), 40% (310/780) and 40% (298/749) respectively. When used in combination, NPV was retained and TE avoidance reached 54% (405/749), regardless of gender or co-infection. Conclusions Non-invasive algorithms can reliably exclude cirrhosis in many patients, allowing improved efficiency of HCV assessment services in Australia and worldwide. PMID:29438397
Kelly, Melissa Louise; Riordan, Stephen M; Bopage, Rohan; Lloyd, Andrew R; Post, Jeffrey John
2018-01-01
Achievement of the 2030 World Health Organisation (WHO) global hepatitis C virus (HCV) elimination targets will be underpinned by scale-up of testing and use of direct-acting antiviral treatments. In Australia, despite publically-funded testing and treatment, less than 15% of patients were treated in the first year of treatment access, highlighting the need for greater efficiency of health service delivery. To this end, non-invasive fibrosis algorithms were examined to reduce reliance on transient elastography (TE) which is currently utilised for the assessment of cirrhosis in most Australian clinical settings. This retrospective and prospective study, with derivation and validation cohorts, examined consecutive patients in a tertiary referral centre, a sexual health clinic, and a prison-based hepatitis program. The negative predictive value (NPV) of seven non-invasive algorithms were measured using published and newly derived cut-offs. The number of TEs avoided for each algorithm, or combination of algorithms, was determined. The 850 patients included 780 (92%) with HCV mono-infection, and 70 (8%) co-infected with HIV or hepatitis B. The mono-infected cohort included 612 men (79%), with an overall prevalence of cirrhosis of 16% (125/780). An 'APRI' algorithm cut-off of 1.0 had a 94% NPV (95%CI: 91-96%). Newly derived cut-offs of 'APRI' (0.49), 'FIB-4' (0.93) and 'GUCI' (0.5) algorithms each had NPVs of 99% (95%CI: 97-100%), allowing avoidance of TE in 40% (315/780), 40% (310/780) and 40% (298/749) respectively. When used in combination, NPV was retained and TE avoidance reached 54% (405/749), regardless of gender or co-infection. Non-invasive algorithms can reliably exclude cirrhosis in many patients, allowing improved efficiency of HCV assessment services in Australia and worldwide.
Harati, Vida; Khayati, Rasoul; Farzan, Abdolreza
2011-07-01
Uncontrollable and unlimited cell growth leads to tumor genesis in the brain. If brain tumors are not diagnosed early and cured properly, they could cause permanent brain damage or even death to patients. As in all methods of treatments, any information about tumor position and size is important for successful treatment; hence, finding an accurate and a fully automated method to give information to physicians is necessary. A fully automatic and accurate method for tumor region detection and segmentation in brain magnetic resonance (MR) images is suggested. The presented approach is an improved fuzzy connectedness (FC) algorithm based on a scale in which the seed point is selected automatically. This algorithm is independent of the tumor type in terms of its pixels intensity. Tumor segmentation evaluation results based on similarity criteria (similarity index (SI), overlap fraction (OF), and extra fraction (EF) are 92.89%, 91.75%, and 3.95%, respectively) indicate a higher performance of the proposed approach compared to the conventional methods, especially in MR images, in tumor regions with low contrast. Thus, the suggested method is useful for increasing the ability of automatic estimation of tumor size and position in brain tissues, which provides more accurate investigation of the required surgery, chemotherapy, and radiotherapy procedures. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lestari, A. W.; Rustam, Z.
2017-07-01
In the last decade, breast cancer has become the focus of world attention as this disease is one of the primary leading cause of death for women. Therefore, it is necessary to have the correct precautions and treatment. In previous studies, Fuzzy Kennel K-Medoid algorithm has been used for multi-class data. This paper proposes an algorithm to classify the high dimensional data of breast cancer using Fuzzy Possibilistic C-means (FPCM) and a new method based on clustering analysis using Normed Kernel Function-Based Fuzzy Possibilistic C-Means (NKFPCM). The objective of this paper is to obtain the best accuracy in classification of breast cancer data. In order to improve the accuracy of the two methods, the features candidates are evaluated using feature selection, where Laplacian Score is used. The results show the comparison accuracy and running time of FPCM and NKFPCM with and without feature selection.
Individual treatment selection for patients with posttraumatic stress disorder.
Deisenhofer, Anne-Katharina; Delgadillo, Jaime; Rubel, Julian A; Böhnke, Jan R; Zimmermann, Dirk; Schwartz, Brian; Lutz, Wolfgang
2018-04-16
Trauma-focused cognitive behavioral therapy (Tf-CBT) and eye movement desensitization and reprocessing (EMDR) are two highly effective treatment options for posttraumatic stress disorder (PTSD). Yet, on an individual level, PTSD patients vary substantially in treatment response. The aim of the paper is to test the application of a treatment selection method based on a personalized advantage index (PAI). The study used clinical data for patients accessing treatment for PTSD in a primary care mental health service in the north of England. PTSD patients received either EMDR (N = 75) or Tf-CBT (N = 242). The Patient Health Questionnaire (PHQ-9) was used as an outcome measure for depressive symptoms associated with PTSD. Variables predicting differential treatment response were identified using an automated variable selection approach (genetic algorithm) and afterwards included in regression models, allowing the calculation of each patient's PAI. Age, employment status, gender, and functional impairment were identified as relevant variables for Tf-CBT. For EMDR, baseline depressive symptoms as well as prescribed antidepressant medication were selected as predictor variables. Fifty-six percent of the patients (n = 125) had a PAI equal or higher than one standard deviation. From those patients, 62 (50%) did not receive their model-predicted treatment and could have benefited from a treatment assignment based on the PAI. Using a PAI-based algorithm has the potential to improve clinical decision making and to enhance individual patient outcomes, although further replication is necessary before such an approach can be implemented in prospective studies. © 2018 Wiley Periodicals, Inc.
Ricken, Roland; Wiethoff, Katja; Reinhold, Thomas; Stamm, Thomas J; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Brieger, Peter; Cordes, Joachim; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Heinz, Andreas; Bauer, Michael; Adli, Mazda
2018-03-01
In a previous single center study we found that a standardized drug treatment algorithm (ALGO) was more cost effective than treatment as usual (TAU) for inpatients with major depression. This report aimed to determine whether this promising initial finding could be replicated in a multicenter study. Treatment costs were calculated for two time periods: the study period (from enrolment to exit from study) and time in hospital (from enrolment to hospital discharge) based on daily hospital charges. Cost per remitted patient during the study period was considered as primary outcome. 266 patients received ALGO and 84 received TAU. For the study period, ALGO costs were significantly lower than TAU (ALGO: 7 848 ± 6 065 €; TAU: 10 033 ± 7 696 €; p = 0.04). For time in hospital, costs were not different (ALGO: 14 734 ± 8 329 €; TAU: 14 244 ± 8 419 €; p = 0.617). Remission rates did not differ for the study period (ALGO: 57.9%, TAU: 50.0%; p=0.201). Remission rates were greater in ALGO (83.3%) than TAU (66.2%) for time in hospital (p = 0.002). Cost per remission was lower in ALGO (13 554 ± 10 476 €) than TAU (20 066 ± 15 391 €) for the study period (p < 0.001) and for time in hospital (ALGO: 17 582 ± 9 939 €; TAU: 21 516 ± 12 718 €; p = 0.036). Indirect costs were not assessed. Different dropout rates in TAU and ALGO complicated interpretation. Treatment algorithms enhance the cost effectiveness of the care of depressed inpatients, which replicates our prior results in an independent sample. Copyright © 2017. Published by Elsevier B.V.
Implementation of Monte Carlo Dose calculation for CyberKnife treatment planning
NASA Astrophysics Data System (ADS)
Ma, C.-M.; Li, J. S.; Deng, J.; Fan, J.
2008-02-01
Accurate dose calculation is essential to advanced stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT) especially for treatment planning involving heterogeneous patient anatomy. This paper describes the implementation of a fast Monte Carlo dose calculation algorithm in SRS/SRT treatment planning for the CyberKnife® SRS/SRT system. A superposition Monte Carlo algorithm is developed for this application. Photon mean free paths and interaction types for different materials and energies as well as the tracks of secondary electrons are pre-simulated using the MCSIM system. Photon interaction forcing and splitting are applied to the source photons in the patient calculation and the pre-simulated electron tracks are repeated with proper corrections based on the tissue density and electron stopping powers. Electron energy is deposited along the tracks and accumulated in the simulation geometry. Scattered and bremsstrahlung photons are transported, after applying the Russian roulette technique, in the same way as the primary photons. Dose calculations are compared with full Monte Carlo simulations performed using EGS4/MCSIM and the CyberKnife treatment planning system (TPS) for lung, head & neck and liver treatments. Comparisons with full Monte Carlo simulations show excellent agreement (within 0.5%). More than 10% differences in the target dose are found between Monte Carlo simulations and the CyberKnife TPS for SRS/SRT lung treatment while negligible differences are shown in head and neck and liver for the cases investigated. The calculation time using our superposition Monte Carlo algorithm is reduced up to 62 times (46 times on average for 10 typical clinical cases) compared to full Monte Carlo simulations. SRS/SRT dose distributions calculated by simple dose algorithms may be significantly overestimated for small lung target volumes, which can be improved by accurate Monte Carlo dose calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Hardy, D.; Favennec, Y., E-mail: yann.favennec@univ-nantes.fr; Rousseau, B.
The contribution of this paper relies in the development of numerical algorithms for the mathematical treatment of specular reflection on borders when dealing with the numerical solution of radiative transfer problems. The radiative transfer equation being integro-differential, the discrete ordinates method allows to write down a set of semi-discrete equations in which weights are to be calculated. The calculation of these weights is well known to be based on either a quadrature or on angular discretization, making the use of such method straightforward for the state equation. Also, the diffuse contribution of reflection on borders is usually well taken intomore » account. However, the calculation of accurate partition ratio coefficients is much more tricky for the specular condition applied on arbitrary geometrical borders. This paper presents algorithms that calculate analytically partition ratio coefficients needed in numerical treatments. The developed algorithms, combined with a decentered finite element scheme, are validated with the help of comparisons with analytical solutions before being applied on complex geometries.« less
Pliszka, S R; Greenhill, L L; Crismon, M L; Sedillo, A; Carlson, C; Conners, C K; McCracken, J T; Swanson, J M; Hughes, C W; Llana, M E; Lopez, M; Toprac, M G
2000-07-01
Expert consensus methodology was used to develop a medication treatment algorithm for attention-deficit/hyperactivity disorder (ADHD). The algorithm broadly outlined the choice of medication for ADHD and some of its most common comorbid conditions. Specific tactical recommendations were developed with regard to medication dosage, assessment of drug response, management of side effects, and long-term medication management. The consensus conference of academic clinicians and researchers, practicing clinicians, administrators, consumers, and families developed evidence-based tactics for the pharmacotherapy of childhood ADHD and its common comorbid disorders. The panel discussed specifics of treatment of ADHD and its comorbid conditions with stimulants, antidepressants, mood stabilizers, alpha-agonists, and (when appropriate) antipsychotics. Specific tactics for the use of each of the above agents are outlined. The tactics are designed to be practical for implementation in the public mental health sector, but they may have utility in many practice settings, including the private practice environment. Tactics for psychopharmacological management of ADHD can be developed with consensus.
MACVIA clinical decision algorithm in adolescents and adults with allergic rhinitis.
Bousquet, Jean; Schünemann, Holger J; Hellings, Peter W; Arnavielhe, Sylvie; Bachert, Claus; Bedbrook, Anna; Bergmann, Karl-Christian; Bosnic-Anticevich, Sinthia; Brozek, Jan; Calderon, Moises; Canonica, G Walter; Casale, Thomas B; Chavannes, Niels H; Cox, Linda; Chrystyn, Henry; Cruz, Alvaro A; Dahl, Ronald; De Carlo, Giuseppe; Demoly, Pascal; Devillier, Phillipe; Dray, Gérard; Fletcher, Monica; Fokkens, Wytske J; Fonseca, Joao; Gonzalez-Diaz, Sandra N; Grouse, Lawrence; Keil, Thomas; Kuna, Piotr; Larenas-Linnemann, Désirée; Lodrup Carlsen, Karin C; Meltzer, Eli O; Mullol, Jaoquim; Muraro, Antonella; Naclerio, Robert N; Palkonen, Susanna; Papadopoulos, Nikolaos G; Passalacqua, Giovanni; Price, David; Ryan, Dermot; Samolinski, Boleslaw; Scadding, Glenis K; Sheikh, Aziz; Spertini, François; Valiulis, Arunas; Valovirta, Erkka; Walker, Samantha; Wickman, Magnus; Yorgancioglu, Arzu; Haahtela, Tari; Zuberbier, Torsten
2016-08-01
The selection of pharmacotherapy for patients with allergic rhinitis (AR) depends on several factors, including age, prominent symptoms, symptom severity, control of AR, patient preferences, and cost. Allergen exposure and the resulting symptoms vary, and treatment adjustment is required. Clinical decision support systems (CDSSs) might be beneficial for the assessment of disease control. CDSSs should be based on the best evidence and algorithms to aid patients and health care professionals to jointly determine treatment and its step-up or step-down strategy depending on AR control. Contre les MAladies Chroniques pour un VIeillissement Actif en Languedoc-Roussillon (MACVIA-LR [fighting chronic diseases for active and healthy ageing]), one of the reference sites of the European Innovation Partnership on Active and Healthy Ageing, has initiated an allergy sentinel network (the MACVIA-ARIA Sentinel Network). A CDSS is currently being developed to optimize AR control. An algorithm developed by consensus is presented in this article. This algorithm should be confirmed by appropriate trials. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tung, Chuan-Jong; Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, Hsinchu, Taiwan; Yu, Pei-Chieh
2010-01-01
During radiotherapy treatments, quality assurance/control is essential, particularly dose delivery to patients. This study was designed to verify midline doses with diode in vivo dosimetry. Dosimetry was studied for 6-MV bilateral fields in head and neck cancer treatments and 10-MV bilateral and anteroposterior/posteroanterior (AP/PA) fields in pelvic cancer treatments. Calibrations with corrections of diodes were performed using plastic water phantoms; 190 and 100 portals were studied for head and neck and pelvis treatments, respectively. Calculations of midline doses were made using the midline transmission, arithmetic mean, and geometric mean algorithms. These midline doses were compared with the treatment planning systemmore » target doses for lateral or AP (PA) portals and paired opposed portals. For head and neck treatments, all 3 algorithms were satisfactory, although the geometric mean algorithm was less accurate and more uncertain. For pelvis treatments, the arithmetic mean algorithm seemed unacceptable, whereas the other algorithms were satisfactory. The random error was reduced by using averaged midline doses of paired opposed portals because the asymmetric effect was averaged out. Considering the simplicity of in vivo dosimetry, the arithmetic mean and geometric mean algorithm should be adopted for head/neck and pelvis treatments, respectively.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tajaldeen, A; Ramachandran, P; Geso, M
2015-06-15
Purpose: The purpose of this study was to investigate and quantify the variation in dose distributions in small field lung cancer radiotherapy using seven different dose calculation algorithms. Methods: The study was performed in 21 lung cancer patients who underwent Stereotactic Ablative Body Radiotherapy (SABR). Two different methods (i) Same dose coverage to the target volume (named as same dose method) (ii) Same monitor units in all algorithms (named as same monitor units) were used for studying the performance of seven different dose calculation algorithms in XiO and Eclipse treatment planning systems. The seven dose calculation algorithms include Superposition, Fastmore » superposition, Fast Fourier Transform ( FFT) Convolution, Clarkson, Anisotropic Analytic Algorithm (AAA), Acurous XB and pencil beam (PB) algorithms. Prior to this, a phantom study was performed to assess the accuracy of these algorithms. Superposition algorithm was used as a reference algorithm in this study. The treatment plans were compared using different dosimetric parameters including conformity, heterogeneity and dose fall off index. In addition to this, the dose to critical structures like lungs, heart, oesophagus and spinal cord were also studied. Statistical analysis was performed using Prism software. Results: The mean±stdev with conformity index for Superposition, Fast superposition, Clarkson and FFT convolution algorithms were 1.29±0.13, 1.31±0.16, 2.2±0.7 and 2.17±0.59 respectively whereas for AAA, pencil beam and Acurous XB were 1.4±0.27, 1.66±0.27 and 1.35±0.24 respectively. Conclusion: Our study showed significant variations among the seven different algorithms. Superposition and AcurosXB algorithms showed similar values for most of the dosimetric parameters. Clarkson, FFT convolution and pencil beam algorithms showed large differences as compared to superposition algorithms. Based on our study, we recommend Superposition and AcurosXB algorithms as the first choice of algorithms in lung cancer radiotherapy involving small fields. However, further investigation by Monte Carlo simulation is required to confirm our results.« less
Shoemaker, W C; Patil, R; Appel, P L; Kram, H B
1992-11-01
A generalized decision tree or clinical algorithm for treatment of high-risk elective surgical patients was developed from a physiologic model based on empirical data. First, a large data bank was used to do the following: (1) describe temporal hemodynamic and oxygen transport patterns that interrelate cardiac, pulmonary, and tissue perfusion functions in survivors and nonsurvivors; (2) define optimal therapeutic goals based on the supranormal oxygen transport values of high-risk postoperative survivors; (3) compare the relative effectiveness of alternative therapies in a wide variety of clinical and physiologic conditions; and (4) to develop criteria for titration of therapy to the endpoints of the supranormal optimal goals using cardiac index (CI), oxygen delivery (DO2), and oxygen consumption (VO2) as proxy outcome measures. Second, a general purpose algorithm was generated from these data and tested in preoperatively randomized clinical trials of high-risk surgical patients. Improved outcome was demonstrated with this generalized algorithm. The concept that the supranormal values represent compensations that have survival value has been corroborated by several other groups. We now propose a unique approach to refine the generalized algorithm to develop customized algorithms and individualized decision analysis for each patient's unique problems. The present article describes a preliminary evaluation of the feasibility of artificial intelligence techniques to accomplish individualized algorithms that may further improve patient care and outcome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klüter, Sebastian, E-mail: sebastian.klueter@med.uni-heidelberg.de; Schubert, Kai; Lissner, Steffen
Purpose: The dosimetric verification of treatment plans in helical tomotherapy usually is carried out via verification measurements. In this study, a method for independent dose calculation of tomotherapy treatment plans is presented, that uses a conventional treatment planning system with a pencil kernel dose calculation algorithm for generation of verification dose distributions based on patient CT data. Methods: A pencil beam algorithm that directly uses measured beam data was configured for dose calculation for a tomotherapy machine. Tomotherapy treatment plans were converted into a format readable by an in-house treatment planning system by assigning each projection to one static treatmentmore » field and shifting the calculation isocenter for each field in order to account for the couch movement. The modulation of the fluence for each projection is read out of the delivery sinogram, and with the kernel-based dose calculation, this information can directly be used for dose calculation without the need for decomposition of the sinogram. The sinogram values are only corrected for leaf output and leaf latency. Using the converted treatment plans, dose was recalculated with the independent treatment planning system. Multiple treatment plans ranging from simple static fields to real patient treatment plans were calculated using the new approach and either compared to actual measurements or the 3D dose distribution calculated by the tomotherapy treatment planning system. In addition, dose–volume histograms were calculated for the patient plans. Results: Except for minor deviations at the maximum field size, the pencil beam dose calculation for static beams agreed with measurements in a water tank within 2%/2 mm. A mean deviation to point dose measurements in the cheese phantom of 0.89% ± 0.81% was found for unmodulated helical plans. A mean voxel-based deviation of −0.67% ± 1.11% for all voxels in the respective high dose region (dose values >80%), and a mean local voxel-based deviation of −2.41% ± 0.75% for all voxels with dose values >20% were found for 11 modulated plans in the cheese phantom. Averaged over nine patient plans, the deviations amounted to −0.14% ± 1.97% (voxels >80%) and −0.95% ± 2.27% (>20%, local deviations). For a lung case, mean voxel-based deviations of more than 4% were found, while for all other patient plans, all mean voxel-based deviations were within ±2.4%. Conclusions: The presented method is suitable for independent dose calculation for helical tomotherapy within the known limitations of the pencil beam algorithm. It can serve as verification of the primary dose calculation and thereby reduce the need for time-consuming measurements. By using the patient anatomy and generating full 3D dose data, and combined with measurements of additional machine parameters, it can substantially contribute to overall patient safety.« less
NASA Astrophysics Data System (ADS)
Bukhari, W.; Hong, S.-M.
2015-01-01
Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR+, implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR+ algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR+ implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR+ in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR+. The experimental results show that the EKF-GPR+ algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR+ reduces the patient-wise RMS error to 37%, 39% and 42% in percent ratios relative to no prediction for a duty cycle of 80% at lookahead lengths of 192 ms, 384 ms and 576 ms, respectively. The experiments also confirm that EKF-GPR+ controls the duty cycle with reasonable accuracy.
Cost and effectiveness of biologics for rheumatoid arthritis in a commercially insured population.
Curtis, Jeffrey R; Chastek, Benjamin; Becker, Laura; Quach, Caroleen; Harrison, David J; Yun, Huifeng; Joseph, George J; Collier, David H
2015-04-01
Administrative claims contain detailed medication, diagnosis, and procedure data, but the lack of clinical outcomes for rheumatoid arthritis (RA) historically has limited their use in comparative effectiveness research. A claims-based algorithm was developed and validated to estimate effectiveness for RA from data for adherence, dosing, and treatment modifications. To implement the claims-based algorithm in a U.S. managed care database to estimate biologic cost per effectively treated patient. The cohort included patients with RA aged 18-63 years in the Optum Research Database who initiated biologic treatment between January 2007 and December 2010 and were continuously enrolled 6 months before through 12 months after the first claim for the biologic (the index date). Patients were categorized as effectively treated by the claims-based algorithm if they met all of the following 6 criteria in the 12-month post-index period: (1) a medication possession ratio ≥ 80% for subcutaneous biologics, or at least as many infusions as specified in U.S. labeling for intravenous biologics; (2) no increase in biologic dose; (3) no switch in biologics; (4) no new nonbiologic disease-modifying antirheumatic drug; (5) no new or increased oral glucocorticoid treatment; and (6) no more than 1 glucocorticoid injection. Drug costs (all biologics) and administration costs (intravenous biologics) were obtained from allowed amounts on claims. Biologic cost per effectively treated patient was defined as total 1-year biologic cost divided by the number of patients categorized by the algorithm as effectively treated with that index biologic. Sensitivity analysis was conducted to examine the total health care costs per effectively treated patient during the first year of biologic therapy. A total of 5,474 individuals were included in the analysis. The index biologic was categorized as effective by the algorithm for 28.9% of patients overall, including 30.6% for subcutaneous biologics and 22.1% for intravenous biologics. The index biologic was categorized as effective in the first year for 32.7% of etanercept (794/2,425), 32.3% of golimumab (40/124), 30.2% of abatacept (89/295), 27.7% of adalimumab (514/1,857), and 19.0% of infliximab (147/773) patients. Mean 1-year biologic cost per effectively treated patient, as defined in the algorithm, was lowest for etanercept ($43,935), followed by golimumab ($49,589), adalimumab ($52,752), abatacept ($62,300), and infliximab ($101,402). The rank order in the sensitivity analysis was the same, except for golimumab and etanercept. Using a claims-based algorithm in a large commercial claims database, etanercept was the most effective and had the lowest biologic cost per effectively treated patient with RA.
Liu, Han; Sintay, Benjamin; Pearman, Keith; Shang, Qingyang; Hayes, Lane; Maurer, Jacqueline; Vanderstraeten, Caroline; Wiant, David
2018-05-20
The photon optimization (PO) algorithm was recently released by Varian Medical Systems to improve volumetric modulated arc therapy (VMAT) optimization within Eclipse (Version 13.5). The purpose of this study is to compare the PO algorithm with its predecessor, progressive resolution optimizer (PRO) for lung SBRT and brain SRS treatments. A total of 30 patients were selected retrospectively. Previously, all the plans were generated with the PRO algorithm within Eclipse Version 13.6. In the new version of PO algorithm (Version 15), dynamic conformal arcs (DCA) were first conformed to the target, then VMAT inverse planning was performed to achieve the desired dose distributions. PTV coverages were forced to be identical for the same patient for a fair comparison. SBRT plan quality was assessed based on selected dose-volume parameters, including the conformity index, V 20 for lung, V 30 Gy for chest wall, and D 0.035 cc for other critical organs. SRS plan quality was evaluated based on the conformity index and normal tissue volumes encompassed by the 12 and 6 Gy isodose lines (V 12 and V 6 ). The modulation complexity score (MCS) was used to compare plan complexity of two algorithms. No statistically significant differences between the PRO and PO algorithms were found for any of the dosimetric parameters studied, which indicates both algorithms produce comparable plan quality. Significant improvements in the gamma passing rate (increased from 97.0% to 99.2% for SBRT and 96.1% to 98.4% for SRS), MCS (average increase of 0.15 for SBRT and 0.10 for SRS), and delivery efficiency (MU reduction of 29.8% for SBRT and 28.3% for SRS) were found for the PO algorithm. MCS showed a strong correlation with the gamma passing rate, and an inverse correlation with total MUs used. The PO algorithm offers comparable plan quality to the PRO, while minimizing MLC complexity, thereby improving the delivery efficiency and accuracy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Point-of-Care Coagulation Monitoring in Trauma Patients.
Stein, Philipp; Kaserer, Alexander; Spahn, Gabriela H; Spahn, Donat R
2017-06-01
Trauma remains one of the major causes of death and disability all over the world. Uncontrolled blood loss and trauma-induced coagulopathy represent preventable causes of trauma-related morbidity and mortality. Treatment may consist of allogeneic blood product transfusion at a fixed ratio or in an individualized goal-directed way based on point-of-care (POC) and routine laboratory measurements. Viscoelastic POC measurement of the developing clot in whole blood and POC platelet function testing allow rapid and tailored coagulation and transfusion treatment based on goal-directed, factor concentrate-based algorithms. The first studies have been published showing that this concept reduces the need for allogeneic blood transfusion and improves outcome. This review highlights the concept of goal-directed POC coagulation management in trauma patients, introduces a selection of POC devices, and presents algorithms which allow a reduction in allogeneic blood product transfusion and an improvement of trauma patient outcome. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Chapuy, Claudia I; Sahai, Inderneel; Sharma, Rohit; Zhu, Andrew X; Kozyreva, Olga N
2016-04-01
We report a case of a 31-year-old man with metastatic fibrolamellar hepatocellular carcinoma (FLHCC) treated with gemcitabine and oxaliplatin complicated by hyperammonemic encephalopathy biochemically consistent with acquired ornithine transcarbamylase deficiency. Awareness of FLHCC-associated hyperammonemic encephalopathy and a pathophysiology-based management approach can optimize patient outcome and prevent serious complications. A discussion of the management, literature review, and proposed treatment algorithm of this rare metabolic complication are presented. Pathophysiology-guided management of cancer-associated hyperammonemic encephalopathy can improve patient outcome and prevent life-threatening complications. Community and academic oncologists should be aware of this serious metabolic complication of cancer and be familiar with its management. ©AlphaMed Press.
Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.
2017-01-01
The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387
Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M
2017-05-01
The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
GPU implementation of prior image constrained compressed sensing (PICCS)
NASA Astrophysics Data System (ADS)
Nett, Brian E.; Tang, Jie; Chen, Guang-Hong
2010-04-01
The Prior Image Constrained Compressed Sensing (PICCS) algorithm (Med. Phys. 35, pg. 660, 2008) has been applied to several computed tomography applications with both standard CT systems and flat-panel based systems designed for guiding interventional procedures and radiation therapy treatment delivery. The PICCS algorithm typically utilizes a prior image which is reconstructed via the standard Filtered Backprojection (FBP) reconstruction algorithm. The algorithm then iteratively solves for the image volume that matches the measured data, while simultaneously assuring the image is similar to the prior image. The PICCS algorithm has demonstrated utility in several applications including: improved temporal resolution reconstruction, 4D respiratory phase specific reconstructions for radiation therapy, and cardiac reconstruction from data acquired on an interventional C-arm. One disadvantage of the PICCS algorithm, just as other iterative algorithms, is the long computation times typically associated with reconstruction. In order for an algorithm to gain clinical acceptance reconstruction must be achievable in minutes rather than hours. In this work the PICCS algorithm has been implemented on the GPU in order to significantly reduce the reconstruction time of the PICCS algorithm. The Compute Unified Device Architecture (CUDA) was used in this implementation.
TinyOS-based quality of service management in wireless sensor networks
Peterson, N.; Anusuya-Rangappa, L.; Shirazi, B.A.; Huang, R.; Song, W.-Z.; Miceli, M.; McBride, D.; Hurson, A.; LaHusen, R.
2009-01-01
Previously the cost and extremely limited capabilities of sensors prohibited Quality of Service (QoS) implementations in wireless sensor networks. With advances in technology, sensors are becoming significantly less expensive and the increases in computational and storage capabilities are opening the door for new, sophisticated algorithms to be implemented. Newer sensor network applications require higher data rates with more stringent priority requirements. We introduce a dynamic scheduling algorithm to improve bandwidth for high priority data in sensor networks, called Tiny-DWFQ. Our Tiny-Dynamic Weighted Fair Queuing scheduling algorithm allows for dynamic QoS for prioritized communications by continually adjusting the treatment of communication packages according to their priorities and the current level of network congestion. For performance evaluation, we tested Tiny-DWFQ, Tiny-WFQ (traditional WFQ algorithm implemented in TinyOS), and FIFO queues on an Imote2-based wireless sensor network and report their throughput and packet loss. Our results show that Tiny-DWFQ performs better in all test cases. ?? 2009 IEEE.
NASA Astrophysics Data System (ADS)
Wang, Lilie; Ding, George X.
2014-07-01
The out-of-field dose can be clinically important as it relates to the dose of the organ-at-risk, although the accuracy of its calculation in commercial radiotherapy treatment planning systems (TPSs) receives less attention. This study evaluates the uncertainties of out-of-field dose calculated with a model based dose calculation algorithm, anisotropic analytical algorithm (AAA), implemented in a commercial radiotherapy TPS, Varian Eclipse V10, by using Monte Carlo (MC) simulations, in which the entire accelerator head is modeled including the multi-leaf collimators. The MC calculated out-of-field doses were validated by experimental measurements. The dose calculations were performed in a water phantom as well as CT based patient geometries and both static and highly modulated intensity-modulated radiation therapy (IMRT) fields were evaluated. We compared the calculated out-of-field doses, defined as lower than 5% of the prescription dose, in four H&N cancer patients and two lung cancer patients treated with volumetric modulated arc therapy (VMAT) and IMRT techniques. The results show that the discrepancy of calculated out-of-field dose profiles between AAA and the MC depends on the depth and is generally less than 1% for in water phantom comparisons and in CT based patient dose calculations for static field and IMRT. In cases of VMAT plans, the difference between AAA and MC is <0.5%. The clinical impact resulting from the error on the calculated organ doses were analyzed by using dose-volume histograms. Although the AAA algorithm significantly underestimated the out-of-field doses, the clinical impact on the calculated organ doses in out-of-field regions may not be significant in practice due to very low out-of-field doses relative to the target dose.
Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve
2013-12-21
Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine.
Chen, Xudong; Xu, Zhongwen; Yao, Liming; Ma, Ning
2018-03-05
This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.
Modeling of skin cancer dermatoscopy images
NASA Astrophysics Data System (ADS)
Iralieva, Malica B.; Myakinin, Oleg O.; Bratchenko, Ivan A.; Zakharov, Valery P.
2018-04-01
An early identified cancer is more likely to effective respond to treatment and has a less expensive treatment as well. Dermatoscopy is one of general diagnostic techniques for skin cancer early detection that allows us in vivo evaluation of colors and microstructures on skin lesions. Digital phantoms with known properties are required during new instrument developing to compare sample's features with data from the instrument. An algorithm for image modeling of skin cancer is proposed in the paper. Steps of the algorithm include setting shape, texture generation, adding texture and normal skin background setting. The Gaussian represents the shape, and then the texture generation based on a fractal noise algorithm is responsible for spatial chromophores distributions, while the colormap applied to the values corresponds to spectral properties. Finally, a normal skin image simulated by mixed Monte Carlo method using a special online tool is added as a background. Varying of Asymmetry, Borders, Colors and Diameter settings is shown to be fully matched to the ABCD clinical recognition algorithm. The asymmetry is specified by setting different standard deviation values of Gaussian in different parts of image. The noise amplitude is increased to set the irregular borders score. Standard deviation is changed to determine size of the lesion. Colors are set by colormap changing. The algorithm for simulating different structural elements is required to match with others recognition algorithms.
SU-F-T-20: Novel Catheter Lumen Recognition Algorithm for Rapid Digitization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dise, J; McDonald, D; Ashenafi, M
Purpose: Manual catheter recognition remains a time-consuming aspect of high-dose-rate brachytherapy (HDR) treatment planning. In this work, a novel catheter lumen recognition algorithm was created for accurate and rapid digitization. Methods: MatLab v8.5 was used to create the catheter recognition algorithm. Initially, the algorithm searches the patient CT dataset using an intensity based k-means filter designed to locate catheters. Once the catheters have been located, seed points are manually selected to initialize digitization of each catheter. From each seed point, the algorithm searches locally in order to automatically digitize the remaining catheter. This digitization is accomplished by finding pixels withmore » similar image curvature and divergence parameters compared to the seed pixel. Newly digitized pixels are treated as new seed positions, and hessian image analysis is used to direct the algorithm toward neighboring catheter pixels, and to make the algorithm insensitive to adjacent catheters that are unresolvable on CT, air pockets, and high Z artifacts. The algorithm was tested using 11 HDR treatment plans, including the Syed template, tandem and ovoid applicator, and multi-catheter lung brachytherapy. Digitization error was calculated by comparing manually determined catheter positions to those determined by the algorithm. Results: he digitization error was 0.23 mm ± 0.14 mm axially and 0.62 mm ± 0.13 mm longitudinally at the tip. The time of digitization, following initial seed placement was less than 1 second per catheter. The maximum total time required to digitize all tested applicators was 4 minutes (Syed template with 15 needles). Conclusion: This algorithm successfully digitizes HDR catheters for a variety of applicators with or without CT markers. The minimal axial error demonstrates the accuracy of the algorithm, and its insensitivity to image artifacts and challenging catheter positioning. Future work to automatically place initial seed positions would improve the algorithm speed.« less
López-Guerrero, José Antonio; Romero, Ignacio; Poveda, Andrés
2015-01-01
Epithelial ovarian cancer (OC) is a common gynecologic malignancy in women. The standard treatment for OC is maximal cytoreductive surgical debulking followed by platinum-based chemotherapy. Despite the high response rate to primary therapy, approximately 85% of patients will develop recurrent ovarian cancer (ROC). This review identifies the clinical use of trabectedin in the treatment algorithm for ROC, with specific emphasis on platinum-sensitive ROC, for which trabectedin in combination with pegylated liposomal doxorubicin has been approved as a treatment protocol. The main mechanisms of action of trabectedin at the cellular level and in the tumor microenvironment is also discussed as bases for identifying biomarkers for selecting patients who may largely benefit from trabectedin-based therapies.
NASA Astrophysics Data System (ADS)
Chvetsov, Alevei V.; Sandison, George A.; Schwartz, Jeffrey L.; Rengan, Ramesh
2015-11-01
The main objective of this article is to improve the stability of reconstruction algorithms for estimation of radiobiological parameters using serial tumor imaging data acquired during radiation therapy. Serial images of tumor response to radiation therapy represent a complex summation of several exponential processes as treatment induced cell inactivation, tumor growth rates, and the rate of cell loss. Accurate assessment of treatment response would require separation of these processes because they define radiobiological determinants of treatment response and, correspondingly, tumor control probability. However, the estimation of radiobiological parameters using imaging data can be considered an inverse ill-posed problem because a sum of several exponentials would produce the Fredholm integral equation of the first kind which is ill posed. Therefore, the stability of reconstruction of radiobiological parameters presents a problem even for the simplest models of tumor response. To study stability of the parameter reconstruction problem, we used a set of serial CT imaging data for head and neck cancer and a simplest case of a two-level cell population model of tumor response. Inverse reconstruction was performed using a simulated annealing algorithm to minimize a least squared objective function. Results show that the reconstructed values of cell surviving fractions and cell doubling time exhibit significant nonphysical fluctuations if no stabilization algorithms are applied. However, after applying a stabilization algorithm based on variational regularization, the reconstruction produces statistical distributions for survival fractions and doubling time that are comparable to published in vitro data. This algorithm is an advance over our previous work where only cell surviving fractions were reconstructed. We conclude that variational regularization allows for an increase in the number of free parameters in our model which enables development of more-advanced parameter reconstruction algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, L; Pi, Y; Chen, Z
2016-06-15
Purpose: To evaluate the ROI contours and accumulated dose difference using different deformable image registration (DIR) algorithms for head and neck (H&N) adaptive radiotherapy. Methods: Eight H&N cancer patients were randomly selected from the affiliated hospital. During the treatment, patients were rescanned every week with ROIs well delineated by radiation oncologist on each weekly CT. New weekly treatment plans were also re-designed with consistent dose prescription on the rescanned CT and executed for one week on Siemens CT-on-rails accelerator. At the end, we got six weekly CT scans from CT1 to CT6 including six weekly treatment plans for each patient.more » The primary CT1 was set as the reference CT for DIR proceeding with the left five weekly CTs using ANACONDA and MORFEUS algorithms separately in RayStation and the external skin ROI was set to be the controlling ROI both. The entire calculated weekly dose were deformed and accumulated on corresponding reference CT1 according to the deformation vector field (DVFs) generated by the two different DIR algorithms respectively. Thus we got both the ANACONDA-based and MORFEUS-based accumulated total dose on CT1 for each patient. At the same time, we mapped the ROIs on CT1 to generate the corresponding ROIs on CT6 using ANACONDA and MORFEUS DIR algorithms. DICE coefficients between the DIR deformed and radiation oncologist delineated ROIs on CT6 were calculated. Results: For DIR accumulated dose, PTV D95 and Left-Eyeball Dmax show significant differences with 67.13 cGy and 109.29 cGy respectively (Table1). For DIR mapped ROIs, PTV, Spinal cord and Left-Optic nerve show difference with −0.025, −0.127 and −0.124 (Table2). Conclusion: Even two excellent DIR algorithms can give divergent results for ROI deformation and dose accumulation. As more and more TPS get DIR module integrated, there is an urgent need to realize the potential risk using DIR in clinical.« less
Current Treatment Algorithms for Patients with Metastatic Non-Small Cell, Non-Squamous Lung Cancer
Melosky, Barbara
2017-01-01
The treatment paradigm for metastatic non-small cell, non-squamous lung cancer is continuously evolving due to new treatment options and our increasing knowledge of molecular signal pathways. As a result of treatments becoming more efficacious and more personalized, survival for selected groups of non-small cell lung cancer (NSCLC) patients is increasing. In this paper, three algorithms will be presented for treating patients with metastatic non-squamous, NSCLC. These include treatment algorithms for NSCLC patients whose tumors have EGFR mutations, ALK rearrangements, or wild-type/wild-type tumors. As the world of immunotherapy continues to evolve quickly, a future algorithm will also be presented. PMID:28373963
Laireiter, Anton Rupert
2017-01-01
Background In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Objective Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. Methods In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. Results The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. Conclusions With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. PMID:28729233
Identification of chronic rhinosinusitis phenotypes using cluster analysis.
Soler, Zachary M; Hyer, J Madison; Ramakrishnan, Viswanathan; Smith, Timothy L; Mace, Jess; Rudmik, Luke; Schlosser, Rodney J
2015-05-01
Current clinical classifications of chronic rhinosinusitis (CRS) have been largely defined based upon preconceived notions of factors thought to be important, such as polyp or eosinophil status. Unfortunately, these classification systems have little correlation with symptom severity or treatment outcomes. Unsupervised clustering can be used to identify phenotypic subgroups of CRS patients, describe clinical differences in these clusters and define simple algorithms for classification. A multi-institutional, prospective study of 382 patients with CRS who had failed initial medical therapy completed the Sino-Nasal Outcome Test (SNOT-22), Rhinosinusitis Disability Index (RSDI), Medical Outcomes Study Short Form-12 (SF-12), Pittsburgh Sleep Quality Index (PSQI), and Patient Health Questionnaire (PHQ-2). Objective measures of CRS severity included Brief Smell Identification Test (B-SIT), CT, and endoscopy scoring. All variables were reduced and unsupervised hierarchical clustering was performed. After clusters were defined, variations in medication usage were analyzed. Discriminant analysis was performed to develop a simplified, clinically useful algorithm for clustering. Clustering was largely determined by age, severity of patient reported outcome measures, depression, and fibromyalgia. CT and endoscopy varied somewhat among clusters. Traditional clinical measures, including polyp/atopic status, prior surgery, B-SIT and asthma, did not vary among clusters. A simplified algorithm based upon productivity loss, SNOT-22 score, and age predicted clustering with 89% accuracy. Medication usage among clusters did vary significantly. A simplified algorithm based upon hierarchical clustering is able to classify CRS patients and predict medication usage. Further studies are warranted to determine if such clustering predicts treatment outcomes. © 2015 ARS-AAOA, LLC.
Algorithm of first-aid management of dental trauma for medics and corpsmen.
Zadik, Yehuda
2008-12-01
In order to fill the discrepancy between the necessity of providing prompt and proper treatment to dental trauma patients, and the inadequate knowledge among medics and corpsmen, as well as the lack of instructions in first-aid textbook and manuals, and after reviewing the dental literature, a simple algorithm for non-professional first-aid management for various injuries to hard (teeth) and soft oral tissues, is presented. The recommended management of tooth avulsion, subluxation and luxation, crown fracture and lip, tongue or gingival laceration included in the algorithm. Along with a list of after-hour dental clinics, this symptoms- and clinical-appearance-based algorithm is suited to tuck easily into a pocket for quick utilization by medics/corpsmen in an emergency situation. Although the algorithm was developed for the usage of military non-dental health-care providers, this method could be adjusted and employed in the civilian environment as well.
An improved stochastic fractal search algorithm for 3D protein structure prediction.
Zhou, Changjun; Sun, Chuan; Wang, Bin; Wang, Xiaojun
2018-05-03
Protein structure prediction (PSP) is a significant area for biological information research, disease treatment, and drug development and so on. In this paper, three-dimensional structures of proteins are predicted based on the known amino acid sequences, and the structure prediction problem is transformed into a typical NP problem by an AB off-lattice model. This work applies a novel improved Stochastic Fractal Search algorithm (ISFS) to solve the problem. The Stochastic Fractal Search algorithm (SFS) is an effective evolutionary algorithm that performs well in exploring the search space but falls into local minimums sometimes. In order to avoid the weakness, Lvy flight and internal feedback information are introduced in ISFS. In the experimental process, simulations are conducted by ISFS algorithm on Fibonacci sequences and real peptide sequences. Experimental results prove that the ISFS performs more efficiently and robust in terms of finding the global minimum and avoiding getting stuck in local minimums.
TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuemann, J; Grassberger, C; Paganetti, H
2014-06-15
Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50)more » were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend treatment plan verification using Monte Carlo simulations for patients with complex geometries.« less
Use of treatment algorithms for depression.
Trivedi, Madhukar H; Fava, Maurizio; Marangell, Lauren B; Osser, David N; Shelton, Richard C
2006-01-01
Depression continues to be a treatment challenge for many physicians-psychiatrists and primary care physicians alike-in part because of the nature of the disorder, but also because of the wide variety of medications and other treatments available, each with a distinct efficacy and safety profile. One way of negotiating treatment decisions is to use treatment guidelines and algorithms. This Commentary, which appears in the September 2006 issue of The Journal of Clinical Psychiatry (2006;67:1458-1465), provides the primary care clinician with insight into the pros and cons of using treatment algorithms to guide the treatment of depression. -Larry Culpepper, M.D.
Koh, Victor; Swamidoss, Issac Niwas; Aquino, Maria Cecilia D; Chew, Paul T; Sng, Chelvin
2018-04-27
Develop an algorithm to predict the success of laser peripheral iridotomy (LPI) in primary angle closure suspect (PACS), using pre-treatment anterior segment optical coherence tomography (ASOCT) scans. A total of 116 eyes with PACS underwent LPI and time-domain ASOCT scans (temporal and nasal cuts) were performed before and 1 month after LPI. All the post-treatment scans were classified to one of the following categories: (a) both angles open, (b) one of two angles open and (c) both angles closed. After LPI, success is defined as one or more angles changed from close to open. In this proposed method, the pre and post-LPI ASOCT scans were registered at the corresponding angles based on similarities between the respective local descriptor features and random sample consensus technique was used to identify the largest consensus set of correspondences between the pre and post-LPI ASOCT scans. Subsequently, features such as correlation co-efficient (CC) and structural similarity index (SSIM) were extracted and correlated with the success of LPI. We included 116 eyes and 91 (78.44%) eyes fulfilled the criteria for success after LPI. Using the CC and SSIM index scores from this training set of ASOCT images, our algorithm showed that the success of LPI in eyes with narrow angles can be predicted with 89.7% accuracy, specificity of 95.2% and sensitivity of 36.4% based on pre-LPI ASOCT scans only. Using pre-LPI ASOCT scans, our proposed algorithm showed good accuracy in predicting the success of LPI for PACS eyes. This fully-automated algorithm could aid decision making in offering LPI as a prophylactic treatment for PACS.
Performance of the "CCS Algorithm" in real world patients.
LaHaye, Stephen A; Olesen, Jonas B; Lacombe, Shawn P
2015-06-01
With the publication of the 2014 Focused Update of the Canadian Cardiovascular Society Guidelines for the Management of Atrial Fibrillation, the Canadian Cardiovascular Society Atrial Fibrillation Guidelines Committee has introduced a new triage and management algorithm; the so-called "CCS Algorithm". The CCS Algorithm is based upon expert opinion of the best available evidence; however, the CCS Algorithm has not yet been validated. Accordingly, the purpose of this study is to evaluate the performance of the CCS Algorithm in a cohort of real world patients. We compared the CCS Algorithm with the European Society of Cardiology (ESC) Algorithm in 172 hospital inpatients who are at risk of stroke due to non-valvular atrial fibrillation in whom anticoagulant therapy was being considered. The CCS Algorithm and the ESC Algorithm were concordant in 170/172 patients (99% of the time). There were two patients (1%) with vascular disease, but no other thromboembolic risk factors, which were classified as requiring oral anticoagulant therapy using the ESC Algorithm, but for whom ASA was recommended by the CCS Algorithm. The CCS Algorithm appears to be unnecessarily complicated in so far as it does not appear to provide any additional discriminatory value above and beyond the use of the ESC Algorithm, and its use could result in under treatment of patients, specifically female patients with vascular disease, whose real risk of stroke has been understated by the Guidelines.
Bladder segmentation in MR images with watershed segmentation and graph cut algorithm
NASA Astrophysics Data System (ADS)
Blaffert, Thomas; Renisch, Steffen; Schadewaldt, Nicole; Schulz, Heinrich; Wiemker, Rafael
2014-03-01
Prostate and cervix cancer diagnosis and treatment planning that is based on MR images benefit from superior soft tissue contrast compared to CT images. For these images an automatic delineation of the prostate or cervix and the organs at risk such as the bladder is highly desirable. This paper describes a method for bladder segmentation that is based on a watershed transform on high image gradient values and gray value valleys together with the classification of watershed regions into bladder contents and tissue by a graph cut algorithm. The obtained results are superior if compared to a simple region-after-region classification.
A multi-block adaptive solving technique based on lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Zhang, Yang; Xie, Jiahua; Li, Xiaoyue; Ma, Zhenghai; Zou, Jianfeng; Zheng, Yao
2018-05-01
In this paper, a CFD parallel adaptive algorithm is self-developed by combining the multi-block Lattice Boltzmann Method (LBM) with Adaptive Mesh Refinement (AMR). The mesh refinement criterion of this algorithm is based on the density, velocity and vortices of the flow field. The refined grid boundary is obtained by extending outward half a ghost cell from the coarse grid boundary, which makes the adaptive mesh more compact and the boundary treatment more convenient. Two numerical examples of the backward step flow separation and the unsteady flow around circular cylinder demonstrate the vortex structure of the cold flow field accurately and specifically.
Bruyère, Olivier; Cooper, Cyrus; Pelletier, Jean-Pierre; Branco, Jaime; Luisa Brandi, Maria; Guillemin, Francis; Hochberg, Marc C; Kanis, John A; Kvien, Tore K; Martel-Pelletier, Johanne; Rizzoli, René; Silverman, Stuart; Reginster, Jean-Yves
2014-12-01
Existing practice guidelines for osteoarthritis (OA) analyze the evidence behind each proposed treatment but do not prioritize the interventions in a given sequence. The objective was to develop a treatment algorithm recommendation that is easier to interpret for the prescribing physician based on the available evidence and that is applicable in Europe and internationally. The knee was used as the model OA joint. ESCEO assembled a task force of 13 international experts (rheumatologists, clinical epidemiologists, and clinical scientists). Existing guidelines were reviewed; all interventions listed and recent evidence were retrieved using established databases. A first schematic flow chart with treatment prioritization was discussed in a 1-day meeting and shaped to the treatment algorithm. Fine-tuning occurred by electronic communication and three consultation rounds until consensus. Basic principles consist of the need for a combined pharmacological and non-pharmacological treatment with a core set of initial measures, including information access/education, weight loss if overweight, and an appropriate exercise program. Four multimodal steps are then established. Step 1 consists of background therapy, either non-pharmacological (referral to a physical therapist for re-alignment treatment if needed and sequential introduction of further physical interventions initially and at any time thereafter) or pharmacological. The latter consists of chronic Symptomatic Slow-Acting Drugs for OA (e.g., prescription glucosamine sulfate and/or chondroitin sulfate) with paracetamol at-need; topical NSAIDs are added in the still symptomatic patient. Step 2 consists of the advanced pharmacological management in the persistent symptomatic patient and is centered on the use of oral COX-2 selective or non-selective NSAIDs, chosen based on concomitant risk factors, with intra-articular corticosteroids or hyaluronate for further symptom relief if insufficient. In Step 3, the last pharmacological attempts before surgery are represented by weak opioids and other central analgesics. Finally, Step 4 consists of end-stage disease management and surgery, with classical opioids as a difficult-to-manage alternative when surgery is contraindicated. The proposed treatment algorithm may represent a new framework for the development of future guidelines for the management of OA, more easily accessible to physicians. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan
Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan
2013-12-15
Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less
Network-based machine learning and graph theory algorithms for precision oncology.
Zhang, Wei; Chien, Jeremy; Yong, Jeongsik; Kuang, Rui
2017-01-01
Network-based analytics plays an increasingly important role in precision oncology. Growing evidence in recent studies suggests that cancer can be better understood through mutated or dysregulated pathways or networks rather than individual mutations and that the efficacy of repositioned drugs can be inferred from disease modules in molecular networks. This article reviews network-based machine learning and graph theory algorithms for integrative analysis of personal genomic data and biomedical knowledge bases to identify tumor-specific molecular mechanisms, candidate targets and repositioned drugs for personalized treatment. The review focuses on the algorithmic design and mathematical formulation of these methods to facilitate applications and implementations of network-based analysis in the practice of precision oncology. We review the methods applied in three scenarios to integrate genomic data and network models in different analysis pipelines, and we examine three categories of network-based approaches for repositioning drugs in drug-disease-gene networks. In addition, we perform a comprehensive subnetwork/pathway analysis of mutations in 31 cancer genome projects in the Cancer Genome Atlas and present a detailed case study on ovarian cancer. Finally, we discuss interesting observations, potential pitfalls and future directions in network-based precision oncology.
Development, Comparisons and Evaluation of Aerosol Retrieval Algorithms
NASA Astrophysics Data System (ADS)
de Leeuw, G.; Holzer-Popp, T.; Aerosol-cci Team
2011-12-01
The Climate Change Initiative (cci) of the European Space Agency (ESA) has brought together a team of European Aerosol retrieval groups working on the development and improvement of aerosol retrieval algorithms. The goal of this cooperation is the development of methods to provide the best possible information on climate and climate change based on satellite observations. To achieve this, algorithms are characterized in detail as regards the retrieval approaches, the aerosol models used in each algorithm, cloud detection and surface treatment. A round-robin intercomparison of results from the various participating algorithms serves to identify the best modules or combinations of modules for each sensor. Annual global datasets including their uncertainties will then be produced and validated. The project builds on 9 existing algorithms to produce spectral aerosol optical depth (AOD and Ångström exponent) as well as other aerosol information; two instruments are included to provide the absorbing aerosol index (AAI) and stratospheric aerosol information. The algorithms included are: - 3 for ATSR (ORAC developed by RAL / Oxford university, ADV developed by FMI and the SU algorithm developed by Swansea University ) - 2 for MERIS (BAER by Bremen university and the ESA standard handled by HYGEOS) - 1 for POLDER over ocean (LOA) - 1 for synergetic retrieval (SYNAER by DLR ) - 1 for OMI retreival of the absorbing aerosol index with averaging kernel information (KNMI) - 1 for GOMOS stratospheric extinction profile retrieval (BIRA) The first seven algorithms aim at the retrieval of the AOD. However, each of the algorithms used differ in their approach, even for algorithms working with the same instrument such as ATSR or MERIS. To analyse the strengths and weaknesses of each algorithm several tests are made. The starting point for comparison and measurement of improvements is a retrieval run for 1 month, September 2008. The data from the same month are subsequently used for several runs with a prescribed set of aerosol models and an a priori data set derived from the median of AEROCOM model runs. The aerosol models and a priori data can be used in several ways, i.e. fully prescribed or with some freedom to choose a combination of aerosol models, based on the a priori or not. Another test gives insight in the effect of the cloud masks used, i.e. retrievals using the same cloud mask (the AATSR APOLLO cloud mask for collocated instruments) are compared with runs using the standard cloud masks. Tests to determine the influence of surface treatment are planned as well. The results of all these tests are evaluated by an independent team which compares the retrieval results with ground-based remote sensing (in particular AERONET) and in-situ data, and by a scoring method. Results are compared with other satellites such as MODIS and MISR. Blind tests using synthetic data are part of the algorithm characterization. The presentation will summarize results of the ongoing phase 1 inter-comparison and evaluation work within the Aerosol_cci project.
Xu, Q; Yang, D; Tan, J; Anastasio, M
2012-06-01
To improve image quality and reduce imaging dose in CBCT for radiation therapy applications and to realize near real-time image reconstruction based on use of a fast convergence iterative algorithm and acceleration by multi-GPUs. An iterative image reconstruction that sought to minimize a weighted least squares cost function that employed total variation (TV) regularization was employed to mitigate projection data incompleteness and noise. To achieve rapid 3D image reconstruction (< 1 min), a highly optimized multiple-GPU implementation of the algorithm was developed. The convergence rate and reconstruction accuracy were evaluated using a modified 3D Shepp-Logan digital phantom and a Catphan-600 physical phantom. The reconstructed images were compared with the clinical FDK reconstruction results. Digital phantom studies showed that only 15 iterations and 60 iterations are needed to achieve algorithm convergence for 360-view and 60-view cases, respectively. The RMSE was reduced to 10-4 and 10-2, respectively, by using 15 iterations for each case. Our algorithm required 5.4s to complete one iteration for the 60-view case using one Tesla C2075 GPU. The few-view study indicated that our iterative algorithm has great potential to reduce the imaging dose and preserve good image quality. For the physical Catphan studies, the images obtained from the iterative algorithm possessed better spatial resolution and higher SNRs than those obtained from by use of a clinical FDK reconstruction algorithm. We have developed a fast convergence iterative algorithm for CBCT image reconstruction. The developed algorithm yielded images with better spatial resolution and higher SNR than those produced by a commercial FDK tool. In addition, from the few-view study, the iterative algorithm has shown great potential for significantly reducing imaging dose. We expect that the developed reconstruction approach will facilitate applications including IGART and patient daily CBCT-based treatment localization. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Sanchez-Parcerisa, D.; Cortés-Giraldo, M. A.; Dolney, D.; Kondrla, M.; Fager, M.; Carabe, A.
2016-02-01
In order to integrate radiobiological modelling with clinical treatment planning for proton radiotherapy, we extended our in-house treatment planning system FoCa with a 3D analytical algorithm to calculate linear energy transfer (LET) in voxelized patient geometries. Both active scanning and passive scattering delivery modalities are supported. The analytical calculation is much faster than the Monte-Carlo (MC) method and it can be implemented in the inverse treatment planning optimization suite, allowing us to create LET-based objectives in inverse planning. The LET was calculated by combining a 1D analytical approach including a novel correction for secondary protons with pencil-beam type LET-kernels. Then, these LET kernels were inserted into the proton-convolution-superposition algorithm in FoCa. The analytical LET distributions were benchmarked against MC simulations carried out in Geant4. A cohort of simple phantom and patient plans representing a wide variety of sites (prostate, lung, brain, head and neck) was selected. The calculation algorithm was able to reproduce the MC LET to within 6% (1 standard deviation) for low-LET areas (under 1.7 keV μm-1) and within 22% for the high-LET areas above that threshold. The dose and LET distributions can be further extended, using radiobiological models, to include radiobiological effectiveness (RBE) calculations in the treatment planning system. This implementation also allows for radiobiological optimization of treatments by including RBE-weighted dose constraints in the inverse treatment planning process.
Sanchez-Parcerisa, D; Cortés-Giraldo, M A; Dolney, D; Kondrla, M; Fager, M; Carabe, A
2016-02-21
In order to integrate radiobiological modelling with clinical treatment planning for proton radiotherapy, we extended our in-house treatment planning system FoCa with a 3D analytical algorithm to calculate linear energy transfer (LET) in voxelized patient geometries. Both active scanning and passive scattering delivery modalities are supported. The analytical calculation is much faster than the Monte-Carlo (MC) method and it can be implemented in the inverse treatment planning optimization suite, allowing us to create LET-based objectives in inverse planning. The LET was calculated by combining a 1D analytical approach including a novel correction for secondary protons with pencil-beam type LET-kernels. Then, these LET kernels were inserted into the proton-convolution-superposition algorithm in FoCa. The analytical LET distributions were benchmarked against MC simulations carried out in Geant4. A cohort of simple phantom and patient plans representing a wide variety of sites (prostate, lung, brain, head and neck) was selected. The calculation algorithm was able to reproduce the MC LET to within 6% (1 standard deviation) for low-LET areas (under 1.7 keV μm(-1)) and within 22% for the high-LET areas above that threshold. The dose and LET distributions can be further extended, using radiobiological models, to include radiobiological effectiveness (RBE) calculations in the treatment planning system. This implementation also allows for radiobiological optimization of treatments by including RBE-weighted dose constraints in the inverse treatment planning process.
Wright, Gavin; Harrold, Natalie; Bownes, Peter
2018-01-01
Aims To compare the accuracies of the convolution and TMR10 Gamma Knife treatment planning algorithms, and assess the impact upon clinical practice of implementing convolution-based treatment planning. Methods Doses calculated by both algorithms were compared against ionisation chamber measurements in homogeneous and heterogeneous phantoms. Relative dose distributions calculated by both algorithms were compared against film-derived 2D isodose plots in a heterogeneous phantom, with distance-to-agreement (DTA) measured at the 80%, 50% and 20% isodose levels. A retrospective planning study compared 19 clinically acceptable metastasis convolution plans against TMR10 plans with matched shot times, allowing novel comparison of true dosimetric parameters rather than total beam-on-time. Gamma analysis and dose-difference analysis were performed on each pair of dose distributions. Results Both algorithms matched point dose measurement within ±1.1% in homogeneous conditions. Convolution provided superior point-dose accuracy in the heterogeneous phantom (-1.1% v 4.0%), with no discernible differences in relative dose distribution accuracy. In our study convolution-calculated plans yielded D99% 6.4% (95% CI:5.5%-7.3%,p<0.001) less than shot matched TMR10 plans. For gamma passing criteria 1%/1mm, 16% of targets had passing rates >95%. The range of dose differences in the targets was 0.2-4.6Gy. Conclusions Convolution provides superior accuracy versus TMR10 in heterogeneous conditions. Implementing convolution would result in increased target doses therefore its implementation may require a revaluation of prescription doses. PMID:29657896
Hu, Chen; Steingrimsson, Jon Arni
2018-01-01
A crucial component of making individualized treatment decisions is to accurately predict each patient's disease risk. In clinical oncology, disease risks are often measured through time-to-event data, such as overall survival and progression/recurrence-free survival, and are often subject to censoring. Risk prediction models based on recursive partitioning methods are becoming increasingly popular largely due to their ability to handle nonlinear relationships, higher-order interactions, and/or high-dimensional covariates. The most popular recursive partitioning methods are versions of the Classification and Regression Tree (CART) algorithm, which builds a simple interpretable tree structured model. With the aim of increasing prediction accuracy, the random forest algorithm averages multiple CART trees, creating a flexible risk prediction model. Risk prediction models used in clinical oncology commonly use both traditional demographic and tumor pathological factors as well as high-dimensional genetic markers and treatment parameters from multimodality treatments. In this article, we describe the most commonly used extensions of the CART and random forest algorithms to right-censored outcomes. We focus on how they differ from the methods for noncensored outcomes, and how the different splitting rules and methods for cost-complexity pruning impact these algorithms. We demonstrate these algorithms by analyzing a randomized Phase III clinical trial of breast cancer. We also conduct Monte Carlo simulations to compare the prediction accuracy of survival forests with more commonly used regression models under various scenarios. These simulation studies aim to evaluate how sensitive the prediction accuracy is to the underlying model specifications, the choice of tuning parameters, and the degrees of missing covariates.
Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.
Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi
2013-01-01
The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.
Optimization-based reconstruction for reduction of CBCT artifact in IGRT
NASA Astrophysics Data System (ADS)
Xia, Dan; Zhang, Zheng; Paysan, Pascal; Seghers, Dieter; Brehm, Marcus; Munro, Peter; Sidky, Emil Y.; Pelizzari, Charles; Pan, Xiaochuan
2016-04-01
Kilo-voltage cone-beam computed tomography (CBCT) plays an important role in image guided radiation therapy (IGRT) by providing 3D spatial information of tumor potentially useful for optimizing treatment planning. In current IGRT CBCT system, reconstructed images obtained with analytic algorithms, such as FDK algorithm and its variants, may contain artifacts. In an attempt to compensate for the artifacts, we investigate optimization-based reconstruction algorithms such as the ASD-POCS algorithm for potentially reducing arti- facts in IGRT CBCT images. In this study, using data acquired with a physical phantom and a patient subject, we demonstrate that the ASD-POCS reconstruction can significantly reduce artifacts observed in clinical re- constructions. Moreover, patient images reconstructed by use of the ASD-POCS algorithm indicate a contrast level of soft-tissue improved over that of the clinical reconstruction. We have also performed reconstructions from sparse-view data, and observe that, for current clinical imaging conditions, ASD-POCS reconstructions from data collected at one half of the current clinical projection views appear to show image quality, in terms of spatial and soft-tissue-contrast resolution, higher than that of the corresponding clinical reconstructions.
PCA-based artifact removal algorithm for stroke detection using UWB radar imaging.
Ricci, Elisa; di Domenico, Simone; Cianca, Ernestina; Rossi, Tommaso; Diomedi, Marina
2017-06-01
Stroke patients should be dispatched at the highest level of care available in the shortest time. In this context, a transportable system in specialized ambulances, able to evaluate the presence of an acute brain lesion in a short time interval (i.e., few minutes), could shorten delay of treatment. UWB radar imaging is an emerging diagnostic branch that has great potential for the implementation of a transportable and low-cost device. Transportability, low cost and short response time pose challenges to the signal processing algorithms of the backscattered signals as they should guarantee good performance with a reasonably low number of antennas and low computational complexity, tightly related to the response time of the device. The paper shows that a PCA-based preprocessing algorithm can: (1) achieve good performance already with a computationally simple beamforming algorithm; (2) outperform state-of-the-art preprocessing algorithms; (3) enable a further improvement in the performance (and/or decrease in the number of antennas) by using a multistatic approach with just a modest increase in computational complexity. This is an important result toward the implementation of such a diagnostic device that could play an important role in emergency scenario.
A link prediction approach to cancer drug sensitivity prediction.
Turki, Turki; Wei, Zhi
2017-10-03
Predicting the response to a drug for cancer disease patients based on genomic information is an important problem in modern clinical oncology. This problem occurs in part because many available drug sensitivity prediction algorithms do not consider better quality cancer cell lines and the adoption of new feature representations; both lead to the accurate prediction of drug responses. By predicting accurate drug responses to cancer, oncologists gain a more complete understanding of the effective treatments for each patient, which is a core goal in precision medicine. In this paper, we model cancer drug sensitivity as a link prediction, which is shown to be an effective technique. We evaluate our proposed link prediction algorithms and compare them with an existing drug sensitivity prediction approach based on clinical trial data. The experimental results based on the clinical trial data show the stability of our link prediction algorithms, which yield the highest area under the ROC curve (AUC) and are statistically significant. We propose a link prediction approach to obtain new feature representation. Compared with an existing approach, the results show that incorporating the new feature representation to the link prediction algorithms has significantly improved the performance.
Social network fragmentation and community health.
Chami, Goylette F; Ahnert, Sebastian E; Kabatereine, Narcis B; Tukahebwa, Edridah M
2017-09-05
Community health interventions often seek to intentionally destroy paths between individuals to prevent the spread of infectious diseases. Immunizing individuals through direct vaccination or the provision of health education prevents pathogen transmission and the propagation of misinformation concerning medical treatments. However, it remains an open question whether network-based strategies should be used in place of conventional field approaches to target individuals for medical treatment in low-income countries. We collected complete friendship and health advice networks in 17 rural villages of Mayuge District, Uganda. Here we show that acquaintance algorithms, i.e., selecting neighbors of randomly selected nodes, were systematically more efficient in fragmenting all networks than targeting well-established community roles, i.e., health workers, village government members, and schoolteachers. Additionally, community roles were not good proxy indicators of physical proximity to other households or connections to many sick people. We also show that acquaintance algorithms were effective in offsetting potential noncompliance with deworming treatments for 16,357 individuals during mass drug administration (MDA). Health advice networks were destroyed more easily than friendship networks. Only an average of 32% of nodes were removed from health advice networks to reduce the percentage of nodes at risk for refusing treatment in MDA to below 25%. Treatment compliance of at least 75% is needed in MDA to control human morbidity attributable to parasitic worms and progress toward elimination. Our findings point toward the potential use of network-based approaches as an alternative to role-based strategies for targeting individuals in rural health interventions.
NASA Astrophysics Data System (ADS)
Woon, Y. L.; Heng, S. P.; Wong, J. H. D.; Ung, N. M.
2016-03-01
Inhomogeneity correction is recommended for accurate dose calculation in radiotherapy treatment planning since human body are highly inhomogeneous with the presence of bones and air cavities. However, each dose calculation algorithm has its own limitations. This study is to assess the accuracy of five algorithms that are currently implemented for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC) and Acuros XB (AXB). The calculated dose was compared with the measured dose using radiochromic film (Gafchromic EBT2) in inhomogeneous phantoms. In addition, the dosimetric impact of different algorithms on intensity modulated radiotherapy (IMRT) was studied for head and neck region. MC had the best agreement with the measured percentage depth dose (PDD) within the inhomogeneous region. This was followed by AXB, AAA, SP and PBC. For IMRT planning, MC algorithm is recommended for treatment planning in preference to PBC and SP. The MC and AXB algorithms were found to have better accuracy in terms of inhomogeneity correction and should be used for tumour volume within the proximity of inhomogeneous structures.
Mohammad, Othman; Osser, David N
2014-01-01
This new algorithm for the pharmacotherapy of acute mania was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. The authors conducted a literature search in PubMed and reviewed key studies, other algorithms and guidelines, and their references. Treatments were prioritized considering three main considerations: (1) effectiveness in treating the current episode, (2) preventing potential relapses to depression, and (3) minimizing side effects over the short and long term. The algorithm presupposes that clinicians have made an accurate diagnosis, decided how to manage contributing medical causes (including substance misuse), discontinued antidepressants, and considered the patient's childbearing potential. We propose different algorithms for mixed and nonmixed mania. Patients with mixed mania may be treated first with a second-generation antipsychotic, of which the first choice is quetiapine because of its greater efficacy for depressive symptoms and episodes in bipolar disorder. Valproate and then either lithium or carbamazepine may be added. For nonmixed mania, lithium is the first-line recommendation. A second-generation antipsychotic can be added. Again, quetiapine is favored, but if quetiapine is unacceptable, risperidone is the next choice. Olanzapine is not considered a first-line treatment due to its long-term side effects, but it could be second-line. If the patient, whether mixed or nonmixed, is still refractory to the above medications, then depending on what has already been tried, consider carbamazepine, haloperidol, olanzapine, risperidone, and valproate first tier; aripiprazole, asenapine, and ziprasidone second tier; and clozapine third tier (because of its weaker evidence base and greater side effects). Electroconvulsive therapy may be considered at any point in the algorithm if the patient has a history of positive response or is intolerant of medications.
TH-E-BRE-04: An Online Replanning Algorithm for VMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahunbay, E; Li, X; Moreau, M
2014-06-15
Purpose: To develop a fast replanning algorithm based on segment aperture morphing (SAM) for online replanning of volumetric modulated arc therapy (VMAT) with flattening filtered (FF) and flattening filter free (FFF) beams. Methods: A software tool was developed to interface with a VMAT planning system ((Monaco, Elekta), enabling the output of detailed beam/machine parameters of original VMAT plans generated based on planning CTs for FF or FFF beams. A SAM algorithm, previously developed for fixed-beam IMRT, was modified to allow the algorithm to correct for interfractional variations (e.g., setup error, organ motion and deformation) by morphing apertures based on themore » geometric relationship between the beam's eye view of the anatomy from the planning CT and that from the daily CT for each control point. The algorithm was tested using daily CTs acquired using an in-room CT during daily IGRT for representative prostate cancer cases along with their planning CTs. The algorithm allows for restricted MLC leaf travel distance between control points of the VMAT delivery to prevent SAM from increasing leaf travel, and therefore treatment delivery time. Results: The VMAT plans adapted to the daily CT by SAM were found to improve the dosimetry relative to the IGRT repositioning plans for both FF and FFF beams. For the adaptive plans, the changes in leaf travel distance between control points were < 1cm for 80% of the control points with no restriction. When restricted to the original plans' maximum travel distance, the dosimetric effect was minimal. The adaptive plans were delivered successfully with similar delivery times as the original plans. The execution of the SAM algorithm was < 10 seconds. Conclusion: The SAM algorithm can quickly generate deliverable online-adaptive VMAT plans based on the anatomy of the day for both FF and FFF beams.« less
Automatic Tracking Algorithm in Coaxial Near-Infrared Laser Ablation Endoscope for Fetus Surgery
NASA Astrophysics Data System (ADS)
Hu, Yan; Yamanaka, Noriaki; Masamune, Ken
2014-07-01
This article reports a stable vessel object tracking method for the treatment of twin-to-twin transfusion syndrome based on our previous 2 DOF endoscope. During the treatment of laser coagulation, it is necessary to focus on the exact position of the target object, however it moves by the mother's respiratory motion and still remains a challenge to obtain and track the position precisely. In this article, an algorithm which uses features from accelerated segment test (FAST) to extract the features and optical flow as the object tracking method, is proposed to deal with above problem. Further, we experimentally simulate the movement due to the mother's respiration, and the results of position errors and similarity verify the effectiveness of the proposed tracking algorithm for laser ablation endoscopy in-vitro and under water considering two influential factors. At average, the errors are about 10 pixels and the similarity over 0.92 are obtained in the experiments.
Strzelczyk, Adam; Ansorge, Sonja; Hapfelmeier, Jana; Bonthapally, Vijayveer; Erder, M Haim; Rosenow, Felix
2017-09-01
Super-refractory status epilepticus (SRSE) is a severe condition in which a patient in status epilepticus (SE) for ≥24 h does not respond to first-, second-, or third-line therapy. The economic impact of SRSE treatment remains unclear. A health insurance research database was used for a population-based estimation of SRSE-associated inpatient costs, length of stay, and mortality in Germany. An algorithm using International Classification of Diseases, 10th Edition coding and treatment parameters identified and classified patients in a German statutory health insurance database covering admissions from 2008 to 2013 as having refractory SE (RSE) or SRSE. Admissions data in our study refer to these classifications. Associated patient data included costs, procedures, and demographics. The algorithm identified 2,585 (all type) SE admissions, classified as 1,655 nonrefractory SE (64%), 592 (22.9%) RSE, and 338 (13.1%) SRSE, producing database incidence rates of 15.0 in 100,000, 5.2 in 100,000, and 3.0 in 100,000 per year, respectively. Median cost per admission was €4,063 for nonrefractory SE, €4,581 (p < 0.001) for RSE, and €32,706 (p < 0.001) for SRSE. Median length of stay varied significantly between 8 days (mean = 13.6) in nonrefractory SE, 14 days in RSE, and up to 37 days in SRSE. Discharge mortality increased from 9.6% in nonrefractory SE to 15.0% (p < 0.001) in RSE and 39.9% (p < 0.001) in SRSE. This study evaluated the hospital treatment costs associated with admissions classified by the algorithm as SRSE in Germany. SRSE represented 13% of all SE admissions, but resulted in 56% of all SE-related costs. The lack of approved treatments and limited number of evidence-based treatment guidelines highlight the need for further evaluations of the SRSE burden of illness and the potential for further optimization of treatments for SRSE. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Gupta, Sumit; Nathan, Paul C; Baxter, Nancy N; Lau, Cindy; Daly, Corinne; Pole, Jason D
2018-06-01
Despite the importance of estimating population level cancer outcomes, most registries do not collect critical events such as relapse. Attempts to use health administrative data to identify these events have focused on older adults and have been mostly unsuccessful. We developed and tested administrative data-based algorithms in a population-based cohort of adolescents and young adults with cancer. We identified all Ontario adolescents and young adults 15-21 years old diagnosed with leukemia, lymphoma, sarcoma, or testicular cancer between 1992-2012. Chart abstraction determined the end of initial treatment (EOIT) date and subsequent cancer-related events (progression, relapse, second cancer). Linkage to population-based administrative databases identified fee and procedure codes indicating cancer treatment or palliative care. Algorithms determining EOIT based on a time interval free of treatment-associated codes, and new cancer-related events based on billing codes, were compared with chart-abstracted data. The cohort comprised 1404 patients. Time periods free of treatment-associated codes did not validly identify EOIT dates; using subsequent codes to identify new cancer events was thus associated with low sensitivity (56.2%). However, using administrative data codes that occurred after the EOIT date based on chart abstraction, the first cancer-related event was identified with excellent validity (sensitivity, 87.0%; specificity, 93.3%; positive predictive value, 81.5%; negative predictive value, 95.5%). Although administrative data alone did not validly identify cancer-related events, administrative data in combination with chart collected EOIT dates was associated with excellent validity. The collection of EOIT dates by cancer registries would significantly expand the potential of administrative data linkage to assess cancer outcomes.
Prepatellar and olecranon bursitis: literature review and development of a treatment algorithm.
Baumbach, Sebastian F; Lobo, Christopher M; Badyine, Ilias; Mutschler, Wolf; Kanz, Karl-Georg
2014-03-01
Olecranon bursitis and prepatellar bursitis are common entities, with a minimum annual incidence of 10/100,000, predominantly affecting male patients (80 %) aged 40-60 years. Approximately 1/3 of cases are septic (SB) and 2/3 of cases are non-septic (NSB), with substantial variations in treatment regimens internationally. The aim of the study was the development of a literature review-based treatment algorithm for prepatellar and olecranon bursitis. Following a systematic review of Pubmed, the Cochrane Library, textbooks of emergency medicine and surgery, and a manual reference search, 52 relevant papers were identified. The initial differentiation between SB and NSB was based on clinical presentation, bursal aspirate, and blood sampling analysis. Physical findings suggesting SB were fever >37.8 °C, prebursal temperature difference greater 2.2 °C, and skin lesions. Relevant findings for bursal aspirate were purulent aspirate, fluid-to-serum glucose ratio <50 %, white cell count >3,000 cells/μl, polymorphonuclear cells >50 %, positive Gram staining, and positive culture. General treatment measures for SB and NSB consist of bursal aspiration, NSAIDs, and PRICE. For patients with confirmed NSB and high athletic or occupational demands, intrabursal steroid injection may be performed. In the case of SB, antibiotic therapy should be initiated. Surgical treatment, i.e., incision, drainage, or bursectomy, should be restricted to severe, refractory, or chronic/recurrent cases. The available evidence did not support the central European concept of immediate bursectomy in cases of SB. A conservative treatment regimen should be pursued, following bursal aspirate-based differentiation between SB and NSB.
Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society.
Lerario, Antonio C; Chacra, Antonio R; Pimazoni-Netto, Augusto; Malerbi, Domingos; Gross, Jorge L; Oliveira, José Ep; Gomes, Marilia B; Santos, Raul D; Fonseca, Reine Mc; Betti, Roberto; Raduan, Roberto
2010-06-08
The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009.An additional source used, as a basis for the new algorithm, was to assess the acceptability of controversial arguments published in international literature, through a panel of renowned Brazilian specialists. Thirty controversial arguments in diabetes have been selected with their respective references, where each argument was assessed and scored according to its acceptability level and personal conviction of each member of the evaluation panel.This methodology was adapted using a similar approach to the one adopted in the recent position statement by the American College of Cardiology on coronary revascularization, of which not only cardiologists took part, but also specialists of other related areas.
MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmad, M; Nourzadeh, H; Neal, B
2016-06-15
Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) ifmore » radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with EPID images prediction algorithm, was developed. The system is capable of performing verifications between frames acquisitions and identifying source(s) of any out-of-tolerance variations. This work was supported in part by Varian Medical Systems.« less
SU-F-T-261: Reconstruction of Initial Photon Fluence Based On EPID Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seliger, T; Engenhart-Cabillic, R; Czarnecki, D
2016-06-15
Purpose: Verifying an algorithm to reconstruct relative initial photon fluence for clinical use. Clinical EPID and CT images were acquired to reconstruct an external photon radiation treatment field. The reconstructed initial photon fluence could be used to verify the treatment or calculate the applied dose to the patient. Methods: The acquired EPID images were corrected for scatter caused by the patient and the EPID with an iterative reconstruction algorithm. The transmitted photon fluence behind the patient was calculated subsequently. Based on the transmitted fluence the initial photon fluence was calculated using a back-projection algorithm which takes the patient geometry andmore » its energy dependent linear attenuation into account. This attenuation was gained from the acquired cone-beam CT or the planning CT by calculating a water-equivalent radiological thickness for each irradiation direction. To verify the algorithm an inhomogeneous phantom consisting of three inhomogeneities was irradiated by a static 6 MV photon field and compared to a reference flood field image. Results: The mean deviation between the reconstructed relative photon fluence for the inhomogeneous phantom and the flood field EPID image was 3% rising up to 7% for off-axis fluence. This was probably caused by the used clinical EPID calibration, which flattens the inhomogeneous fluence profile of the beam. Conclusion: In this clinical experiment the algorithm achieved good results in the center of the field while it showed high deviation of the lateral fluence. This could be reduced by optimizing the EPID calibration, considering the off-axis differential energy response. In further progress this and other aspects of the EPID, eg. field size dependency, CT and dose calibration have to be studied to realize a clinical acceptable accuracy of 2%.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kieselmann, J; Bartzsch, S; Oelfke, U
Purpose: Microbeam Radiation Therapy is a preclinical method in radiation oncology that modulates radiation fields on a micrometre scale. Dose calculation is challenging due to arising dose gradients and therapeutically important dose ranges. Monte Carlo (MC) simulations, often used as gold standard, are computationally expensive and hence too slow for the optimisation of treatment parameters in future clinical applications. On the other hand, conventional kernel based dose calculation leads to inaccurate results close to material interfaces. The purpose of this work is to overcome these inaccuracies while keeping computation times low. Methods: A point kernel superposition algorithm is modified tomore » account for tissue inhomogeneities. Instead of conventional ray tracing approaches, methods from differential geometry are applied and the space around the primary photon interaction is locally warped. The performance of this approach is compared to MC simulations and a simple convolution algorithm (CA) for two different phantoms and photon spectra. Results: While peak doses of all dose calculation methods agreed within less than 4% deviations, the proposed approach surpassed a simple convolution algorithm in accuracy by a factor of up to 3 in the scatter dose. In a treatment geometry similar to possible future clinical situations differences between Monte Carlo and the differential geometry algorithm were less than 3%. At the same time the calculation time did not exceed 15 minutes. Conclusion: With the developed method it was possible to improve the dose calculation based on the CA method with respect to accuracy especially at sharp tissue boundaries. While the calculation is more extensive than for the CA method and depends on field size, the typical calculation time for a 20×20 mm{sup 2} field on a 3.4 GHz and 8 GByte RAM processor remained below 15 minutes. Parallelisation and optimisation of the algorithm could lead to further significant calculation time reductions.« less
Decision-making for complex scapula and ipsilateral clavicle fractures: a review.
Hess, Florian; Zettl, Ralph; Smolen, Daniel; Knoth, Christoph
2018-03-23
Complex scapula with ipsilateral clavicle fracures remains a challange and treatment recommendations are still missing. This review provides an overview of the evolution of the definition, classification and treatment strategies for complex scapula and ipsilateral clavicle fractures. As with other rare conditions, consensus has not been reached on the most suitable management strategies to treat these patients. The aim of this review is twofold: to compile and summarize the currently available literature on this topic, and to recommend treatment approaches. Included in the review are the following topics: biomechanics of scapula and ipsilateral clavicle fractures, preoperative radiological evaluation, surgical treatment of the clavicle only, surgical treatment of both the clavicle and scapula, and nonsurgical treatment options. A decision-making algorithm is proposed for different treatment strategies based on pre-operative parameters, and an example of a case treated our institution is presented to illustrate use of the algorithm. The role of instability in complex scapula with ipsilateral clavicle fractures remains unclear. The question of stability is preoperatively less relevant than the question of whether the dislocated fragments lead to compromised shoulder function.
Brass, Eric P; Vassil, Theodore; Replogle, Amy; Hwang, Peggy; Rusche, Steven; Shiffman, Saul; Levine, Jeffrey G
2008-05-15
Access to over-the-counter (OTC) statins has the potential to improve public health by reducing cardiovascular events. The Self Evaluation of Lovastatin to Enhance Cholesterol Treatment (SELECT) Study was designed to assess consumers' ability to self-select for treatment with lovastatin in an unsupervised setting. Subjects examined proposed OTC lovastatin cartons with labels that detailed an algorithm for self-selection based on age, lipid profile, and cardiovascular risk factors. Subjects viewed a carton with either a low-density lipoprotein cholesterol-based self-selection algorithm or one based on total cholesterol. Labels also contained warnings against use based on health conditions that might increase the risk of adverse events. Subjects were asked if the drug was appropriate for their use (self-assessment) and whether they would like to purchase the drug (purchase decision). A total of 1,326 consumers provided self-assessment decisions. After viewing the low-density lipoprotein cholesterol-based label, 82%, 36%, and 82% of those who self-assessed that the drug was appropriate for their use were correct with respect to the age, lipid, and risk-factor criteria, respectively. Corresponding numbers for the total cholesterol algorithm were 85%, 50% and 75%. Almost 90% of women aged <55 years who evaluated the drug indicated the drug was not right for them, and women in this age group made up only 9% of the total group of subjects who believed the drug was appropriate for their use. The label was also effective in discouraging use by women who were or may become pregnant, consumers with liver disease, and those with potential drug interactions. In conclusion, SELECT showed that consumers could use an OTC drug label in an unsupervised setting to appropriately self-select for self-management of their cholesterol with lovastatin.
Terra, Ricardo Mingarini; Waisberg, Daniel Reis; de Almeida, José Luiz Jesus; Devido, Marcela Santana; Pêgo-Fernandes, Paulo Manuel; Jatene, Fabio Biscegli
2012-01-01
OBJECTIVE: We aimed to evaluate whether the inclusion of videothoracoscopy in a pleural empyema treatment algorithm would change the clinical outcome of such patients. METHODS: This study performed quality-improvement research. We conducted a retrospective review of patients who underwent pleural decortication for pleural empyema at our institution from 2002 to 2008. With the old algorithm (January 2002 to September 2005), open decortication was the procedure of choice, and videothoracoscopy was only performed in certain sporadic mid-stage cases. With the new algorithm (October 2005 to December 2008), videothoracoscopy became the first-line treatment option, whereas open decortication was only performed in patients with a thick pleural peel (>2 cm) observed by chest scan. The patients were divided into an old algorithm (n = 93) and new algorithm (n = 113) group and compared. The main outcome variables assessed included treatment failure (pleural space reintervention or death up to 60 days after medical discharge) and the occurrence of complications. RESULTS: Videothoracoscopy and open decortication were performed in 13 and 80 patients from the old algorithm group and in 81 and 32 patients from the new algorithm group, respectively (p<0.01). The patients in the new algorithm group were older (41±1 vs. 46.3±16.7 years, p = 0.014) and had higher Charlson Comorbidity Index scores [0(0-3) vs. 2(0-4), p = 0.032]. The occurrence of treatment failure was similar in both groups (19.35% vs. 24.77%, p = 0.35), although the complication rate was lower in the new algorithm group (48.3% vs. 33.6%, p = 0.04). CONCLUSIONS: The wider use of videothoracoscopy in pleural empyema treatment was associated with fewer complications and unaltered rates of mortality and reoperation even though more severely ill patients were subjected to videothoracoscopic surgery. PMID:22760892
Medial elbow injury in young throwing athletes
Gregory, Bonnie; Nyland, John
2013-01-01
Summary This report reviews the anatomy, overhead throwing biomechanics, injury mechanism and incidence, physical examination and diagnosis, diagnostic imaging and conservative treatment of medial elbow injuries in young throwing athletes. Based on the information a clinical management decision-making algorithm is presented. PMID:23888291
Li, Haisen S; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S; Chetty, Indrin J
2014-01-06
The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.
NASA Astrophysics Data System (ADS)
Li, Haisen S.; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S.; Chetty, Indrin J.
2014-01-01
The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.
Patch-based Adaptive Mesh Refinement for Multimaterial Hydrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lomov, I; Pember, R; Greenough, J
2005-10-18
We present a patch-based direct Eulerian adaptive mesh refinement (AMR) algorithm for modeling real equation-of-state, multimaterial compressible flow with strength. Our approach to AMR uses a hierarchical, structured grid approach first developed by (Berger and Oliger 1984), (Berger and Oliger 1984). The grid structure is dynamic in time and is composed of nested uniform rectangular grids of varying resolution. The integration scheme on the grid hierarchy is a recursive procedure in which the coarse grids are advanced, then the fine grids are advanced multiple steps to reach the same time, and finally the coarse and fine grids are synchronized tomore » remove conservation errors during the separate advances. The methodology presented here is based on a single grid algorithm developed for multimaterial gas dynamics by (Colella et al. 1993), refined by(Greenough et al. 1995), and extended to the solution of solid mechanics problems with significant strength by (Lomov and Rubin 2003). The single grid algorithm uses a second-order Godunov scheme with an approximate single fluid Riemann solver and a volume-of-fluid treatment of material interfaces. The method also uses a non-conservative treatment of the deformation tensor and an acoustic approximation for shear waves in the Riemann solver. This departure from a strict application of the higher-order Godunov methodology to the equation of solid mechanics is justified due to the fact that highly nonlinear behavior of shear stresses is rare. This algorithm is implemented in two codes, Geodyn and Raptor, the latter of which is a coupled rad-hydro code. The present discussion will be solely concerned with hydrodynamics modeling. Results from a number of simulations for flows with and without strength will be presented.« less
Diagnosis and treatment of acute ankle injuries: development of an evidence-based algorithm
Polzer, Hans; Kanz, Karl Georg; Prall, Wolf Christian; Haasters, Florian; Ockert, Ben; Mutschler, Wolf; Grote, Stefan
2011-01-01
Acute ankle injuries are among the most common injuries in emergency departments. However, there are still no standardized examination procedures or evidence-based treatment. Therefore, the aim of this study was to systematically search the current literature, classify the evidence, and develop an algorithm for the diagnosis and treatment of acute ankle injuries. We systematically searched PubMed and the Cochrane Database for randomized controlled trials, meta-analyses, systematic reviews or, if applicable, observational studies and classified them according to their level of evidence. According to the currently available literature, the following recommendations have been formulated: i) the Ottawa Ankle/Foot Rule should be applied in order to rule out fractures; ii) physical examination is sufficient for diagnosing injuries to the lateral ligament complex; iii) classification into stable and unstable injuries is applicable and of clinical importance; iv) the squeeze-, crossed leg- and external rotation test are indicative for injuries of the syndesmosis; v) magnetic resonance imaging is recommended to verify injuries of the syndesmosis; vi) stable ankle sprains have a good prognosis while for unstable ankle sprains, conservative treatment is at least as effective as operative treatment without the related possible complications; vii) early functional treatment leads to the fastest recovery and the least rate of reinjury; viii) supervised rehabilitation reduces residual symptoms and re-injuries. Taken these recommendations into account, we present an applicable and evidence-based, step by step, decision pathway for the diagnosis and treatment of acute ankle injuries, which can be implemented in any emergency department or doctor's practice. It provides quality assurance for the patient and promotes confidence in the attending physician. PMID:22577506
Combs, Stephanie E; Debus, Jürgen; Feick, Günter; Hadaschik, Boris; Hohenfellner, Markus; Schüle, Roland; Zacharias, Jens-Peter; Schwardt, Malte
2014-11-04
A brainstorming and consensus meeting organized by the German Cancer Aid focused on modern treatment of prostate cancer and promising innovative techniques and research areas. Besides optimization of screening algorithms, molecular-based stratification and individually tailored treatment regimens will be the future of multimodal prostate cancer management. Effective interdisciplinary structures, including biobanking and data collection mechanisms are the basis for such developments.
2011-01-01
Background Envenomation by crotaline snakes (rattlesnake, cottonmouth, copperhead) is a complex, potentially lethal condition affecting thousands of people in the United States each year. Treatment of crotaline envenomation is not standardized, and significant variation in practice exists. Methods A geographically diverse panel of experts was convened for the purpose of deriving an evidence-informed unified treatment algorithm. Research staff analyzed the extant medical literature and performed targeted analyses of existing databases to inform specific clinical decisions. A trained external facilitator used modified Delphi and structured consensus methodology to achieve consensus on the final treatment algorithm. Results A unified treatment algorithm was produced and endorsed by all nine expert panel members. This algorithm provides guidance about clinical and laboratory observations, indications for and dosing of antivenom, adjunctive therapies, post-stabilization care, and management of complications from envenomation and therapy. Conclusions Clinical manifestations and ideal treatment of crotaline snakebite differ greatly, and can result in severe complications. Using a modified Delphi method, we provide evidence-informed treatment guidelines in an attempt to reduce variation in care and possibly improve clinical outcomes. PMID:21291549
Han, Miaomiao; Guo, Zhirong; Liu, Haifeng; Li, Qinghua
2018-05-01
Tomographic Gamma Scanning (TGS) is a method used for the nondestructive assay of radioactive wastes. In TGS, the actual irregular edge voxels are regarded as regular cubic voxels in the traditional treatment method. In this study, in order to improve the performance of TGS, a novel edge treatment method is proposed that considers the actual shapes of these voxels. The two different edge voxel treatment methods were compared by computing the pixel-level relative errors and normalized mean square errors (NMSEs) between the reconstructed transmission images and the ideal images. Both methods were coupled with two different interative algorithms comprising Algebraic Reconstruction Technique (ART) with a non-negativity constraint and Maximum Likelihood Expectation Maximization (MLEM). The results demonstrated that the traditional method for edge voxel treatment can introduce significant error and that the real irregular edge voxel treatment method can improve the performance of TGS by obtaining better transmission reconstruction images. With the real irregular edge voxel treatment method, MLEM algorithm and ART algorithm can be comparable when assaying homogenous matrices, but MLEM algorithm is superior to ART algorithm when assaying heterogeneous matrices. Copyright © 2018 Elsevier Ltd. All rights reserved.
Algorithm for lung cancer detection based on PET/CT images
NASA Astrophysics Data System (ADS)
Saita, Shinsuke; Ishimatsu, Keita; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Ohtsuka, Hideki; Nishitani, Hiromu; Ohmatsu, Hironobu; Eguchi, Kenji; Kaneko, Masahiro; Moriyama, Noriyuki
2009-02-01
The five year survival rate of the lung cancer is low with about twenty-five percent. In addition it is an obstinate lung cancer wherein three out of four people die within five years. Then, the early stage detection and treatment of the lung cancer are important. Recently, we can obtain CT and PET image at the same time because PET/CT device has been developed. PET/CT is possible for a highly accurate cancer diagnosis because it analyzes quantitative shape information from CT image and FDG distribution from PET image. However, neither benign-malignant classification nor staging intended for lung cancer have been established still enough by using PET/CT images. In this study, we detect lung nodules based on internal organs extracted from CT image, and we also develop algorithm which classifies benignmalignant and metastatic or non metastatic lung cancer using lung structure and FDG distribution(one and two hour after administering FDG). We apply the algorithm to 59 PET/CT images (malignant 43 cases [Ad:31, Sq:9, sm:3], benign 16 cases) and show the effectiveness of this algorithm.
A Comparison of Three PML Treatments for CAA (and CFD)
NASA Technical Reports Server (NTRS)
Goodrich, John W.
2008-01-01
In this paper we compare three Perfectly Matched Layer (PML) treatments by means of a series of numerical experiments, using common numerical algorithms, computational grids, and code implementations. These comparisons are with the Linearized Euler Equations, for base uniform base flow. We see that there are two very good PML candidates, and that can both control the introduced error. Furthermore, we also show that corners can be handled with essentially no increase in the introduced error, and that with a good PML, the outer boundary is the most significant source of err
Investigation of Convection and Pressure Treatment with Splitting Techniques
NASA Technical Reports Server (NTRS)
Thakur, Siddharth; Shyy, Wei; Liou, Meng-Sing
1995-01-01
Treatment of convective and pressure fluxes in the Euler and Navier-Stokes equations using splitting formulas for convective velocity and pressure is investigated. Two schemes - controlled variation scheme (CVS) and advection upstream splitting method (AUSM) - are explored for their accuracy in resolving sharp gradients in flows involving moving or reflecting shock waves as well as a one-dimensional combusting flow with a strong heat release source term. For two-dimensional compressible flow computations, these two schemes are implemented in one of the pressure-based algorithms, whose very basis is the separate treatment of convective and pressure fluxes. For the convective fluxes in the momentum equations as well as the estimation of mass fluxes in the pressure correction equation (which is derived from the momentum and continuity equations) of the present algorithm, both first- and second-order (with minmod limiter) flux estimations are employed. Some issues resulting from the conventional use in pressure-based methods of a staggered grid, for the location of velocity components and pressure, are also addressed. Using the second-order fluxes, both CVS and AUSM type schemes exhibit sharp resolution. Overall, the combination of upwinding and splitting for the convective and pressure fluxes separately exhibits robust performance for a variety of flows and is particularly amenable for adoption in pressure-based methods.
Sub-second pencil beam dose calculation on GPU for adaptive proton therapy.
da Silva, Joakim; Ansorge, Richard; Jena, Rajesh
2015-06-21
Although proton therapy delivered using scanned pencil beams has the potential to produce better dose conformity than conventional radiotherapy, the created dose distributions are more sensitive to anatomical changes and patient motion. Therefore, the introduction of adaptive treatment techniques where the dose can be monitored as it is being delivered is highly desirable. We present a GPU-based dose calculation engine relying on the widely used pencil beam algorithm, developed for on-line dose calculation. The calculation engine was implemented from scratch, with each step of the algorithm parallelized and adapted to run efficiently on the GPU architecture. To ensure fast calculation, it employs several application-specific modifications and simplifications, and a fast scatter-based implementation of the computationally expensive kernel superposition step. The calculation time for a skull base treatment plan using two beam directions was 0.22 s on an Nvidia Tesla K40 GPU, whereas a test case of a cubic target in water from the literature took 0.14 s to calculate. The accuracy of the patient dose distributions was assessed by calculating the γ-index with respect to a gold standard Monte Carlo simulation. The passing rates were 99.2% and 96.7%, respectively, for the 3%/3 mm and 2%/2 mm criteria, matching those produced by a clinical treatment planning system.
Altomare, Cristina; Guglielmann, Raffaella; Riboldi, Marco; Bellazzi, Riccardo; Baroni, Guido
2015-02-01
In high precision photon radiotherapy and in hadrontherapy, it is crucial to minimize the occurrence of geometrical deviations with respect to the treatment plan in each treatment session. To this end, point-based infrared (IR) optical tracking for patient set-up quality assessment is performed. Such tracking depends on external fiducial points placement. The main purpose of our work is to propose a new algorithm based on simulated annealing and augmented Lagrangian pattern search (SAPS), which is able to take into account prior knowledge, such as spatial constraints, during the optimization process. The SAPS algorithm was tested on data related to head and neck and pelvic cancer patients, and that were fitted with external surface markers for IR optical tracking applied for patient set-up preliminary correction. The integrated algorithm was tested considering optimality measures obtained with Computed Tomography (CT) images (i.e. the ratio between the so-called target registration error and fiducial registration error, TRE/FRE) and assessing the marker spatial distribution. Comparison has been performed with randomly selected marker configuration and with the GETS algorithm (Genetic Evolutionary Taboo Search), also taking into account the presence of organs at risk. The results obtained with SAPS highlight improvements with respect to the other approaches: (i) TRE/FRE ratio decreases; (ii) marker distribution satisfies both marker visibility and spatial constraints. We have also investigated how the TRE/FRE ratio is influenced by the number of markers, obtaining significant TRE/FRE reduction with respect to the random configurations, when a high number of markers is used. The SAPS algorithm is a valuable strategy for fiducial configuration optimization in IR optical tracking applied for patient set-up error detection and correction in radiation therapy, showing that taking into account prior knowledge is valuable in this optimization process. Further work will be focused on the computational optimization of the SAPS algorithm toward fast point-of-care applications. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nanayakkara, Nuwan D.; Samarabandu, Jagath; Fenster, Aaron
2006-04-01
Estimation of prostate location and volume is essential in determining a dose plan for ultrasound-guided brachytherapy, a common prostate cancer treatment. However, manual segmentation is difficult, time consuming and prone to variability. In this paper, we present a semi-automatic discrete dynamic contour (DDC) model based image segmentation algorithm, which effectively combines a multi-resolution model refinement procedure together with the domain knowledge of the image class. The segmentation begins on a low-resolution image by defining a closed DDC model by the user. This contour model is then deformed progressively towards higher resolution images. We use a combination of a domain knowledge based fuzzy inference system (FIS) and a set of adaptive region based operators to enhance the edges of interest and to govern the model refinement using a DDC model. The automatic vertex relocation process, embedded into the algorithm, relocates deviated contour points back onto the actual prostate boundary, eliminating the need of user interaction after initialization. The accuracy of the prostate boundary produced by the proposed algorithm was evaluated by comparing it with a manually outlined contour by an expert observer. We used this algorithm to segment the prostate boundary in 114 2D transrectal ultrasound (TRUS) images of six patients scheduled for brachytherapy. The mean distance between the contours produced by the proposed algorithm and the manual outlines was 2.70 ± 0.51 pixels (0.54 ± 0.10 mm). We also showed that the algorithm is insensitive to variations of the initial model and parameter values, thus increasing the accuracy and reproducibility of the resulting boundaries in the presence of noise and artefacts.
Berthon, Beatrice; Marshall, Christopher; Evans, Mererid; Spezi, Emiliano
2016-07-07
Accurate and reliable tumour delineation on positron emission tomography (PET) is crucial for radiotherapy treatment planning. PET automatic segmentation (PET-AS) eliminates intra- and interobserver variability, but there is currently no consensus on the optimal method to use, as different algorithms appear to perform better for different types of tumours. This work aimed to develop a predictive segmentation model, trained to automatically select and apply the best PET-AS method, according to the tumour characteristics. ATLAAS, the automatic decision tree-based learning algorithm for advanced segmentation is based on supervised machine learning using decision trees. The model includes nine PET-AS methods and was trained on a 100 PET scans with known true contour. A decision tree was built for each PET-AS algorithm to predict its accuracy, quantified using the Dice similarity coefficient (DSC), according to the tumour volume, tumour peak to background SUV ratio and a regional texture metric. The performance of ATLAAS was evaluated for 85 PET scans obtained from fillable and printed subresolution sandwich phantoms. ATLAAS showed excellent accuracy across a wide range of phantom data and predicted the best or near-best segmentation algorithm in 93% of cases. ATLAAS outperformed all single PET-AS methods on fillable phantom data with a DSC of 0.881, while the DSC for H&N phantom data was 0.819. DSCs higher than 0.650 were achieved in all cases. ATLAAS is an advanced automatic image segmentation algorithm based on decision tree predictive modelling, which can be trained on images with known true contour, to predict the best PET-AS method when the true contour is unknown. ATLAAS provides robust and accurate image segmentation with potential applications to radiation oncology.
NASA Astrophysics Data System (ADS)
Berthon, Beatrice; Marshall, Christopher; Evans, Mererid; Spezi, Emiliano
2016-07-01
Accurate and reliable tumour delineation on positron emission tomography (PET) is crucial for radiotherapy treatment planning. PET automatic segmentation (PET-AS) eliminates intra- and interobserver variability, but there is currently no consensus on the optimal method to use, as different algorithms appear to perform better for different types of tumours. This work aimed to develop a predictive segmentation model, trained to automatically select and apply the best PET-AS method, according to the tumour characteristics. ATLAAS, the automatic decision tree-based learning algorithm for advanced segmentation is based on supervised machine learning using decision trees. The model includes nine PET-AS methods and was trained on a 100 PET scans with known true contour. A decision tree was built for each PET-AS algorithm to predict its accuracy, quantified using the Dice similarity coefficient (DSC), according to the tumour volume, tumour peak to background SUV ratio and a regional texture metric. The performance of ATLAAS was evaluated for 85 PET scans obtained from fillable and printed subresolution sandwich phantoms. ATLAAS showed excellent accuracy across a wide range of phantom data and predicted the best or near-best segmentation algorithm in 93% of cases. ATLAAS outperformed all single PET-AS methods on fillable phantom data with a DSC of 0.881, while the DSC for H&N phantom data was 0.819. DSCs higher than 0.650 were achieved in all cases. ATLAAS is an advanced automatic image segmentation algorithm based on decision tree predictive modelling, which can be trained on images with known true contour, to predict the best PET-AS method when the true contour is unknown. ATLAAS provides robust and accurate image segmentation with potential applications to radiation oncology.
A return-to-sport algorithm for acute hamstring injuries.
Mendiguchia, Jurdan; Brughelli, Matt
2011-02-01
Acute hamstring injuries are the most prevalent muscle injuries reported in sport. Despite a thorough and concentrated effort to prevent and rehabilitate hamstring injuries, injury occurrence and re-injury rates have not improved over the past 28 years. This failure is most likely due to the following: 1) an over-reliance on treating the symptoms of injury, such as subjective measures of "pain", with drugs and interventions; 2) the risk factors investigated for hamstring injuries have not been related to the actual movements that cause hamstring injuries i.e. not functional; and, 3) a multi-factorial approach to assessment and treatment has not been utilized. The purpose of this clinical commentary is to introduce a model for progression through a return-to-sport rehabilitation following an acute hamstring injury. This model is developed from objective and quantifiable tests (i.e. clinical and functional tests) that are structured into a step-by-step algorithm. In addition, each step in the algorithm includes a treatment protocol. These protocols are meant to help the athlete to improve through each phase safely so that they can achieve the desired goals and progress through the algorithm and back to their chosen sport. We hope that this algorithm can serve as a foundation for future evidence based research and aid in the development of new objective and quantifiable testing methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
Stitzel, Joel D; Weaver, Ashley A; Talton, Jennifer W; Barnard, Ryan T; Schoell, Samantha L; Doud, Andrea N; Martin, R Shayn; Meredith, J Wayne
2016-06-01
Advanced Automatic Crash Notification algorithms use vehicle telemetry measurements to predict risk of serious motor vehicle crash injury. The objective of the study was to develop an Advanced Automatic Crash Notification algorithm to reduce response time, increase triage efficiency, and improve patient outcomes by minimizing undertriage (<5%) and overtriage (<50%), as recommended by the American College of Surgeons. A list of injuries associated with a patient's need for Level I/II trauma center treatment known as the Target Injury List was determined using an approach based on 3 facets of injury: severity, time sensitivity, and predictability. Multivariable logistic regression was used to predict an occupant's risk of sustaining an injury on the Target Injury List based on crash severity and restraint factors for occupants in the National Automotive Sampling System - Crashworthiness Data System 2000-2011. The Advanced Automatic Crash Notification algorithm was optimized and evaluated to minimize triage rates, per American College of Surgeons recommendations. The following rates were achieved: <50% overtriage and <5% undertriage in side impacts and 6% to 16% undertriage in other crash modes. Nationwide implementation of our algorithm is estimated to improve triage decisions for 44% of undertriaged and 38% of overtriaged occupants. Annually, this translates to more appropriate care for >2,700 seriously injured occupants and reduces unnecessary use of trauma center resources for >162,000 minimally injured occupants. The algorithm could be incorporated into vehicles to inform emergency personnel of recommended motor vehicle crash triage decisions. Lower under- and overtriage was achieved, and nationwide implementation of the algorithm would yield improved triage decision making for an estimated 165,000 occupants annually. Copyright © 2016. Published by Elsevier Inc.
Xu, Hang; Su, Shi; Tang, Wuji; Wei, Meng; Wang, Tao; Wang, Dongjin; Ge, Weihong
2015-09-01
A large number of warfarin pharmacogenetics algorithms have been published. Our research was aimed to evaluate the performance of the selected pharmacogenetic algorithms in patients with surgery of heart valve replacement and heart valvuloplasty during the phase of initial and stable anticoagulation treatment. 10 pharmacogenetic algorithms were selected by searching PubMed. We compared the performance of the selected algorithms in a cohort of 193 patients during the phase of initial and stable anticoagulation therapy. Predicted dose was compared to therapeutic dose by using a predicted dose percentage that falls within 20% threshold of the actual dose (percentage within 20%) and mean absolute error (MAE). The average warfarin dose for patients was 3.05±1.23mg/day for initial treatment and 3.45±1.18mg/day for stable treatment. The percentages of the predicted dose within 20% of the therapeutic dose were 44.0±8.8% and 44.6±9.7% for the initial and stable phases, respectively. The MAEs of the selected algorithms were 0.85±0.18mg/day and 0.93±0.19mg/day, respectively. All algorithms had better performance in the ideal group than in the low dose and high dose groups. The only exception is the Wadelius et al. algorithm, which had better performance in the high dose group. The algorithms had similar performance except for the Wadelius et al. and Miao et al. algorithms, which had poor accuracy in our study cohort. The Gage et al. algorithm had better performance in both phases of initial and stable treatment. Algorithms had relatively higher accuracy in the >50years group of patients on the stable phase. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Kok Foong; Patterson, Robert I.A.; Wagner, Wolfgang
2015-12-15
Graphical abstract: -- Highlights: •Problems concerning multi-compartment population balance equations are studied. •A class of fragmentation weight transfer functions is presented. •Three stochastic weighted algorithms are compared against the direct simulation algorithm. •The numerical errors of the stochastic solutions are assessed as a function of fragmentation rate. •The algorithms are applied to a multi-dimensional granulation model. -- Abstract: This paper introduces stochastic weighted particle algorithms for the solution of multi-compartment population balance equations. In particular, it presents a class of fragmentation weight transfer functions which are constructed such that the number of computational particles stays constant during fragmentation events. Themore » weight transfer functions are constructed based on systems of weighted computational particles and each of it leads to a stochastic particle algorithm for the numerical treatment of population balance equations. Besides fragmentation, the algorithms also consider physical processes such as coagulation and the exchange of mass with the surroundings. The numerical properties of the algorithms are compared to the direct simulation algorithm and an existing method for the fragmentation of weighted particles. It is found that the new algorithms show better numerical performance over the two existing methods especially for systems with significant amount of large particles and high fragmentation rates.« less
Troeller, Almut; Garny, Sylvia; Pachmann, Sophia; Kantz, Steffi; Gerum, Sabine; Manapov, Farkhad; Ganswindt, Ute; Belka, Claus; Söhn, Matthias
2015-02-22
The use of high accuracy dose calculation algorithms, such as Monte Carlo (MC) and Collapsed Cone (CC) determine dose in inhomogeneous tissue more accurately than pencil beam (PB) algorithms. However, prescription protocols based on clinical experience with PB are often used for treatment plans calculated with CC. This may lead to treatment plans with changes in field size (FS) and changes in dose to organs at risk (OAR), especially for small tumor volumes in lung tissue treated with SABR. We re-evaluated 17 3D-conformal treatment plans for small intrapulmonary lesions with a prescription of 60 Gy in fractions of 7.5 Gy to the 80% isodose. All treatment plans were initially calculated in Oncentra MasterPlan® using a PB algorithm and recalculated with CC (CCre-calc). Furthermore, a CC-based plan with coverage similar to the PB plan (CCcov) and a CC plan with relaxed coverage criteria (CCclin), were created. The plans were analyzed in terms of Dmean, Dmin, Dmax and coverage for GTV, PTV and ITV. Changes in mean lung dose (MLD), V10Gy and V20Gy were evaluated for the lungs. The re-planned CC plans were compared to the original PB plans regarding changes in total monitor units (MU) and average FS. When PB plans were recalculated with CC, the average V60Gy of GTV, ITV and PTV decreased by 13.2%, 19.9% and 41.4%, respectively. Average Dmean decreased by 9% (GTV), 11.6% (ITV) and 14.2% (PTV). Dmin decreased by 18.5% (GTV), 21.3% (ITV) and 17.5% (PTV). Dmax declined by 7.5%. PTV coverage correlated with PTV volume (p < 0.001). MLD, V10Gy, and V20Gy were significantly reduced in the CC plans. Both, CCcov and CCclin had significantly increased MUs and FS compared to PB. Recalculation of PB plans for small lung lesions with CC showed a strong decline in dose and coverage in GTV, ITV and PTV, and declined dose in the lung. Thus, switching from a PB algorithm to CC, while aiming to obtain similar target coverage, can be associated with application of more MU and extension of radiotherapy fields, causing greater OAR exposition.
Yang, Jie; Zhang, Pengcheng; Zhang, Liyuan; Shu, Huazhong; Li, Baosheng; Gui, Zhiguo
2017-01-01
In inverse treatment planning of intensity-modulated radiation therapy (IMRT), the objective function is typically the sum of the weighted sub-scores, where the weights indicate the importance of the sub-scores. To obtain a high-quality treatment plan, the planner manually adjusts the objective weights using a trial-and-error procedure until an acceptable plan is reached. In this work, a new particle swarm optimization (PSO) method which can adjust the weighting factors automatically was investigated to overcome the requirement of manual adjustment, thereby reducing the workload of the human planner and contributing to the development of a fully automated planning process. The proposed optimization method consists of three steps. (i) First, a swarm of weighting factors (i.e., particles) is initialized randomly in the search space, where each particle corresponds to a global objective function. (ii) Then, a plan optimization solver is employed to obtain the optimal solution for each particle, and the values of the evaluation functions used to determine the particle's location and the population global location for the PSO are calculated based on these results. (iii) Next, the weighting factors are updated based on the particle's location and the population global location. Step (ii) is performed alternately with step (iii) until the termination condition is reached. In this method, the evaluation function is a combination of several key points on the dose volume histograms. Furthermore, a perturbation strategy - the crossover and mutation operator hybrid approach - is employed to enhance the population diversity, and two arguments are applied to the evaluation function to improve the flexibility of the algorithm. In this study, the proposed method was used to develop IMRT treatment plans involving five unequally spaced 6MV photon beams for 10 prostate cancer cases. The proposed optimization algorithm yielded high-quality plans for all of the cases, without human planner intervention. A comparison of the results with the optimized solution obtained using a similar optimization model but with human planner intervention revealed that the proposed algorithm produced optimized plans superior to that developed using the manual plan. The proposed algorithm can generate admissible solutions within reasonable computational times and can be used to develop fully automated IMRT treatment planning methods, thus reducing human planners' workloads during iterative processes. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumarasiri, Akila, E-mail: akumara1@hfhs.org; Siddiqui, Farzan; Liu, Chang
2014-12-15
Purpose: To evaluate the clinical potential of deformable image registration (DIR)-based automatic propagation of physician-drawn contours from a planning CT to midtreatment CT images for head and neck (H and N) adaptive radiotherapy. Methods: Ten H and N patients, each with a planning CT (CT1) and a subsequent CT (CT2) taken approximately 3–4 week into treatment, were considered retrospectively. Clinically relevant organs and targets were manually delineated by a radiation oncologist on both sets of images. Four commercial DIR algorithms, two B-spline-based and two Demons-based, were used to deform CT1 and the relevant contour sets onto corresponding CT2 images. Agreementmore » of the propagated contours with manually drawn contours on CT2 was visually rated by four radiation oncologists in a scale from 1 to 5, the volume overlap was quantified using Dice coefficients, and a distance analysis was done using center of mass (CoM) displacements and Hausdorff distances (HDs). Performance of these four commercial algorithms was validated using a parameter-optimized Elastix DIR algorithm. Results: All algorithms attained Dice coefficients of >0.85 for organs with clear boundaries and those with volumes >9 cm{sup 3}. Organs with volumes <3 cm{sup 3} and/or those with poorly defined boundaries showed Dice coefficients of ∼0.5–0.6. For the propagation of small organs (<3 cm{sup 3}), the B-spline-based algorithms showed higher mean Dice values (Dice = 0.60) than the Demons-based algorithms (Dice = 0.54). For the gross and planning target volumes, the respective mean Dice coefficients were 0.8 and 0.9. There was no statistically significant difference in the Dice coefficients, CoM, or HD among investigated DIR algorithms. The mean radiation oncologist visual scores of the four algorithms ranged from 3.2 to 3.8, which indicated that the quality of transferred contours was “clinically acceptable with minor modification or major modification in a small number of contours.” Conclusions: Use of DIR-based contour propagation in the routine clinical setting is expected to increase the efficiency of H and N replanning, reducing the amount of time needed for manual target and organ delineations.« less
Madan, Jason; Khan, Kamran A; Petrou, Stavros; Lamb, Sarah E
2017-05-01
Mapping algorithms are increasingly being used to predict health-utility values based on responses or scores from non-preference-based measures, thereby informing economic evaluations. We explored whether predictions in the EuroQol 5-dimension 3-level instrument (EQ-5D-3L) health-utility gains from mapping algorithms might differ if estimated using differenced versus raw scores, using the Roland-Morris Disability Questionnaire (RMQ), a widely used health status measure for low back pain, as an example. We estimated algorithms mapping within-person changes in RMQ scores to changes in EQ-5D-3L health utilities using data from two clinical trials with repeated observations. We also used logistic regression models to estimate response mapping algorithms from these data to predict within-person changes in responses to each EQ-5D-3L dimension from changes in RMQ scores. Predicted health-utility gains from these mappings were compared with predictions based on raw RMQ data. Using differenced scores reduced the predicted health-utility gain from a unit decrease in RMQ score from 0.037 (standard error [SE] 0.001) to 0.020 (SE 0.002). Analysis of response mapping data suggests that the use of differenced data reduces the predicted impact of reducing RMQ scores across EQ-5D-3L dimensions and that patients can experience health-utility gains on the EQ-5D-3L 'usual activity' dimension independent from improvements captured by the RMQ. Mappings based on raw RMQ data overestimate the EQ-5D-3L health utility gains from interventions that reduce RMQ scores. Where possible, mapping algorithms should reflect within-person changes in health outcome and be estimated from datasets containing repeated observations if they are to be used to estimate incremental health-utility gains.
Pediatric Benign Soft Tissue Oral and Maxillofacial Pathology.
Glickman, Alexandra; Karlis, Vasiliki
2016-02-01
Despite the many types of oral pathologic lesions found in infants and children, the most commonly encountered are benign soft tissue lesions. The clinical features, diagnostic criteria, and treatment algorithms of pathologies in the age group from birth to 18 years of age are summarized based on their prevalence in each given age distribution. Treatment modalities include both medical and surgical management. Copyright © 2016 Elsevier Inc. All rights reserved.
[Algorithms for treatment of complex hand injuries].
Pillukat, T; Prommersberger, K-J
2011-07-01
The primary treatment strongly influences the course and prognosis of hand injuries. Complex injuries which compromise functional recovery are especially challenging. Despite an apparently unlimited number of injury patterns it is possible to develop strategies which facilitate a standardized approach to operative treatment. In this situation algorithms can be important guidelines for a rational approach. The following algorithms have been proven in the treatment of complex injuries of the hand by our own experience. They were modified according to the current literature and refer to prehospital care, emergency room management, basic strategy in general and reconstruction of bone and joints, vessels, nerves, tendons and soft tissue coverage in detail. Algorithms facilitate the treatment of severe hand injuries. Applying simple yes/no decisions complex injury patterns are split into distinct partial problems which can be managed step by step.
Campos, Nicole G; Maza, Mauricio; Alfaro, Karla; Gage, Julia C; Castle, Philip E; Felix, Juan C; Cremer, Miriam L; Kim, Jane J
2015-08-15
Cervical cancer is the leading cause of cancer death among women in El Salvador. Utilizing data from the Cervical Cancer Prevention in El Salvador (CAPE) demonstration project, we assessed the health and economic impact of HPV-based screening and two different algorithms for the management of women who test HPV-positive, relative to existing Pap-based screening. We calibrated a mathematical model of cervical cancer to epidemiologic data from El Salvador and compared three screening algorithms for women aged 30-65 years: (i) HPV screening every 5 years followed by referral to colposcopy for HPV-positive women (Colposcopy Management [CM]); (ii) HPV screening every 5 years followed by treatment with cryotherapy for eligible HPV-positive women (Screen and Treat [ST]); and (iii) Pap screening every 2 years followed by referral to colposcopy for Pap-positive women (Pap). Potential harms and complications associated with overtreatment were not assessed. Under base case assumptions of 65% screening coverage, HPV-based screening was more effective than Pap, reducing cancer risk by ∼ 60% (Pap: 50%). ST was the least costly strategy, and cost $2,040 per year of life saved. ST remained the most attractive strategy as visit compliance, costs, coverage, and test performance were varied. We conclude that a screen-and-treat algorithm within an HPV-based screening program is very cost-effective in El Salvador, with a cost-effectiveness ratio below per capita GDP. © 2015 UICC.
Kurz, Christopher; Bauer, Julia; Conti, Maurizio; Guérin, Laura; Eriksson, Lars; Parodi, Katia
2015-07-01
External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β(+)-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small number of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme has been applied to exemplary postirradiation patient data-sets. Among the investigated reconstruction options, the overall best results in terms of image noise, activity quantification, and accurate geometrical recovery were achieved using the ordered subset expectation maximization reconstruction algorithm with time-of-flight (TOF) and point-spread function (PSF) information. For this algorithm, reasonably accurate (better than 5%) and precise (uncertainty of the mean activity below 10%) imaging can be provided down to 80,000 true coincidences at 96% RF. Image noise and geometrical fidelity are generally improved for fewer iterations. The main limitation for PET-based treatment monitoring has been identified in the small number of true coincidences, rather than the high intrinsic random background. Application of the optimized reconstruction scheme to patient data-sets results in a 25% - 50% reduced image noise at a comparable activity quantification accuracy and an improved geometrical performance with respect to the formerly used reconstruction scheme at HIT, adopted from nuclear medicine applications. Under the poor statistical conditions in PET-based treatment monitoring, improved results can be achieved by considering PSF and TOF information during image reconstruction and by applying less iterations than in conventional nuclear medicine imaging. Geometrical fidelity and image noise are mainly limited by the low number of true coincidences, not the high LSO-related random background. The retrieved results might also impact other emerging PET applications at low counting statistics.
NASA Astrophysics Data System (ADS)
Maguen, Ezra I.; Salz, James J.; McDonald, Marguerite B.; Pettit, George H.; Papaioannou, Thanassis; Grundfest, Warren S.
2002-06-01
A study was undertaken to assess whether results of laser vision correction with the LADARVISION 193-nm excimer laser (Alcon-Autonomous technologies) can be improved with the use of wavefront analysis generated by a proprietary system including a Hartman-Schack sensor and expressed using Zernicke polynomials. A total of 82 eyes underwent LASIK in several centers with an improved algorithm, using the CustomCornea system. A subgroup of 48 eyes of 24 patients was randomized so that one eye undergoes conventional treatment and one eye undergoes treatment based on wavefront analysis. Treatment parameters were equal for each type of refractive error. 83% of all eyes had uncorrected vision of 20/20 or better and 95% were 20/25 or better. In all groups, uncorrected visual acuities did not improve significantly in eyes treated with wavefront analysis compared to conventional treatments. Higher order aberrations were consistently better corrected in eyes undergoing treatment based on wavefront analysis for LASIK at 6 months postop. In addition, the number of eyes with reduced RMS was significantly higher in the subset of eyes treated with a wavefront algorithm (38% vs. 5%). Wavefront technology may improve the outcomes of laser vision correction with the LADARVISION excimer laser. Further refinements of the technology and clinical trials will contribute to this goal.
An international consensus algorithm for management of chronic postoperative inguinal pain.
Lange, J F M; Kaufmann, R; Wijsmuller, A R; Pierie, J P E N; Ploeg, R J; Chen, D C; Amid, P K
2015-02-01
Tension-free mesh repair of inguinal hernia has led to uniformly low recurrence rates. Morbidity associated with this operation is mainly related to chronic pain. No consensus guidelines exist for the management of this condition. The goal of this study is to design an expert-based algorithm for diagnostic and therapeutic management of chronic inguinal postoperative pain (CPIP). A group of surgeons considered experts on inguinal hernia surgery was solicited to develop the algorithm. Consensus regarding each step of an algorithm proposed by the authors was sought by means of the Delphi method leading to a revised expert-based algorithm. With the input of 28 international experts, an algorithm for a stepwise approach for management of CPIP was created. 26 participants accepted the final algorithm as a consensus model. One participant could not agree with the final concept. One expert did not respond during the final phase. There is a need for guidelines with regard to management of CPIP. This algorithm can serve as a guide with regard to the diagnosis, management, and treatment of these patients and improve clinical outcomes. If an expectative phase of a few months has passed without any amelioration of CPIP, a multidisciplinary approach is indicated and a pain management team should be consulted. Pharmacologic, behavioral, and interventional modalities including nerve blocks are essential. If conservative measures fail and surgery is considered, triple neurectomy, correction for recurrence with or without neurectomy, and meshoma removal if indicated should be performed. Surgeons less experienced with remedial operations for CPIP should not hesitate to refer their patients to dedicated hernia surgeons.
Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs
NASA Astrophysics Data System (ADS)
Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Jiang Graves, Yan; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve
2013-12-01
Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine. This work was originally presented at the 54th AAPM annual meeting in Charlotte, NC, July 29-August 2, 2012.
Suppes, Trisha; Rush, A John; Dennehy, Ellen B; Crismon, M Lynn; Kashner, T Michael; Toprac, Marcia G; Carmody, Thomas J; Brown, E Sherwood; Biggs, Melanie M; Shores-Wilson, Kathy; Witte, Bradley P; Trivedi, Madhukar H; Miller, Alexander L; Altshuler, Kenneth Z; Shon, Steven P
2003-04-01
The Texas Medication Algorithm Project (TMAP) assessed the clinical and economic impact of algorithm-driven treatment (ALGO) as compared with treatment-as-usual (TAU) in patients served in public mental health centers. This report presents clinical outcomes in patients with a history of mania (BD), including bipolar I and schizoaffective disorder, bipolar type, during 12 months of treatment beginning March 1998 and ending with the final active patient visit in April 2000. Patients were diagnosed with bipolar I disorder or schizoaffective disorder, bipolar type, according to DSM-IV criteria. ALGO was comprised of a medication algorithm and manual to guide treatment decisions. Physicians and clinical coordinators received training and expert consultation throughout the project. ALGO also provided a disorder-specific patient and family education package. TAU clinics had no exposure to the medication algorithms. Quarterly outcome evaluations were obtained by independent raters. Hierarchical linear modeling, based on a declining effects model, was used to assess clinical outcome of ALGO versus TAU. ALGO and TAU patients showed significant initial decreases in symptoms (p =.03 and p <.001, respectively) measured by the 24-item Brief Psychiatric Rating Scale (BPRS-24) at the 3-month assessment interval, with significantly greater effects for the ALGO group. Limited catch-up by TAU was observed over the remaining 3 quarters. Differences were also observed in measures of mania and psychosis but not in depression, side-effect burden, or functioning. For patients with a history of mania, relative to TAU, the ALGO intervention package was associated with greater initial and sustained improvement on the primary clinical outcome measure, the BPRS-24, and the secondary outcome measure, the Clinician-Administered Rating Scale for Mania (CARS-M). Further research is planned to clarify which elements of the ALGO package contributed to this between-group difference.
Liu, Yan; Stojadinovic, Strahinja; Hrycushko, Brian; Wardak, Zabi; Lau, Steven; Lu, Weiguo; Yan, Yulong; Jiang, Steve B; Zhen, Xin; Timmerman, Robert; Nedzi, Lucien; Gu, Xuejun
2017-01-01
Accurate and automatic brain metastases target delineation is a key step for efficient and effective stereotactic radiosurgery (SRS) treatment planning. In this work, we developed a deep learning convolutional neural network (CNN) algorithm for segmenting brain metastases on contrast-enhanced T1-weighted magnetic resonance imaging (MRI) datasets. We integrated the CNN-based algorithm into an automatic brain metastases segmentation workflow and validated on both Multimodal Brain Tumor Image Segmentation challenge (BRATS) data and clinical patients' data. Validation on BRATS data yielded average DICE coefficients (DCs) of 0.75±0.07 in the tumor core and 0.81±0.04 in the enhancing tumor, which outperformed most techniques in the 2015 BRATS challenge. Segmentation results of patient cases showed an average of DCs 0.67±0.03 and achieved an area under the receiver operating characteristic curve of 0.98±0.01. The developed automatic segmentation strategy surpasses current benchmark levels and offers a promising tool for SRS treatment planning for multiple brain metastases.
Neural network explanation using inversion.
Saad, Emad W; Wunsch, Donald C
2007-01-01
An important drawback of many artificial neural networks (ANN) is their lack of explanation capability [Andrews, R., Diederich, J., & Tickle, A. B. (1996). A survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based Systems, 8, 373-389]. This paper starts with a survey of algorithms which attempt to explain the ANN output. We then present HYPINV, a new explanation algorithm which relies on network inversion; i.e. calculating the ANN input which produces a desired output. HYPINV is a pedagogical algorithm, that extracts rules, in the form of hyperplanes. It is able to generate rules with arbitrarily desired fidelity, maintaining a fidelity-complexity tradeoff. To our knowledge, HYPINV is the only pedagogical rule extraction method, which extracts hyperplane rules from continuous or binary attribute neural networks. Different network inversion techniques, involving gradient descent as well as an evolutionary algorithm, are presented. An information theoretic treatment of rule extraction is presented. HYPINV is applied to example synthetic problems, to a real aerospace problem, and compared with similar algorithms using benchmark problems.
Jinno, Shunta; Tachibana, Hidenobu; Moriya, Shunsuke; Mizuno, Norifumi; Takahashi, Ryo; Kamima, Tatsuya; Ishibashi, Satoru; Sato, Masanori
2018-05-21
In inhomogeneous media, there is often a large systematic difference in the dose between the conventional Clarkson algorithm (C-Clarkson) for independent calculation verification and the superposition-based algorithms of treatment planning systems (TPSs). These treatment site-dependent differences increase the complexity of the radiotherapy planning secondary check. We developed a simple and effective method of heterogeneity correction integrated with the Clarkson algorithm (L-Clarkson) to account for the effects of heterogeneity in the lateral dimension, and performed a multi-institutional study to evaluate the effectiveness of the method. In the method, a 2D image reconstructed from computed tomography (CT) images is divided according to lines extending from the reference point to the edge of the multileaf collimator (MLC) or jaw collimator for each pie sector, and the radiological path length (RPL) of each line is calculated on the 2D image to obtain a tissue maximum ratio and phantom scatter factor, allowing the dose to be calculated. A total of 261 plans (1237 beams) for conventional breast and lung treatments and lung stereotactic body radiotherapy were collected from four institutions. Disagreements in dose between the on-site TPSs and a verification program using the C-Clarkson and L-Clarkson algorithms were compared. Systematic differences with the L-Clarkson method were within 1% for all sites, while the C-Clarkson method resulted in systematic differences of 1-5%. The L-Clarkson method showed smaller variations. This heterogeneity correction integrated with the Clarkson algorithm would provide a simple evaluation within the range of -5% to +5% for a radiotherapy plan secondary check.
The Chandra Source Catalog: Algorithms
NASA Astrophysics Data System (ADS)
McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-09-01
Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.
Fast Adapting Ensemble: A New Algorithm for Mining Data Streams with Concept Drift
Ortíz Díaz, Agustín; Ramos-Jiménez, Gonzalo; Frías Blanco, Isvani; Caballero Mota, Yailé; Morales-Bueno, Rafael
2015-01-01
The treatment of large data streams in the presence of concept drifts is one of the main challenges in the field of data mining, particularly when the algorithms have to deal with concepts that disappear and then reappear. This paper presents a new algorithm, called Fast Adapting Ensemble (FAE), which adapts very quickly to both abrupt and gradual concept drifts, and has been specifically designed to deal with recurring concepts. FAE processes the learning examples in blocks of the same size, but it does not have to wait for the batch to be complete in order to adapt its base classification mechanism. FAE incorporates a drift detector to improve the handling of abrupt concept drifts and stores a set of inactive classifiers that represent old concepts, which are activated very quickly when these concepts reappear. We compare our new algorithm with various well-known learning algorithms, taking into account, common benchmark datasets. The experiments show promising results from the proposed algorithm (regarding accuracy and runtime), handling different types of concept drifts. PMID:25879051
The analysis of isotherms of radionuclides sorption by inorganic sorbents
NASA Astrophysics Data System (ADS)
Bykova, E. P.; Nedobukh, T. A.
2017-09-01
The isotherm of cesium sorption by an inorganic sorbent based on granulated glauconite obtained in a wide cesium concentrations range was mathematically treated using Langmuir, Freundlich and Redlich-Peterson sorption models. The algorithms of mathematical treatment of experimental data using these models were described; parameters of all isotherms were determined. It was shown that estimating the correctness of various sorption models relies not only on the correlation coefficient values but also on the closeness of the calculated and experimental data. Various types of sorption sites were found as a result of mathematical treatment of the isotherm of cesium sorption. The algorithm was described and calculation of parameters of the isotherm was performed under the assumption that simultaneous sorption on all three types of sorption sites occurs in accordance with Langmuir isotherm.
Tse, Yvonne; Armstrong, David; Andrews, Christopher N; Bitton, Alain; Bressler, Brian; Marshall, John; Liu, Louis W C
2017-01-01
Background . Chronic idiopathic constipation (CIC) and constipation-predominant irritable bowel syndrome (IBS-C) are common functional lower gastrointestinal disorders that impair patients' quality of life. In a national survey, we aimed to evaluate (1) Canadian physician practice patterns in the utilization of therapeutic agents listed in the new ACG and AGA guidelines; (2) physicians satisfaction with these agents for their CIC and IBS-C patients; and (3) the usefulness of these new guidelines in their clinical practice. Methods . A 9-item questionnaire was sent to 350 Canadian specialists to evaluate their clinical practice for the management of CIC and IBS-C. Results . The response rate to the survey was 16% ( n = 55). Almost all (96%) respondents followed a standard, stepwise approach for management while they believed that only 24% of referring physicians followed the same approach. Respondents found guanylyl cyclase C (GCC) agonist most satisfying when treating their patients. Among the 69% of respondents who were aware of published guidelines, only 50% found them helpful in prioritizing treatment choices and 69% of respondents indicated that a treatment algorithm, applicable to Canadian practice, would be valuable. Conclusion . Based on this needs assessment, a treatment algorithm was developed to provide clinical guidance in the management of IBS-C and CIC in Canada.
Janjua, Naveed Zafar; Islam, Nazrul; Kuo, Margot; Yu, Amanda; Wong, Stanley; Butt, Zahid A; Gilbert, Mark; Buxton, Jane; Chapinal, Nuria; Samji, Hasina; Chong, Mei; Alvarez, Maria; Wong, Jason; Tyndall, Mark W; Krajden, Mel
2018-05-01
Large linked healthcare administrative datasets could be used to monitor programs providing prevention and treatment services to people who inject drugs (PWID). However, diagnostic codes in administrative datasets do not differentiate non-injection from injection drug use (IDU). We validated algorithms based on diagnostic codes and prescription records representing IDU in administrative datasets against interview-based IDU data. The British Columbia Hepatitis Testers Cohort (BC-HTC) includes ∼1.7 million individuals tested for HCV/HIV or reported HBV/HCV/HIV/tuberculosis cases in BC from 1990 to 2015, linked to administrative datasets including physician visit, hospitalization and prescription drug records. IDU, assessed through interviews as part of enhanced surveillance at the time of HIV or HCV/HBV diagnosis from a subset of cases included in the BC-HTC (n = 6559), was used as the gold standard. ICD-9/ICD-10 codes for IDU and injecting-related infections (IRI) were grouped with records of opioid substitution therapy (OST) into multiple IDU algorithms in administrative datasets. We assessed the performance of IDU algorithms through calculation of sensitivity, specificity, positive predictive, and negative predictive values. Sensitivity was highest (90-94%), and specificity was lowest (42-73%) for algorithms based either on IDU or IRI and drug misuse codes. Algorithms requiring both drug misuse and IRI had lower sensitivity (57-60%) and higher specificity (90-92%). An optimal sensitivity and specificity combination was found with two medical visits or a single hospitalization for injectable drugs with (83%/82%) and without OST (78%/83%), respectively. Based on algorithms that included two medical visits, a single hospitalization or OST records, there were 41,358 (1.2% of 11-65 years individuals in BC) recent PWID in BC based on health encounters during 3- year period (2013-2015). Algorithms for identifying PWID using diagnostic codes in linked administrative data could be used for tracking the progress of programing aimed at PWID. With population-based datasets, this tool can be used to inform much needed estimates of PWID population size. Copyright © 2018 Elsevier B.V. All rights reserved.
Tehrani, Joubin Nasehi; O'Brien, Ricky T; Poulsen, Per Rugaard; Keall, Paul
2013-12-07
Previous studies have shown that during cancer radiotherapy a small translation or rotation of the tumor can lead to errors in dose delivery. Current best practice in radiotherapy accounts for tumor translations, but is unable to address rotation due to a lack of a reliable real-time estimate. We have developed a method based on the iterative closest point (ICP) algorithm that can compute rotation from kilovoltage x-ray images acquired during radiation treatment delivery. A total of 11 748 kilovoltage (kV) images acquired from ten patients (one fraction for each patient) were used to evaluate our tumor rotation algorithm. For each kV image, the three dimensional coordinates of three fiducial markers inside the prostate were calculated. The three dimensional coordinates were used as input to the ICP algorithm to calculate the real-time tumor rotation and translation around three axes. The results show that the root mean square error was improved for real-time calculation of tumor displacement from a mean of 0.97 mm with the stand alone translation to a mean of 0.16 mm by adding real-time rotation and translation displacement with the ICP algorithm. The standard deviation (SD) of rotation for the ten patients was 2.3°, 0.89° and 0.72° for rotation around the right-left (RL), anterior-posterior (AP) and superior-inferior (SI) directions respectively. The correlation between all six degrees of freedom showed that the highest correlation belonged to the AP and SI translation with a correlation of 0.67. The second highest correlation in our study was between the rotation around RL and rotation around AP, with a correlation of -0.33. Our real-time algorithm for calculation of rotation also confirms previous studies that have shown the maximum SD belongs to AP translation and rotation around RL. ICP is a reliable and fast algorithm for estimating real-time tumor rotation which could create a pathway to investigational clinical treatment studies requiring real-time measurement and adaptation to tumor rotation.
NASA Astrophysics Data System (ADS)
Nasehi Tehrani, Joubin; O'Brien, Ricky T.; Rugaard Poulsen, Per; Keall, Paul
2013-12-01
Previous studies have shown that during cancer radiotherapy a small translation or rotation of the tumor can lead to errors in dose delivery. Current best practice in radiotherapy accounts for tumor translations, but is unable to address rotation due to a lack of a reliable real-time estimate. We have developed a method based on the iterative closest point (ICP) algorithm that can compute rotation from kilovoltage x-ray images acquired during radiation treatment delivery. A total of 11 748 kilovoltage (kV) images acquired from ten patients (one fraction for each patient) were used to evaluate our tumor rotation algorithm. For each kV image, the three dimensional coordinates of three fiducial markers inside the prostate were calculated. The three dimensional coordinates were used as input to the ICP algorithm to calculate the real-time tumor rotation and translation around three axes. The results show that the root mean square error was improved for real-time calculation of tumor displacement from a mean of 0.97 mm with the stand alone translation to a mean of 0.16 mm by adding real-time rotation and translation displacement with the ICP algorithm. The standard deviation (SD) of rotation for the ten patients was 2.3°, 0.89° and 0.72° for rotation around the right-left (RL), anterior-posterior (AP) and superior-inferior (SI) directions respectively. The correlation between all six degrees of freedom showed that the highest correlation belonged to the AP and SI translation with a correlation of 0.67. The second highest correlation in our study was between the rotation around RL and rotation around AP, with a correlation of -0.33. Our real-time algorithm for calculation of rotation also confirms previous studies that have shown the maximum SD belongs to AP translation and rotation around RL. ICP is a reliable and fast algorithm for estimating real-time tumor rotation which could create a pathway to investigational clinical treatment studies requiring real-time measurement and adaptation to tumor rotation.
Li, Xia; Abramson, Richard G; Arlinghaus, Lori R; Chakravarthy, Anuradha Bapsi; Abramson, Vandana; Mayer, Ingrid; Farley, Jaime; Delbeke, Dominique; Yankeelov, Thomas E
2012-11-16
By providing estimates of tumor glucose metabolism, 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) can potentially characterize the response of breast tumors to treatment. To assess therapy response, serial measurements of FDG-PET parameters (derived from static and/or dynamic images) can be obtained at different time points during the course of treatment. However, most studies track the changes in average parameter values obtained from the whole tumor, thereby discarding all spatial information manifested in tumor heterogeneity. Here, we propose a method whereby serially acquired FDG-PET breast data sets can be spatially co-registered to enable the spatial comparison of parameter maps at the voxel level. The goal is to optimally register normal tissues while simultaneously preventing tumor distortion. In order to accomplish this, we constructed a PET support device to enable PET/CT imaging of the breasts of ten patients in the prone position and applied a mutual information-based rigid body registration followed by a non-rigid registration. The non-rigid registration algorithm extended the adaptive bases algorithm (ABA) by incorporating a tumor volume-preserving constraint, which computed the Jacobian determinant over the tumor regions as outlined on the PET/CT images, into the cost function. We tested this approach on ten breast cancer patients undergoing neoadjuvant chemotherapy. By both qualitative and quantitative evaluation, our constrained algorithm yielded significantly less tumor distortion than the unconstrained algorithm: considering the tumor volume determined from standard uptake value maps, the post-registration median tumor volume changes, and the 25th and 75th quantiles were 3.42% (0%, 13.39%) and 16.93% (9.21%, 49.93%) for the constrained and unconstrained algorithms, respectively (p = 0.002), while the bending energy (a measure of the smoothness of the deformation) was 0.0015 (0.0005, 0.012) and 0.017 (0.005, 0.044), respectively (p = 0.005). The results indicate that the constrained ABA algorithm can accurately align prone breast FDG-PET images acquired at different time points while keeping the tumor from being substantially compressed or distorted. NCT00474604.
Mian, Shahid; Ball, Graham; Hornbuckle, Jo; Holding, Finn; Carmichael, James; Ellis, Ian; Ali, Selman; Li, Geng; McArdle, Stephanie; Creaser, Colin; Rees, Robert
2003-09-01
An ability to predict the likelihood of cellular response towards particular chemotherapeutic agents based upon protein expression patterns could facilitate the identification of biological molecules with previously undefined roles in the process of chemoresistance/chemosensitivity, and if robust enough these patterns might also be exploited towards the development of novel predictive assays. To ascertain whether proteomic based molecular profiling in conjunction with artificial neural network (ANN) algorithms could be applied towards the specific recognition of phenotypic patterns between either control or drug treated and chemosensitive or chemoresistant cellular populations, a combined approach involving MALDI-TOF matrix-assisted laser desorption/ionization-time of flight mass spectrometry, Ciphergen protein chip technology and ANN algorithms have been applied to specifically identify proteomic 'fingerprints' indicative of treatment regimen for chemosensitive (MCF-7, T47D) and chemoresistant (MCF-7/ADR) breast cancer cell lines following exposure to Doxorubicin or Paclitaxel. The results indicate that proteomic patterns can be identified by ANN algorithms to correctly assign 'class' for treatment regimen (e.g. control/drug treated or chemosensitive/chemoresistant) with a high degree of accuracy using boot-strap statistical validation techniques and that biomarker ion patterns indicative of response/non-response phenotypes are associated with MCF-7 and MCF-7/ADR cells exposed to Doxorubicin. We have also examined the predictive capability of this approach towards MCF-7 and T47D cells to ascertain whether prediction could be made based upon treatment regimen irrespective of cell lineage. Models were identified that could correctly assign class (control or Paclitaxel treatment) for 35/38 samples of an independent dataset. A similar level of predictive capability was also found (> 92%; n = 28) when proteomic patterns derived from the drug resistant cell line MCF-7/ADR were compared against those derived from MCF-7 and T47D as a model system of drug resistant and drug sensitive phenotypes. This approach might offer a potential methodology for predicting the biological behaviour of cancer cells towards particular chemotherapeutics and through protein isolation and sequence identification could result in the identification of biological molecules associated with chemosensitive/chemoresistance tumour phenotypes.
Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam
2016-01-01
The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Z; Kennedy, A; Larsen, E
2015-06-15
Purpose: The aim of this study was to investigate the dosimetric impact of the combination of photon energy and treatment technique on radiotherapy of localized prostate cancer when knowledge based planning was used. Methods: A total of 16 patients with localized prostate cancer were retrospectively retrieved from database and used for this study. For each patient, four types of treatment plans with different combinations of photon energy (6X and 10X) and treatment techniques (7-field IMRT and 2-arc VMAT) were created using a prostate DVH estimation model in RapidPlan™ and Eclipse treatment planning system (Varian Medical System). For any beam arrangement,more » DVH objectives and weighting priorities were generated based on the geometric relationship between the OAR and PTV. Photon optimization algorithm was used for plan optimization and AAA algorithm was used for final dose calculation. Plans were evaluated in terms of the pre-defined dosimetric endpoints for PTV, rectum, bladder, penile bulb, and femur heads. A Student’s paired t-test was used for statistical analysis and p > 0.05 was considered statistically significant. Results: For PTV, V95 was statistically similar among all four types of plans, though the mean dose of 10X plans was higher than that of 6X plans. VMAT plans showed higher heterogeneity index than IMRT plans. No statistically significant difference in dosimetry metrics was observed for rectum, bladder, and penile bulb among plan types. For left and right femur, VMAT plans had a higher mean dose than IMRT plans regardless of photon energy, whereas the maximum dose was similar. Conclusion: Overall, the dosimetric endpoints were similar regardless of photon energy and treatment techniques when knowledge based auto planning was used. Given the similarity in dosimetry metrics of rectum, bladder, and penile bulb, the genitourinary and gastrointestinal toxicities should be comparable among the selections of photon energy and treatment techniques.« less
Zhou, Lu; Zhou, Linghong; Zhang, Shuxu; Zhen, Xin; Yu, Hui; Zhang, Guoqian; Wang, Ruihao
2014-01-01
Deformable image registration (DIR) was widely used in radiation therapy, such as in automatic contour generation, dose accumulation, tumor growth or regression analysis. To achieve higher registration accuracy and faster convergence, an improved 'diffeomorphic demons' registration algorithm was proposed and validated. Based on Brox et al.'s gradient constancy assumption and Malis's efficient second-order minimization (ESM) algorithm, a grey value gradient similarity term and a transformation error term were added into the demons energy function, and a formula was derived to calculate the update of transformation field. The limited Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) algorithm was used to optimize the energy function so that the iteration number could be determined automatically. The proposed algorithm was validated using mathematically deformed images and physically deformed phantom images. Compared with the original 'diffeomorphic demons' algorithm, the registration method proposed achieve a higher precision and a faster convergence speed. Due to the influence of different scanning conditions in fractionated radiation, the density range of the treatment image and the planning image may be different. In such a case, the improved demons algorithm can achieve faster and more accurate radiotherapy.
Trivedi, Madhukar H; Daly, Ella J
2007-05-01
Despite years of antidepressant drug development and patient and provider education, suboptimal medication dosing and duration of exposure resulting in incomplete remission of symptoms remains the norm in the treatment of depression. Additionally, since no one treatment is effective for all patients, optimal implementation focusing on the measurement of symptoms, side effects, and function is essential to determine effective sequential treatment approaches. There is a need for a paradigm shift in how clinical decision making is incorporated into clinical practice and for a move away from the trial-and-error approach that currently determines the "next best" treatment. This paper describes how our experience with the Texas Medication Algorithm Project (TMAP) and the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial has confirmed the need for easy-to-use clinical support systems to ensure fidelity to guidelines. To further enhance guideline fidelity, we have developed an electronic decision support system that provides critical feedback and guidance at the point of patient care. We believe that a measurement-based care (MBC) approach is essential to any decision support system, allowing physicians to individualize and adapt decisions about patient care based on symptom progress, tolerability of medication, and dose optimization. We also believe that successful integration of sequential algorithms with MBC into real-world clinics will facilitate change that will endure and improve patient outcomes. Although we use major depression to illustrate our approach, the issues addressed are applicable to other chronic psychiatric conditions including comorbid depression and substance use disorder as well as other medical illnesses.
Trivedi, Madhukar H.; Daly, Ella J.
2009-01-01
Despite years of antidepressant drug development and patient and provider education, suboptimal medication dosing and duration of exposure resulting in incomplete remission of symptoms remains the norm in the treatment of depression. Additionally, since no one treatment is effective for all patients, optimal implementation focusing on the measurement of symptoms, side effects, and function is essential to determine effective sequential treatment approaches. There is a need for a paradigm shift in how clinical decision making is incorporated into clinical practice and for a move away from the trial-and-error approach that currently determines the “next best” treatment. This paper describes how our experience with the Texas Medication Algorithm Project (TMAP) and the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) trial has confirmed the need for easy-to-use clinical support systems to ensure fidelity to guidelines. To further enhance guideline fidelity, we have developed an electronic decision support system that provides critical feedback and guidance at the point of patient care. We believe that a measurement-based care (MBC) approach is essential to any decision support system, allowing physicians to individualize and adapt decisions about patient care based on symptom progress, tolerability of medication, and dose optimization. We also believe that successful integration of sequential algorithms with MBC into real-world clinics will facilitate change that will endure and improve patient outcomes. Although we use major depression to illustrate our approach, the issues addressed are applicable to other chronic psychiatric conditions including comorbid depression and substance use disorder as well as other medical illnesses. PMID:17320312
Automated planning of ablation targets in atrial fibrillation treatment
NASA Astrophysics Data System (ADS)
Keustermans, Johannes; De Buck, Stijn; Heidbüchel, Hein; Suetens, Paul
2011-03-01
Catheter based radio-frequency ablation is used as an invasive treatment of atrial fibrillation. This procedure is often guided by the use of 3D anatomical models obtained from CT, MRI or rotational angiography. During the intervention the operator accurately guides the catheter to prespecified target ablation lines. The planning stage, however, can be time consuming and operator dependent which is suboptimal both from a cost and health perspective. Therefore, we present a novel statistical model-based algorithm for locating ablation targets from 3D rotational angiography images. Based on a training data set of 20 patients, consisting of 3D rotational angiography images with 30 manually indicated ablation points, a statistical local appearance and shape model is built. The local appearance model is based on local image descriptors to capture the intensity patterns around each ablation point. The local shape model is constructed by embedding the ablation points in an undirected graph and imposing that each ablation point only interacts with its neighbors. Identifying the ablation points on a new 3D rotational angiography image is performed by proposing a set of possible candidate locations for each ablation point, as such, converting the problem into a labeling problem. The algorithm is validated using a leave-one-out-approach on the training data set, by computing the distance between the ablation lines obtained by the algorithm and the manually identified ablation points. The distance error is equal to 3.8+/-2.9 mm. As ablation lesion size is around 5-7 mm, automated planning of ablation targets by the presented approach is sufficiently accurate.
Shimol, Eli Ben; Joskowicz, Leo; Eliahou, Ruth; Shoshan, Yigal
2018-02-01
Stereotactic radiosurgery (SRS) is a common treatment for intracranial meningiomas. SRS is planned on a pre-therapy gadolinium-enhanced T1-weighted MRI scan (Gd-T1w MRI) in which the meningioma contours have been delineated. Post-SRS therapy serial Gd-T1w MRI scans are then acquired for longitudinal treatment evaluation. Accurate tumor volume change quantification is required for treatment efficacy evaluation and for treatment continuation. We present a new algorithm for the automatic segmentation and volumetric assessment of meningioma in post-therapy Gd-T1w MRI scans. The inputs are the pre- and post-therapy Gd-T1w MRI scans and the meningioma delineation in the pre-therapy scan. The output is the meningioma delineations and volumes in the post-therapy scan. The algorithm uses the pre-therapy scan and its meningioma delineation to initialize an extended Chan-Vese active contour method and as a strong patient-specific intensity and shape prior for the post-therapy scan meningioma segmentation. The algorithm is automatic, obviates the need for independent tumor localization and segmentation initialization, and incorporates the same tumor delineation criteria in both the pre- and post-therapy scans. Our experimental results on retrospective pre- and post-therapy scans with a total of 32 meningiomas with volume ranges 0.4-26.5 cm[Formula: see text] yield a Dice coefficient of [Formula: see text]% with respect to ground-truth delineations in post-therapy scans created by two clinicians. These results indicate a high correspondence to the ground-truth delineations. Our algorithm yields more reliable and accurate tumor volume change measurements than other stand-alone segmentation methods. It may be a useful tool for quantitative meningioma prognosis evaluation after SRS.
Diagnosis and treatment of gastroesophageal reflux disease complicated by Barrett's esophagus.
Stasyshyn, Andriy
2017-08-31
The aim of the study was to evaluate the effectiveness of a diagnostic and therapeutic algorithm for gastroesophageal reflux disease complicated by Barrett's esophagus in 46 patients. A diagnostic and therapeutic algorithm for complicated GERD was developed. To describe the changes in the esophagus with reflux esophagitis, the Los Angeles classification was used. Intestinal metaplasia of the epithelium in the lower third of the esophagus was assessed using videoendoscopy, chromoscopy, and biopsy. Quality of life was assessed with the Gastro-Intestinal Quality of Life Index. The used methods were modeling, clinical, analytical, comparative, standardized, and questionnaire-based. Results and their discussion. Among the complications of GERD, Barrett's esophagus was diagnosed in 9 (19.6 %), peptic ulcer in the esophagus in 10 (21.7 %), peptic stricture of the esophagus in 4 (8.7 %), esophageal-gastric bleeding in 23 (50.0 %), including Malory-Weiss syndrome in 18, and erosive ulcerous bleeding in 5 people. Hiatal hernia was diagnosed in 171 (87.7 %) patients (sliding in 157 (91.8%), paraesophageal hernia in 2 (1.2%), and mixed hernia in 12 (7.0%) cases). One hundred ninety-five patients underwent laparoscopic surgery. Nissen fundoplication was conducted in 176 (90.2%) patients, Toupet fundoplication in 14 (7.2%), and Dor fundoplication in 5 (2.6%). It was established that the use of the diagnostic and treatment algorithm promoted systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life. Argon coagulation and use of PPIs for 8-12 weeks before surgery led to the regeneration of the mucous membrane in the esophagus. The developed diagnostic and therapeutic algorithm facilitated systematization and objectification of changes in complicated GERD, contributed to early diagnosis, helped in choosing treatment, and improved quality of life.
Implementation and evaluation of various demons deformable image registration algorithms on a GPU.
Gu, Xuejun; Pan, Hubert; Liang, Yun; Castillo, Richard; Yang, Deshan; Choi, Dongju; Castillo, Edward; Majumdar, Amitava; Guerrero, Thomas; Jiang, Steve B
2010-01-07
Online adaptive radiation therapy (ART) promises the ability to deliver an optimal treatment in response to daily patient anatomic variation. A major technical barrier for the clinical implementation of online ART is the requirement of rapid image segmentation. Deformable image registration (DIR) has been used as an automated segmentation method to transfer tumor/organ contours from the planning image to daily images. However, the current computational time of DIR is insufficient for online ART. In this work, this issue is addressed by using computer graphics processing units (GPUs). A gray-scale-based DIR algorithm called demons and five of its variants were implemented on GPUs using the compute unified device architecture (CUDA) programming environment. The spatial accuracy of these algorithms was evaluated over five sets of pulmonary 4D CT images with an average size of 256 x 256 x 100 and more than 1100 expert-determined landmark point pairs each. For all the testing scenarios presented in this paper, the GPU-based DIR computation required around 7 to 11 s to yield an average 3D error ranging from 1.5 to 1.8 mm. It is interesting to find out that the original passive force demons algorithms outperform subsequently proposed variants based on the combination of accuracy, efficiency and ease of implementation.
NASA Astrophysics Data System (ADS)
Harmon, Stephanie A.; Tuite, Michael J.; Jeraj, Robert
2016-10-01
Site selection for image-guided biopsies in patients with multiple lesions is typically based on clinical feasibility and physician preference. This study outlines the development of a selection algorithm that, in addition to clinical requirements, incorporates quantitative imaging data for automatic identification of candidate lesions for biopsy. The algorithm is designed to rank potential targets by maximizing a lesion-specific score, incorporating various criteria separated into two categories: (1) physician-feasibility category including physician-preferred lesion location and absolute volume scores, and (2) imaging-based category including various modality and application-specific metrics. This platform was benchmarked in two clinical scenarios, a pre-treatment setting and response-based setting using imaging from metastatic prostate cancer patients with high disease burden (multiple lesions) undergoing conventional treatment and receiving whole-body [18F]NaF PET/CT scans pre- and mid-treatment. Targeting of metastatic lesions was robust to different weighting ratios and candidacy for biopsy was physician confirmed. Lesion ranked as top targets for biopsy remained so for all patients in pre-treatment and post-treatment biopsy selection after sensitivity testing was completed for physician-biased or imaging-biased scenarios. After identifying candidates, biopsy feasibility was evaluated by a physician and confirmed for 90% (32/36) of high-ranking lesions, of which all top choices were confirmed. The remaining cases represented lesions with high anatomical difficulty for targeting, such as proximity to sciatic nerve. This newly developed selection method was successfully used to quantitatively identify candidate lesions for biopsies in patients with multiple lesions. In a prospective study, we were able to successfully plan, develop, and implement this technique for the selection of a pre-treatment biopsy location.
Perioperative management of endocrine insufficiency after total pancreatectomy for neoplasia.
Maker, Ajay V; Sheikh, Raashid; Bhagia, Vinita
2017-09-01
Indications for total pancreatectomy (TP) have increased, including for diffuse main duct intrapapillary mucinous neoplasms of the pancreas and malignancy; therefore, the need persists for surgeons to develop appropriate endocrine post-operative management strategies. The brittle diabetes after TP differs from type 1/2 diabetes in that patients have absolute deficiency of insulin and functional glucagon. This makes glucose management challenging, complicates recovery, and predisposes to hospital readmissions. This article aims to define the disease, describe the cause for its occurrence, review the anatomy of the endocrine pancreas, and explain how this condition differs from diabetes mellitus in the setting of post-operative management. The morbidity and mortality of post-TP endocrine insufficiency and practical treatment strategies are systematically reviewed from the literature. Finally, an evidence-based treatment algorithm is created for the practicing pancreatic surgeon and their care team of endocrinologists to aid in managing these complex patients. A PubMed, Science Citation Index/Social sciences Citation Index, and Cochrane Evidence-Based Medicine database search was undertaken along with extensive backward search of the references of published articles to identify studies evaluating endocrine morbidity and treatment after TP and to establish an evidence-based treatment strategy. Indications for TP and the etiology of pancreatogenic diabetes are reviewed. After TP, ~80% patients develop hypoglycemic episodes and 40% experience severe hypoglycemia, resulting in 0-8% mortality and 25-45% morbidity. Referral to a nutritionist and endocrinologist for patient education before surgery followed by surgical reevaluation to determine if the patient has the appropriate understanding, support, and resources preoperatively has significantly reduced morbidity and mortality. The use of modern recombinant long-acting insulin analogues, continuous subcutaneous insulin infusion, and glucagon rescue therapy has greatly improved management in the modern era and constitute the current standard of care. A simple immediate post-operative algorithm was constructed. Successful perioperative surgical management of total pancreatectomy and resulting pancreatogenic diabetes is critical to achieve acceptable post-operative outcomes, and we review the pertinent literature and provide a simple, evidence-based algorithm for immediate post-resection glycemic control.
Planning of electroporation-based treatments using Web-based treatment-planning software.
Pavliha, Denis; Kos, Bor; Marčan, Marija; Zupanič, Anže; Serša, Gregor; Miklavčič, Damijan
2013-11-01
Electroporation-based treatment combining high-voltage electric pulses and poorly permanent cytotoxic drugs, i.e., electrochemotherapy (ECT), is currently used for treating superficial tumor nodules by following standard operating procedures. Besides ECT, another electroporation-based treatment, nonthermal irreversible electroporation (N-TIRE), is also efficient at ablating deep-seated tumors. To perform ECT or N-TIRE of deep-seated tumors, following standard operating procedures is not sufficient and patient-specific treatment planning is required for successful treatment. Treatment planning is required because of the use of individual long-needle electrodes and the diverse shape, size and location of deep-seated tumors. Many institutions that already perform ECT of superficial metastases could benefit from treatment-planning software that would enable the preparation of patient-specific treatment plans. To this end, we have developed a Web-based treatment-planning software for planning electroporation-based treatments that does not require prior engineering knowledge from the user (e.g., the clinician). The software includes algorithms for automatic tissue segmentation and, after segmentation, generation of a 3D model of the tissue. The procedure allows the user to define how the electrodes will be inserted. Finally, electric field distribution is computed, the position of electrodes and the voltage to be applied are optimized using the 3D model and a downloadable treatment plan is made available to the user.
Zhang, Melvyn W B; Ho, Roger C M; Mcintyre, Roger S
2016-07-27
Over the past decade, there have been massive advances in technology. These advances in technology have significantly transformed various aspects of healthcare. The advent of E-health and its influence on healthcare practice also implies that there is a paradigm shift in the way healthcare professionals work. Conventionally, healthcare professionals would have to refer to books and journals for updates in treatment algorithms, but with the advent of technology, they could access this information via the web or via various smartphone applications on the go. In the field of Psychiatry, one of the commonest mental health disorder to date, with significant morbidity and mortality is that of Major depressive disorder. Routinely, clinicians and healthcare professionals are advised to refer to standard guidelines in guiding them with regards to their treatment options. Given the high prevalence of conditions like Major Depressive Disorder, it is thus of importance that whatever guidelines that clinicians and healthcare professionals refer to are constantly kept up to date, so that patients could benefit from latest evidence based therapy and treatment. A review of the current literature highlights that whilst there are a multitude of smartphone applications designed for mental health care, previous systematic review has highlighted a paucity of evidence based applications. More importantly, current literature with regards to provision of treatment information to healthcare professionals and patients are limited to web-based interventions. It is the aim of this technical note to highlight a methodology to which the authors have conceptualized in the implementation of an evidence based mental health guideline applications, known as the `Wiki Guidelines' smartphone application. The authors hope to illustrate the algorithms behind the development of the application, and how it could be easily updated by the guidelines working group.
Biomechanical deformable image registration of longitudinal lung CT images using vessel information
NASA Astrophysics Data System (ADS)
Cazoulat, Guillaume; Owen, Dawn; Matuszak, Martha M.; Balter, James M.; Brock, Kristy K.
2016-07-01
Spatial correlation of lung tissue across longitudinal images, as the patient responds to treatment, is a critical step in adaptive radiotherapy. The goal of this work is to expand a biomechanical model-based deformable registration algorithm (Morfeus) to achieve accurate registration in the presence of significant anatomical changes. Six lung cancer patients previously treated with conventionally fractionated radiotherapy were retrospectively evaluated. Exhale CT scans were obtained at treatment planning and following three weeks of treatment. For each patient, the planning CT was registered to the follow-up CT using Morfeus, a biomechanical model-based deformable registration algorithm. To model the complex response of the lung, an extension to Morfeus has been developed: an initial deformation was estimated with Morfeus consisting of boundary conditions on the chest wall and incorporating a sliding interface with the lungs. It was hypothesized that the addition of boundary conditions based on vessel tree matching would provide a robust reduction of the residual registration error. To achieve this, the vessel trees were segmented on the two images by thresholding a vesselness image based on the Hessian matrix’s eigenvalues. For each point on the reference vessel tree centerline, the displacement vector was estimated by applying a variant of the Demons registration algorithm between the planning CT and the deformed follow-up CT. An expert independently identified corresponding landmarks well distributed in the lung to compute target registration errors (TRE). The TRE was: 5.8+/- 2.9 , 3.4+/- 2.3 and 1.6+/- 1.3 mm after rigid registration, Morfeus and Morfeus with boundary conditions on the vessel tree, respectively. In conclusion, the addition of boundary conditions on the vessels significantly improved the accuracy in modeling the response of the lung and tumor over the course of radiotherapy. Minimizing and modeling these geometrical uncertainties will enable future plan adaptation strategies.
SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo
2016-06-15
Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarroll, R; UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX; Beadle, B
Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrectmore » contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to flag auto-contours for special review or used with safety margins in a fully automatic treatment planning system.« less
Kaiser, Tim; Laireiter, Anton Rupert
2017-07-20
In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. ©Tim Kaiser, Anton Rupert Laireiter. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 20.07.2017.
Cascade process modeling with mechanism-based hierarchical neural networks.
Cong, Qiumei; Yu, Wen; Chai, Tianyou
2010-02-01
Cascade process, such as wastewater treatment plant, includes many nonlinear sub-systems and many variables. When the number of sub-systems is big, the input-output relation in the first block and the last block cannot represent the whole process. In this paper we use two techniques to overcome the above problem. Firstly we propose a new neural model: hierarchical neural networks to identify the cascade process; then we use serial structural mechanism model based on the physical equations to connect with neural model. A stable learning algorithm and theoretical analysis are given. Finally, this method is used to model a wastewater treatment plant. Real operational data of wastewater treatment plant is applied to illustrate the modeling approach.
Automated Assessment of Existing Patient's Revised Cardiac Risk Index Using Algorithmic Software.
Hofer, Ira S; Cheng, Drew; Grogan, Tristan; Fujimoto, Yohei; Yamada, Takashige; Beck, Lauren; Cannesson, Maxime; Mahajan, Aman
2018-05-25
Previous work in the field of medical informatics has shown that rules-based algorithms can be created to identify patients with various medical conditions; however, these techniques have not been compared to actual clinician notes nor has the ability to predict complications been tested. We hypothesize that a rules-based algorithm can successfully identify patients with the diseases in the Revised Cardiac Risk Index (RCRI). Patients undergoing surgery at the University of California, Los Angeles Health System between April 1, 2013 and July 1, 2016 and who had at least 2 previous office visits were included. For each disease in the RCRI except renal failure-congestive heart failure, ischemic heart disease, cerebrovascular disease, and diabetes mellitus-diagnosis algorithms were created based on diagnostic and standard clinical treatment criteria. For each disease state, the prevalence of the disease as determined by the algorithm, International Classification of Disease (ICD) code, and anesthesiologist's preoperative note were determined. Additionally, 400 American Society of Anesthesiologists classes III and IV cases were randomly chosen for manual review by an anesthesiologist. The sensitivity, specificity, accuracy, positive predictive value, negative predictive value, and area under the receiver operating characteristic curve were determined using the manual review as a gold standard. Last, the ability of the RCRI as calculated by each of the methods to predict in-hospital mortality was determined, and the time necessary to run the algorithms was calculated. A total of 64,151 patients met inclusion criteria for the study. In general, the incidence of definite or likely disease determined by the algorithms was higher than that detected by the anesthesiologist. Additionally, in all disease states, the prevalence of disease was always lowest for the ICD codes, followed by the preoperative note, followed by the algorithms. In the subset of patients for whom the records were manually reviewed, the algorithms were generally the most sensitive and the ICD codes the most specific. When computing the modified RCRI using each of the methods, the modified RCRI from the algorithms predicted in-hospital mortality with an area under the receiver operating characteristic curve of 0.70 (0.67-0.73), which compared to 0.70 (0.67-0.72) for ICD codes and 0.64 (0.61-0.67) for the preoperative note. On average, the algorithms took 12.64 ± 1.20 minutes to run on 1.4 million patients. Rules-based algorithms for disease in the RCRI can be created that perform with a similar discriminative ability as compared to physician notes and ICD codes but with significantly increased economies of scale.
Factor-Analysis Methods for Higher-Performance Neural Prostheses
Santhanam, Gopal; Yu, Byron M.; Gilja, Vikash; Ryu, Stephen I.; Afshar, Afsheen; Sahani, Maneesh; Shenoy, Krishna V.
2009-01-01
Neural prostheses aim to provide treatment options for individuals with nervous-system disease or injury. It is necessary, however, to increase the performance of such systems before they can be clinically viable for patients with motor dysfunction. One performance limitation is the presence of correlated trial-to-trial variability that can cause neural responses to wax and wane in concert as the subject is, for example, more attentive or more fatigued. If a system does not properly account for this variability, it may mistakenly interpret such variability as an entirely different intention by the subject. We report here the design and characterization of factor-analysis (FA)–based decoding algorithms that can contend with this confound. We characterize the decoders (classifiers) on experimental data where monkeys performed both a real reach task and a prosthetic cursor task while we recorded from 96 electrodes implanted in dorsal premotor cortex. The decoder attempts to infer the underlying factors that comodulate the neurons' responses and can use this information to substantially lower error rates (one of eight reach endpoint predictions) by ≲75% (e.g., ∼20% total prediction error using traditional independent Poisson models reduced to ∼5%). We also examine additional key aspects of these new algorithms: the effect of neural integration window length on performance, an extension of the algorithms to use Poisson statistics, and the effect of training set size on the decoding accuracy of test data. We found that FA-based methods are most effective for integration windows >150 ms, although still advantageous at shorter timescales, that Gaussian-based algorithms performed better than the analogous Poisson-based algorithms and that the FA algorithm is robust even with a limited amount of training data. We propose that FA-based methods are effective in modeling correlated trial-to-trial neural variability and can be used to substantially increase overall prosthetic system performance. PMID:19297518
McMahon, Ryan; Berbeco, Ross; Nishioka, Seiko; Ishikawa, Masayori; Papiez, Lech
2008-09-01
An MLC control algorithm for delivering intensity modulated radiation therapy (IMRT) to targets that are undergoing two-dimensional (2D) rigid motion in the beam's eye view (BEV) is presented. The goal of this method is to deliver 3D-derived fluence maps over a moving patient anatomy. Target motion measured prior to delivery is first used to design a set of planned dynamic-MLC (DMLC) sliding-window leaf trajectories. During actual delivery, the algorithm relies on real-time feedback to compensate for target motion that does not agree with the motion measured during planning. The methodology is based on an existing one-dimensional (ID) algorithm that uses on-the-fly intensity calculations to appropriately adjust the DMLC leaf trajectories in real-time during exposure delivery [McMahon et al., Med. Phys. 34, 3211-3223 (2007)]. To extend the 1D algorithm's application to 2D target motion, a real-time leaf-pair shifting mechanism has been developed. Target motion that is orthogonal to leaf travel is tracked by appropriately shifting the positions of all MLC leaves. The performance of the tracking algorithm was tested for a single beam of a fractionated IMRT treatment, using a clinically derived intensity profile and a 2D target trajectory based on measured patient data. Comparisons were made between 2D tracking, 1D tracking, and no tracking. The impact of the tracking lag time and the frequency of real-time imaging were investigated. A study of the dependence of the algorithm's performance on the level of agreement between the motion measured during planning and delivery was also included. Results demonstrated that tracking both components of the 2D motion (i.e., parallel and orthogonal to leaf travel) results in delivered fluence profiles that are superior to those that track the component of motion that is parallel to leaf travel alone. Tracking lag time effects may lead to relatively large intensity delivery errors compared to the other sources of error investigated. However, the algorithm presented is robust in the sense that it does not rely on a high level of agreement between the target motion measured during treatment planning and delivery.
NASA Astrophysics Data System (ADS)
van Haver, Sven; Janssen, Olaf T. A.; Braat, Joseph J. M.; Janssen, Augustus J. E. M.; Urbach, H. Paul; Pereira, Silvania F.
2008-03-01
In this paper we introduce a new mask imaging algorithm that is based on the source point integration method (or Abbe method). The method presented here distinguishes itself from existing methods by exploiting the through-focus imaging feature of the Extended Nijboer-Zernike (ENZ) theory of diffraction. An introduction to ENZ-theory and its application in general imaging is provided after which we describe the mask imaging scheme that can be derived from it. The remainder of the paper is devoted to illustrating the advantages of the new method over existing methods (Hopkins-based). To this extent several simulation results are included that illustrate advantages arising from: the accurate incorporation of isolated structures, the rigorous treatment of the object (mask topography) and the fully vectorial through-focus image formation of the ENZ-based algorithm.
Cost-Effective Fuel Treatment Planning
NASA Astrophysics Data System (ADS)
Kreitler, J.; Thompson, M.; Vaillant, N.
2014-12-01
The cost of fighting large wildland fires in the western United States has grown dramatically over the past decade. This trend will likely continue with growth of the WUI into fire prone ecosystems, dangerous fuel conditions from decades of fire suppression, and a potentially increasing effect from prolonged drought and climate change. Fuel treatments are often considered the primary pre-fire mechanism to reduce the exposure of values at risk to wildland fire, and a growing suite of fire models and tools are employed to prioritize where treatments could mitigate wildland fire damages. Assessments using the likelihood and consequence of fire are critical because funds are insufficient to reduce risk on all lands needing treatment, therefore prioritization is required to maximize the effectiveness of fuel treatment budgets. Cost-effectiveness, doing the most good per dollar, would seem to be an important fuel treatment metric, yet studies or plans that prioritize fuel treatments using costs or cost-effectiveness measures are absent from the literature. Therefore, to explore the effect of using costs in fuel treatment planning we test four prioritization algorithms designed to reduce risk in a case study examining fuel treatments on the Sisters Ranger District of central Oregon. For benefits we model sediment retention and standing biomass, and measure the effectiveness of each algorithm by comparing the differences among treatment and no treat alternative scenarios. Our objective is to maximize the averted loss of net benefits subject to a representative fuel treatment budget. We model costs across the study landscape using the My Fuel Treatment Planner software, tree list data, local mill prices, and GIS-measured site characteristics. We use fire simulations to generate burn probabilities, and estimate fire intensity as conditional flame length at each pixel. Two prioritization algorithms target treatments based on cost-effectiveness and show improvements over those that use only benefits. Variations across the heterogeneous surfaces of costs and benefits create opportunities for fuel treatments to maximize the expected averted loss of benefits. By targeting these opportunities we demonstrate how incorporating costs in fuel treatment prioritization can improve the outcome of fuel treatment planning.
Kathirvel, M; Subramanian, V Sai; Arun, G; Thirumalaiswamy, S; Ramalingam, K; Kumar, S Ashok; Jagadeesh, K
2012-06-01
To dosimetrically validate AcurosXB algorithm for Volumetric Modulated Arc Therapy (VMAT) in comparison with standard clinical Anisotropic Analytic Algorithm(AAA) and Collapsed Cone Convolution(CCC) dose calculation algorithms. AcurosXB dose calculation algorithm is available with Varian Eclipse treatment planning system (V10). It uses grid-based Boltzmann equation solver to predict dose precisely in lesser time. This study was made to realize algorithms ability to predict dose accurately as its delivery for which five clinical cases each of Brain, Head&Neck, Thoracic, Pelvic and SBRT were taken. Verification plans were created on multicube phantom with iMatrixx-2D detector array and then dose prediction was done with AcurosXB, AAA & CCC (COMPASS System) algorithm and the same were delivered onto CLINAC-iX treatment machine. Delivered dose was captured in iMatrixx plane for all 25 plans. Measured dose was taken as reference to quantify the agreement between AcurosXB calculation algorithm against previously validated AAA and CCC algorithm. Gamma evaluation was performed with clinical criteria distance-to-agreement 3&2mm and dose difference 3&2% in omnipro-I'MRT software. Plans were evaluated in terms of correlation coefficient, quantitative area gamma and average gamma. Study shows good agreement between mean correlation 0.9979±0.0012, 0.9984±0.0009 & 0.9979±0.0011 for AAA, CCC & Acuros respectively. Mean area gamma for criteria 3mm/3% was found to be 98.80±1.04, 98.14±2.31, 98.08±2.01 and 2mm/2% was found to be 93.94±3.83, 87.17±10.54 & 92.36±5.46 for AAA, CCC & Acuros respectively. Mean average gamma for 3mm/3% was 0.26±0.07, 0.42±0.08, 0.28±0.09 and 2mm/2% was found to be 0.39±0.10, 0.64±0.11, 0.42±0.13 for AAA, CCC & Acuros respectively. This study demonstrated that the AcurosXB algorithm had a good agreement with the AAA & CCC in terms of dose prediction. In conclusion AcurosXB algorithm provides a valid, accurate and speedy alternative to AAA and CCC algorithms in a busy clinical environment. © 2012 American Association of Physicists in Medicine.
Acne Scarring—Pathogenesis, Evaluation, and Treatment Options
Connolly, Deirdre; Vu, Ha Linh; Mariwalla, Kavita
2017-01-01
Acne vulgaris is a ubiquitous problem affecting 80 percent of people ages 11 to 30 years, with many patients experiencing some degree of scarring. This review focuses on atrophic scars, the most common type of acne scar. We briefly address the cellular sequelae that lead to scar formation and the initial evaluation of patients with acne scars. We then discuss an algorithmic approach to the treatment of acne scarring based on the classification of scars into erythematous and atrophic types. Lastly, we discuss the future treatment of acne scars and ongoing clinical trials. PMID:29344322
'Treat to Target' - Lessons Learnt.
Kurti, Zsuzsanna; Vegh, Zsuzsanna; Golovics, Petra Anna; Lakatos, Peter Laszlo
2016-01-01
Therapeutic management in inflammatory bowel diseases (IBD) has significantly changed in the last decades with the advent of biological therapy resulting in new treatment targets other than clinical symptoms. Patient stratification in the early stage of the disease is an important step to identify patients with poor prognosis, who might benefit from early aggressive treatment to avoid complications in the later disease course. Recent randomized and hypothesis driven (e.g., Randomized Evaluation of an Algorithm for Crohn's Treatment, Post-Operative Crohn's Endoscopic Recurrence) clinical trials conducted in the biological era underscore the need of objective disease monitoring including assessment of biomarkers (e.g., C-reactive protein and calprotectin), mucosal healing and, for biologically treated patients, therapeutic drug monitoring beside clinical symptom assessment in both Crohn's disease and ulcerative colitis. Assessing the treatment efficacy objectively has become an important element of patient monitoring besides clinical symptom assessment. Further clinical studies are needed to assess whether implementation of new therapeutic algorithms based on these targets and tight monitoring in clinical practice have the potential to further improve long-term disease outcomes in IBD. © 2016 S. Karger AG, Basel.
A Decision Fusion Framework for Treatment Recommendation Systems.
Mei, Jing; Liu, Haifeng; Li, Xiang; Xie, Guotong; Yu, Yiqin
2015-01-01
Treatment recommendation is a nontrivial task--it requires not only domain knowledge from evidence-based medicine, but also data insights from descriptive, predictive and prescriptive analysis. A single treatment recommendation system is usually trained or modeled with a limited (size or quality) source. This paper proposes a decision fusion framework, combining both knowledge-driven and data-driven decision engines for treatment recommendation. End users (e.g. using the clinician workstation or mobile apps) could have a comprehensive view of various engines' opinions, as well as the final decision after fusion. For implementation, we leverage several well-known fusion algorithms, such as decision templates and meta classifiers (of logistic and SVM, etc.). Using an outcome-driven evaluation metric, we compare the fusion engine with base engines, and our experimental results show that decision fusion is a promising way towards a more valuable treatment recommendation.
Validation of two algorithms for managing children with a non-blanching rash.
Riordan, F Andrew I; Jones, Laura; Clark, Julia
2016-08-01
Paediatricians are concerned that children who present with a non-blanching rash (NBR) may have meningococcal disease (MCD). Two algorithms have been devised to help identify which children with an NBR have MCD. To evaluate the NBR algorithms' ability to identify children with MCD. The Newcastle-Birmingham-Liverpool (NBL) algorithm was applied retrospectively to three cohorts of children who had presented with NBRs. This algorithm was also piloted in four hospitals, and then used prospectively for 12 months in one hospital. The National Institute for Health and Care Excellence (NICE) algorithm was validated retrospectively using data from all cohorts. The cohorts included 625 children, 145 (23%) of whom had confirmed or probable MCD. Paediatricians empirically treated 324 (52%) children with antibiotics. The NBL algorithm identified all children with MCD and suggested treatment for a further 86 children (sensitivity 100%, specificity 82%). One child with MCD did not receive immediate antibiotic treatment, despite this being suggested by the algorithm. The NICE algorithm suggested 382 children (61%) who should be treated with antibiotics. This included 141 of the 145 children with MCD (sensitivity 97%, specificity 50%). These algorithms may help paediatricians identify children with MCD who present with NBRs. The NBL algorithm may be more specific than the NICE algorithm as it includes fewer features suggesting MCD. The only significant delay in treatment of MCD occurred when the algorithms were not followed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
A new algorithm for attitude-independent magnetometer calibration
NASA Technical Reports Server (NTRS)
Alonso, Roberto; Shuster, Malcolm D.
1994-01-01
A new algorithm is developed for inflight magnetometer bias determination without knowledge of the attitude. This algorithm combines the fast convergence of a heuristic algorithm currently in use with the correct treatment of the statistics and without discarding data. The algorithm performance is examined using simulated data and compared with previous algorithms.
Deleu, Dirk; Mesraoua, Boulenouar; El Khider, Hisham; Canibano, Beatriz; Melikyan, Gayane; Al Hail, Hassan; Mhjob, Noha; Bhagat, Anjushri; Ibrahim, Faiza; Hanssens, Yolande
2017-03-01
The introduction of disease-modifying therapies (DMTs) - with varying degrees of efficacy for reducing annual relapse rate and disability progression - has considerably transformed the therapeutic landscape of relapsing-remitting multiple sclerosis (RRMS). We aim to develop rational evidence-based treatment recommendations and algorithms for the management of clinically isolated syndrome (CIS) and RRMS that conform to the healthcare system in a fast-developing economic country such as Qatar. We conducted a systematic review using a comprehensive search of MEDLINE, PubMed, and Cochrane Database of Systematic Reviews (1 January 1990 through 30 September 2016). Additional searches of the American Academy of Neurology and European Committee for Treatment and Research in Multiple Sclerosis abstracts from 2012 through 2016 were performed, in addition to searches of the Food and Drug Administration and European Medicines Agency websites to obtain relevant safety information on these DMTs. For each of the DMTs, the mode of action, efficacy, safety and tolerability are briefly discussed. To facilitate the interpretation, the efficacy data of the pivotal phase III trials are expressed by their most clinically useful measure of therapeutic efficacy, the number needed to treat (NNT). In addition, an overview of head-to-head trials in RRMS is provided as well as a summary of the several different RRMS management strategies (lateral switching, escalation, induction, maintenance and combination therapy) and the potential role of each DMT. Finally, algorithms were developed for CIS, active and highly active or rapidly evolving RRMS and subsequent breakthrough disease or suboptimal treatment response while on DMTs. The benefit-to-risk profiles of the DMTs, taking into account patient preference, allowed the provision of rational and safe patient-tailored treatment algorithms. Recommendations and algorithms for the management of CIS and RRMS have been developed relevant to the healthcare system of this fast-developing economic country.
Cunha, Laura Pires da; Juncal, Verena; Carvalhaes, Cecília Godoy; Leão, Sylvia Cardoso; Chimara, Erica; Freitas, Denise
2018-06-01
To report a case of nocardial scleritis and to propose a logical treatment algorithm based on a literature review. It is important to suspect a nocardial infection when evaluating anterior unilateral scleritis accompanied by multiple purulent or necrotic abscesses, especially in male patients with a history of chronic ocular pain and redness, trauma inflicted by organic materials, or recent ophthalmic surgery. A microbiological investigation is essential. In positive cases, a direct smear reveals weakly acid-fast organisms or Gram-positive, thin, beading and branching filaments. Also, the organism (usually) grows on blood agar and Lowenstein-Jensen plates. An infection can generally be fully resolved by debridement of necrotic areas and application of topical amikacin drops accompanied by systemic sulfamethoxazole-trimethoprim. Together with the case report described, we review data on a total of 43 eyes with nocardial scleritis. Our proposed algorithm may afford a useful understanding of this sight-threatening disease, facilitating easier and faster diagnosis and management.
Sequencing batch-reactor control using Gaussian-process models.
Kocijan, Juš; Hvala, Nadja
2013-06-01
This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alpuche Aviles, Jorge E.; VanBeek, Timothy
Purpose: This work presents an algorithm used to quantify intra-fraction motion for patients treated using deep inspiration breath hold (DIBH). The algorithm quantifies the position of the chest wall in breast tangent fields using electronic portal images. Methods: The algorithm assumes that image profiles, taken along a direction perpendicular to the medial border of the field, follow a monotonically and smooth decreasing function. This assumption is invalid in the presence of lung and can be used to calculate chest wall position. The algorithm was validated by determining the position of the chest wall for varying field edge positions in portalmore » images of a thoracic phantom. The algorithm was used to quantify intra-fraction motion in cine images for 7 patients treated with DIBH. Results: Phantom results show that changes in the distance between chest wall and field edge were accurate within 0.1 mm on average. For a fixed field edge, the algorithm calculates the position of the chest wall with a 0.2 mm standard deviation. Intra-fraction motion for DIBH patients was within 1 mm 91.4% of the time and within 1.5 mm 97.9% of the time. The maximum intra-fraction motion was 3.0 mm. Conclusions: A physics based algorithm was developed and can be used to quantify the position of chest wall irradiated in tangent portal images with an accuracy of 0.1 mm and precision of 0.6 mm. Intra-fraction motion for patients treated with DIBH at our clinic is less than 3 mm.« less
Diagnosis and Management of Functional Heartburn.
Hachem, Christine; Shaheen, Nicholas J
2016-01-01
Heartburn is among the most common gastrointestinal symptoms presenting to both generalist physicians and gastroenterologists. Heartburn that does not respond to traditional acid suppression is a diagnostic and therapeutic dilemma. In the era of high utilization of proton pump inhibitors, a substantial proportion of patients presenting to the gastroenterologist with chronic symptoms of heartburn do not have a reflux-mediated disease. Subjects without objective evidence of reflux as a cause of their symptoms have "functional heartburn". The diagnostic role of endoscopy, reflux and motility testing in functional heartburn (FH) patients is discussed. Lifestyle modifications, pharmacological interventions, and alternative therapies for FH are also presented. Recognition of patients with FH allows earlier assignment of these patients to different treatment algorithms, which may allow greater likelihood of success of treatment, diminished resource utilization and improved quality of life. Further data on this large and understudied group of patients is necessary to allow improvement in treatment algorithms and a more evidence-based approach to care of these patients.
Alvarez, Matheus; de Pina, Diana Rodrigues; Romeiro, Fernando Gomes; Duarte, Sérgio Barbosa; Miranda, José Ricardo de Arruda
2014-07-26
Hepatocellular carcinoma is a primary tumor of the liver and involves different treatment modalities according to the tumor stage. After local therapies, the tumor evaluation is based on the mRECIST criteria, which involves the measurement of the maximum diameter of the viable lesion. This paper describes a computed methodology to measure through the contrasted area of the lesions the maximum diameter of the tumor by a computational algorithm. 63 computed tomography (CT) slices from 23 patients were assessed. Non-contrasted liver and HCC typical nodules were evaluated, and a virtual phantom was developed for this purpose. Optimization of the algorithm detection and quantification was made using the virtual phantom. After that, we compared the algorithm findings of maximum diameter of the target lesions against radiologist measures. Computed results of the maximum diameter are in good agreement with the results obtained by radiologist evaluation, indicating that the algorithm was able to detect properly the tumor limits. A comparison of the estimated maximum diameter by radiologist versus the algorithm revealed differences on the order of 0.25 cm for large-sized tumors (diameter > 5 cm), whereas agreement lesser than 1.0 cm was found for small-sized tumors. Differences between algorithm and radiologist measures were accurate for small-sized tumors with a trend to a small decrease for tumors greater than 5 cm. Therefore, traditional methods for measuring lesion diameter should be complemented non-subjective measurement methods, which would allow a more correct evaluation of the contrast-enhanced areas of HCC according to the mRECIST criteria.
Carver, Robert L; Sprunger, Conrad P; Hogstrom, Kenneth R; Popple, Richard A; Antolak, John A
2016-05-08
The purpose of this study was to evaluate the accuracy and calculation speed of electron dose distributions calculated by the Eclipse electron Monte Carlo (eMC) algorithm for use with bolus electron conformal therapy (ECT). The recent com-mercial availability of bolus ECT technology requires further validation of the eMC dose calculation algorithm. eMC-calculated electron dose distributions for bolus ECT have been compared to previously measured TLD-dose points throughout patient-based cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV (planning treatment volume) CT anatomy. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The treatment plans were imported into the Eclipse treatment planning system, and electron dose distributions calculated using 1% and < 0.2% statistical uncertainties. The accuracy of the dose calculations using moderate smoothing and no smooth-ing were evaluated. Dose differences (eMC-calculated less measured dose) were evaluated in terms of absolute dose difference, where 100% equals the given dose, as well as distance to agreement (DTA). Dose calculations were also evaluated for calculation speed. Results from the eMC for the retromolar trigone phantom using 1% statistical uncertainty without smoothing showed calculated dose at 89% (41/46) of the measured TLD-dose points was within 3% dose difference or 3 mm DTA of the measured value. The average dose difference was -0.21%, and the net standard deviation was 2.32%. Differences as large as 3.7% occurred immediately distal to the mandible bone. Results for the nose phantom, using 1% statistical uncertainty without smoothing, showed calculated dose at 93% (53/57) of the measured TLD-dose points within 3% dose difference or 3 mm DTA. The average dose difference was 1.08%, and the net standard deviation was 3.17%. Differences as large as 10% occurred lateral to the nasal air cavities. Including smoothing had insignificant effects on the accuracy of the retromolar trigone phantom calculations, but reduced the accuracy of the nose phantom calculations in the high-gradient dose areas. Dose calculation times with 1% statistical uncertainty for the retromolar trigone and nose treatment plans were 30 s and 24 s, respectively, using 16 processors (Intel Xeon E5-2690, 2.9 GHz) on a framework agent server (FAS). In comparison, the eMC was significantly more accurate than the pencil beam algorithm (PBA). The eMC has comparable accuracy to the pencil beam redefinition algorithm (PBRA) used for bolus ECT planning and has acceptably low dose calculation times. The eMC accuracy decreased when smoothing was used in high-gradient dose regions. The eMC accuracy was consistent with that previously reported for accuracy of the eMC electron dose algorithm and shows that the algorithm is suitable for clinical implementation of bolus ECT.
Curtis, Jeffrey R; Schabert, Vernon F; Harrison, David J; Yeaw, Jason; Korn, Jonathan R; Quach, Caroleen; Yun, Huifeng; Joseph, George J; Collier, David H
2014-07-01
The aim of this analysis was to implement a claims-based algorithm to estimate biologic cost per effectively treated patient for biologics approved for moderate to severe rheumatoid arthritis (RA). This retrospective analysis included commercially insured adults (aged 18-63 years) with RA in a commercial database, who initiated biologic treatment with abatacept, adalimumab, etanercept, golimumab, or infliximab between 2007 and 2010. The algorithm defined effectiveness as having all of the following: high adherence, no biologic dose increase, no biologic switching, no new nonbiologic disease-modifying antirheumatic drug, no increased or new oral glucocorticoid use, and no more than 1 glucocorticoid injection. For each biologic, cost per effectively treated patient was defined as total drug and administration costs (from allowed amounts on claims), divided by the number of patients categorized as effectively treated. Of 15,351 patients, 12,018 (78.3%) were women, and the mean (SD) age was 49.7 (9.6) years. The algorithm categorized treatment as effective in the first year for 30% (1899/6374) of etanercept, 30% (1396/4661) of adalimumab, 20% (560/2765) of infliximab, 27% (361/1338) of abatacept, and 29% (62/213) of golimumab treated patients. The 1-year biologic cost per effectively treated patient, as defined by the algorithm, was nominally lower for subcutaneously injected biologics than for infused biologics. The 1-year biologic cost per effectively treated patient, as defined by the algorithm, was lowest for etanercept ($49,952), followed by golimumab ($50,189), adalimumab ($52,858), abatacept ($71,866), and infliximab ($104,333). Algorithm-defined effectiveness was similar for biologics other than infliximab. The 1-year biologic cost per effectively treated patient, as defined by the algorithm, was nominally lower for subcutaneously injected biologics than for infused biologics. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.
Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir
2017-01-01
Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mass-casualty triage: time for an evidence-based approach.
Jenkins, Jennifer Lee; McCarthy, Melissa L; Sauer, Lauren M; Green, Gary B; Stuart, Stephanie; Thomas, Tamara L; Hsu, Edbert B
2008-01-01
Mass-casualty triage has developed from a wartime necessity to a civilian tool to ensure that constrained medical resources are directed at achieving the greatest good for the most number of people. Several primary and secondary triage tools have been developed, including Simple Treatment and Rapid Transport (START), JumpSTART, Care Flight Triage, Triage Sieve, Sacco Triage Method, Secondary Assessment of Victim Endpoint (SAVE), and Pediatric Triage Tape. Evidence to support the use of one triage algorithm over another is limited, and the development of effective triage protocols is an important research priority. The most widely recognized mass-casualty triage algorithms in use today are not evidence-based, and no studies directly address these issues in the mass-casualty setting. Furthermore, no studies have evaluated existing mass-casualty triage algorithms regarding ease of use, reliability, and validity when biological, chemical, or radiological agents are introduced. Currently, the lack of a standardized mass-casualty triage system that is well validated, reliable, and uniformly accepted, remains an important gap. Future research directed at triage is recognized as a necessity, and the development of a practical, universal, triage algorithm that incorporates requirements for decontamination or special precautions for infectious agents would facilitate a more organized mass-casualty medical response.
Towards frameless maskless SRS through real-time 6DoF robotic motion compensation.
Belcher, Andrew H; Liu, Xinmin; Chmura, Steven; Yenice, Kamil; Wiersma, Rodney D
2017-11-13
Stereotactic radiosurgery (SRS) uses precise dose placement to treat conditions of the CNS. Frame-based SRS uses a metal head ring fixed to the patient's skull to provide high treatment accuracy, but patient comfort and clinical workflow may suffer. Frameless SRS, while potentially more convenient, may increase uncertainty of treatment accuracy and be physiologically confining to some patients. By incorporating highly precise robotics and advanced software algorithms into frameless treatments, we present a novel frameless and maskless SRS system where a robot provides real-time 6DoF head motion stabilization allowing positional accuracies to match or exceed those of traditional frame-based SRS. A 6DoF parallel kinematics robot was developed and integrated with a real-time infrared camera in a closed loop configuration. A novel compensation algorithm was developed based on an iterative closest-path correction approach. The robotic SRS system was tested on six volunteers, whose motion was monitored and compensated for in real-time over 15 min simulated treatments. The system's effectiveness in maintaining the target's 6DoF position within preset thresholds was determined by comparing volunteer head motion with and without compensation. Comparing corrected and uncorrected motion, the 6DoF robotic system showed an overall improvement factor of 21 in terms of maintaining target position within 0.5 mm and 0.5 degree thresholds. Although the system's effectiveness varied among the volunteers examined, for all volunteers tested the target position remained within the preset tolerances 99.0% of the time when robotic stabilization was used, compared to 4.7% without robotic stabilization. The pre-clinical robotic SRS compensation system was found to be effective at responding to sub-millimeter and sub-degree cranial motions for all volunteers examined. The system's success with volunteers has demonstrated its capability for implementation with frameless and maskless SRS treatments, potentially able to achieve the same or better treatment accuracies compared to traditional frame-based approaches.
Towards frameless maskless SRS through real-time 6DoF robotic motion compensation
NASA Astrophysics Data System (ADS)
Belcher, Andrew H.; Liu, Xinmin; Chmura, Steven; Yenice, Kamil; Wiersma, Rodney D.
2017-12-01
Stereotactic radiosurgery (SRS) uses precise dose placement to treat conditions of the CNS. Frame-based SRS uses a metal head ring fixed to the patient’s skull to provide high treatment accuracy, but patient comfort and clinical workflow may suffer. Frameless SRS, while potentially more convenient, may increase uncertainty of treatment accuracy and be physiologically confining to some patients. By incorporating highly precise robotics and advanced software algorithms into frameless treatments, we present a novel frameless and maskless SRS system where a robot provides real-time 6DoF head motion stabilization allowing positional accuracies to match or exceed those of traditional frame-based SRS. A 6DoF parallel kinematics robot was developed and integrated with a real-time infrared camera in a closed loop configuration. A novel compensation algorithm was developed based on an iterative closest-path correction approach. The robotic SRS system was tested on six volunteers, whose motion was monitored and compensated for in real-time over 15 min simulated treatments. The system’s effectiveness in maintaining the target’s 6DoF position within preset thresholds was determined by comparing volunteer head motion with and without compensation. Comparing corrected and uncorrected motion, the 6DoF robotic system showed an overall improvement factor of 21 in terms of maintaining target position within 0.5 mm and 0.5 degree thresholds. Although the system’s effectiveness varied among the volunteers examined, for all volunteers tested the target position remained within the preset tolerances 99.0% of the time when robotic stabilization was used, compared to 4.7% without robotic stabilization. The pre-clinical robotic SRS compensation system was found to be effective at responding to sub-millimeter and sub-degree cranial motions for all volunteers examined. The system’s success with volunteers has demonstrated its capability for implementation with frameless and maskless SRS treatments, potentially able to achieve the same or better treatment accuracies compared to traditional frame-based approaches.
Sub-second pencil beam dose calculation on GPU for adaptive proton therapy
NASA Astrophysics Data System (ADS)
da Silva, Joakim; Ansorge, Richard; Jena, Rajesh
2015-06-01
Although proton therapy delivered using scanned pencil beams has the potential to produce better dose conformity than conventional radiotherapy, the created dose distributions are more sensitive to anatomical changes and patient motion. Therefore, the introduction of adaptive treatment techniques where the dose can be monitored as it is being delivered is highly desirable. We present a GPU-based dose calculation engine relying on the widely used pencil beam algorithm, developed for on-line dose calculation. The calculation engine was implemented from scratch, with each step of the algorithm parallelized and adapted to run efficiently on the GPU architecture. To ensure fast calculation, it employs several application-specific modifications and simplifications, and a fast scatter-based implementation of the computationally expensive kernel superposition step. The calculation time for a skull base treatment plan using two beam directions was 0.22 s on an Nvidia Tesla K40 GPU, whereas a test case of a cubic target in water from the literature took 0.14 s to calculate. The accuracy of the patient dose distributions was assessed by calculating the γ-index with respect to a gold standard Monte Carlo simulation. The passing rates were 99.2% and 96.7%, respectively, for the 3%/3 mm and 2%/2 mm criteria, matching those produced by a clinical treatment planning system.
Ziacchi, Matteo; Palmisano, Pietro; Biffi, Mauro; Ricci, Renato P; Landolina, Maurizio; Zoni-Berisso, Massimo; Occhetta, Eraldo; Maglia, Giampiero; Botto, Gianluca; Padeletti, Luigi; Boriani, Giuseppe
2018-04-01
: Modern pacemakers have an increasing number of programable parameters and specific algorithms designed to optimize pacing therapy in relation to the individual characteristics of patients. When choosing the most appropriate pacemaker type and programing, the following variables must be taken into account: the type of bradyarrhythmia at the time of pacemaker implantation; the cardiac chamber requiring pacing, and the percentage of pacing actually needed to correct the rhythm disorder; the possible association of multiple rhythm disturbances and conduction diseases; the evolution of conduction disorders during follow-up. The goals of device programing are to preserve or restore the heart rate response to metabolic and hemodynamic demands; to maintain physiological conduction; to maximize device longevity; to detect, prevent, and treat atrial arrhythmia. In patients with sinus node disease, the optimal pacing mode is DDDR. Based on all the available evidence, in this setting, we consider appropriate the activation of the following algorithms: rate responsive function in patients with chronotropic incompetence; algorithms to maximize intrinsic atrioventricular conduction in the absence of atrioventricular blocks; mode-switch algorithms; algorithms for autoadaptive management of the atrial pacing output; algorithms for the prevention and treatment of atrial tachyarrhythmias in the subgroup of patients with atrial tachyarrhythmias/atrial fibrillation. The purpose of this two-part consensus document is to provide specific suggestions (based on an extensive literature review) on appropriate pacemaker setting in relation to patients' clinical features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao Daliang; Earl, Matthew A.; Luan, Shuang
2006-04-15
A new leaf-sequencing approach has been developed that is designed to reduce the number of required beam segments for step-and-shoot intensity modulated radiation therapy (IMRT). This approach to leaf sequencing is called continuous-intensity-map-optimization (CIMO). Using a simulated annealing algorithm, CIMO seeks to minimize differences between the optimized and sequenced intensity maps. Two distinguishing features of the CIMO algorithm are (1) CIMO does not require that each optimized intensity map be clustered into discrete levels and (2) CIMO is not rule-based but rather simultaneously optimizes both the aperture shapes and weights. To test the CIMO algorithm, ten IMRT patient cases weremore » selected (four head-and-neck, two pancreas, two prostate, one brain, and one pelvis). For each case, the optimized intensity maps were extracted from the Pinnacle{sup 3} treatment planning system. The CIMO algorithm was applied, and the optimized aperture shapes and weights were loaded back into Pinnacle. A final dose calculation was performed using Pinnacle's convolution/superposition based dose calculation. On average, the CIMO algorithm provided a 54% reduction in the number of beam segments as compared with Pinnacle's leaf sequencer. The plans sequenced using the CIMO algorithm also provided improved target dose uniformity and a reduced discrepancy between the optimized and sequenced intensity maps. For ten clinical intensity maps, comparisons were performed between the CIMO algorithm and the power-of-two reduction algorithm of Xia and Verhey [Med. Phys. 25(8), 1424-1434 (1998)]. When the constraints of a Varian Millennium multileaf collimator were applied, the CIMO algorithm resulted in a 26% reduction in the number of segments. For an Elekta multileaf collimator, the CIMO algorithm resulted in a 67% reduction in the number of segments. An average leaf sequencing time of less than one minute per beam was observed.« less
Parabolized Navier-Stokes solutions of separation and trailing-edge flows
NASA Technical Reports Server (NTRS)
Brown, J. L.
1983-01-01
A robust, iterative solution procedure is presented for the parabolized Navier-Stokes or higher order boundary layer equations as applied to subsonic viscous-inviscid interaction flows. The robustness of the present procedure is due, in part, to an improved algorithmic formulation. The present formulation is based on a reinterpretation of stability requirements for this class of algorithms and requires only second order accurate backward or central differences for all streamwise derivatives. Upstream influence is provided for through the algorithmic formulation and iterative sweeps in x. The primary contribution to robustness, however, is the boundary condition treatment, which imposes global constraints to control the convergence path. Discussed are successful calculations of subsonic, strong viscous-inviscid interactions, including separation. These results are consistent with Navier-Stokes solutions and triple deck theory.
A new algorithm for epilepsy seizure onset detection and spread estimation from EEG signals
NASA Astrophysics Data System (ADS)
Quintero-Rincón, Antonio; Pereyra, Marcelo; D'Giano, Carlos; Batatia, Hadj; Risk, Marcelo
2016-04-01
Appropriate diagnosis and treatment of epilepsy is a main public health issue. Patients suffering from this disease often exhibit different physical characterizations, which result from the synchronous and excessive discharge of a group of neurons in the cerebral cortex. Extracting this information using EEG signals is an important problem in biomedical signal processing. In this work we propose a new algorithm for seizure onset detection and spread estimation in epilepsy patients. The algorithm is based on a multilevel 1-D wavelet decomposition that captures the physiological brain frequency signals coupled with a generalized gaussian model. Preliminary experiments with signals from 30 epilepsy crisis and 11 subjects, suggest that the proposed methodology is a powerful tool for detecting the onset of epilepsy seizures with his spread across the brain.
An algorithm for assessment and treatment of postherniorrhaphy pain.
Voorbrood, C E H; Burgmans, J P J; Van Dalen, T; Breel, J; Clevers, G J; Wille, F; Simmermacher, R K J
2015-08-01
Inguinal pain after groin hernia repair is a challenging issue. About 50 % of postherniorrhaphy pain allegedly is neuropathic, treatment of which is cumbersome given the limited efficacy of current therapeutic modalities. Possibly a clear protocol assessing the type of pain and treating it accordingly could improve its treatment. A prospective study was done to evaluate an algorithm in patients with chronic postherniorrhaphy groin pain, aiming to select those with neuropathic pain and to treat appropriately. Treatment consisted of ultrasound-guided nerve blocks as an initial treatment for neuropathic pain. If long-term pain reduction proved inadequate, peripheral nerve stimulation was offered. After our diagnostic workup consisting of anamnesis, physical examination and imaging, 68 patients out of 105 were diagnosed as having non-neuropathic pain. These patients were referred to the most appropriate consultant, treated accordingly or sometimes pain appeared to be self-limiting. Thirty-seven (35 %) patients were diagnosed as having neuropathic pain with a median NRS of 7 (range 4-9) and were referred for further treatment to our pain clinic. The majority (21 of 28 patients) suffered ileo-inguinal nerve involvement. After ultrasound-guided nerve blocks, a permanent reduction in pain was achieved in 18 patients (62 %) with a median post-treatment NRS of 1 (range 0-3). In six patients to which an additional peripheral nerve stimulator (PNS) was offered, pain reduction to a level of mild complaints with a median NRS of 2 (range 1-8) was observed. In total, 24 of the 28 patients (83 %) diagnosed with neuropathic postherniorrhaphy pain achieved significant pain reduction after algorithm-based treatment. In the present study, we implemented a diagnostic workup for patients with postherniorrhaphy inguinal pain to select those with neuropathic pain. Eighty-three percent of the patients with neuropathic groin pain obtained significant improvement of their pain scores after our protocolled treatment. The effect was achieved by nerve infiltrations and in some cases by an implanted PNS when the former was unsuccessful.
Quah, Conal; Porteous, Matthew; Stephen, Arthur
2017-05-01
The management of periprosthetic fractures around total hip replacements is a complex and challenging problem. Getting it right first time is an important factor in reducing the morbidity, mortality and financial burden associated with these injuries. Understanding and applying the basic principles of fracture management helps increase the chance of successful treatment. Based on these principles, we suggest a treatment algorithm for managing periprosthetic fractures around polished tapered femoral stems.
Hippocampus shape analysis for temporal lobe epilepsy detection in magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Kohan, Zohreh; Azmi, Reza
2016-03-01
There are evidences in the literature that Temporal Lobe Epilepsy (TLE) causes some lateralized atrophy and deformation on hippocampus and other substructures of the brain. Magnetic Resonance Imaging (MRI), due to high-contrast soft tissue imaging, is one of the most popular imaging modalities being used in TLE diagnosis and treatment procedures. Using an algorithm to help clinicians for better and more effective shape deformations analysis could improve the diagnosis and treatment of the disease. In this project our purpose is to design, implement and test a classification algorithm for MRIs based on hippocampal asymmetry detection using shape and size-based features. Our method consisted of two main parts; (1) shape feature extraction, and (2) image classification. We tested 11 different shape and size features and selected four of them that detect the asymmetry in hippocampus significantly in a randomly selected subset of the dataset. Then, we employed a support vector machine (SVM) classifier to classify the remaining images of the dataset to normal and epileptic images using our selected features. The dataset contains 25 patient images in which 12 cases were used as a training set and the rest 13 cases for testing the performance of classifier. We measured accuracy, specificity and sensitivity of, respectively, 76%, 100%, and 70% for our algorithm. The preliminary results show that using shape and size features for detecting hippocampal asymmetry could be helpful in TLE diagnosis in MRI.
Casu, Sebastian; Häske, David
2016-06-01
Delayed antibiotic treatment for patients in severe sepsis and septic shock decreases the probability of survival. In this survey, medical directors of different emergency medical services (EMS) in Germany were asked if they are prepared for pre-hospital sepsis therapy with antibiotics or special algorithms to evaluate the individual preparations of the different rescue areas for the treatment of patients with this infectious disease. The objective of the survey was to obtain a general picture of the current status of the EMS with respect to rapid antibiotic treatment for sepsis. A total of 166 medical directors were invited to complete a short survey on behalf of the different rescue service districts in Germany via an electronic cover letter. Of the rescue districts, 25.6 % (n = 20) stated that they keep antibiotics on EMS vehicles. In addition, 2.6 % carry blood cultures on the vehicles. The most common antibiotic is ceftriaxone (third generation cephalosporin). In total, 8 (10.3 %) rescue districts use an algorithm for patients with sepsis, severe sepsis or septic shock. Although the German EMS is an emergency physician-based rescue system, special opportunities in the form of antibiotics on emergency physician vehicles are missing. Simultaneously, only 10.3 % of the rescue districts use a special algorithm for sepsis therapy. Sepsis, severe sepsis and septic shock do not appear to be prioritized as highly as these deadly diseases should be in the pre-hospital setting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labine, Alexandre; Carrier, Jean-François; Bedwani, Stéphane
2014-08-15
Purpose: To investigate an automatic bronchial and vessel bifurcations detection algorithm for deformable image registration (DIR) assessment to improve lung cancer radiation treatment. Methods: 4DCT datasets were acquired and exported to Varian treatment planning system (TPS) EclipseTM for contouring. The lungs TPS contour was used as the prior shape for a segmentation algorithm based on hierarchical surface deformation that identifies the deformed lungs volumes of the 10 breathing phases. Hounsfield unit (HU) threshold filter was applied within the segmented lung volumes to identify blood vessels and airways. Segmented blood vessels and airways were skeletonised using a hierarchical curve-skeleton algorithm basedmore » on a generalized potential field approach. A graph representation of the computed skeleton was generated to assign one of three labels to each node: the termination node, the continuation node or the branching node. Results: 320 ± 51 bifurcations were detected in the right lung of a patient for the 10 breathing phases. The bifurcations were visually analyzed. 92 ± 10 bifurcations were found in the upper half of the lung and 228 ± 45 bifurcations were found in the lower half of the lung. Discrepancies between ten vessel trees were mainly ascribed to large deformation and in regions where the HU varies. Conclusions: We established an automatic method for DIR assessment using the morphological information of the patient anatomy. This approach allows a description of the lung's internal structure movement, which is needed to validate the DIR deformation fields for accurate 4D cancer treatment planning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurz, Christopher, E-mail: Christopher.Kurz@physik.uni-muenchen.de; Bauer, Julia; Conti, Maurizio
Purpose: External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β{sup +}-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small numbermore » of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. Methods: The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme has been applied to exemplary postirradiation patient data-sets. Results: Among the investigated reconstruction options, the overall best results in terms of image noise, activity quantification, and accurate geometrical recovery were achieved using the ordered subset expectation maximization reconstruction algorithm with time-of-flight (TOF) and point-spread function (PSF) information. For this algorithm, reasonably accurate (better than 5%) and precise (uncertainty of the mean activity below 10%) imaging can be provided down to 80 000 true coincidences at 96% RF. Image noise and geometrical fidelity are generally improved for fewer iterations. The main limitation for PET-based treatment monitoring has been identified in the small number of true coincidences, rather than the high intrinsic random background. Application of the optimized reconstruction scheme to patient data-sets results in a 25% − 50% reduced image noise at a comparable activity quantification accuracy and an improved geometrical performance with respect to the formerly used reconstruction scheme at HIT, adopted from nuclear medicine applications. Conclusions: Under the poor statistical conditions in PET-based treatment monitoring, improved results can be achieved by considering PSF and TOF information during image reconstruction and by applying less iterations than in conventional nuclear medicine imaging. Geometrical fidelity and image noise are mainly limited by the low number of true coincidences, not the high LSO-related random background. The retrieved results might also impact other emerging PET applications at low counting statistics.« less
Schulz, A; Perbix, W; Shoham, Y; Daali, S; Charalampaki, C; Fuchs, P C; Schiefer, J
2017-03-01
Excisional surgical debridement (SD) is still the gold standard in the treatment of deeply burned hands, though the intricate anatomy is easily damaged. Previous studies demonstrated that enzymatic debridement with the bromelain debriding agent NexoBrid ® (EDNX) is more selective and thus can preserve viable tissue with excellent outcome results. So far no method paper has been published presenting different treatment algorithms in this new field. Therefore our aim was to close this gap by presenting our detailed learning curve in EDNX of deeply burned hands. We conducted a single-center prospective observational clinical trial treating 20 patients with deeply burned hands with EDNX. Different anaesthetic procedures, debridement and wound treatment algorithms were compared and main pitfalls described. EDNX was efficient in 90% of the treatments though correct wound bed evaluation was challenging and found unusual compared to SD. Post EDNX surprisingly the majority of the burn surface area was found overestimated (18 wounds). Finally we simplified our process and reduced treatment costs by following a modified treatment algorithm and treating under plexus anaesthesia bedside through a single nurse and one burn surgeon solely. Suprathel ® could be shown to be an appropriate dressing for wound treatment after EDNX. Complete healing (less 5% rest defect) was achieved at an average of day 28. EDNX in deep burned hands is promising regarding handling and duration of the treatment, efficiency and selectivity of debridement, healing potential and early rehabilitation. Following our treatment algorithm EDNX can be performed easily and even without special knowledge in burn wound depth evaluation. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Nesbitt, S D; Shojaee, A; Maa, J-F; Weir, M R
2013-07-01
A prespecified subgroup analysis of an open-label, multicenter, single-arm, dose-titration study is presented. The efficacy and safety of 20-week treatment with an amlodipine (AML)/olmesartan medoxomil (OM)±hydrochlorothiazide (HCTZ) algorithm were assessed in patients with hypertension and type 2 diabetes mellitus (T2DM) who were uncontrolled by antihypertensive monotherapy. Eligible patients received AML/OM 5/20 mg for 4 weeks, followed by stepwise uptitration to AML/OM 5/40 mg, AML/OM 10/40 mg, AML/OM 10/40 mg+HCTZ 12.5 mg and AML/OM 10/40 mg+HCTZ 25 mg at 4-week intervals if blood pressure (BP) remained uncontrolled. The primary end point was the achievement of the seated cuff systolic BP (SeSBP) goal (<140 mm Hg, or <130 mm Hg for patients with T2DM) at week 12. Seated cuff BP was significantly reduced from baseline at all titration dose periods. At week 12, the cumulative SeSBP goal was achieved by 57.9% and 80.1% of patients in the T2DM and non-T2DM subgroups, respectively. Treatment was well tolerated, with low rates of peripheral edema. In summary, switching to a treatment algorithm based on AML/OM±HCTZ after failed monotherapy was safe and improved BP control in patients with hypertension and T2DM.
Modified Treatment Algorithm for Pseudogynecomastia After Massive Weight Loss.
Ziegler, Ulrich E; Lorenz, Udo; Daigeler, Adrien; Ziegler, Selina N; Zeplin, Philip H
2018-06-19
Pseudogynecomastia is the increased aggregation of fatty tissue in the area of the male breast with resultant female appearance. Two forms can appear: pseudogynecomastia after massive weight loss (pseudogynecomastia obese [PO]) and pseudogynecomastia, which is caused only by adipose tissue (pseudogynecomastia fat). For PO, only the Gusenoff classification with corresponding operative treatment options exists. However, this classification is limited by the fact that it underestimates the extensive variability of residual fat tissue and skin excess, both crucial factors for operative planning. For this reason, we propose a modification of the treatment algorithm for the Gusenoff classification based on our results to achieve more masculine results. A total of 43 male patients with PO were included in this retrospective study (grade 1a, n = 1; grade 1b, n = 1; grade 2, n = 17; grade 3, n = 24). Forty-two mastectomies with a free nipple-areola complex (NAC) transposition (grades 2 and 3) and 1 with a subcutaneous mastectomy (grade 1a) with periareolar lifting were performed. A retrospective chart review was performed to obtain data regarding age, body mass index, body mass index loss, weight loss, reason for weight loss, comorbidities, nicotine, and additional procedures, postoperative sensitive on the NAC transplants and complications. None of the free-nipple grafts were lost. Forty (95%) of 42 patients with mastectomy had a resensitivity on the NAC. For pseudogynecomastia, the treatment algorithm of the Gusenoff classification should be modified and adapted according to our recommendations to achieve more optimal masculine results.
Yuste, Valentin; Delgado, Julio; Agullo, Alberto; Sampietro, Jose Mauel
2017-06-01
Burns of the first commissure of the hand can evolve into an adduction contracture of the thumb. We decided to conduct a review of the existing literature on the treatment of full-thickness burns of the first commissure in order to develop a treatment algorithm that integrates the various currently available procedures. A search of the existing literature was conducted, focusing on the treatment of a burn of the first commissure in its chronic and acute phases. A total of 29 relevant articles were selected; 24 focused exclusively on the chronic contracture stage, while 3 focused exclusively on the acute burn stage, and 2 articles studied both stages. A therapeutic algorithm for full-thickness burns of the first commissure of the hand was developed. With this algorithm we sought to relate each degree and stage of the burn with a treatment. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.
New development of the image matching algorithm
NASA Astrophysics Data System (ADS)
Zhang, Xiaoqiang; Feng, Zhao
2018-04-01
To study the image matching algorithm, algorithm four elements are described, i.e., similarity measurement, feature space, search space and search strategy. Four common indexes for evaluating the image matching algorithm are described, i.e., matching accuracy, matching efficiency, robustness and universality. Meanwhile, this paper describes the principle of image matching algorithm based on the gray value, image matching algorithm based on the feature, image matching algorithm based on the frequency domain analysis, image matching algorithm based on the neural network and image matching algorithm based on the semantic recognition, and analyzes their characteristics and latest research achievements. Finally, the development trend of image matching algorithm is discussed. This study is significant for the algorithm improvement, new algorithm design and algorithm selection in practice.
An Efficient Statistical Computation Technique for Health Care Big Data using R
NASA Astrophysics Data System (ADS)
Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.
2017-08-01
Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.
The Texas Medication Algorithm Project antipsychotic algorithm for schizophrenia: 2006 update.
Moore, Troy A; Buchanan, Robert W; Buckley, Peter F; Chiles, John A; Conley, Robert R; Crismon, M Lynn; Essock, Susan M; Finnerty, Molly; Marder, Stephen R; Miller, Del D; McEvoy, Joseph P; Robinson, Delbert G; Schooler, Nina R; Shon, Steven P; Stroup, T Scott; Miller, Alexander L
2007-11-01
A panel of academic psychiatrists and pharmacists, clinicians from the Texas public mental health system, advocates, and consumers met in June 2006 in Dallas, Tex., to review recent evidence in the pharmacologic treatment of schizophrenia. The goal of the consensus conference was to update and revise the Texas Medication Algorithm Project (TMAP) algorithm for schizophrenia used in the Texas Implementation of Medication Algorithms, a statewide quality assurance program for treatment of major psychiatric illness. Four questions were identified via premeeting teleconferences. (1) Should antipsychotic treatment of first-episode schizophrenia be different from that of multiepisode schizophrenia? (2) In which algorithm stages should first-generation antipsychotics (FGAs) be an option? (3) How many antipsychotic trials should precede a clozapine trial? (4) What is the status of augmentation strategies for clozapine? Subgroups reviewed the evidence in each area and presented their findings at the conference. The algorithm was updated to incorporate the following recommendations. (1) Persons with first-episode schizophrenia typically require lower antipsychotic doses and are more sensitive to side effects such as weight gain and extrapyramidal symptoms (group consensus). Second-generation antipsychotics (SGAs) are preferred for treatment of first-episode schizophrenia (majority opinion). (2) FGAs should be included in algorithm stages after first episode that include SGAs other than clozapine as options (group consensus). (3) The recommended number of trials of other antipsychotics that should precede a clozapine trial is 2, but earlier use of clozapine should be considered in the presence of persistent problems such as suicidality, comorbid violence, and substance abuse (group consensus). (4) Augmentation is reasonable for persons with inadequate response to clozapine, but published results on augmenting agents have not identified replicable positive results (group consensus). These recommendations are meant to provide a framework for clinical decision making, not to replace clinical judgment. As with any algorithm, treatment practices will evolve beyond the recommendations of this consensus conference as new evidence and additional medications become available.
A software tool of digital tomosynthesis application for patient positioning in radiotherapy.
Yan, Hui; Dai, Jian-Rong
2016-03-08
Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm and CPU-based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU-based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU-based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU-based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy.
A geometrically based method for automated radiosurgery planning.
Wagner, T H; Yi, T; Meeks, S L; Bova, F J; Brechner, B L; Chen, Y; Buatti, J M; Friedman, W A; Foote, K D; Bouchet, L G
2000-12-01
A geometrically based method of multiple isocenter linear accelerator radiosurgery treatment planning optimization was developed, based on a target's solid shape. Our method uses an edge detection process to determine the optimal sphere packing arrangement with which to cover the planning target. The sphere packing arrangement is converted into a radiosurgery treatment plan by substituting the isocenter locations and collimator sizes for the spheres. This method is demonstrated on a set of 5 irregularly shaped phantom targets, as well as a set of 10 clinical example cases ranging from simple to very complex in planning difficulty. Using a prototype implementation of the method and standard dosimetric radiosurgery treatment planning tools, feasible treatment plans were developed for each target. The treatment plans generated for the phantom targets showed excellent dose conformity and acceptable dose homogeneity within the target volume. The algorithm was able to generate a radiosurgery plan conforming to the Radiation Therapy Oncology Group (RTOG) guidelines on radiosurgery for every clinical and phantom target examined. This automated planning method can serve as a valuable tool to assist treatment planners in rapidly and consistently designing conformal multiple isocenter radiosurgery treatment plans.
Shi, Hai-Bo; Cheng, Lei; Nakayama, Meiho; Kakazu, Yasuhiro; Yin, Min; Miyoshi, Akira; Komune, Shizuo
2005-09-01
Automatic continuous positive airway pressure (auto-CPAP) machines differ mainly in algorithms used for respiratory event detection and pressure control. The auto-CPAP machines operated by novel algorithms are expected to have better performance than the earlier ones in the treatment of obstructive sleep apnea syndrome (OSAS). The purpose of this study was to determine the therapeutic characteristics between two different auto-CPAP devices, i.e., the third-generation flow-based (f-APAP) and the second-generation vibration-based (v-APAP) machines, during the first night treatment of OSAS. We retrospectively reviewed the polysomnography (PSG) recordings of 43 OSAS patients who were initially performed an overnight diagnostic PSG to confirm the disease and afterwards received the first night auto-CPAP treatment with using either the f-APAP (n=22) or v-APAP (n=21) device under another PSG evaluation. There were 13.6% and 61.9% patients who remained a residual apnea/hypopnea index more than 5 during the f-APAP and v-APAP application, respectively (P<0.005). The f-APAP was more effective than the v-APAP in reducing apnea/hypopnea index (P=0.003), hypopnea index (P=0.023) and apnea index (P=0.007), improving the lowest oxygen saturation index (P=0.007) and shortening stage 1 sleep (P=0.016). However, the f-APAP was less sufficient than the v-APAP in reducing arousal/awakening index (P=0.02). These findings suggest that the f-APAP works better than the v-APAP in abolishing breathing abnormities in the treatment of OSAS; however, the f-APAP device might still have some potential limitations in the clinical application.
Chatzistamatiou, Kimon; Moysiadis, Theodoros; Moschaki, Viktoria; Panteleris, Nikolaos; Agorastos, Theodoros
2016-07-01
The objective of the present study was to identify the most effective cervical cancer screening algorithm incorporating different combinations of cytology, HPV testing and genotyping. Women 25-55years old recruited for the "HERMES" (HEllenic Real life Multicentric cErvical Screening) study were screened in terms of cytology and high-risk (hr) HPV testing with HPV 16/18 genotyping. Women positive for cytology or/and hrHPV were referred for colposcopy, biopsy and treatment. Ten screening algorithms based on different combinations of cytology, HPV testing and HPV 16/18 genotyping were investigated in terms of diagnostic accuracy. Three clusters of algorithms were formed according to the balance between effectiveness and harm caused by screening. The cluster showing the best balance included two algorithms based on co-testing and two based on HPV primary screening with HPV 16/18 genotyping. Among these, hrHPV testing with HPV 16/18 genotyping and reflex cytology (atypical squamous cells of undetermined significance - ASCUS threshold) presented the optimal combination of sensitivity (82.9%) and specificity relative to cytology alone (0.99) with 1.26 false positive rate relative to cytology alone. HPV testing with HPV 16/18 genotyping, referring HPV 16/18 positive women directly to colposcopy, and hrHPV (non 16/18) positive women to reflex cytology (ASCUS threshold), as a triage method to colposcopy, reflects the best equilibrium between screening effectiveness and harm. Algorithms, based on cytology as initial screening method, on co-testing or HPV primary without genotyping, and on HPV primary with genotyping but without cytology triage, are not supported according to the present analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou Jinghao; Kim, Sung; Jabbour, Salma
2010-03-15
Purpose: In the external beam radiation treatment of prostate cancers, successful implementation of adaptive radiotherapy and conformal radiation dose delivery is highly dependent on precise and expeditious segmentation and registration of the prostate volume between the simulation and the treatment images. The purpose of this study is to develop a novel, fast, and accurate segmentation and registration method to increase the computational efficiency to meet the restricted clinical treatment time requirement in image guided radiotherapy. Methods: The method developed in this study used soft tissues to capture the transformation between the 3D planning CT (pCT) images and 3D cone-beam CTmore » (CBCT) treatment images. The method incorporated a global-to-local deformable mesh model based registration framework as well as an automatic anatomy-constrained robust active shape model (ACRASM) based segmentation algorithm in the 3D CBCT images. The global registration was based on the mutual information method, and the local registration was to minimize the Euclidian distance of the corresponding nodal points from the global transformation of deformable mesh models, which implicitly used the information of the segmented target volume. The method was applied on six data sets of prostate cancer patients. Target volumes delineated by the same radiation oncologist on the pCT and CBCT were chosen as the benchmarks and were compared to the segmented and registered results. The distance-based and the volume-based estimators were used to quantitatively evaluate the results of segmentation and registration. Results: The ACRASM segmentation algorithm was compared to the original active shape model (ASM) algorithm by evaluating the values of the distance-based estimators. With respect to the corresponding benchmarks, the mean distance ranged from -0.85 to 0.84 mm for ACRASM and from -1.44 to 1.17 mm for ASM. The mean absolute distance ranged from 1.77 to 3.07 mm for ACRASM and from 2.45 to 6.54 mm for ASM. The volume overlap ratio ranged from 79% to 91% for ACRASM and from 44% to 80% for ASM. These data demonstrated that the segmentation results of ACRASM were in better agreement with the corresponding benchmarks than those of ASM. The developed registration algorithm was quantitatively evaluated by comparing the registered target volumes from the pCT to the benchmarks on the CBCT. The mean distance and the root mean square error ranged from 0.38 to 2.2 mm and from 0.45 to 2.36 mm, respectively, between the CBCT images and the registered pCT. The mean overlap ratio of the prostate volumes ranged from 85.2% to 95% after registration. The average time of the ACRASM-based segmentation was under 1 min. The average time of the global transformation was from 2 to 4 min on two 3D volumes and the average time of the local transformation was from 20 to 34 s on two deformable superquadrics mesh models. Conclusions: A novel and fast segmentation and deformable registration method was developed to capture the transformation between the planning and treatment images for external beam radiotherapy of prostate cancers. This method increases the computational efficiency and may provide foundation to achieve real time adaptive radiotherapy.« less
Barnes, M P; Ebert, M A
2008-03-01
The concept of electron pencil-beam dose distributions is central to pencil-beam algorithms used in electron beam radiotherapy treatment planning. The Hogstrom algorithm, which is a common algorithm for electron treatment planning, models large electron field dose distributions by the superposition of a series of pencil beam dose distributions. This means that the accurate characterisation of an electron pencil beam is essential for the accuracy of the dose algorithm. The aim of this study was to evaluate a measurement based approach for obtaining electron pencil-beam dose distributions. The primary incentive for the study was the accurate calculation of dose distributions for narrow fields as traditional electron algorithms are generally inaccurate for such geometries. Kodak X-Omat radiographic film was used in a solid water phantom to measure the dose distribution of circular 12 MeV beams from a Varian 21EX linear accelerator. Measurements were made for beams of diameter, 1.5, 2, 4, 8, 16 and 32 mm. A blocked-field technique was used to subtract photon contamination in the beam. The "error function" derived from Fermi-Eyges Multiple Coulomb Scattering (MCS) theory for corresponding square fields was used to fit resulting dose distributions so that extrapolation down to a pencil beam distribution could be made. The Monte Carlo codes, BEAM and EGSnrc were used to simulate the experimental arrangement. The 8 mm beam dose distribution was also measured with TLD-100 microcubes. Agreement between film, TLD and Monte Carlo simulation results were found to be consistent with the spatial resolution used. The study has shown that it is possible to extrapolate narrow electron beam dose distributions down to a pencil beam dose distribution using the error function. However, due to experimental uncertainties and measurement difficulties, Monte Carlo is recommended as the method of choice for characterising electron pencil-beam dose distributions.
A simplified surgical algorithm for flap reconstruction of eyebrow defects.
Liu, Hai-Peng; Shao, Ying; Yu, Xiao-Jie; Zhang, Duo
2017-04-01
Partial or total eyebrow defects after trauma or tumor excisions have been repaired by several surgical technique and algorithms. However, these algorithms are often complicated and difficult to apply clinically. We therefore established a simplified surgical algorithm for the treatment of eyebrow defects using flap reconstruction. During the period between January 2009 and December 2015, a total of 21 Chinese patients (12 males, 9 females) with eyebrow defects were treated with eyebrow flap reconstruction. The ages ranged from 12 to 51 years. The patients included 13 cases located on the left and 8 cases on the right eyebrow. These defects were caused by trauma (5 patients) and tumor excision (16 patients). Among them, 6 patients were treated using superficial temporal artery island flap, while 15 patients were treated using the V-Y advancement pedicle flap based on the orbicularis oculi muscle. The minimum defect area was 0.8 × 1.0 cm and maximum area was 2.3 × 4.3 cm. All patients were followed up for 6 months to 5 years postoperatively. The clinical effects of eyebrow reconstruction were evaluated using a designated scoring system. All 21 flaps survived without significant complications and the shapes of the reconstructed eyebrows were continuous, symmetrical and with good integrity. According to the rating scale, there were 13 excellent, 8 good reconstructions among all patients. After an average of 9 months of follow-up, all patients had no recurrence of tumors and no infection or scarring. Based upon our experience with 21 patients who underwent eyebrow reconstruction for various eyebrow defects, we believe that our simplified surgical algorithm can serve as a model for the treatment of patients with eyebrow defects. Copyright © 2017 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Chanes, Daniella Cristina; da Luz Gonçalves Pedreira, Mavilde; de Gutiérrez, Maria Gaby Rivero
2012-02-01
The antineoplastic agents infusion through peripheral lines may lead to several adverse events such as extravasation that is one of the most severe acute reactions of this sort of treatment. The extravasation prevention and management must be part of a safe and evidence-based nursing care. Due to this fact, two algorithms were developed with the purpose of guiding nursing care to children who undergo chemotherapy through peripheral line. The objectives of this study were to determine the content validity of both algorithms with pediatric oncology nurses in Brazil and United States of America, and to verify the agreement between the evaluations of both groups. A descriptive validation study was carried out through the Delphi Technique that has the following steps: development of the data collection instrument, application to the specialists, data analysis, algorithms' review, re-evaluation by the specialists, final data analysis and content validity determination. The data analysis was descriptive and based on the specialists agreement consensus equal or higher than 80% in every step of the algorithms. The process showed that the agreement with both instruments ranged from 92.8% to 99.0%. The algorithms are valid for application in nursing care with the main purpose of preventing and managing the antineoplastic agents' extravasation. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moura, Eduardo S., E-mail: emoura@wisc.edu; Micka, John A.; Hammer, Cliff G.
Purpose: This work presents the development of a phantom to verify the treatment planning system (TPS) algorithms used for high-dose-rate (HDR) brachytherapy. It is designed to measure the relative dose in a heterogeneous media. The experimental details used, simulation methods, and comparisons with a commercial TPS are also provided. Methods: To simulate heterogeneous conditions, four materials were used: Virtual Water™ (VM), BR50/50™, cork, and aluminum. The materials were arranged in 11 heterogeneity configurations. Three dosimeters were used to measure the relative response from a HDR {sup 192}Ir source: TLD-100™, Gafchromic{sup ®} EBT3 film, and an Exradin™ A1SL ionization chamber. Tomore » compare the results from the experimental measurements, the various configurations were modeled in the PENELOPE/penEasy Monte Carlo code. Images of each setup geometry were acquired from a CT scanner and imported into BrachyVision™ TPS software, which includes a grid-based Boltzmann solver Acuros™. The results of the measurements performed in the heterogeneous setups were normalized to the dose values measured in the homogeneous Virtual Water™ setup and the respective differences due to the heterogeneities were considered. Additionally, dose values calculated based on the American Association of Physicists in Medicine-Task Group 43 formalism were compared to dose values calculated with the Acuros™ algorithm in the phantom. Calculated doses were compared at the same points, where measurements have been performed. Results: Differences in the relative response as high as 11.5% were found from the homogeneous setup when the heterogeneous materials were inserted into the experimental phantom. The aluminum and cork materials produced larger differences than the plastic materials, with the BR50/50™ material producing results similar to the Virtual Water™ results. Our experimental methods agree with the PENELOPE/penEasy simulations for most setups and dosimeters. The TPS relative differences with the Acuros™ algorithm were similar in both experimental and simulated setups. The discrepancy between the BrachyVision™, Acuros™, and TG-43 dose responses in the phantom described by this work exceeded 12% for certain setups. Conclusions: The results derived from the phantom measurements show good agreement with the simulations and TPS calculations, using Acuros™ algorithm. Differences in the dose responses were evident in the experimental results when heterogeneous materials were introduced. These measurements prove the usefulness of the heterogeneous phantom for verification of HDR treatment planning systems based on model-based dose calculation algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mijnheer, B; Mans, A; Olaciregui-Ruiz, I
Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less
NASA Astrophysics Data System (ADS)
Leavens, Claudia; Vik, Torbjørn; Schulz, Heinrich; Allaire, Stéphane; Kim, John; Dawson, Laura; O'Sullivan, Brian; Breen, Stephen; Jaffray, David; Pekar, Vladimir
2008-03-01
Manual contouring of target volumes and organs at risk in radiation therapy is extremely time-consuming, in particular for treating the head-and-neck area, where a single patient treatment plan can take several hours to contour. As radiation treatment delivery moves towards adaptive treatment, the need for more efficient segmentation techniques will increase. We are developing a method for automatic model-based segmentation of the head and neck. This process can be broken down into three main steps: i) automatic landmark identification in the image dataset of interest, ii) automatic landmark-based initialization of deformable surface models to the patient image dataset, and iii) adaptation of the deformable models to the patient-specific anatomical boundaries of interest. In this paper, we focus on the validation of the first step of this method, quantifying the results of our automatic landmark identification method. We use an image atlas formed by applying thin-plate spline (TPS) interpolation to ten atlas datasets, using 27 manually identified landmarks in each atlas/training dataset. The principal variation modes returned by principal component analysis (PCA) of the landmark positions were used by an automatic registration algorithm, which sought the corresponding landmarks in the clinical dataset of interest using a controlled random search algorithm. Applying a run time of 60 seconds to the random search, a root mean square (rms) distance to the ground-truth landmark position of 9.5 +/- 0.6 mm was calculated for the identified landmarks. Automatic segmentation of the brain, mandible and brain stem, using the detected landmarks, is demonstrated.
Stojadinovic, Strahinja; Hrycushko, Brian; Wardak, Zabi; Lau, Steven; Lu, Weiguo; Yan, Yulong; Jiang, Steve B.; Zhen, Xin; Timmerman, Robert; Nedzi, Lucien
2017-01-01
Accurate and automatic brain metastases target delineation is a key step for efficient and effective stereotactic radiosurgery (SRS) treatment planning. In this work, we developed a deep learning convolutional neural network (CNN) algorithm for segmenting brain metastases on contrast-enhanced T1-weighted magnetic resonance imaging (MRI) datasets. We integrated the CNN-based algorithm into an automatic brain metastases segmentation workflow and validated on both Multimodal Brain Tumor Image Segmentation challenge (BRATS) data and clinical patients' data. Validation on BRATS data yielded average DICE coefficients (DCs) of 0.75±0.07 in the tumor core and 0.81±0.04 in the enhancing tumor, which outperformed most techniques in the 2015 BRATS challenge. Segmentation results of patient cases showed an average of DCs 0.67±0.03 and achieved an area under the receiver operating characteristic curve of 0.98±0.01. The developed automatic segmentation strategy surpasses current benchmark levels and offers a promising tool for SRS treatment planning for multiple brain metastases. PMID:28985229
Farahmand, Farid; Khadivi, Kevin O.; Rodrigues, Joel J. P. C.
2009-01-01
The utility of a novel, high-precision, non-intrusive, wireless, accelerometer-based patient orientation monitoring system (APOMS) in determining orientation change in patients undergoing radiation treatment is reported here. Using this system a small wireless accelerometer sensor is placed on a patient’s skin, broadcasting its orientation to the receiving station connected to a PC in the control area. A threshold-based algorithm is developed to identify the exact amount of the patient’s head orientation change. Through real-time measurements, an audible alarm can alert the radiation therapist if the user-defined orientation threshold is violated. Our results indicate that, in spite of its low-cost and simplicity, the APOMS is highly sensitive and offers accurate measurements. Furthermore, the APOMS is patient friendly, vendor neutral, and requires minimal user training. The versatile architecture of the APOMS makes it potentially suitable for variety of applications, including study of correlation between external and internal markers during Image-Guided Radiation Therapy (IGRT), with no major changes in hardware setup or algorithm. PMID:22423196
The acute management of haemorrhoids.
Hardy, A; Cohen, C R G
2014-10-01
Although the acute thrombosis and strangulation of haemorrhoids is a common condition, there is no consensus as to its most effective treatment. A PubMed search was undertaken for papers describing the aetiology and treatment of the acute complications of haemorrhoids. The anatomy and treatments for strangulated internal haemorrhoids and thrombosed perianal varices are discussed. Studies of the effectiveness and complications of conservative and operative treatments are reviewed. Ambiguities exist in the terminology used to describe the two separate pathologies that make up the acute complications of haemorrhoids. These complications have traditionally been treated conservatively. There is evidence that early operative intervention for strangulated internal haemorrhoids is safe and effective. A suggested algorithm for treatment is given, based on the published literature.
Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan
2018-01-01
Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a greater data amount are areas of future works.
Long-term surface EMG monitoring using K-means clustering and compressive sensing
NASA Astrophysics Data System (ADS)
Balouchestani, Mohammadreza; Krishnan, Sridhar
2015-05-01
In this work, we present an advanced K-means clustering algorithm based on Compressed Sensing theory (CS) in combination with the K-Singular Value Decomposition (K-SVD) method for Clustering of long-term recording of surface Electromyography (sEMG) signals. The long-term monitoring of sEMG signals aims at recording of the electrical activity produced by muscles which are very useful procedure for treatment and diagnostic purposes as well as for detection of various pathologies. The proposed algorithm is examined for three scenarios of sEMG signals including healthy person (sEMG-Healthy), a patient with myopathy (sEMG-Myopathy), and a patient with neuropathy (sEMG-Neuropathr), respectively. The proposed algorithm can easily scan large sEMG datasets of long-term sEMG recording. We test the proposed algorithm with Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) dimensionality reduction methods. Then, the output of the proposed algorithm is fed to K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers in order to calclute the clustering performance. The proposed algorithm achieves a classification accuracy of 99.22%. This ability allows reducing 17% of Average Classification Error (ACE), 9% of Training Error (TE), and 18% of Root Mean Square Error (RMSE). The proposed algorithm also reduces 14% clustering energy consumption compared to the existing K-Means clustering algorithm.
Sensitivity of NTCP parameter values against a change of dose calculation algorithm.
Brink, Carsten; Berg, Martin; Nielsen, Morten
2007-09-01
Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.
Sensitivity of NTCP parameter values against a change of dose calculation algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, Carsten; Berg, Martin; Nielsen, Morten
2007-09-15
Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis withmore » those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Q; Devpura, S; Feghali, K
2016-06-15
Purpose: To investigate correlation of normal lung CT density changes with dose accuracy and outcome after SBRT for patients with early stage lung cancer. Methods: Dose distributions for patients originally planned and treated using a 1-D pencil beam-based (PB-1D) dose algorithm were retrospectively recomputed using algorithms: 3-D pencil beam (PB-3D), and model-based Methods: AAA, Acuros XB (AXB), and Monte Carlo (MC). Prescription dose was 12 Gy × 4 fractions. Planning CT images were rigidly registered to the followup CT datasets at 6–9 months after treatment. Corresponding dose distributions were mapped from the planning to followup CT images. Following the methodmore » of Palma et al .(1–2), Hounsfield Unit (HU) changes in lung density in individual, 5 Gy, dose bins from 5–45 Gy were assessed in the peri-tumor region, defined as a uniform, 3 cm expansion around the ITV(1). Results: There is a 10–15% displacement of the high dose region (40–45 Gy) with the model-based algorithms, relative to the PB method, due to the electron scattering of dose away from the tumor into normal lung tissue (Fig.1). Consequently, the high-dose lung region falls within the 40–45 Gy dose range, causing an increase in HU change in this region, as predicted by model-based algorithms (Fig.2). The patient with the highest HU change (∼110) had mild radiation pneumonitis, and the patient with HU change of ∼80–90 had shortness of breath. No evidence of pneumonitis was observed for the 3 patients with smaller CT density changes (<50 HU). Changes in CT densities, and dose-response correlation, as computed with model-based algorithms, are in excellent agreement with the findings of Palma et al. (1–2). Conclusion: Dose computed with PB (1D or 3D) algorithms was poorly correlated with clinically relevant CT density changes, as opposed to model-based algorithms. A larger cohort of patients is needed to confirm these results. This work was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less
Quantin, C; Collin, C; Frérot, M; Besson, J; Cottenet, J; Corneloup, M; Soudry-Faure, A; Mariet, A-S; Roussot, A
2017-10-01
The aim of the REDSIAM network is to foster communication between users of French medico-administrative databases and to validate and promote analysis methods suitable for the data. Within this network, the working group "Mental and behavioral disorders" took an interest in algorithms to identify adult schizophrenia in the SNIIRAM database and inventoried identification criteria for patients with schizophrenia in these databases. The methodology was based on interviews with nine experts in schizophrenia concerning the procedures they use to identify patients with schizophrenia disorders in databases. The interviews were based on a questionnaire and conducted by telephone. The synthesis of the interviews showed that the SNIIRAM contains various tables which allow coders to identify patients suffering from schizophrenia: chronic disease status, drugs and hospitalizations. Taken separately, these criteria were not sufficient to recognize patients with schizophrenia, an algorithm should be based on all of them. Apparently, only one-third of people living with schizophrenia benefit from the longstanding disease status. Not all patients are hospitalized, and coding for diagnoses at the hospitalization, notably for short stays in medicine, surgery or obstetrics departments, is not exhaustive. As for treatment with antipsychotics, it is not specific enough as such treatments are also prescribed to patients with bipolar disorders, or even other disorders. It seems appropriate to combine these complementary criteria, while keeping in mind out-patient care (every year 80,000 patients are seen exclusively in an outpatient setting), even if these data are difficult to link with other information. Finally, the experts made three propositions for selection algorithms of patients with schizophrenia. Patients with schizophrenia can be relatively accurately identified using SNIIRAM data. Different combinations of the selected criteria must be used depending on the objectives and they must be related to an appropriate length of time. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Huang, Li; Yuan, Jiamin; Yang, Zhimin; Xu, Fuping; Huang, Chunhua
2015-01-01
Background. In this study, we use association rules to explore the latent rules and patterns of prescribing and adjusting the ingredients of herbal decoctions based on empirical herbal formula of Chinese Medicine (CM). Materials and Methods. The consideration and development of CM prescriptions based on the knowledge of CM doctors are analyzed. The study contained three stages. The first stage is to identify the chief symptoms to a specific empirical herbal formula, which can serve as the key indication for herb addition and cancellation. The second stage is to conduct a case study on the empirical CM herbal formula for insomnia. Doctors will add extra ingredients or cancel some of them by CM syndrome diagnosis. The last stage of the study is to divide the observed cases into the effective group and ineffective group based on the assessed clinical effect by doctors. The patterns during the diagnosis and treatment are selected by the applied algorithm and the relations between clinical symptoms or indications and herb choosing principles will be selected by the association rules algorithm. Results. Totally 40 patients were observed in this study: 28 patients were considered effective after treatment and the remaining 12 were ineffective. 206 patterns related to clinical indications of Chinese Medicine were checked and screened with each observed case. In the analysis of the effective group, we used the algorithm of association rules to select combinations between 28 herbal adjustment strategies of the empirical herbal formula and the 190 patterns of individual clinical manifestations. During this stage, 11 common patterns were eliminated and 5 major symptoms for insomnia remained. 12 association rules were identified which included 5 herbal adjustment strategies. Conclusion. The association rules method is an effective algorithm to explore the latent relations between clinical indications and herbal adjustment strategies for the study on empirical herbal formulas. PMID:26495415
RayPlus: a Web-Based Platform for Medical Image Processing.
Yuan, Rong; Luo, Ming; Sun, Zhi; Shi, Shuyue; Xiao, Peng; Xie, Qingguo
2017-04-01
Medical image can provide valuable information for preclinical research, clinical diagnosis, and treatment. As the widespread use of digital medical imaging, many researchers are currently developing medical image processing algorithms and systems in order to accommodate a better result to clinical community, including accurate clinical parameters or processed images from the original images. In this paper, we propose a web-based platform to present and process medical images. By using Internet and novel database technologies, authorized users can easily access to medical images and facilitate their workflows of processing with server-side powerful computing performance without any installation. We implement a series of algorithms of image processing and visualization in the initial version of Rayplus. Integration of our system allows much flexibility and convenience for both research and clinical communities.
TH-AB-202-01: Daily Lung Tumor Motion Characterization On EPIDs Using a Markerless Tiling Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozario, T; University of Texas at Dallas, Richardson, TX; Chiu, T
Purpose: Tracking lung tumor motion in real time allows for target dose escalation while simultaneously reducing dose to sensitive structures, thus increasing local control without increasing toxicity. We present a novel intra-fractional markerless lung tumor tracking algorithm using MV treatment beam images acquired during treatment delivery. Strong signals superimposed on the tumor significantly reduced the soft tissue resolution; while different imaging modalities involved introduce global imaging discrepancies. This reduced the comparison accuracies. A simple yet elegant Tiling algorithm is reported to overcome the aforementioned issues. Methods: MV treatment beam images were acquired continuously in beam’s eye view (BEV) by anmore » electronic portal imaging device (EPID) during treatment and analyzed to obtain tumor positions on every frame. Every frame of the MV image was simulated by a composite of two components with separate digitally reconstructed radiographs (DRRs): all non-moving structures and the tumor. This Titling algorithm divides the global composite DRR and the corresponding MV projection into sub-images called tiles. Rigid registration is performed independently on tile-pairs in order to improve local soft tissue resolution. This enables the composite DRR to be transformed accurately to match the MV projection and attain a high correlation value through a pixel-based linear transformation. The highest cumulative correlation for all tile-pairs achieved over a user-defined search range indicates the 2-D coordinates of the tumor location on the MV projection. Results: This algorithm was successfully applied to cine-mode BEV images acquired during two SBRT plans delivered five times with different motion patterns to each of two phantoms. Approximately 15000 beam’s eye view images were analyzed and tumor locations were successfully identified on every projection with a maximum/average error of 1.8 mm / 1.0 mm. Conclusion: Despite the presence of strong anatomical signal overlapping with tumor images, this markerless detection algorithm accurately tracks intrafractional lung tumor motions. This project is partially supported by an Elekta research grant.« less
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Men Chunhua; Romeijn, H. Edwin; Jia Xun
2010-11-15
Purpose: To develop a novel aperture-based algorithm for volumetric modulated arc therapy (VMAT) treatment plan optimization with high quality and high efficiency. Methods: The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequentialmore » way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. Results: The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. Conclusions: The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.« less
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT).
Men, Chunhua; Romeijn, H Edwin; Jia, Xun; Jiang, Steve B
2010-11-01
To develop a novel aperture-based algorithm for volumetric modulated are therapy (VMAT) treatment plan optimization with high quality and high efficiency. The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequential way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.
Text extraction via an edge-bounded averaging and a parametric character model
NASA Astrophysics Data System (ADS)
Fan, Jian
2003-01-01
We present a deterministic text extraction algorithm that relies on three basic assumptions: color/luminance uniformity of the interior region, closed boundaries of sharp edges and the consistency of local contrast. The algorithm is basically independent of the character alphabet, text layout, font size and orientation. The heart of this algorithm is an edge-bounded averaging for the classification of smooth regions that enhances robustness against noise without sacrificing boundary accuracy. We have also developed a verification process to clean up the residue of incoherent segmentation. Our framework provides a symmetric treatment for both regular and inverse text. We have proposed three heuristics for identifying the type of text from a cluster consisting of two types of pixel aggregates. Finally, we have demonstrated the advantages of the proposed algorithm over adaptive thresholding and block-based clustering methods in terms of boundary accuracy, segmentation coherency, and capability to identify inverse text and separate characters from background patches.
Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweezy, Jeremy Ed
2016-01-21
The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less
A software tool for determination of breast cancer treatment methods using data mining approach.
Cakır, Abdülkadir; Demirel, Burçin
2011-12-01
In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.
EAU guidelines on surgical treatment of urinary incontinence.
Lucas, M G; Bosch, R J L; Burkhard, F C; Cruz, F; Madden, T B; Nambiar, A K; Neisius, A; de Ridder, D J M K; Tubaro, A; Turner, W H; Pickard, R S
2013-09-01
The European Association of Urology (EAU) guidelines on urinary incontinence published in March 2012 have been rewritten based on an independent systematic review carried out by the EAU guidelines panel using a sustainable methodology. We present a short version here of the full guidelines on the surgical treatment of patients with urinary incontinence, with the aim of dissemination to a wider audience. Evidence appraisal included a pragmatic review of existing systematic reviews and independent new literature searches based on Population, Intervention, Comparator, Outcome (PICO) questions. The appraisal of papers was carried out by an international panel of experts, who also collaborated in a series of consensus discussions, to develop concise structured evidence summaries and action-based recommendations using a modified Oxford system. The full version of the guidance is available online (www.uroweb.org/guidelines/online-guidelines/). The guidance includes algorithms that refer the reader back to the supporting evidence and have greater accessibility in daily clinical practice. Two original meta-analyses were carried out specifically for these guidelines and are included in this report. These new guidelines present an up-to-date summary of the available evidence, together with clear clinical algorithms and action-based recommendations based on the best available evidence. Where high-level evidence is lacking, they present a consensus of expert panel opinion. Copyright © 2012 AEU. Published by Elsevier Espana. All rights reserved.
TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xue, J; Park, J; Kim, L
2016-06-15
Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommendedmore » by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.« less
Technical Note: Improving the VMERGE treatment planning algorithm for rotational radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaddy, Melissa R., E-mail: mrgaddy@ncsu.edu; Papp,
2016-07-15
Purpose: The authors revisit the VMERGE treatment planning algorithm by Craft et al. [“Multicriteria VMAT optimization,” Med. Phys. 39, 686–696 (2012)] for arc therapy planning and propose two changes to the method that are aimed at improving the achieved trade-off between treatment time and plan quality at little additional planning time cost, while retaining other desirable properties of the original algorithm. Methods: The original VMERGE algorithm first computes an “ideal,” high quality but also highly time consuming treatment plan that irradiates the patient from all possible angles in a fine angular grid with a highly modulated beam and then makesmore » this plan deliverable within practical treatment time by an iterative fluence map merging and sequencing algorithm. We propose two changes to this method. First, we regularize the ideal plan obtained in the first step by adding an explicit constraint on treatment time. Second, we propose a different merging criterion that comprises of identifying and merging adjacent maps whose merging results in the least degradation of radiation dose. Results: The effect of both suggested modifications is evaluated individually and jointly on clinical prostate and paraspinal cases. Details of the two cases are reported. Conclusions: In the authors’ computational study they found that both proposed modifications, especially the regularization, yield noticeably improved treatment plans for the same treatment times than what can be obtained using the original VMERGE method. The resulting plans match the quality of 20-beam step-and-shoot IMRT plans with a delivery time of approximately 2 min.« less
Control algorithms and applications of the wavefront sensorless adaptive optics
NASA Astrophysics Data System (ADS)
Ma, Liang; Wang, Bin; Zhou, Yuanshen; Yang, Huizhen
2017-10-01
Compared with the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system need not to measure the wavefront and reconstruct it. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. Based on the analysis of principle and system model of the WFSless AO system, wavefront correction methods of the WFSless AO system were divided into two categories: model-free-based and model-based control algorithms. The WFSless AO system based on model-free-based control algorithms commonly considers the performance metric as a function of the control parameters and then uses certain control algorithm to improve the performance metric. The model-based control algorithms include modal control algorithms, nonlinear control algorithms and control algorithms based on geometrical optics. Based on the brief description of above typical control algorithms, hybrid methods combining the model-free-based control algorithm with the model-based control algorithm were generalized. Additionally, characteristics of various control algorithms were compared and analyzed. We also discussed the extensive applications of WFSless AO system in free space optical communication (FSO), retinal imaging in the human eye, confocal microscope, coherent beam combination (CBC) techniques and extended objects.
A comparison of guidelines for the treatment of schizophrenia.
Milner, Karen K; Valenstein, Marcia
2002-07-01
Although the clinical and administrative rationales for the use of guidelines in the treatment of schizophrenia are convincing, meaningful implementation has been slow. Guideline characteristics themselves influence whether implementation occurs. The authors examine three widely distributed guidelines and one set of algorithms to compare characteristics that are likely to influence implementation, including their degree of scientific rigor, comprehensiveness, and clinical applicability (ease of use, timeliness, specificity, and ease of operationalizing). The three guidelines are the Expert Consensus Guideline Series' "Treatment of Schizophrenia"; the American Psychiatric Association's "Practice Guideline for the Treatment of Patients With Schizophrenia"; and the Schizophrenia Patient Outcomes Research Team (PORT) treatment recommendations. The algorithms are those of the Texas Medication Algorithm Project (TMAP). The authors outline the strengths of each and suggest how a future guideline might build on these strengths.
Fritz, Julie M; Rundell, Sean D; Dougherty, Paul; Gentili, Angela; Kochersberger, Gary; Morone, Natalia E; Naga Raja, Srinivasa; Rodriguez, Eric; Rossi, Michelle I; Shega, Joseph; Sowa, Gwendolyn; Weiner, Debra K
2016-03-01
. To present the sixth in a series of articles designed to deconstruct chronic low back pain (CLBP) in older adults. This article focuses on the evaluation and management of lumbar spinal stenosis (LSS), the most common condition for which older adults undergo spinal surgery. . The evaluation and treatment algorithm, a table articulating the rationale for the individual algorithm components, and stepped-care drug recommendations were developed using a modified Delphi approach. The Principal Investigator, a five-member content expert panel and a nine-member primary care panel were involved in the iterative development of these materials. The illustrative clinical case was taken from the clinical practice of a contributor's colleague (SR). . We present an algorithm and supportive materials to help guide the care of older adults with LSS, a condition that occurs not uncommonly in those with CLBP. The case illustrates the importance of function-focused management and a rational approach to conservative care. . Lumbar spinal stenosis exists not uncommonly in older adults with CLBP and management often can be accomplished without surgery. Treatment should address all conditions in addition to LSS contributing to pain and disability. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
An Algorithm for Neuropathic Pain Management in Older People.
Pickering, Gisèle; Marcoux, Margaux; Chapiro, Sylvie; David, Laurence; Rat, Patrice; Michel, Micheline; Bertrand, Isabelle; Voute, Marion; Wary, Bernard
2016-08-01
Neuropathic pain frequently affects older people, who generally also have several comorbidities. Elderly patients are often poly-medicated, which increases the risk of drug-drug interactions. These patients, especially those with cognitive problems, may also have restricted communication skills, making pain evaluation difficult and pain treatment challenging. Clinicians and other healthcare providers need a decisional algorithm to optimize the recognition and management of neuropathic pain. We present a decisional algorithm developed by a multidisciplinary group of experts, which focuses on pain assessment and therapeutic options for the management of neuropathic pain, particularly in the elderly. The algorithm involves four main steps: (1) detection, (2) evaluation, (3) treatment, and (4) re-evaluation. The detection of neuropathic pain is an essential step in ensuring successful management. The extent of the impact of the neuropathic pain is then assessed, generally with self-report scales, except in patients with communication difficulties who can be assessed using behavioral scales. The management of neuropathic pain frequently requires combination treatments, and recommended treatments should be prescribed with caution in these elderly patients, taking into consideration their comorbidities and potential drug-drug interactions and adverse events. This algorithm can be used in the management of neuropathic pain in the elderly to ensure timely and adequate treatment by a multidisciplinary team.
A Vorticity-preserving Hydrodynamical Scheme for Modeling Accretion Disk Flows
NASA Astrophysics Data System (ADS)
Seligman, Darryl; Laughlin, Gregory
2017-10-01
Vortices, turbulence, and unsteady nonlaminar flows are likely both prominent and dynamically important features of astrophysical disks. Such strongly nonlinear phenomena are often difficult, however, to simulate accurately, and are generally amenable to analytic treatment only in idealized form. In this paper, we explore the evolution of compressible two-dimensional flows using an implicit dual-time hydrodynamical scheme that strictly conserves vorticity (if applied to simulate inviscid flows for which Kelvin’s Circulation Theorem is applicable). The algorithm is based on the work of Lerat et al., who proposed it in the context of terrestrial applications such as the blade-vortex interactions generated by helicopter rotors. We present several tests of Lerat et al.'s vorticity-preserving approach, which we have implemented to second-order accuracy, providing side-by-side comparisons with other algorithms that are frequently used in protostellar disk simulations. The comparison codes include one based on explicit, second-order van Leer advection, one based on spectral methods, and another that implements a higher-order Godunov solver. Our results suggest that the Lerat et al. algorithm will be useful for simulations of astrophysical environments in which vortices play a dynamical role, and where strong shocks are not expected.
Defining the Need for Surgery in Small-Bowel Obstruction.
Kuehn, Florian; Weinrich, Malte; Ehmann, Sarah; Kloker, Katja; Pergolini, Ilaria; Klar, Ernst
2017-07-01
Small-bowel obstruction is a frequent disorder in emergency medicine and represents a major burden for patients and health care systems worldwide. Within the past years, progress has been made regarding the management of small-bowel obstructions, including the use of contrast agent swallow as a tool in the decision-making process. This is a prospective controlled study investigating the central role of contrast agent swallow in the diagnostic and treatment algorithm for small-bowel obstruction at a university department of surgery. Endpoints were the correct identification of patients who needed operative treatment and the accuracy of a conservative treatment decision including the analysis of dropout from this routine algorithm. We performed a single-center analysis of 181 consecutive patients diagnosed with a small-bowel obstruction based on clinical, radiologic, and sonographic findings. Patients with clinical signs of strangulation or peritonitis underwent immediate surgery (group 1). Patients without signs of peritonitis and incomplete stop in the initial abdominal plain film were considered eligible for Gastrografin® challenge (group 2). Seventy-six of the 181 patients (42.0%) underwent immediate surgery. A Gastrografin® challenge was initialized in 105 of the 181 patients (58.0%). Twenty of these 105 patients (19.1%) with persisting or progressive symptoms and absence of contrast agent in the colon after 12 and 24 h subsequently underwent surgery. Here, a segmental bowel resection was necessary in 6 of these 20 patients (30.0%). In 16 out of 20 patients (80.0%) who failed the Gastrografin® challenge, a corresponding correlate in terms of a strangulation was detected intraoperatively. The Gastrografin® challenge had a specificity of 96% and a sensitivity of 100%; accuracy to predict the need for exploration was 96%. A straightforward algorithm based mainly on contrast agent swallow for patients with small-bowel obstructions enabled a timely and very accurate differentiation between patients qualifying for conservative and operative treatment.
Wang, S W; Li, M; Yang, H F; Zhao, Y J; Wang, Y; Liu, Y
2016-04-18
To compare the accuracyof interactive closet point (ICP) algorithm, Procrustes analysis (PA) algorithm,and a landmark-independent method to construct the mid-sagittal plane (MSP) of the cone beam computed tomography.To provide theoretical basis for establishing coordinate systemof CBCT images and symmetric analysis. Ten patients were selected and scanned by CBCT before orthodontic treatment.The scan data was imported into Mimics 10.0 to reconstructthree dimensional skulls.And the MSP of each skull was generated by ICP algorithm, PA algorithm and landmark-independent method. MSP extracted by ICP algorithm or PA algorithm involvedthree steps. First, the 3D skull processing was performed by reverse engineering software geomagic studio 2012 to obtain the mirror skull. Then, the original and its mirror skull was registered separately by ICP algorithm in geomagic studio 2012 and PA algorithm in NX Imageware 11.0. Finally, the registered data were united into new data to calculate the MSP of the originaldata in geomagic studio 2012. The mid-sagittal plane was determined by SELLA (S), nasion (N), basion (Ba) as traditional landmark-dependent methodconducted in software InVivoDental 5.0. The distance from 9 pairs of symmetric anatomical marked points to three sagittal plane were measured and calculated to compare the differences of the absolute value. The one-way ANOVA test was used to analyze the variable differences among the 3 MSPs. The pairwise comparison was performed with LSD method. MSPs calculated by the three methods were available for clinic analysis, which could be concluded from the front view.However, there was significant differences among the distances from the 9 pairs of symmetric anatomical marked points to the MSPs (F=10.932,P=0.001).LSD test showed there was no significant difference between the ICP algorithm and landmark-independent method (P=0.11), while there was significant difference between the PA algorithm and landmark-independent methods (P=0.01) . Mid-sagittal plane of 3D skulls could be generated base on ICP algorithm or PA algorithm. There was no significant difference between the ICP algorithm and landmark-independent method. For the subjects with no evident asymmetry, ICP algorithm is feasible in clinical analysis.
Convergence issues in domain decomposition parallel computation of hovering rotor
NASA Astrophysics Data System (ADS)
Xiao, Zhongyun; Liu, Gang; Mou, Bin; Jiang, Xiong
2018-05-01
Implicit LU-SGS time integration algorithm has been widely used in parallel computation in spite of its lack of information from adjacent domains. When applied to parallel computation of hovering rotor flows in a rotating frame, it brings about convergence issues. To remedy the problem, three LU factorization-based implicit schemes (consisting of LU-SGS, DP-LUR and HLU-SGS) are investigated comparatively. A test case of pure grid rotation is designed to verify these algorithms, which show that LU-SGS algorithm introduces errors on boundary cells. When partition boundaries are circumferential, errors arise in proportion to grid speed, accumulating along with the rotation, and leading to computational failure in the end. Meanwhile, DP-LUR and HLU-SGS methods show good convergence owing to boundary treatment which are desirable in domain decomposition parallel computations.
SU-F-J-23: Field-Of-View Expansion in Cone-Beam CT Reconstruction by Use of Prior Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haga, A; Magome, T; Nakano, M
Purpose: Cone-beam CT (CBCT) has become an integral part of online patient setup in an image-guided radiation therapy (IGRT). In addition, the utility of CBCT for dose calculation has actively been investigated. However, the limited size of field-of-view (FOV) and resulted CBCT image with a lack of peripheral area of patient body prevents the reliability of dose calculation. In this study, we aim to develop an FOV expanded CBCT in IGRT system to allow the dose calculation. Methods: Three lung cancer patients were selected in this study. We collected the cone-beam projection images in the CBCT-based IGRT system (X-ray volumemore » imaging unit, ELEKTA), where FOV size of the provided CBCT with these projections was 410 × 410 mm{sup 2} (normal FOV). Using these projections, CBCT with a size of 728 × 728 mm{sup 2} was reconstructed by a posteriori estimation algorithm including a prior image constrained compressed sensing (PICCS). The treatment planning CT was used as a prior image. To assess the effectiveness of FOV expansion, a dose calculation was performed on the expanded CBCT image with region-of-interest (ROI) density mapping method, and it was compared with that of treatment planning CT as well as that of CBCT reconstructed by filtered back projection (FBP) algorithm. Results: A posteriori estimation algorithm with PICCS clearly visualized an area outside normal FOV, whereas the FBP algorithm yielded severe streak artifacts outside normal FOV due to under-sampling. The dose calculation result using the expanded CBCT agreed with that using treatment planning CT very well; a maximum dose difference was 1.3% for gross tumor volumes. Conclusion: With a posteriori estimation algorithm, FOV in CBCT can be expanded. Dose comparison results suggested that the use of expanded CBCTs is acceptable for dose calculation in adaptive radiation therapy. This study has been supported by KAKENHI (15K08691).« less
ERIC Educational Resources Information Center
Pliszka, Steven R.; Crismon, M. Lynn; Hughes, Carroll W.; Corners, C. Keith; Emslie, Graham J.; Jensen, Peter S.; McCracken, James T.; Swanson, James M.; Lopez, Molly
2006-01-01
Objective: In 1998, the Texas Department of Mental Health and Mental Retardation developed algorithms for medication treatment of attention-deficit/hyperactivity disorder (ADHD). Advances in the psychopharmacology of ADHD and results of a feasibility study of algorithm use in community mental health centers caused the algorithm to be modified and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mabhouti, H; Sanli, E; Cebe, M
Purpose: Brain stereotactic radiosurgery involves the use of precisely directed, single session radiation to create a desired radiobiologic response within the brain target with acceptable minimal effects on surrounding structures or tissues. In this study, the dosimetric comparison of Truebeam 2.0 and Cyberknife M6 treatment plans were made. Methods: For Truebeam 2.0 machine, treatment planning were done using 2 full arc VMAT technique with 6 FFF beam on the CT scan of Randophantom simulating the treatment of sterotactic treatments for one brain metastasis. The dose distribution were calculated using Eclipse treatment planning system with Acuros XB algorithm. The treatment planningmore » of the same target were also done for Cyberknife M6 machine with Multiplan treatment planning system using Monte Carlo algorithm. Using the same film batch, the net OD to dose calibration curve was obtained using both machine by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. Dose distribution were measured using EBT3 film dosimeter. The measured and calculated doses were compared. Results: The dose distribution in the target and 2 cm beyond the target edge were calculated on TPSs and measured using EBT3 film. For cyberknife plans, the gamma analysis passing rates between measured and calculated dose distributions were 99.2% and 96.7% for target and peripheral region of target respectively. For Truebeam plans, the gamma analysis passing rates were 99.1% and 95.5% for target and peripheral region of target respectively. Conclusion: Although, target dose distribution calculated accurately by Acuros XB and Monte Carlo algorithms, Monte carlo calculation algorithm predicts dose distribution around the peripheral region of target more accurately than Acuros algorithm.« less
Suppes, T; Swann, A C; Dennehy, E B; Habermacher, E D; Mason, M; Crismon, M L; Toprac, M G; Rush, A J; Shon, S P; Altshuler, K Z
2001-06-01
Use of treatment guidelines for treatment of major psychiatric illnesses has increased in recent years. The Texas Medication Algorithm Project (TMAP) was developed to study the feasibility and process of developing and implementing guidelines for bipolar disorder, major depressive disorder, and schizophrenia in the public mental health system of Texas. This article describes the consensus process used to develop the first set of TMAP algorithms for the Bipolar Disorder Module (Phase 1) and the trial testing the feasibility of their implementation in inpatient and outpatient psychiatric settings across Texas (Phase 2). The feasibility trial answered core questions regarding implementation of treatment guidelines for bipolar disorder. A total of 69 patients were treated with the original algorithms for bipolar disorder developed in Phase 1 of TMAP. Results support that physicians accepted the guidelines, followed recommendations to see patients at certain intervals, and utilized sequenced treatment steps differentially over the course of treatment. While improvements in clinical symptoms (24-item Brief Psychiatric Rating Scale) were observed over the course of enrollment in the trial, these conclusions are limited by the fact that physician volunteers were utilized for both treatment and ratings. and there was no control group. Results from Phases 1 and 2 indicate that it is possible to develop and implement a treatment guideline for patients with a history of mania in public mental health clinics in Texas. TMAP Phase 3, a recently completed larger and controlled trial assessing the clinical and economic impact of treatment guidelines and patient and family education in the public mental health system of Texas, improves upon this methodology.
Capkun, Gorana; Lahoz, Raquel; Verdun, Elisabetta; Song, Xue; Chen, Weston; Korn, Jonathan R; Dahlke, Frank; Freitas, Rita; Fraeman, Kathy; Simeone, Jason; Johnson, Barbara H; Nordstrom, Beth
2015-05-01
Administrative claims databases provide a wealth of data for assessing the effect of treatments in clinical practice. Our aim was to propose methodology for real-world studies in multiple sclerosis (MS) using these databases. In three large US administrative claims databases: MarketScan, PharMetrics Plus and Department of Defense (DoD), patients with MS were selected using an algorithm identified in the published literature and refined for accuracy. Algorithms for detecting newly diagnosed ('incident') MS cases were also refined and tested. Methodology based on resource and treatment use was developed to differentiate between relapses with and without hospitalization. When various patient selection criteria were applied to the MarketScan database, an algorithm requiring two MS diagnoses at least 30 days apart was identified as the preferred method of selecting patient cohorts. Attempts to detect incident MS cases were confounded by the limited continuous enrollment of patients in these databases. Relapse detection algorithms identified similar proportions of patients in the MarketScan and PharMetrics Plus databases experiencing relapses with (2% in both databases) and without (15-20%) hospitalization in the 1 year follow-up period, providing findings in the range of those in the published literature. Additional validation of the algorithms proposed here would increase their credibility. The methods suggested in this study offer a good foundation for performing real-world research in MS using administrative claims databases, potentially allowing evidence from different studies to be compared and combined more systematically than in current research practice.
Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray
2014-11-25
A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esquivel, C; Patton, L; Walker, S
Purpose: Use Sun Nuclear Quality Reports™ with PlanIQ™ to evaluate different treatment delivery techniques for various treatment sites. Methods: Fifteen random patients with different treatment sites were evaluated. These include the Head/Neck, prostate, pelvis, lung, esophagus, axilla, bladder and abdomen. Initially, these sites were planned on the Pinnacle {sup 3} V9.6 treatment planning system and utilized nine 6MV step-n-shoot IMRT fields. The RT plan, dose and structure sets were sent to Quality Reports™ where a DVH was recreated and the plans were compared to a unique Plan Algorithm for each treatment site. Each algorithm has its own plan quality metricsmore » and objectives, which include the PTV coverage, PTV maximum dose, the prescription dose outside the target, doses to the critical structures, and the global maximum dose and its location. Each plan was scored base on meeting each objective. Plans may have been reoptimized and reevaluated with Quality Reports™ based on the initial score. PlanIQ™ was used to evaluate if any objective not met was achievable or difficult to obtain. A second plan using VMAT delivery was created for each patient and scored with Quality Reports™. Results: There were a wide range of scores for the different treatment sites with some scoring better for IMRT plans and some better for the VMAT deliveries. The variation in the scores could be attributed to the treatment site, location, and shape of the target. Most deliveries were chosen for the VMAT due to the short treatment times and quick patient throughput with acceptable plan scores. Conclusion: The tools are provided for both physician and dosimetrist to objectively evaluate the use of VMAT delivery versus the step-n-shoot IMRT delivery for various sites. PlanIQ validates if objectives can be met. For the physicist, a concise pass/fail report is created for plan evaluation.« less
Wu, Guangsheng; Liu, Juan; Wang, Caihua
2017-12-28
Prediction of drug-disease interactions is promising for either drug repositioning or disease treatment fields. The discovery of novel drug-disease interactions, on one hand can help to find novel indictions for the approved drugs; on the other hand can provide new therapeutic approaches for the diseases. Recently, computational methods for finding drug-disease interactions have attracted lots of attention because of their far more higher efficiency and lower cost than the traditional wet experiment methods. However, they still face several challenges, such as the organization of the heterogeneous data, the performance of the model, and so on. In this work, we present to hierarchically integrate the heterogeneous data into three layers. The drug-drug and disease-disease similarities are first calculated separately in each layer, and then the similarities from three layers are linearly fused into comprehensive drug similarities and disease similarities, which can then be used to measure the similarities between two drug-disease pairs. We construct a novel weighted drug-disease pair network, where a node is a drug-disease pair with known or unknown treatment relation, an edge represents the node-node relation which is weighted with the similarity score between two pairs. Now that similar drug-disease pairs are supposed to show similar treatment patterns, we can find the optimal graph cut of the network. The drug-disease pair with unknown relation can then be considered to have similar treatment relation with that within the same cut. Therefore, we develop a semi-supervised graph cut algorithm, SSGC, to find the optimal graph cut, based on which we can identify the potential drug-disease treatment interactions. By comparing with three representative network-based methods, SSGC achieves the highest performances, in terms of both AUC score and the identification rates of true drug-disease pairs. The experiments with different integration strategies also demonstrate that considering several sources of data can improve the performances of the predictors. Further case studies on four diseases, the top-ranked drug-disease associations have been confirmed by KEGG, CTD database and the literature, illustrating the usefulness of SSGC. The proposed comprehensive similarity scores from multi-views and multiple layers and the graph-cut based algorithm can greatly improve the prediction performances of drug-disease associations.
Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute
NASA Astrophysics Data System (ADS)
Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.
2015-01-01
3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre
2015-04-01
Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.
C-learning: A new classification framework to estimate optimal dynamic treatment regimes.
Zhang, Baqun; Zhang, Min
2017-12-11
A dynamic treatment regime is a sequence of decision rules, each corresponding to a decision point, that determine that next treatment based on each individual's own available characteristics and treatment history up to that point. We show that identifying the optimal dynamic treatment regime can be recast as a sequential optimization problem and propose a direct sequential optimization method to estimate the optimal treatment regimes. In particular, at each decision point, the optimization is equivalent to sequentially minimizing a weighted expected misclassification error. Based on this classification perspective, we propose a powerful and flexible C-learning algorithm to learn the optimal dynamic treatment regimes backward sequentially from the last stage until the first stage. C-learning is a direct optimization method that directly targets optimizing decision rules by exploiting powerful optimization/classification techniques and it allows incorporation of patient's characteristics and treatment history to improve performance, hence enjoying advantages of both the traditional outcome regression-based methods (Q- and A-learning) and the more recent direct optimization methods. The superior performance and flexibility of the proposed methods are illustrated through extensive simulation studies. © 2017, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicolae, Alexandru; Department of Medical Physics, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario; Morton, Gerard
Purpose: This work presents the application of a machine learning (ML) algorithm to automatically generate high-quality, prostate low-dose-rate (LDR) brachytherapy treatment plans. The ML algorithm can mimic characteristics of preoperative treatment plans deemed clinically acceptable by brachytherapists. The planning efficiency, dosimetry, and quality (as assessed by experts) of preoperative plans generated with an ML planning approach was retrospectively evaluated in this study. Methods and Materials: Preimplantation and postimplantation treatment plans were extracted from 100 high-quality LDR treatments and stored within a training database. The ML training algorithm matches similar features from a new LDR case to those within the trainingmore » database to rapidly obtain an initial seed distribution; plans were then further fine-tuned using stochastic optimization. Preimplantation treatment plans generated by the ML algorithm were compared with brachytherapist (BT) treatment plans in terms of planning time (Wilcoxon rank sum, α = 0.05) and dosimetry (1-way analysis of variance, α = 0.05). Qualitative preimplantation plan quality was evaluated by expert LDR radiation oncologists using a Likert scale questionnaire. Results: The average planning time for the ML approach was 0.84 ± 0.57 minutes, compared with 17.88 ± 8.76 minutes for the expert planner (P=.020). Preimplantation plans were dosimetrically equivalent to the BT plans; the average prostate V150% was 4% lower for ML plans (P=.002), although the difference was not clinically significant. Respondents ranked the ML-generated plans as equivalent to expert BT treatment plans in terms of target coverage, normal tissue avoidance, implant confidence, and the need for plan modifications. Respondents had difficulty differentiating between plans generated by a human or those generated by the ML algorithm. Conclusions: Prostate LDR preimplantation treatment plans that have equivalent quality to plans created by brachytherapists can be rapidly generated using ML. The adoption of ML in the brachytherapy workflow is expected to improve LDR treatment plan uniformity while reducing planning time and resources.« less
Stanley, Nick; Glide-Hurst, Carri; Kim, Jinkoo; Adams, Jeffrey; Li, Shunshan; Wen, Ning; Chetty, Indrin J.; Zhong, Hualiang
2014-01-01
The quality of adaptive treatment planning depends on the accuracy of its underlying deformable image registration (DIR). The purpose of this study is to evaluate the performance of two DIR algorithms, B-spline–based deformable multipass (DMP) and deformable demons (Demons), implemented in a commercial software package. Evaluations were conducted using both computational and physical deformable phantoms. Based on a finite element method (FEM), a total of 11 computational models were developed from a set of CT images acquired from four lung and one prostate cancer patients. FEM generated displacement vector fields (DVF) were used to construct the lung and prostate image phantoms. Based on a fast-Fourier transform technique, image noise power spectrum was incorporated into the prostate image phantoms to create simulated CBCT images. The FEM-DVF served as a gold standard for verification of the two registration algorithms performed on these phantoms. The registration algorithms were also evaluated at the homologous points quantified in the CT images of a physical lung phantom. The results indicated that the mean errors of the DMP algorithm were in the range of 1.0 ~ 3.1 mm for the computational phantoms and 1.9 mm for the physical lung phantom. For the computational prostate phantoms, the corresponding mean error was 1.0–1.9 mm in the prostate, 1.9–2.4 mm in the rectum, and 1.8–2.1 mm over the entire patient body. Sinusoidal errors induced by B-spline interpolations were observed in all the displacement profiles of the DMP registrations. Regions of large displacements were observed to have more registration errors. Patient-specific FEM models have been developed to evaluate the DIR algorithms implemented in the commercial software package. It has been found that the accuracy of these algorithms is patient-dependent and related to various factors including tissue deformation magnitudes and image intensity gradients across the regions of interest. This may suggest that DIR algorithms need to be verified for each registration instance when implementing adaptive radiation therapy. PMID:24257278
Stanley, Nick; Glide‐Hurst, Carri; Kim, Jinkoo; Adams, Jeffrey; Li, Shunshan; Wen, Ning; Chetty, Indrin J
2013-01-01
The quality of adaptive treatment planning depends on the accuracy of its underlying deformable image registration (DIR). The purpose of this study is to evaluate the performance of two DIR algorithms, B‐spline‐based deformable multipass (DMP) and deformable demons (Demons), implemented in a commercial software package. Evaluations were conducted using both computational and physical deformable phantoms. Based on a finite element method (FEM), a total of 11 computational models were developed from a set of CT images acquired from four lung and one prostate cancer patients. FEM generated displacement vector fields (DVF) were used to construct the lung and prostate image phantoms. Based on a fast‐Fourier transform technique, image noise power spectrum was incorporated into the prostate image phantoms to create simulated CBCT images. The FEM‐DVF served as a gold standard for verification of the two registration algorithms performed on these phantoms. The registration algorithms were also evaluated at the homologous points quantified in the CT images of a physical lung phantom. The results indicated that the mean errors of the DMP algorithm were in the range of 1.0~3.1mm for the computational phantoms and 1.9 mm for the physical lung phantom. For the computational prostate phantoms, the corresponding mean error was 1.0–1.9 mm in the prostate, 1.9–2.4 mm in the rectum, and 1.8–2.1 mm over the entire patient body. Sinusoidal errors induced by B‐spline interpolations were observed in all the displacement profiles of the DMP registrations. Regions of large displacements were observed to have more registration errors. Patient‐specific FEM models have been developed to evaluate the DIR algorithms implemented in the commercial software package. It has been found that the accuracy of these algorithms is patient‐dependent and related to various factors including tissue deformation magnitudes and image intensity gradients across the regions of interest. This may suggest that DIR algorithms need to be verified for each registration instance when implementing adaptive radiation therapy. PACS numbers: 87.10.Kn, 87.55.km, 87.55.Qr, 87.57.nj
Grude, Nils; Lindbaek, Morten
2015-01-01
Objective. To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Design. Randomized controlled trial. Setting. Out-of-hours service, Oslo, Norway. Intervention. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010–November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Subjects. Women (n = 441) aged 16–55 years. Mean age in both groups 27 years. Main outcome measures. Number of days until symptomatic resolution. Results. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Conclusion. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder. PMID:25961367
Bollestad, Marianne; Grude, Nils; Lindbaek, Morten
2015-06-01
To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Randomized controlled trial. Out-of-hours service, Oslo, Norway. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010-November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Women (n = 441) aged 16-55 years. Mean age in both groups 27 years. Number of days until symptomatic resolution. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder.
Prakash, Gaurav; Agarwal, Amar; Kumar, Dhivya Ashok; Jacob, Soosan; Agarwal, Athiya; Maity, Amrita
2011-03-01
To evaluate the visual and refractive outcomes and expected benefits of Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking. This prospective, interventional case series comprised 122 eyes (70 patients). Pre- and postoperative assessment included uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), refraction, and higher order aberrations. All patients underwent Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking using the Technolas 217z 100-Hz excimer platform (Technolas Perfect Vision GmbH). Follow-up was performed up to 6 months postoperatively. Theoretical benefit analysis was performed to evaluate the algorithm's outcomes compared to others. Preoperative spherocylindrical power was sphere -3.62 ± 1.60 diopters (D) (range: 0 to -6.75 D), cylinder -1.15 ± 1.00 D (range: 0 to -3.50 D), and spherical equivalent -4.19 ± 1.60 D (range: -7.75 to -2.00 D). At 6 months, 91% (111/122) of eyes were within ± 0.50 D of attempted correction. Postoperative UDVA was comparable to preoperative CDVA at 1 month (P=.47) and progressively improved at 6 months (P=.004). Two eyes lost one line of CDVA at 6 months. Theoretical benefit analysis revealed that of 101 eyes with astigmatism, 29 would have had cyclotorsion-induced astigmatism of ≥ 10% if iris recognition and dynamic rotational eye tracking were not used. Furthermore, the mean percentage decrease in maximum depth of ablation by using the Tissue Saving Treatment was 11.8 ± 2.9% over Aspheric, 17.8 ± 6.2% over Personalized, and 18.2 ± 2.8% over Planoscan algorithms. Tissue saving surface ablation with iris recognition and dynamic rotational eye tracking was safe and effective in this series of eyes. Copyright 2011, SLACK Incorporated.
A detail enhancement and dynamic range adjustment algorithm for high dynamic range images
NASA Astrophysics Data System (ADS)
Xu, Bo; Wang, Huachuang; Liang, Mingtao; Yu, Cong; Hu, Jinlong; Cheng, Hua
2014-08-01
Although high dynamic range (HDR) images contain large amounts of information, they have weak texture and low contrast. What's more, these images are difficult to be reproduced on low dynamic range displaying mediums. If much more information is to be acquired when these images are displayed on PCs, some specific transforms, such as compressing the dynamic range, enhancing the portions of little difference in original contrast and highlighting the texture details on the premise of keeping the parts of large contrast, are needed. To this ends, a multi-scale guided filter enhancement algorithm which derives from the single-scale guided filter based on the analysis of non-physical model is proposed in this paper. Firstly, this algorithm decomposes the original HDR images into base image and detail images of different scales, and then it adaptively selects a transform function which acts on the enhanced detail images and original images. By comparing the treatment effects of HDR images and low dynamic range (LDR) images of different scene features, it proves that this algorithm, on the basis of maintaining the hierarchy and texture details of images, not only improves the contrast and enhances the details of images, but also adjusts the dynamic range well. Thus, it is much suitable for human observation or analytical processing of machines.
Treating alcoholism as a chronic disease: approaches to long-term continuing care.
McKay, James R; Hiller-Sturmhofel, Susanne
2011-01-01
For many patients, alcohol and other drug (AOD) use disorders are chronic, recurring conditions involving multiple cycles of treatment, abstinence, and relapse. To disrupt this cycle, treatment can include continuing care to reduce the risk of relapse. The most commonly used treatment approach is initial intensive inpatient or outpatient care based on 12-step principles, followed by continuing care involving self-help groups, 12-step group counseling, or individual therapy. Although these programs can be effective, many patients drop out of initial treatment or do not complete continuing care. Thus, researchers and clinicians have begun to develop alternative approaches to enhance treatment retention in both initial and continuing care. One focus of these efforts has been the design of extended treatment models. These approaches increasingly blur the distinction between initial and continuing care and aim to prolong treatment participation by providing a continuum of care. Other researchers have focused on developing alternative treatment strategies (e.g., telephone-based interventions) that go beyond traditional settings and adaptive treatment algorithms that may improve outcomes for clients who do not respond well to traditional approaches.
Management of anaphylaxis in an austere or operational environment.
Ellis, B Craig; Brown, Simon G A
2014-01-01
We present a case report of a Special Operations Soldier who developed anaphylaxis as a consequence of a bee sting, resulting in compromise of the operation. We review the current literature as it relates to the pathophysiology of the disease process, its diagnosis, and its management. An evidence-based field treatment algorithm is suggested. 2014.
Abejuela, Harmony Raylen; Osser, David N
2016-01-01
This revision of previous algorithms for the pharmacotherapy of generalized anxiety disorder was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. Algorithms from 1999 and 2010 and associated references were reevaluated. Newer studies and reviews published from 2008-14 were obtained from PubMed and analyzed with a focus on their potential to justify changes in the recommendations. Exceptions to the main algorithm for special patient populations, such as women of childbearing potential, pregnant women, the elderly, and those with common medical and psychiatric comorbidities, were considered. Selective serotonin reuptake inhibitors (SSRIs) are still the basic first-line medication. Early alternatives include duloxetine, buspirone, hydroxyzine, pregabalin, or bupropion, in that order. If response is inadequate, then the second recommendation is to try a different SSRI. Additional alternatives now include benzodiazepines, venlafaxine, kava, and agomelatine. If the response to the second SSRI is unsatisfactory, then the recommendation is to try a serotonin-norepinephrine reuptake inhibitor (SNRI). Other alternatives to SSRIs and SNRIs for treatment-resistant or treatment-intolerant patients include tricyclic antidepressants, second-generation antipsychotics, and valproate. This revision of the GAD algorithm responds to issues raised by new treatments under development (such as pregabalin) and organizes the evidence systematically for practical clinical application.
Bonafede, Machaon MK; Curtis, Jeffrey R; McMorrow, Donna; Mahajan, Puneet; Chen, Chieh-I
2016-01-01
Objectives After treatment failure with a tumor necrosis factor inhibitor (TNFi), patients with rheumatoid arthritis (RA) can switch to another TNFi (TNFi cyclers) or to a targeted disease-modifying antirheumatic drug (DMARD) with a non-TNFi mechanism of action (non-TNFi switchers). This study compared treatment patterns and treatment effectiveness between TNFi cyclers and non-TNFi switchers in patients with RA. Methods The analysis included a cohort of patients from the Truven Health Analytics MarketScan Commercial database with RA who switched from a TNFi (adalimumab, certolizumab pegol, etanercept, golimumab, or infliximab) either to another TNFi or to a non-TNFi targeted DMARD (abatacept, tocilizumab, or tofacitinib) between January 1, 2010 and September 30, 2014. A claims-based algorithm was used to estimate treatment effectiveness based on six criteria (adherence, no dose increase, no new conventional therapy, no switch to another targeted DMARD, no new/increased oral glucocorticoid, and intra-articular injections on <2 days). Results The cohort included 5,020 TNFi cyclers and 1,925 non-TNFi switchers. Non-TNFi switchers were significantly less likely than TNFi cyclers to switch therapy again within 6 months (13.2% vs 19.5%; P<0.001) or within 12 months (29.7% vs 34.6%; P<0.001) and significantly more likely to be persistent on therapy at 12 months (61.8% vs 58.2%; P<0.001). Non-TNFi switchers were significantly more likely than TNFi cyclers to achieve all six of the claims-based effectiveness algorithm criteria for the 12 months after the initial switch (27% vs 24%; P=0.011). Conclusion Although the absolute differences were small, these results support switching to a non-TNFi targeted DMARD instead of TNFi cycling when patients with RA require another therapy after TNFi failure. PMID:27980429
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.
2016-02-15
Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less
Brain tumor segmentation in MR slices using improved GrowCut algorithm
NASA Astrophysics Data System (ADS)
Ji, Chunhong; Yu, Jinhua; Wang, Yuanyuan; Chen, Liang; Shi, Zhifeng; Mao, Ying
2015-12-01
The detection of brain tumor from MR images is very significant for medical diagnosis and treatment. However, the existing methods are mostly based on manual or semiautomatic segmentation which are awkward when dealing with a large amount of MR slices. In this paper, a new fully automatic method for the segmentation of brain tumors in MR slices is presented. Based on the hypothesis of the symmetric brain structure, the method improves the interactive GrowCut algorithm by further using the bounding box algorithm in the pre-processing step. More importantly, local reflectional symmetry is used to make up the deficiency of the bounding box method. After segmentation, 3D tumor image is reconstructed. We evaluate the accuracy of the proposed method on MR slices with synthetic tumors and actual clinical MR images. Result of the proposed method is compared with the actual position of simulated 3D tumor qualitatively and quantitatively. In addition, our automatic method produces equivalent performance as manual segmentation and the interactive GrowCut with manual interference while providing fully automatic segmentation.
2013-01-01
Background A large-scale, highly accurate, machine-understandable drug-disease treatment relationship knowledge base is important for computational approaches to drug repurposing. The large body of published biomedical research articles and clinical case reports available on MEDLINE is a rich source of FDA-approved drug-disease indication as well as drug-repurposing knowledge that is crucial for applying FDA-approved drugs for new diseases. However, much of this information is buried in free text and not captured in any existing databases. The goal of this study is to extract a large number of accurate drug-disease treatment pairs from published literature. Results In this study, we developed a simple but highly accurate pattern-learning approach to extract treatment-specific drug-disease pairs from 20 million biomedical abstracts available on MEDLINE. We extracted a total of 34,305 unique drug-disease treatment pairs, the majority of which are not included in existing structured databases. Our algorithm achieved a precision of 0.904 and a recall of 0.131 in extracting all pairs, and a precision of 0.904 and a recall of 0.842 in extracting frequent pairs. In addition, we have shown that the extracted pairs strongly correlate with both drug target genes and therapeutic classes, therefore may have high potential in drug discovery. Conclusions We demonstrated that our simple pattern-learning relationship extraction algorithm is able to accurately extract many drug-disease pairs from the free text of biomedical literature that are not captured in structured databases. The large-scale, accurate, machine-understandable drug-disease treatment knowledge base that is resultant of our study, in combination with pairs from structured databases, will have high potential in computational drug repurposing tasks. PMID:23742147
Manegold, Christian
2014-09-01
The availability of antineoplastic monoclonal antibodies, small molecules and newer cytotoxics such as pemetrexed, the EGFR-tyrosine kinase inhibitors erlotinib, gefitinib, afatinib as well as the anti-angiogenic bevacizumab and the ALK-inhibitor crizotinib has recently changes the treatment algorithm of advanced non-small cell lung cancer. Decision making in 2014 is characterized by customizing therapy, by selecting a specific therapeutic regimen based on the histotype and the genotype of the tumour. This refers to first-line induction therapy and maintenance therapy as well, but also to subsequent lines of therapy since anti-neoplastic drugs and regimens used upfront clinically influence the selection of agents/regimes considered for second-/third-line treatment. Consequently, therapy customization through tumour histology and molecular markers has significantly influenced the work of pathologists around the globe and the process of obtaining an extended therapeutically relevant tumour diagnosis. Not only histological sub-typing became standard but molecular information is also considered of increasing importance for treatment selection. Routine molecular testing in certified laboratories must be established, and the diagnostic process should ideally be performed under the guidance of evidence based recommendation. The process of investigating and implementing medical targeting in lung cancer therefore, requires advanced diagnostic techniques and expertise and because of its large dimension is costly and influenced by the limitation of financial and clinical resources. Copyright © 2014. Published by Elsevier Urban & Partner Sp. z o.o.
Rossini, Zefferino; Milani, Davide; Costa, Francesco; Castellani, Carlotta; Lasio, Giovanni; Fornari, Maurizio
2017-10-01
Chiari malformation type I is a hindbrain abnormality characterized by descent of the cerebellar tonsils beneath the foramen magnum, frequently associated with symptoms or brainstem compression, impaired cerebrospinal fluid circulation, and syringomyelia. Foramen magnum decompression represents the most common way of treatment. Rarely, subdural fluid collection and hydrocephalus represent postoperative adverse events. The treatment of this complication is still debated, and physicians are sometimes uncertain when to perform diversion surgery and when to perform more conservative management. We report an unusual occurrence of subdural fluid collection and hydrocephalus that developed in a 23-year-old patient after foramen magnum decompression for Chiari malformation type I. Following a management protocol, based on a step-by-step approach, from conservative therapy to diversion surgery, the patient was managed with urgent external ventricular drainage, and then with conservative management and wound revision. Because of the rarity of this adverse event, previous case reports differ about the form of treatment. In future cases, finding clinical and radiologic features to identify risk factors that are useful in predicting if the patient will benefit from conservative management or will need to undergo diversion surgery is only possible if a uniform form of treatment is used. Therefore, we believe that a management algorithm based on a step-by-step approach will reduce the use of invasive therapies and help to create a standard of care. Copyright © 2017 Elsevier Inc. All rights reserved.
Chen, Jeon-Hor; Pan, Wei-Fan; Kao, Julian; Lu, Jocelyn; Chen, Li-Kuang; Kuo, Chih-Chen; Chang, Chih-Kai; Chen, Wen-Pin; McLaren, Christine E.; Bahri, Shadfar; Mehta, Rita S.; Su, Min-Ying
2013-01-01
The aim of this study was to evaluate the change of breast density in the normal breast of patients receiving neoadjuvant chemotherapy (NAC). Forty-four breast cancer patients were studied. MRI acquisition was performed before treatment (baseline), and 4 and 12 weeks after treatment. A computer algorithm-based program was used to segment breast tissue and calculate breast volume (BV), fibroglandular tissue volume (FV) and percent density (PD) (the ratio of FV over BV x100%). The reduction of FV and PD after treatment was compared to baseline using paired t-tests with a Bonferroni-Holm correction. The association of density reduction with age was analyzed. FV and PD after NAC showed significant decreases compared to the baseline. FV was 110.0ml (67.2, 189.8) (geometric mean (interquartile range)) at baseline, 104.3ml (66.6, 164.4) after 4 weeks (p< 0.0001), and 94.7ml (60.2, 144.4) after 12 weeks (comparison to baseline, p<0.0001; comparison to 4 weeks, p=0.016). PD was 11.2% (6.4, 22.4) at baseline, 10.6% (6.6, 20.3) after 4 weeks (p< 0.0001), and 9.7% (6.2, 17.9) after 12 weeks (comparison to baseline, p=0.0001; comparison to 4 weeks, p =0.018). Younger patients tended to show a higher density reduction, but overall correlation with age was only moderate (r=0.28 for FV, p=0.07 and r=0.52 for PD, p=0.0003). Our study showed that breast density measured from MR images acquired at 3T MR can be accurately quantified using a robust computer-aided algorithm based on nonparametric nonuniformity normalization (N3) and an adaptive fuzzy C-means algorithm. Similar to doxorubicin and cyclophosphamide regimens, the taxane-based NAC regimen also caused density atrophy in the normal breast and showed reduction in FV and PD. The effect of breast density reduction was age-related and duration-related. PMID:23940080
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rong, Y; Rao, S; Daly, M
Purpose: Adaptive radiotherapy requires complete new sets of regions of interests (ROIs) delineation on the mid-treatment CT images. This work aims at evaluating the accuracy of the RayStation hybrid deformable image registration (DIR) algorithm for its overall integrity and accuracy in contour propagation for adaptive planning. Methods: The hybrid DIR is based on the combination of intensity-based algorithm and anatomical information provided by contours. Patients who received mid-treatment CT scans were identified for the study, including six lung patients (two mid-treatment CTs) and six head-and-neck (HN) patients (one mid-treatment CT). DIRpropagated ROIs were compared with physician-drawn ROIs for 8 ITVsmore » and 7 critical organs (lungs, heart, esophagus, and etc.) for the lung patients, as well as 14 GTVs and 20 critical organs (mandible, eyes, parotids, and etc.) for the HN patients. Volume difference, center of mass (COM) difference, and Dice index were used for evaluation. Clinical-relevance of each propagated ROI was scored by two physicians, and correlated with the Dice index. Results: For critical organs, good agreement (Dice>0.9) were seen on all 7 for lung patients and 13 out of 20 for HN patients, with the rest requiring minimal edits. For targets, COM differences were within 5 mm on average for all patients. For Lung, 5 out of 8 ITVs required minimal edits (Dice 0.8–0.9), with the rest 2 needed re-drawn due to their small volumes (<10 cc). However, the propagated HN GTVs resulted in relatively low Dice values (0.5–0.8) due to their small volumes (3–40 cc) and high variability, among which 2 required re-drawn due to new nodal target identified on the mid-treatment CT scans. Conclusion: The hybrid DIR algorithm was found to be clinically useful and efficient for lung and HN patients, especially for propagated critical organ ROIs. It has potential to significantly improve the workflow in adaptive planning.« less
Management of tibial non-unions according to a novel treatment algorithm.
Ferreira, Nando; Marais, Leonard Charles
2015-12-01
Tibial non-unions represent a spectrum of conditions that are challenging to treat. The optimal management remains unclear despite the frequency with which these diagnoses are encountered. The aim of this study was to determine the outcome of tibial non-unions managed according to a novel tibial non-union treatment algorithm. One hundred and eighteen consecutive patients with 122 uninfected tibial non-unions were treated according to our proposed tibial non-union treatment algorithm. All patients were followed-up clinically and radiologically for a minimum of six months after external fixator removal. Four patients were excluded because they did not complete the intended treatment process. The final study population consisted of 94 men and 24 women with a mean age of 34 years. Sixty-seven non-unions were stiff hypertrophic, 32 mobile atrophic, 16 mobile oligotrophic and one true pseudoarthrosis. Six non-unions were classified as type B1 defect non-unions. Bony union was achieved after the initial surgery in 113/122 (92.6%) tibias. Nine patients had failure of treatment. Seven persistent non-unions were successfully retreated according to the tibial non-union treatment algorithm. This resulted in final bony union in 120/122 (98.3%) tibias. The proposed tibial non-union treatment algorithm appears to produce high union rates across a diverse group of tibial non-unions. Tibial non-unions however, remain difficult to treat and should be referred to specialist units where advanced reconstructive techniques are practiced on a regular basis. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Naijin
2013-03-01
Level Based Partitioning (LBP) algorithm, Cluster Based Partitioning (CBP) algorithm and Enhance Static List (ESL) temporal partitioning algorithm based on adjacent matrix and adjacent table are designed and implemented in this paper. Also partitioning time and memory occupation based on three algorithms are compared. Experiment results show LBP partitioning algorithm possesses the least partitioning time and better parallel character, as far as memory occupation and partitioning time are concerned, algorithms based on adjacent table have less partitioning time and less space memory occupation.
Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering
NASA Astrophysics Data System (ADS)
Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech
2015-03-01
We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.
Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs)
Howsmon, Daniel P.; Cameron, Faye; Baysal, Nihat; Ly, Trang T.; Forlenza, Gregory P.; Maahs, David M.; Buckingham, Bruce A.; Hahn, Juergen; Bequette, B. Wayne
2017-01-01
Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis—a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios. PMID:28098839
Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs).
Howsmon, Daniel P; Cameron, Faye; Baysal, Nihat; Ly, Trang T; Forlenza, Gregory P; Maahs, David M; Buckingham, Bruce A; Hahn, Juergen; Bequette, B Wayne
2017-01-15
Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis-a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios.
NASA Astrophysics Data System (ADS)
Moliner, L.; Correcher, C.; González, A. J.; Conde, P.; Hernández, L.; Orero, A.; Rodríguez-Álvarez, M. J.; Sánchez, F.; Soriano, A.; Vidal, L. F.; Benlloch, J. M.
2013-02-01
In this work we present an innovative algorithm for the reconstruction of PET images based on the List-Mode (LM) technique which improves their spatial resolution compared to results obtained with current MLEM algorithms. This study appears as a part of a large project with the aim of improving diagnosis in early Alzheimer disease stages by means of a newly developed hybrid PET-MR insert. At the present, Alzheimer is the most relevant neurodegenerative disease and the best way to apply an effective treatment is its early diagnosis. The PET device will consist of several monolithic LYSO crystals coupled to SiPM detectors. Monolithic crystals can reduce scanner costs with the advantage to enable implementation of very small virtual pixels in their geometry. This is especially useful for LM reconstruction algorithms, since they do not need a pre-calculated system matrix. We have developed an LM algorithm which has been initially tested with a large aperture (186 mm) breast PET system. Such an algorithm instead of using the common lines of response, incorporates a novel calculation of tubes of response. The new approach improves the volumetric spatial resolution about a factor 2 at the border of the field of view when compared with traditionally used MLEM algorithm. Moreover, it has also shown to decrease the image noise, thus increasing the image quality.
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Youngblood, John N.; Saha, Aindam
1987-01-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, C.C.; Youngblood, J.N.; Saha, A.
1987-12-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less
D'Amours, Michel; Pouliot, Jean; Dagnault, Anne; Verhaegen, Frank; Beaulieu, Luc
2011-12-01
Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report 43 approach for the Axxent source. Copyright © 2011 Elsevier Inc. All rights reserved.
The acute management of haemorrhoids
Cohen, CRG
2014-01-01
Introduction Although the acute thrombosis and strangulation of haemorrhoids is a common condition, there is no consensus as to its most effective treatment. Methods A PubMed search was undertaken for papers describing the aetiology and treatment of the acute complications of haemorrhoids. Results The anatomy and treatments for strangulated internal haemorrhoids and thrombosed perianal varices are discussed. Studies of the effectiveness and complications of conservative and operative treatments are reviewed. Conclusions Ambiguities exist in the terminology used to describe the two separate pathologies that make up the acute complications of haemorrhoids. These complications have traditionally been treated conservatively. There is evidence that early operative intervention for strangulated internal haemorrhoids is safe and effective. A suggested algorithm for treatment is given, based on the published literature. PMID:25245728
Treatment of Pediatric Bipolar Disorder: A Review
Washburn, Jason J.; West, Amy E.; Heil, Jennifer A.
2011-01-01
Aim To review the diagnosis and the pharmacologic and psychosocial interventions for pediatric bipolar disorder (PBD). Methods A comprehensive literature review of studies discussing the diagnosis and treatment of PBD was conducted. Results A context for understanding controversies and difficulties in the diagnosis of PBD is provided. An evidence-based assessment protocol for PBD is reviewed. The evidence for the following three categories of pharmacologic interventions are reviewed: Lithium, antiepileptics, and second generation antipsychotics. Algorithms for medication decisions are briefly reviewed. Existing psychosocial treatments and the evidence for those treatments are also reviewed. Conclusion Despite recent developments in understanding the phenomenology of PBD and in identifying pharmacologic and psychosocial interventions, critical gaps remain. PMID:21822352
SU-F-J-194: Development of Dose-Based Image Guided Proton Therapy Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, R; Sun, B; Zhao, T
Purpose: To implement image-guided proton therapy (IGPT) based on daily proton dose distribution. Methods: Unlike x-ray therapy, simple alignment based on anatomy cannot ensure proper dose coverage in proton therapy. Anatomy changes along the beam path may lead to underdosing the target, or overdosing the organ-at-risk (OAR). With an in-room mobile computed tomography (CT) system, we are developing a dose-based IGPT software tool that allows patient positioning and treatment adaption based on daily dose distributions. During an IGPT treatment, daily CT images are acquired in treatment position. After initial positioning based on rigid image registration, proton dose distribution is calculatedmore » on daily CT images. The target and OARs are automatically delineated via deformable image registration. Dose distributions are evaluated to decide if repositioning or plan adaptation is necessary in order to achieve proper coverage of the target and sparing of OARs. Besides online dose-based image guidance, the software tool can also map daily treatment doses to the treatment planning CT images for offline adaptive treatment. Results: An in-room helical CT system is commissioned for IGPT purposes. It produces accurate CT numbers that allow proton dose calculation. GPU-based deformable image registration algorithms are developed and evaluated for automatic ROI-delineation and dose mapping. The online and offline IGPT functionalities are evaluated with daily CT images of the proton patients. Conclusion: The online and offline IGPT software tool may improve the safety and quality of proton treatment by allowing dose-based IGPT and adaptive proton treatments. Research is partially supported by Mevion Medical Systems.« less
Evaluation of Demons- and FEM-Based Registration Algorithms for Lung Cancer.
Yang, Juan; Li, Dengwang; Yin, Yong; Zhao, Fen; Wang, Hongjun
2016-04-01
We evaluated and compared the accuracy of 2 deformable image registration algorithms in 4-dimensional computed tomography images for patients with lung cancer. Ten patients with non-small cell lung cancer or small cell lung cancer were enrolled in this institutional review board-approved study. The displacement vector fields relative to a specific reference image were calculated by using the diffeomorphic demons (DD) algorithm and the finite element method (FEM)-based algorithm. The registration accuracy was evaluated by using normalized mutual information (NMI), the sum of squared intensity difference (SSD), modified Hausdorff distance (dH_M), and ratio of gross tumor volume (rGTV) difference between reference image and deformed phase image. We also compared the registration speed of the 2 algorithms. Of all patients, the FEM-based algorithm showed stronger ability in aligning 2 images than the DD algorithm. The means (±standard deviation) of NMI were 0.86 (±0.05) and 0.90 (±0.05) using the DD algorithm and the FEM-based algorithm, respectively. The means of SSD were 0.006 (±0.003) and 0.003 (±0.002) using the DD algorithm and the FEM-based algorithm, respectively. The means of dH_M were 0.04 (±0.02) and 0.03 (±0.03) using the DD algorithm and the FEM-based algorithm, respectively. The means of rGTV were 3.9% (±1.01%) and 2.9% (±1.1%) using the DD algorithm and the FEM-based algorithm, respectively. However, the FEM-based algorithm costs a longer time than the DD algorithm, with the average running time of 31.4 minutes compared to 21.9 minutes for all patients. The preliminary results showed that the FEM-based algorithm was more accurate than the DD algorithm while compromised with the registration speed. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y; Liu, B; Liang, B
Purpose: Current CyberKnife treatment planning system (TPS) provided two dose calculation algorithms: Ray-tracing and Monte Carlo. Ray-tracing algorithm is fast, but less accurate, and also can’t handle irregular fields since a multi-leaf collimator system was recently introduced to CyberKnife M6 system. Monte Carlo method has well-known accuracy, but the current version still takes a long time to finish dose calculations. The purpose of this paper is to develop a GPU-based fast C/S dose engine for CyberKnife system to achieve both accuracy and efficiency. Methods: The TERMA distribution from a poly-energetic source was calculated based on beam’s eye view coordinate system,more » which is GPU friendly and has linear complexity. The dose distribution was then computed by inversely collecting the energy depositions from all TERMA points along 192 collapsed-cone directions. EGSnrc user code was used to pre-calculate energy deposition kernels (EDKs) for a series of mono-energy photons The energy spectrum was reconstructed based on measured tissue maximum ratio (TMR) curve, the TERMA averaged cumulative kernels was then calculated. Beam hardening parameters and intensity profiles were optimized based on measurement data from CyberKnife system. Results: The difference between measured and calculated TMR are less than 1% for all collimators except in the build-up regions. The calculated profiles also showed good agreements with the measured doses within 1% except in the penumbra regions. The developed C/S dose engine was also used to evaluate four clinical CyberKnife treatment plans, the results showed a better dose calculation accuracy than Ray-tracing algorithm compared with Monte Carlo method for heterogeneous cases. For the dose calculation time, it takes about several seconds for one beam depends on collimator size and dose calculation grids. Conclusion: A GPU-based C/S dose engine has been developed for CyberKnife system, which was proven to be efficient and accurate for clinical purpose, and can be easily implemented in TPS.« less
Motion prediction in MRI-guided radiotherapy based on interleaved orthogonal cine-MRI
NASA Astrophysics Data System (ADS)
Seregni, M.; Paganelli, C.; Lee, D.; Greer, P. B.; Baroni, G.; Keall, P. J.; Riboldi, M.
2016-01-01
In-room cine-MRI guidance can provide non-invasive target localization during radiotherapy treatment. However, in order to cope with finite imaging frequency and system latencies between target localization and dose delivery, tumour motion prediction is required. This work proposes a framework for motion prediction dedicated to cine-MRI guidance, aiming at quantifying the geometric uncertainties introduced by this process for both tumour tracking and beam gating. The tumour position, identified through scale invariant features detected in cine-MRI slices, is estimated at high-frequency (25 Hz) using three independent predictors, one for each anatomical coordinate. Linear extrapolation, auto-regressive and support vector machine algorithms are compared against systems that use no prediction or surrogate-based motion estimation. Geometric uncertainties are reported as a function of image acquisition period and system latency. Average results show that the tracking error RMS can be decreased down to a [0.2; 1.2] mm range, for acquisition periods between 250 and 750 ms and system latencies between 50 and 300 ms. Except for the linear extrapolator, tracking and gating prediction errors were, on average, lower than those measured for surrogate-based motion estimation. This finding suggests that cine-MRI guidance, combined with appropriate prediction algorithms, could relevantly decrease geometric uncertainties in motion compensated treatments.
Development of an algorithm to plan and simulate a new interventional procedure.
Fujita, Buntaro; Kütting, Maximilian; Scholtz, Smita; Utzenrath, Marc; Hakim-Meibodi, Kavous; Paluszkiewicz, Lech; Schmitz, Christoph; Börgermann, Jochen; Gummert, Jan; Steinseifer, Ulrich; Ensminger, Stephan
2015-07-01
The number of implanted biological valves for treatment of valvular heart disease is growing and a percentage of these patients will eventually undergo a transcatheter valve-in-valve (ViV) procedure. Some of these patients will represent challenging cases. The aim of this study was to develop a feasible algorithm to plan and in vitro simulate a new interventional procedure to improve patient outcome. In addition to standard diagnostic routine, our algorithm includes 3D printing of the annulus, hydrodynamic measurements and high-speed analysis of leaflet kinematics after simulation of the procedure in different prosthesis positions as well as X-ray imaging of the most suitable valve position to create a 'blueprint' for the patient procedure. This algorithm was developed for a patient with a degenerated Perceval aortic sutureless prosthesis requiring a ViV procedure. Different ViV procedures were assessed in the algorithm and based on these results the best option for the patient was chosen. The actual procedure went exactly as planned with help of this algorithm. Here we have developed a new technically feasible algorithm simulating important aspects of a novel interventional procedure prior to the actual procedure. This algorithm can be applied to virtually all patients requiring a novel interventional procedure to help identify risks and find optimal parameters for prosthesis selection and placement in order to maximize safety for the patient. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Planning and delivery of four-dimensional radiation therapy with multileaf collimators
NASA Astrophysics Data System (ADS)
McMahon, Ryan L.
This study is an investigation of the application of multileaf collimators (MLCs) to the treatment of moving anatomy with external beam radiation therapy. First, a method for delivering intensity modulated radiation therapy (IMRT) to moving tumors is presented. This method uses an MLC control algorithm that calculates appropriate MLC leaf speeds in response to feedback from real-time imaging. The algorithm does not require a priori knowledge of a tumor's motion, and is based on the concept of self-correcting DMLC leaf trajectories . This gives the algorithm the distinct advantage of allowing for correction of DMLC delivery errors without interrupting delivery. The algorithm is first tested for the case of one-dimensional (1D) rigid tumor motion in the beam's eye view (BEV). For this type of motion, it is shown that the real-time tracking algorithm results in more accurate deliveries, with respect to delivered intensity, than those which ignore motion altogether. This is followed by an appropriate extension of the algorithm to two-dimensional (2D) rigid motion in the BEV. For this type of motion, it is shown that the 2D real-time tracking algorithm results in improved accuracy (in the delivered intensity) in comparison to deliveries which ignore tumor motion or only account for tumor motion which is aligned with MLC leaf travel. Finally, a method is presented for designing DMLC leaf trajectories which deliver a specified intensity over a moving tumor without overexposing critical structures which exhibit motion patterns that differ from that of the tumor. In addition to avoiding overexposure of critical organs, the method can, in the case shown, produce deliveries that are superior to anything achievable using stationary anatomy. In this regard, the method represents a systematic way to include anatomical motion as a degree of freedom in the optimization of IMRT while producing treatment plans that are deliverable with currently available technology. These results, combined with those related to the real-time MLC tracking algorithm, show that an MLC is a promising tool to investigate for the delivery of four-dimensional radiation therapy.
Physics Based Model for Cryogenic Chilldown and Loading. Part I: Algorithm
NASA Technical Reports Server (NTRS)
Luchinsky, Dmitry G.; Smelyanskiy, Vadim N.; Brown, Barbara
2014-01-01
We report the progress in the development of the physics based model for cryogenic chilldown and loading. The chilldown and loading is model as fully separated non-equilibrium two-phase flow of cryogenic fluid thermally coupled to the pipe walls. The solution follow closely nearly-implicit and semi-implicit algorithms developed for autonomous control of thermal-hydraulic systems developed by Idaho National Laboratory. A special attention is paid to the treatment of instabilities. The model is applied to the analysis of chilldown in rapid loading system developed at NASA-Kennedy Space Center. The nontrivial characteristic feature of the analyzed chilldown regime is its active control by dump valves. The numerical predictions are in reasonable agreement with the experimental time traces. The obtained results pave the way to the development of autonomous loading operation on the ground and space.
Faithfull, S; Lemanska, A; Aslet, P; Bhatt, N; Coe, J; Drudge-Coates, L; Feneley, M; Glynn-Jones, R; Kirby, M; Langley, S; McNicholas, T; Newman, J; Smith, C C; Sahai, A; Trueman, E; Payne, H
2015-10-01
To develop a non-invasive management strategy for men with lower urinary tract symptoms (LUTS) after treatment for pelvic cancer, that is suitable for use in a primary healthcare context. PubMed literature searches of LUTS management in this patient group were carried out, together with obtaining a consensus of management strategies from a panel of authors for the management of LUTS from across the UK. Data from 41 articles were investigated and collated. Clinical experience was sought from authors where there was no clinical evidence. The findings discussed in this paper confirm that LUTS after the cancer treatment can significantly impair men's quality of life. While many men recover from LUTS spontaneously over time, a significant proportion require long-term management. Despite the prevalence of LUTS, there is a lack of consensus on best management. This article offers a comprehensive treatment algorithm to manage patients with LUTS following pelvic cancer treatment. Based on published research literature and clinical experience, recommendations are proposed for the standardisation of management strategies employed for men with LUTS after the pelvic cancer treatment. In addition to implementing the algorithm, understanding the rationale for the type and timing of LUTS management strategies is crucial for clinicians and patients. © 2015 The Authors. International Journal of Clinical Practice Published by John Wiley & Sons Ltd.
Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray
2014-01-01
A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design. PMID:25404761
A new algorithm for reducing the workload of experts in performing systematic reviews.
Matwin, Stan; Kouznetsov, Alexandre; Inkpen, Diana; Frunza, Oana; O'Blenis, Peter
2010-01-01
To determine whether a factorized version of the complement naïve Bayes (FCNB) classifier can reduce the time spent by experts reviewing journal articles for inclusion in systematic reviews of drug class efficacy for disease treatment. The proposed classifier was evaluated on a test collection built from 15 systematic drug class reviews used in previous work. The FCNB classifier was constructed to classify each article as containing high-quality, drug class-specific evidence or not. Weight engineering (WE) techniques were added to reduce underestimation for Medical Subject Headings (MeSH)-based and Publication Type (PubType)-based features. Cross-validation experiments were performed to evaluate the classifier's parameters and performance. Work saved over sampling (WSS) at no less than a 95% recall was used as the main measure of performance. The minimum workload reduction for a systematic review for one topic, achieved with a FCNB/WE classifier, was 8.5%; the maximum was 62.2% and the average over the 15 topics was 33.5%. This is 15.0% higher than the average workload reduction obtained using a voting perceptron-based automated citation classification system. The FCNB/WE classifier is simple, easy to implement, and produces significantly better results in reducing the workload than previously achieved. The results support it being a useful algorithm for machine-learning-based automation of systematic reviews of drug class efficacy for disease treatment.
"Radio-oncomics" : The potential of radiomics in radiation oncology.
Peeken, Jan Caspar; Nüsslin, Fridtjof; Combs, Stephanie E
2017-10-01
Radiomics, a recently introduced concept, describes quantitative computerized algorithm-based feature extraction from imaging data including computer tomography (CT), magnetic resonance imaging (MRT), or positron-emission tomography (PET) images. For radiation oncology it offers the potential to significantly influence clinical decision-making and thus therapy planning and follow-up workflow. After image acquisition, image preprocessing, and defining regions of interest by structure segmentation, algorithms are applied to calculate shape, intensity, texture, and multiscale filter features. By combining multiple features and correlating them with clinical outcome, prognostic models can be created. Retrospective studies have proposed radiomics classifiers predicting, e. g., overall survival, radiation treatment response, distant metastases, or radiation-related toxicity. Besides, radiomics features can be correlated with genomic information ("radiogenomics") and could be used for tumor characterization. Distinct patterns based on data-based as well as genomics-based features will influence radiation oncology in the future. Individualized treatments in terms of dose level adaption and target volume definition, as well as other outcome-related parameters will depend on radiomics and radiogenomics. By integration of various datasets, the prognostic power can be increased making radiomics a valuable part of future precision medicine approaches. This perspective demonstrates the evidence for the radiomics concept in radiation oncology. The necessity of further studies to integrate radiomics classifiers into clinical decision-making and the radiation therapy workflow is emphasized.
NASA Astrophysics Data System (ADS)
Bu, Sunyoung; Huang, Jingfang; Boyer, Treavor H.; Miller, Cass T.
2010-07-01
The focus of this work is on the modeling of an ion exchange process that occurs in drinking water treatment applications. The model formulation consists of a two-scale model in which a set of microscale diffusion equations representing ion exchange resin particles that vary in size and age are coupled through a boundary condition with a macroscopic ordinary differential equation (ODE), which represents the concentration of a species in a well-mixed reactor. We introduce a new age-averaged model (AAM) that averages all ion exchange particle ages for a given size particle to avoid the expensive Monte-Carlo simulation associated with previous modeling applications. We discuss two different numerical schemes to approximate both the original Monte-Carlo algorithm and the new AAM for this two-scale problem. The first scheme is based on the finite element formulation in space coupled with an existing backward difference formula-based ODE solver in time. The second scheme uses an integral equation based Krylov deferred correction (KDC) method and a fast elliptic solver (FES) for the resulting elliptic equations. Numerical results are presented to validate the new AAM algorithm, which is also shown to be more computationally efficient than the original Monte-Carlo algorithm. We also demonstrate that the higher order KDC scheme is more efficient than the traditional finite element solution approach and this advantage becomes increasingly important as the desired accuracy of the solution increases. We also discuss issues of smoothness, which affect the efficiency of the KDC-FES approach, and outline additional algorithmic changes that would further improve the efficiency of these developing methods for a wide range of applications.
Development of sensor-based nitrogen recommendation algorithms for cereal crops
NASA Astrophysics Data System (ADS)
Asebedo, Antonio Ray
Nitrogen (N) management is one of the most recognizable components of farming both within and outside the world of agriculture. Interest over the past decade has greatly increased in improving N management systems in corn (Zea mays) and winter wheat (Triticum aestivum ) to have high NUE, high yield, and be environmentally sustainable. Nine winter wheat experiments were conducted across seven locations from 2011 through 2013. The objectives of this study were to evaluate the impacts of fall-winter, Feekes 4, Feekes 7, and Feekes 9 N applications on winter wheat grain yield, grain protein, and total grain N uptake. Nitrogen treatments were applied as single or split applications in the fall-winter, and top-dressed in the spring at Feekes 4, Feekes 7, and Feekes 9 with applied N rates ranging from 0 to 134 kg ha-1. Results indicate that Feekes 7 and 9 N applications provide more optimal combinations of grain yield, grain protein levels, and fertilizer N recovered in the grain when compared to comparable rates of N applied in the fall-winter or at Feekes 4. Winter wheat N management studies from 2006 through 2013 were utilized to develop sensor-based N recommendation algorithms for winter wheat in Kansas. Algorithm RosieKat v.2.6 was designed for multiple N application strategies and utilized N reference strips for establishing N response potential. Algorithm NRS v1.5 addressed single top-dress N applications and does not require a N reference strip. In 2013, field validations of both algorithms were conducted at eight locations across Kansas. Results show algorithm RK v2.6 consistently provided highly efficient N recommendations for improving NUE, while achieving high grain yield and grain protein. Without the use of the N reference strip, NRS v1.5 performed statistically equal to the KSU soil test N recommendation in regards to grain yield but with lower applied N rates. Six corn N fertigation experiments were conducted at KSU irrigated experiment fields from 2012 through 2014 to evaluate the previously developed KSU sensor-based N recommendation algorithm in corn N fertigation systems. Results indicate that the current KSU corn algorithm was effective at achieving high yields, but has the tendency to overestimate N requirements. To optimize sensor-based N recommendations for N fertigation systems, algorithms must be specifically designed for these systems to take advantage of their full capabilities, thus allowing implementation of high NUE N management systems.
Using medication list--problem list mismatches as markers of potential error.
Carpenter, James D.; Gorman, Paul N.
2002-01-01
The goal of this project was to specify and develop an algorithm that will check for drug and problem list mismatches in an electronic medical record (EMR). The algorithm is based on the premise that a patient's problem list and medication list should agree, and a mismatch may indicate medication error. Successful development of this algorithm could mean detection of some errors, such as medication orders entered into a wrong patient record, or drug therapy omissions, that are not otherwise detected via automated means. Additionally, mismatches may identify opportunities to improve problem list integrity. To assess the concept's feasibility, this study compared medications listed in a pharmacy information system with findings in an online nursing adult admission assessment, serving as a proxy for the problem list. Where drug and problem list mismatches were discovered, examination of the patient record confirmed the mismatch, and identified any potential causes. Evaluation of the algorithm in diabetes treatment indicates that it successfully detects both potential medication error and opportunities to improve problem list completeness. This algorithm, once fully developed and deployed, could prove a valuable way to improve the patient problem list, and could decrease the risk of medication error. PMID:12463796
A scalable, fully implicit algorithm for the reduced two-field low-β extended MHD model
Chacon, Luis; Stanier, Adam John
2016-12-01
Here, we demonstrate a scalable fully implicit algorithm for the two-field low-β extended MHD model. This reduced model describes plasma behavior in the presence of strong guide fields, and is of significant practical impact both in nature and in laboratory plasmas. The model displays strong hyperbolic behavior, as manifested by the presence of fast dispersive waves, which make a fully implicit treatment very challenging. In this study, we employ a Jacobian-free Newton–Krylov nonlinear solver, for which we propose a physics-based preconditioner that renders the linearized set of equations suitable for inversion with multigrid methods. As a result, the algorithm ismore » shown to scale both algorithmically (i.e., the iteration count is insensitive to grid refinement and timestep size) and in parallel in a weak-scaling sense, with the wall-clock time scaling weakly with the number of cores for up to 4096 cores. For a 4096 × 4096 mesh, we demonstrate a wall-clock-time speedup of ~6700 with respect to explicit algorithms. The model is validated linearly (against linear theory predictions) and nonlinearly (against fully kinetic simulations), demonstrating excellent agreement.« less
Wallace, Meredith L; Anderson, Stewart J; Mazumdar, Sati
2010-12-20
Missing covariate data present a challenge to tree-structured methodology due to the fact that a single tree model, as opposed to an estimated parameter value, may be desired for use in a clinical setting. To address this problem, we suggest a multiple imputation algorithm that adds draws of stochastic error to a tree-based single imputation method presented by Conversano and Siciliano (Technical Report, University of Naples, 2003). Unlike previously proposed techniques for accommodating missing covariate data in tree-structured analyses, our methodology allows the modeling of complex and nonlinear covariate structures while still resulting in a single tree model. We perform a simulation study to evaluate our stochastic multiple imputation algorithm when covariate data are missing at random and compare it to other currently used methods. Our algorithm is advantageous for identifying the true underlying covariate structure when complex data and larger percentages of missing covariate observations are present. It is competitive with other current methods with respect to prediction accuracy. To illustrate our algorithm, we create a tree-structured survival model for predicting time to treatment response in older, depressed adults. Copyright © 2010 John Wiley & Sons, Ltd.
Hou, Gary Y; Provost, Jean; Grondin, Julien; Wang, Shutao; Marquet, Fabrice; Bunting, Ethan; Konofagou, Elisa E
2014-11-01
Harmonic motion imaging for focused ultrasound (HMIFU) utilizes an amplitude-modulated HIFU beam to induce a localized focal oscillatory motion simultaneously estimated. The objective of this study is to develop and show the feasibility of a novel fast beamforming algorithm for image reconstruction using GPU-based sparse-matrix operation with real-time feedback. In this study, the algorithm was implemented onto a fully integrated, clinically relevant HMIFU system. A single divergent transmit beam was used while fast beamforming was implemented using a GPU-based delay-and-sum method and a sparse-matrix operation. Axial HMI displacements were then estimated from the RF signals using a 1-D normalized cross-correlation method and streamed to a graphic user interface with frame rates up to 15 Hz, a 100-fold increase compared to conventional CPU-based processing. The real-time feedback rate does not require interrupting the HIFU treatment. Results in phantom experiments showed reproducible HMI images and monitoring of 22 in vitro HIFU treatments using the new 2-D system demonstrated reproducible displacement imaging, and monitoring of 22 in vitro HIFU treatments using the new 2-D system showed a consistent average focal displacement decrease of 46.7 ±14.6% during lesion formation. Complementary focal temperature monitoring also indicated an average rate of displacement increase and decrease with focal temperature at 0.84±1.15%/(°)C, and 2.03±0.93%/(°)C , respectively. These results reinforce the HMIFU capability of estimating and monitoring stiffness related changes in real time. Current ongoing studies include clinical translation of the presented system for monitoring of HIFU treatment for breast and pancreatic tumor applications.
Reynier, Frédéric; Petit, Fabien; Paye, Malick; Turrel-Davin, Fanny; Imbert, Pierre-Emmanuel; Hot, Arnaud; Mougin, Bruno; Miossec, Pierre
2011-01-01
The analysis of gene expression data shows that many genes display similarity in their expression profiles suggesting some co-regulation. Here, we investigated the co-expression patterns in gene expression data and proposed a correlation-based research method to stratify individuals. Using blood from rheumatoid arthritis (RA) patients, we investigated the gene expression profiles from whole blood using Affymetrix microarray technology. Co-expressed genes were analyzed by a biclustering method, followed by gene ontology analysis of the relevant biclusters. Taking the type I interferon (IFN) pathway as an example, a classification algorithm was developed from the 102 RA patients and extended to 10 systemic lupus erythematosus (SLE) patients and 100 healthy volunteers to further characterize individuals. We developed a correlation-based algorithm referred to as Classification Algorithm Based on a Biological Signature (CABS), an alternative to other approaches focused specifically on the expression levels. This algorithm applied to the expression of 35 IFN-related genes showed that the IFN signature presented a heterogeneous expression between RA, SLE and healthy controls which could reflect the level of global IFN signature activation. Moreover, the monitoring of the IFN-related genes during the anti-TNF treatment identified changes in type I IFN gene activity induced in RA patients. In conclusion, we have proposed an original method to analyze genes sharing an expression pattern and a biological function showing that the activation levels of a biological signature could be characterized by its overall state of correlation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, R; Kamima, T; Tachibana, H
2015-06-15
Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less
Lung tumor tracking in fluoroscopic video based on optical flow
Xu, Qianyi; Hamilton, Russell J.; Schowengerdt, Robert A.; Alexander, Brian; Jiang, Steve B.
2008-01-01
Respiratory gating and tumor tracking for dynamic multileaf collimator delivery require accurate and real-time localization of the lung tumor position during treatment. Deriving tumor position from external surrogates such as abdominal surface motion may have large uncertainties due to the intra- and interfraction variations of the correlation between the external surrogates and internal tumor motion. Implanted fiducial markers can be used to track tumors fluoroscopically in real time with sufficient accuracy. However, it may not be a practical procedure when implanting fiducials bronchoscopically. In this work, a method is presented to track the lung tumor mass or relevant anatomic features projected in fluoroscopic images without implanted fiducial markers based on an optical flow algorithm. The algorithm generates the centroid position of the tracked target and ignores shape changes of the tumor mass shadow. The tracking starts with a segmented tumor projection in an initial image frame. Then, the optical flow between this and all incoming frames acquired during treatment delivery is computed as initial estimations of tumor centroid displacements. The tumor contour in the initial frame is transferred to the incoming frames based on the average of the motion vectors, and its positions in the incoming frames are determined by fine-tuning the contour positions using a template matching algorithm with a small search range. The tracking results were validated by comparing with clinician determined contours on each frame. The position difference in 95% of the frames was found to be less than 1.4 pixels (∼0.7 mm) in the best case and 2.8 pixels (∼1.4 mm) in the worst case for the five patients studied. PMID:19175094
Lung tumor tracking in fluoroscopic video based on optical flow.
Xu, Qianyi; Hamilton, Russell J; Schowengerdt, Robert A; Alexander, Brian; Jiang, Steve B
2008-12-01
Respiratory gating and tumor tracking for dynamic multileaf collimator delivery require accurate and real-time localization of the lung tumor position during treatment. Deriving tumor position from external surrogates such as abdominal surface motion may have large uncertainties due to the intra- and interfraction variations of the correlation between the external surrogates and internal tumor motion. Implanted fiducial markers can be used to track tumors fluoroscopically in real time with sufficient accuracy. However, it may not be a practical procedure when implanting fiducials bronchoscopically. In this work, a method is presented to track the lung tumor mass or relevant anatomic features projected in fluoroscopic images without implanted fiducial markers based on an optical flow algorithm. The algorithm generates the centroid position of the tracked target and ignores shape changes of the tumor mass shadow. The tracking starts with a segmented tumor projection in an initial image frame. Then, the optical flow between this and all incoming frames acquired during treatment delivery is computed as initial estimations of tumor centroid displacements. The tumor contour in the initial frame is transferred to the incoming frames based on the average of the motion vectors, and its positions in the incoming frames are determined by fine-tuning the contour positions using a template matching algorithm with a small search range. The tracking results were validated by comparing with clinician determined contours on each frame. The position difference in 95% of the frames was found to be less than 1.4 pixels (approximately 0.7 mm) in the best case and 2.8 pixels (approximately 1.4 mm) in the worst case for the five patients studied.
Menas, Pamela; Merkel, Douglas; Hui, Wendy; Lawton, Jessica; Harper, Abigail; Carro, George
2012-12-01
Aromatase inhibitors (AIs) are routinely used as first-line adjuvant treatment of breast cancer in postmenopausal women with hormone receptor positive tumors. The current recommended length of treatment with an AI is 5 years. Arthralgias have been frequently cited as the primary reason for discontinuation of AI therapy. Various treatment strategies are proposed in literature, but a standardized treatment algorithm has not been established. The initial purpose of this study was to describe the incidence and management of AI-induced arthralgias in patients treated at Kellogg Cancer Center (KCC). Further evaluation led to the development and the implementation of a treatment algorithm and electronic medical record (EMR) documentation tools. The retrospective chart review included 206 adult patients with hormone receptor positive breast cancer who were receiving adjuvant therapy with an AI. A multidisciplinary treatment team consisting of pharmacists, collaborative practice nurses, and physicians met to develop a standardized treatment algorithm and corresponding EMR documentation tool. The treatment algorithm and documentation tool were developed after the study to better monitor and proactively treat patients with AI-induced arthralgias. RESULTS/ CONCLUSIONS: The overall incidence of arthralgias at KCC was 48% (n = 98/206). Of these patients, 32% were documented as having arthralgias within the first 6 months of therapy initiation. Patients who reported AI-induced arthralgias were younger than patients who did not report AI-induced arthralgias (61 vs. 65 years, p = 0.002). There was no statistical difference in the incidence of arthralgias in patients with a history of chemotherapy (including taxane therapy) compared to those who did not receive chemotherapy (p = 0.352). Of patients presenting with AI-induced arthralgias, 41% did not have physician-managed treatment documented in the EMR. A standardized treatment algorithm and electronic chart documentation tools were then developed by the multidisciplinary team.
SPHINX--an algorithm for taxonomic binning of metagenomic sequences.
Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Singh, Nitin Kumar; Mande, Sharmila S
2011-01-01
Compared with composition-based binning algorithms, the binning accuracy and specificity of alignment-based binning algorithms is significantly higher. However, being alignment-based, the latter class of algorithms require enormous amount of time and computing resources for binning huge metagenomic datasets. The motivation was to develop a binning approach that can analyze metagenomic datasets as rapidly as composition-based approaches, but nevertheless has the accuracy and specificity of alignment-based algorithms. This article describes a hybrid binning approach (SPHINX) that achieves high binning efficiency by utilizing the principles of both 'composition'- and 'alignment'-based binning algorithms. Validation results with simulated sequence datasets indicate that SPHINX is able to analyze metagenomic sequences as rapidly as composition-based algorithms. Furthermore, the binning efficiency (in terms of accuracy and specificity of assignments) of SPHINX is observed to be comparable with results obtained using alignment-based algorithms. A web server for the SPHINX algorithm is available at http://metagenomics.atc.tcs.com/SPHINX/.
Acceptance and Commitment Therapy modules: Differential impact on treatment processes and outcomes.
Villatte, Jennifer L; Vilardaga, Roger; Villatte, Matthieu; Plumb Vilardaga, Jennifer C; Atkins, David C; Hayes, Steven C
2016-02-01
A modular, transdiagnostic approach to treatment design and implementation may increase the public health impact of evidence-based psychosocial interventions. Such an approach relies on algorithms for selecting and implementing treatment components intended to have a specific therapeutic effect, yet there is little evidence for how components function independent of their treatment packages when employed in clinical service settings. This study aimed to demonstrate the specificity of treatment effects for two components of Acceptance and Commitment Therapy (ACT), a promising candidate for modularization. A randomized, nonconcurrent, multiple-baseline across participants design was used to examine component effects on treatment processes and outcomes in 15 adults seeking mental health treatment. The ACT OPEN module targeted acceptance and cognitive defusion; the ACT ENGAGED module targeted values-based activation and persistence. According to Tau-U analyses, both modules produced significant improvements in psychiatric symptoms, quality of life, and targeted therapeutic processes. ACT ENGAGED demonstrated greater improvements in quality of life and values-based activation. ACT OPEN showed greater improvements in symptom severity, acceptance, and defusion. Both modules improved awareness and non-reactivity, which were mutually targeted, though using distinct intervention procedures. Both interventions demonstrated high treatment acceptability, completion, and patient satisfaction. Treatment effects were maintained at 3-month follow up. ACT components should be considered for inclusion in a modular approach to implementing evidence-based psychosocial interventions for adults. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGhee, J.M.; Roberts, R.M.; Morel, J.E.
1997-06-01
A spherical harmonics research code (DANTE) has been developed which is compatible with parallel computer architectures. DANTE provides 3-D, multi-material, deterministic, transport capabilities using an arbitrary finite element mesh. The linearized Boltzmann transport equation is solved in a second order self-adjoint form utilizing a Galerkin finite element spatial differencing scheme. The core solver utilizes a preconditioned conjugate gradient algorithm. Other distinguishing features of the code include options for discrete-ordinates and simplified spherical harmonics angular differencing, an exact Marshak boundary treatment for arbitrarily oriented boundary faces, in-line matrix construction techniques to minimize memory consumption, and an effective diffusion based preconditioner formore » scattering dominated problems. Algorithm efficiency is demonstrated for a massively parallel SIMD architecture (CM-5), and compatibility with MPP multiprocessor platforms or workstation clusters is anticipated.« less
Automatic Detection of Seizures with Applications
NASA Technical Reports Server (NTRS)
Olsen, Dale E.; Harris, John C.; Cutchis, Protagoras N.; Cristion, John A.; Lesser, Ronald P.; Webber, W. Robert S.
1993-01-01
There are an estimated two million people with epilepsy in the United States. Many of these people do not respond to anti-epileptic drug therapy. Two devices can be developed to assist in the treatment of epilepsy. The first is a microcomputer-based system designed to process massive amounts of electroencephalogram (EEG) data collected during long-term monitoring of patients for the purpose of diagnosing seizures, assessing the effectiveness of medical therapy, or selecting patients for epilepsy surgery. Such a device would select and display important EEG events. Currently many such events are missed. A second device could be implanted and would detect seizures and initiate therapy. Both of these devices require a reliable seizure detection algorithm. A new algorithm is described. It is believed to represent an improvement over existing seizure detection algorithms because better signal features were selected and better standardization methods were used.
Image segmentation based upon topological operators: real-time implementation case study
NASA Astrophysics Data System (ADS)
Mahmoudi, R.; Akil, M.
2009-02-01
In miscellaneous applications of image treatment, thinning and crest restoring present a lot of interests. Recommended algorithms for these procedures are those able to act directly over grayscales images while preserving topology. But their strong consummation in term of time remains the major disadvantage in their choice. In this paper we present an efficient hardware implementation on RISC processor of two powerful algorithms of thinning and crest restoring developed by our team. Proposed implementation enhances execution time. A chain of segmentation applied to medical imaging will serve as a concrete example to illustrate the improvements brought thanks to the optimization techniques in both algorithm and architectural levels. The particular use of the SSE instruction set relative to the X86_32 processors (PIV 3.06 GHz) will allow a best performance for real time processing: a cadency of 33 images (512*512) per second is assured.
NASA Technical Reports Server (NTRS)
Bauschlicher, C. W., Jr.; Yarkony, D. R.
1980-01-01
A previously reported multi-configuration self-consistent field (MCSCF) algorithm based on the generalized Brillouin theorem is extended in order to treat the excited states of polar molecules. In particular, the algorithm takes into account the proper treatment of nonorthogonality in the space of single excitations and invokes, when necessary, a constrained optimization procedure to prevent the variational collapse of excited states. In addition, a configuration selection scheme (suitable for use in conjunction with extended configuration interaction methods) is proposed for the MCSCF procedure. The algorithm is used to study the low-lying singlet states of BeO, a system which has not previously been studied using an MCSCF procedure. MCSCF wave functions are obtained for three 1 Sigma + and two 1 Pi states. The 1 Sigma + results are juxtaposed with comparable results for MgO in order to assess the generality of the description presented here.
A software tool of digital tomosynthesis application for patient positioning in radiotherapy
Dai, Jian‐Rong
2016-01-01
Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two‐dimensional kV projections covering a narrow scan angles. Comparing with conventional cone‐beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic processing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone‐beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU‐based algorithm and CPU‐based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU‐based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU‐based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU‐based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy. PACS number(s): 87.57.nf PMID:27074482
The Texas Medication Algorithm Project antipsychotic algorithm for schizophrenia: 2003 update.
Miller, Alexander L; Hall, Catherine S; Buchanan, Robert W; Buckley, Peter F; Chiles, John A; Conley, Robert R; Crismon, M Lynn; Ereshefsky, Larry; Essock, Susan M; Finnerty, Molly; Marder, Stephen R; Miller, Del D; McEvoy, Joseph P; Rush, A John; Saeed, Sy A; Schooler, Nina R; Shon, Steven P; Stroup, Scott; Tarin-Godoy, Bernardo
2004-04-01
The Texas Medication Algorithm Project (TMAP) has been a public-academic collaboration in which guidelines for medication treatment of schizophrenia, bipolar disorder, and major depressive disorder were used in selected public outpatient clinics in Texas. Subsequently, these algorithms were implemented throughout Texas and are being used in other states. Guidelines require updating when significant new evidence emerges; the antipsychotic algorithm for schizophrenia was last updated in 1999. This article reports the recommendations developed in 2002 and 2003 by a group of experts, clinicians, and administrators. A conference in January 2002 began the update process. Before the conference, experts in the pharmacologic treatment of schizophrenia, clinicians, and administrators reviewed literature topics and prepared presentations. Topics included ziprasidone's inclusion in the algorithm, the number of antipsychotics tried before clozapine, and the role of first generation antipsychotics. Data were rated according to Agency for Healthcare Research and Quality criteria. After discussing the presentations, conference attendees arrived at consensus recommendations. Consideration of aripiprazole's inclusion was subsequently handled by electronic communications. The antipsychotic algorithm for schizophrenia was updated to include ziprasidone and aripiprazole among the first-line agents. Relative to the prior algorithm, the number of stages before clozapine was reduced. First generation antipsychotics were included but not as first-line choices. For patients refusing or not responding to clozapine and clozapine augmentation, preference was given to trying monotherapy with another antipsychotic before resorting to antipsychotic combinations. Consensus on algorithm revisions was achieved, but only further well-controlled research will answer many key questions about sequence and type of medication treatments of schizophrenia.
Fuzzy hidden Markov chains segmentation for volume determination and quantitation in PET.
Hatt, M; Lamare, F; Boussion, N; Turzo, A; Collet, C; Salzenstein, F; Roux, C; Jarritt, P; Carson, K; Cheze-Le Rest, C; Visvikis, D
2007-06-21
Accurate volume of interest (VOI) estimation in PET is crucial in different oncology applications such as response to therapy evaluation and radiotherapy treatment planning. The objective of our study was to evaluate the performance of the proposed algorithm for automatic lesion volume delineation; namely the fuzzy hidden Markov chains (FHMC), with that of current state of the art in clinical practice threshold based techniques. As the classical hidden Markov chain (HMC) algorithm, FHMC takes into account noise, voxel intensity and spatial correlation, in order to classify a voxel as background or functional VOI. However the novelty of the fuzzy model consists of the inclusion of an estimation of imprecision, which should subsequently lead to a better modelling of the 'fuzzy' nature of the object of interest boundaries in emission tomography data. The performance of the algorithms has been assessed on both simulated and acquired datasets of the IEC phantom, covering a large range of spherical lesion sizes (from 10 to 37 mm), contrast ratios (4:1 and 8:1) and image noise levels. Both lesion activity recovery and VOI determination tasks were assessed in reconstructed images using two different voxel sizes (8 mm3 and 64 mm3). In order to account for both the functional volume location and its size, the concept of % classification errors was introduced in the evaluation of volume segmentation using the simulated datasets. Results reveal that FHMC performs substantially better than the threshold based methodology for functional volume determination or activity concentration recovery considering a contrast ratio of 4:1 and lesion sizes of <28 mm. Furthermore differences between classification and volume estimation errors evaluated were smaller for the segmented volumes provided by the FHMC algorithm. Finally, the performance of the automatic algorithms was less susceptible to image noise levels in comparison to the threshold based techniques. The analysis of both simulated and acquired datasets led to similar results and conclusions as far as the performance of segmentation algorithms under evaluation is concerned.
Ciemniewska-Gorzela, Kinga; Piontek, Tomasz; Szulc, Andrzej
2014-11-14
Intra-abdominal hypertension and abdominal compartment syndrome have been increasingly recognized as a hip arthroscopy complication over the past decade. In the absence of consensus definitions and treatment guidelines, the diagnosis and management of intra-abdominal hypertension and abdominal compartment syndrome remains variable from institution to institution. We report the occurrence of the extravasation of fluid into the abdomen during arthroscopic treatment of femoroacetabular impingement combined with resection of trochanteric bursa and our management of the condition in a 55-year old Caucasian woman. We present an algorithm of treatment of abdominal compartment syndrome, as a hip arthroscopy complication, according to the consensus definitions and recommendations of the World Society of the Abdominal Compartment Syndrome. In the algorithm options, we have included paracentesis and percutaneous catheter decompression as the main point of treatment. Our algorithm will have a broader clinical impact on orthopedic surgery, anesthesiology and emergency medicine.
[Evidence-based therapy for cartilage lesions in the knee - regenerative treatment options].
Proffen, B; von Keudell, A; Vavken, P
2012-06-01
The treatment of cartilage defects has seen a shift from replacement to regeneration in the last few years. The rationale behind this development is the improvement in the quality-of-care for the growing segment of young patients who are prone to arthroplasty complications because of their specific characteristics - young age, high level of activity, high demand for functionality. These days, two of the most popular regenerative treatments are microfracture and autologous chondrocyte implantation (ACI). Although these new options show promising results, no final algorithm for the treatment of cartilage lesions has been established as yet. The objective of this review is to describe and compare these two treatment options and to present an evidence-based treatment algorithm for focal cartilage defects. Microfracture is a cost-effective, arthroscopic one-stage procedure, in which by drilling of the subchondral plate, mesenchymal stem cells from the bone marrow migrate into the defect and rebuild the cartilage. ACI is a two-stage procedure in which first chondrocytes are harvested, expanded in cell culture and in a second open procedure reimplanted into the cartilage defect. Microfracture is usually used for focal cartilage defects < 4 cm2, the treated defect size of the ACI seems to have a wider range. The effectiveness of these two treatments has been shown in long-term longitudinal studies, where microfracture showed improvement in up to 95 % of patients, whereas 92 % of the patients in a 2-9 year period of follow-up after ACI showed improvements, respectively. The successful outcome of the treatment depends on multiple factors such as the location of the defect, cell differentiation and proliferation, concomitant problems, and the age of the patient. Associated complications and disadvantages of the two different applications are, for the microfracture patient, a poor tissue differentation or a formation of an intra-lesional osteophyte, and for the ACI patient, periosteal hypertrophy and the need for two procedures in ACI. Only a few studies provide detailed and evidence-based information on a comparative assessment. These studies, however, are showing widely similar clinical outcomes but better histological results for ACI, which are likely to translate into better long-term outcomes. Although evidence-based studies comparing microfracture and ACI have not found significant differences in the clinical outcome, the literature does show that choosing the treatment based on the size and characteristics of the osteochondral lesion might be beneficial. The American Association of Orthopedic Surgeons suggest that contained lesions < 4 cm2 should be treated by microfracture, lesions bigger than that by ACI. Georg Thieme Verlag KG Stuttgart · New York.
Treating convection in sequential solvers
NASA Technical Reports Server (NTRS)
Shyy, Wei; Thakur, Siddharth
1992-01-01
The treatment of the convection terms in the sequential solver, a standard procedure found in virtually all pressure based algorithms, to compute the flow problems with sharp gradients and source terms is investigated. Both scalar model problems and one-dimensional gas dynamics equations have been used to study the various issues involved. Different approaches including the use of nonlinear filtering techniques and adoption of TVD type schemes have been investigated. Special treatments of the source terms such as pressure gradients and heat release have also been devised, yielding insight and improved accuracy of the numerical procedure adopted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiwari, P; Chen, Y; Hong, L
2015-06-15
Purpose We developed an automated treatment planning system based on a hierarchical goal programming approach. To demonstrate the feasibility of our method, we report the comparison of prostate treatment plans produced from the automated treatment planning system with those produced by a commercial treatment planning system. Methods In our approach, we prioritized the goals of the optimization, and solved one goal at a time. The purpose of prioritization is to ensure that higher priority dose-volume planning goals are not sacrificed to improve lower priority goals. The algorithm has four steps. The first step optimizes dose to the target structures, whilemore » sparing key sensitive organs from radiation. In the second step, the algorithm finds the best beamlet weight to reduce toxicity risks to normal tissue while holding the objective function achieved in the first step as a constraint, with a small amount of allowed slip. Likewise, the third and fourth steps introduce lower priority normal tissue goals and beam smoothing. We compared with prostate treatment plans from Memorial Sloan Kettering Cancer Center developed using Eclipse, with a prescription dose of 72 Gy. A combination of liear, quadratic, and gEUD objective functions were used with a modified open source solver code (IPOPT). Results Initial plan results on 3 different cases show that the automated planning system is capable of competing or improving on expert-driven eclipse plans. Compared to the Eclipse planning system, the automated system produced up to 26% less mean dose to rectum and 24% less mean dose to bladder while having the same D95 (after matching) to the target. Conclusion We have demonstrated that Pareto optimal treatment plans can be generated automatically without a trial-and-error process. The solver finds an optimal plan for the given patient, as opposed to database-driven approaches that set parameters based on geometry and population modeling.« less
Schöfer, Helmut; Tatti, Silvio; Lynde, Charles W; Skerlev, Mihael; Hercogová, Jana; Rotaru, Maria; Ballesteros, Juan; Calzavara-Pinton, Piergiacomo
2017-12-01
This review about the proactive sequential therapy (PST) of external genital and perianal warts (EGW) is based on the most current available clinical literature and on the broad clinical experience of a group of international experts, physicians who are well versed in the treatment of human papillomavirus-associated diseases. It provides a practical guide for the treatment of EGW, including epidemiology, etiology, clinical appearance, and diagnostic procedures for these viral infections. Furthermore, the treatment goals and current treatment options, elucidating provider- and patient-applied therapies, and the parameters driving treatment decisions are summarized. Specifically, the mode of action of the topical treatments sinecatechins and imiquimod, as well as the PST for EGW to achieve rapid and sustained clearance is discussed. The group of experts has developed a treatment algorithm giving healthcare providers a practical tool for the treatment of EGW which is very valuable in the presence of many different treatment options.
Identifying Seizure Onset Zone From the Causal Connectivity Inferred Using Directed Information
NASA Astrophysics Data System (ADS)
Malladi, Rakesh; Kalamangalam, Giridhar; Tandon, Nitin; Aazhang, Behnaam
2016-10-01
In this paper, we developed a model-based and a data-driven estimator for directed information (DI) to infer the causal connectivity graph between electrocorticographic (ECoG) signals recorded from brain and to identify the seizure onset zone (SOZ) in epileptic patients. Directed information, an information theoretic quantity, is a general metric to infer causal connectivity between time-series and is not restricted to a particular class of models unlike the popular metrics based on Granger causality or transfer entropy. The proposed estimators are shown to be almost surely convergent. Causal connectivity between ECoG electrodes in five epileptic patients is inferred using the proposed DI estimators, after validating their performance on simulated data. We then proposed a model-based and a data-driven SOZ identification algorithm to identify SOZ from the causal connectivity inferred using model-based and data-driven DI estimators respectively. The data-driven SOZ identification outperforms the model-based SOZ identification algorithm when benchmarked against visual analysis by neurologist, the current clinical gold standard. The causal connectivity analysis presented here is the first step towards developing novel non-surgical treatments for epilepsy.
Deep learning and texture-based semantic label fusion for brain tumor segmentation
NASA Astrophysics Data System (ADS)
Vidyaratne, L.; Alam, M.; Shboul, Z.; Iftekharuddin, K. M.
2018-02-01
Brain tumor segmentation is a fundamental step in surgical treatment and therapy. Many hand-crafted and learning based methods have been proposed for automatic brain tumor segmentation from MRI. Studies have shown that these approaches have their inherent advantages and limitations. This work proposes a semantic label fusion algorithm by combining two representative state-of-the-art segmentation algorithms: texture based hand-crafted, and deep learning based methods to obtain robust tumor segmentation. We evaluate the proposed method using publicly available BRATS 2017 brain tumor segmentation challenge dataset. The results show that the proposed method offers improved segmentation by alleviating inherent weaknesses: extensive false positives in texture based method, and the false tumor tissue classification problem in deep learning method, respectively. Furthermore, we investigate the effect of patient's gender on the segmentation performance using a subset of validation dataset. Note the substantial improvement in brain tumor segmentation performance proposed in this work has recently enabled us to secure the first place by our group in overall patient survival prediction task at the BRATS 2017 challenge.
Deep Learning and Texture-Based Semantic Label Fusion for Brain Tumor Segmentation.
Vidyaratne, L; Alam, M; Shboul, Z; Iftekharuddin, K M
2018-01-01
Brain tumor segmentation is a fundamental step in surgical treatment and therapy. Many hand-crafted and learning based methods have been proposed for automatic brain tumor segmentation from MRI. Studies have shown that these approaches have their inherent advantages and limitations. This work proposes a semantic label fusion algorithm by combining two representative state-of-the-art segmentation algorithms: texture based hand-crafted, and deep learning based methods to obtain robust tumor segmentation. We evaluate the proposed method using publicly available BRATS 2017 brain tumor segmentation challenge dataset. The results show that the proposed method offers improved segmentation by alleviating inherent weaknesses: extensive false positives in texture based method, and the false tumor tissue classification problem in deep learning method, respectively. Furthermore, we investigate the effect of patient's gender on the segmentation performance using a subset of validation dataset. Note the substantial improvement in brain tumor segmentation performance proposed in this work has recently enabled us to secure the first place by our group in overall patient survival prediction task at the BRATS 2017 challenge.
Brenton, Ashley; Lee, Chee; Lewis, Katrina; Sharma, Maneesh; Kantorovich, Svetlana; Smith, Gregory A; Meshkin, Brian
2018-01-01
The purpose of this study was to determine the clinical utility of an algorithm-based decision tool designed to assess risk associated with opioid use. Specifically, we sought to assess how physicians were using the profile in patient care and how its use affected patient outcomes. A prospective, longitudinal study was conducted to assess the utility of precision medicine testing in 5,397 patients across 100 clinics in the USA. Using a patent-protected, validated algorithm combining specific genetic risk factors with phenotypic traits, patients were categorized into low-, moderate-, and high-risk patients for opioid abuse. Physicians who ordered precision medicine testing were asked to complete patient evaluations and document their actions, decisions, and perceptions regarding the utility of the precision medicine tests. The patient outcomes associated with each treatment action were carefully documented. Physicians used the profile to guide treatment decisions for over half of the patients. Of those, guided treatment decisions for 24.5% of the patients were opioid related, including changing the opioid prescribed, starting an opioid, or titrating a patient off the opioid. Treatment guidance was strongly influenced by profile-predicted opioid use disorder (OUD) risk. Most importantly, patients whose physicians used the profile to guide opioid-related treatment decisions had improved clinical outcomes, including better pain management by medication adjustments, with an average pain decrease of 3.4 points on a scale of 1-10. Patients whose physicians used the profile to guide opioid-related treatment decisions had improved clinical outcomes, as measured by decreased pain levels resulting from better pain management with prescribed medications. The clinical utility of the profile is twofold. It provides clinically actionable recommendations that can be used to 1) prevent OUD through limiting initial opioid prescriptions and 2) reduce pain in patients at low risk of developing OUD.
Comparative analysis of multiple-casualty incident triage algorithms.
Garner, A; Lee, A; Harrison, K; Schultz, C H
2001-11-01
We sought to retrospectively measure the accuracy of multiple-casualty incident (MCI) triage algorithms and their component physiologic variables in predicting adult patients with critical injury. We performed a retrospective review of 1,144 consecutive adult patients transported by ambulance and admitted to 2 trauma centers. Association between first-recorded out-of-hospital physiologic variables and a resource-based definition of severe injury appropriate to the MCI context was determined. The association between severe injury and Triage Sieve, Simple Triage and Rapid Treatment, modified Simple Triage and Rapid Treatment, and CareFlight Triage was determined in the patient population. Of the physiologic variables, the Motor Component of the Glasgow Coma Scale had the strongest association with severe injury, followed by systolic blood pressure. The differences between CareFlight Triage, Simple Triage and Rapid Treatment, and modified Simple Triage and Rapid Treatment were not dramatic, with sensitivities of 82% (95% confidence interval [CI] 75% to 88%), 85% (95% CI 78% to 90%), and 84% (95% CI 76% to 89%), respectively, and specificities of 96% (95% CI 94% to 97%), 86% (95% CI 84% to 88%), and 91% (95% CI 89% to 93%), respectively. Both forms of Triage Sieve were significantly poorer predictors of severe injury. Of the physiologic variables used in the triage algorithms, the Motor Component of the Glasgow Coma Scale and systolic blood pressure had the strongest association with severe injury. CareFlight Triage, Simple Triage and Rapid Treatment, and modified Simple Triage and Rapid Treatment had similar sensitivities in predicting critical injury in designated trauma patients, but CareFlight Triage had better specificity. Because patients in a true mass casualty situation may not be completely comparable with designated trauma patients transported to emergency departments in routine circumstances, the best triage instrument in this study may not be the best in an actual MCI. These findings must be validated prospectively before their accuracy can be confirmed.
Brenton, Ashley; Lee, Chee; Lewis, Katrina; Sharma, Maneesh; Kantorovich, Svetlana; Smith, Gregory A; Meshkin, Brian
2018-01-01
Purpose The purpose of this study was to determine the clinical utility of an algorithm-based decision tool designed to assess risk associated with opioid use. Specifically, we sought to assess how physicians were using the profile in patient care and how its use affected patient outcomes. Patients and methods A prospective, longitudinal study was conducted to assess the utility of precision medicine testing in 5,397 patients across 100 clinics in the USA. Using a patent-protected, validated algorithm combining specific genetic risk factors with phenotypic traits, patients were categorized into low-, moderate-, and high-risk patients for opioid abuse. Physicians who ordered precision medicine testing were asked to complete patient evaluations and document their actions, decisions, and perceptions regarding the utility of the precision medicine tests. The patient outcomes associated with each treatment action were carefully documented. Results Physicians used the profile to guide treatment decisions for over half of the patients. Of those, guided treatment decisions for 24.5% of the patients were opioid related, including changing the opioid prescribed, starting an opioid, or titrating a patient off the opioid. Treatment guidance was strongly influenced by profile-predicted opioid use disorder (OUD) risk. Most importantly, patients whose physicians used the profile to guide opioid-related treatment decisions had improved clinical outcomes, including better pain management by medication adjustments, with an average pain decrease of 3.4 points on a scale of 1–10. Conclusion Patients whose physicians used the profile to guide opioid-related treatment decisions had improved clinical outcomes, as measured by decreased pain levels resulting from better pain management with prescribed medications. The clinical utility of the profile is twofold. It provides clinically actionable recommendations that can be used to 1) prevent OUD through limiting initial opioid prescriptions and 2) reduce pain in patients at low risk of developing OUD. PMID:29379313
Swarm intelligence applied to the risk evaluation for congenital heart surgery.
Zapata-Impata, Brayan S; Ruiz-Fernandez, Daniel; Monsalve-Torra, Ana
2015-01-01
Particle Swarm Optimization is an optimization technique based on the positions of several particles created to find the best solution to a problem. In this work we analyze the accuracy of a modification of this algorithm to classify the levels of risk for a surgery, used as a treatment to correct children malformations that imply congenital heart diseases.
Zhou, Dong; Zhang, Hui; Ye, Peiqing
2016-01-01
Lateral penumbra of multileaf collimator plays an important role in radiotherapy treatment planning. Growing evidence has revealed that, for a single-focused multileaf collimator, lateral penumbra width is leaf position dependent and largely attributed to the leaf end shape. In our study, an analytical method for leaf end induced lateral penumbra modelling is formulated using Tangent Secant Theory. Compared with Monte Carlo simulation and ray tracing algorithm, our model serves well the purpose of cost-efficient penumbra evaluation. Leaf ends represented in parametric forms of circular arc, elliptical arc, Bézier curve, and B-spline are implemented. With biobjective function of penumbra mean and variance introduced, genetic algorithm is carried out for approximating the Pareto frontier. Results show that for circular arc leaf end objective function is convex and convergence to optimal solution is guaranteed using gradient based iterative method. It is found that optimal leaf end in the shape of Bézier curve achieves minimal standard deviation, while using B-spline minimum of penumbra mean is obtained. For treatment modalities in clinical application, optimized leaf ends are in close agreement with actual shapes. Taken together, the method that we propose can provide insight into leaf end shape design of multileaf collimator.
Johnsen, Kay-Martin; Goll, Rasmus; Hansen, Vegard; Olsen, Trine; Rismo, Renathe; Heitmann, Richard; Gundersen, Mona D; Kvamme, Jan M; Paulssen, Eyvind J; Kileng, Hege; Johnsen, Knut; Florholmen, Jon
2017-01-01
Anti-tumour necrosis factor (TNF) agents play a pivotal role in the treatment of moderate to severe ulcerative colitis (UC), and yet, no international consensus on when to discontinue therapy exists. The aim of this study is to study the long-term performance of a treatment algorithm of repeated intensified induction therapy with infliximab (IFX) to remission, followed by discontinuation in patients with UC. Patients with moderate to severe UC were enroled in an open prospective study design. The following algorithm was implemented: (a) intensified induction treatment to remission (Ulcerative Colitis Disease Activity Index score 0-2); (b) discontinuation of IFX; and (c) reinduction treatment if relapse. Mucosal gene expression for TNF was measured with qPCR. A total of 116 patients were included. The median observation time was 47 and 51 months in intention to treat and per protocol. Remission rates of the first three inductions were 95, 93 and 91% per protocol and 83, 56 and 59% by intention to treat. The median time in remission was 40 months per protocol and 34 months by intention to treat. Long-term remission without further anti-TNF treatment during the observation period was obtained for 41%, with a median observation time of 48 months (range: 18-129 months). The median time to relapse was 33 and 11 months with/without normalization of mucosal TNF, respectively. The 5-year success rate for maintaining the effect of IFX in the algorithm was 66%. The treatment algorithm is highly effective for achieving long-term clinical remission in UC. Normalization of mucosal TNF gene expression predicts long-term remission upon discontinuation of IFX.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moignier, C; Huet, C; Barraux, V
Purpose: Advanced stereotactic radiotherapy (SRT) treatments require accurate dose calculation for treatment planning especially for treatment sites involving heterogeneous patient anatomy. The purpose of this study was to evaluate the accuracy of dose calculation algorithms, Raytracing and Monte Carlo (MC), implemented in the MultiPlan treatment planning system (TPS) in presence of heterogeneities. Methods: First, the LINAC of a CyberKnife radiotherapy facility was modeled with the PENELOPE MC code. A protocol for the measurement of dose distributions with EBT3 films was established and validated thanks to comparison between experimental dose distributions and calculated dose distributions obtained with MultiPlan Raytracing and MCmore » algorithms as well as with the PENELOPE MC model for treatments planned with the homogenous Easycube phantom. Finally, bones and lungs inserts were used to set up a heterogeneous Easycube phantom. Treatment plans with the 10, 7.5 or the 5 mm field sizes were generated in Multiplan TPS with different tumor localizations (in the lung and at the lung/bone/soft tissue interface). Experimental dose distributions were compared to the PENELOPE MC and Multiplan calculations using the gamma index method. Results: Regarding the experiment in the homogenous phantom, 100% of the points passed for the 3%/3mm tolerance criteria. These criteria include the global error of the method (CT-scan resolution, EBT3 dosimetry, LINAC positionning …), and were used afterwards to estimate the accuracy of the MultiPlan algorithms in heterogeneous media. Comparison of the dose distributions obtained in the heterogeneous phantom is in progress. Conclusion: This work has led to the development of numerical and experimental dosimetric tools for small beam dosimetry. Raytracing and MC algorithms implemented in MultiPlan TPS were evaluated in heterogeneous media.« less
Changes and challenges: managing ADHD in a fast-paced world.
Manos, Michael J; Tom-Revzon, Catherine; Bukstein, Oscar G; Crismon, M Lynn
2007-11-01
Attention-deficit/hyperactivity disorder (ADHD) impairs the lives of both children and adults. Undiagnosed and untreated, ADHD may have serious lifelong consequences. Research has identified diagnostic clues, neurotransmitter pathways, and psychiatric comorbidities related to ADHD, as well as effective pharmacologic, behavioral, and psychosocial interventions. Stimulant agents have been the foundation of ADHD therapy for more than 50 years. Availability of new extended-release (XR or ER) and longer-acting (LA) formulations and novel agents allows for wider and more individualized treatment choices. Side effects of stimulants are generally mild, short lived, and responsive to adjustments in dosage or timing. Outcomes in ADHD treatment can be improved with the use of clear treatment guidelines and tools to aid clinicians in implementing them efficiently and effectively. The Texas Children's Medication Algorithm Project (CMAP) provides a system of algorithm-driven treatment decisions that is evidence based and easy to implement. To (1) review the psychological components of attention, the neurotransmitter pathways associated with ADHD, and the array of therapeutic options for ADHD, with an emphasis on the most recent introductions to the therapeutic armamentarium; (2) discuss the rare psychiatric and cardiovascular side effects associated with stimulants; (3) review abuse liability, comorbidities, and suggested approaches to these issues; and (4) review the development and use of CMAP and offer resources for its implementation in clinical practice. The pathophysiology of ADHD is linked to dysfunction of fronto-subcortical networks and dysregulation of dopaminergic, noradrenergic, and nicotinic neurotransmitter systems. An additive effect of multiple genes as well as environmental influences contributes to the clinical picture. Treatment with stimulants and nonstimulants has proven effective in different subgroups, with the effectiveness of specific agents most likely related to the primary neurotransmitter involved. Availability of XR, ER, LA, and transdermal stimulant formulations, as well as alternative nonstimulant agents, offers new options for the pharmacotherapy of ADHD. Major concerns associated with abuse liability of stimulants have been allayed by the availability of ER formulations, which have reduced reinforcing effects associated with short-acting preparations. Medication outcomes in ADHD can be enhanced by the use of evidence-based algorithms such as CMAP. Keys to success are adequate initial assessment and diagnosis, the use of sustained-release products, sufficient dose titration, and the use of clinical rating scales with feedback from caregivers and teachers. Optimal treatment outcomes can be achieved by appropriate pharmacotherapy combined with psychosocial interventions.
A clinical study of lung cancer dose calculation accuracy with Monte Carlo simulation.
Zhao, Yanqun; Qi, Guohai; Yin, Gang; Wang, Xianliang; Wang, Pei; Li, Jian; Xiao, Mingyong; Li, Jie; Kang, Shengwei; Liao, Xiongfei
2014-12-16
The accuracy of dose calculation is crucial to the quality of treatment planning and, consequently, to the dose delivered to patients undergoing radiation therapy. Current general calculation algorithms such as Pencil Beam Convolution (PBC) and Collapsed Cone Convolution (CCC) have shortcomings in regard to severe inhomogeneities, particularly in those regions where charged particle equilibrium does not hold. The aim of this study was to evaluate the accuracy of the PBC and CCC algorithms in lung cancer radiotherapy using Monte Carlo (MC) technology. Four treatment plans were designed using Oncentra Masterplan TPS for each patient. Two intensity-modulated radiation therapy (IMRT) plans were developed using the PBC and CCC algorithms, and two three-dimensional conformal therapy (3DCRT) plans were developed using the PBC and CCC algorithms. The DICOM-RT files of the treatment plans were exported to the Monte Carlo system to recalculate. The dose distributions of GTV, PTV and ipsilateral lung calculated by the TPS and MC were compared. For 3DCRT and IMRT plans, the mean dose differences for GTV between the CCC and MC increased with decreasing of the GTV volume. For IMRT, the mean dose differences were found to be higher than that of 3DCRT. The CCC algorithm overestimated the GTV mean dose by approximately 3% for IMRT. For 3DCRT plans, when the volume of the GTV was greater than 100 cm(3), the mean doses calculated by CCC and MC almost have no difference. PBC shows large deviations from the MC algorithm. For the dose to the ipsilateral lung, the CCC algorithm overestimated the dose to the entire lung, and the PBC algorithm overestimated V20 but underestimated V5; the difference in V10 was not statistically significant. PBC substantially overestimates the dose to the tumour, but the CCC is similar to the MC simulation. It is recommended that the treatment plans for lung cancer be developed using an advanced dose calculation algorithm other than PBC. MC can accurately calculate the dose distribution in lung cancer and can provide a notably effective tool for benchmarking the performance of other dose calculation algorithms within patients.
Block-Based Connected-Component Labeling Algorithm Using Binary Decision Trees
Chang, Wan-Yu; Chiu, Chung-Cheng; Yang, Jia-Horng
2015-01-01
In this paper, we propose a fast labeling algorithm based on block-based concepts. Because the number of memory access points directly affects the time consumption of the labeling algorithms, the aim of the proposed algorithm is to minimize neighborhood operations. Our algorithm utilizes a block-based view and correlates a raster scan to select the necessary pixels generated by a block-based scan mask. We analyze the advantages of a sequential raster scan for the block-based scan mask, and integrate the block-connected relationships using two different procedures with binary decision trees to reduce unnecessary memory access. This greatly simplifies the pixel locations of the block-based scan mask. Furthermore, our algorithm significantly reduces the number of leaf nodes and depth levels required in the binary decision tree. We analyze the labeling performance of the proposed algorithm alongside that of other labeling algorithms using high-resolution images and foreground images. The experimental results from synthetic and real image datasets demonstrate that the proposed algorithm is faster than other methods. PMID:26393597
Linear feasibility algorithms for treatment planning in interstitial photodynamic therapy
NASA Astrophysics Data System (ADS)
Rendon, A.; Beck, J. C.; Lilge, Lothar
2008-02-01
Interstitial Photodynamic therapy (IPDT) has been under intense investigation in recent years, with multiple clinical trials underway. This effort has demanded the development of optimization strategies that determine the best locations and output powers for light sources (cylindrical or point diffusers) to achieve an optimal light delivery. Furthermore, we have recently introduced cylindrical diffusers with customizable emission profiles, placing additional requirements on the optimization algorithms, particularly in terms of the stability of the inverse problem. Here, we present a general class of linear feasibility algorithms and their properties. Moreover, we compare two particular instances of these algorithms, which are been used in the context of IPDT: the Cimmino algorithm and a weighted gradient descent (WGD) algorithm. The algorithms were compared in terms of their convergence properties, the cost function they minimize in the infeasible case, their ability to regularize the inverse problem, and the resulting optimal light dose distributions. Our results show that the WGD algorithm overall performs slightly better than the Cimmino algorithm and that it converges to a minimizer of a clinically relevant cost function in the infeasible case. Interestingly however, treatment plans resulting from either algorithms were very similar in terms of the resulting fluence maps and dose volume histograms, once the diffuser powers adjusted to achieve equal prostate coverage.
Randomized algorithms for high quality treatment planning in volumetric modulated arc therapy
NASA Astrophysics Data System (ADS)
Yang, Yu; Dong, Bin; Wen, Zaiwen
2017-02-01
In recent years, volumetric modulated arc therapy (VMAT) has been becoming a more and more important radiation technique widely used in clinical application for cancer treatment. One of the key problems in VMAT is treatment plan optimization, which is complicated due to the constraints imposed by the involved equipments. In this paper, we consider a model with four major constraints: the bound on the beam intensity, an upper bound on the rate of the change of the beam intensity, the moving speed of leaves of the multi-leaf collimator (MLC) and its directional-convexity. We solve the model by a two-stage algorithm: performing minimization with respect to the shapes of the aperture and the beam intensities alternatively. Specifically, the shapes of the aperture are obtained by a greedy algorithm whose performance is enhanced by random sampling in the leaf pairs with a decremental rate. The beam intensity is optimized using a gradient projection method with non-monotonic line search. We further improve the proposed algorithm by an incremental random importance sampling of the voxels to reduce the computational cost of the energy functional. Numerical simulations on two clinical cancer date sets demonstrate that our method is highly competitive to the state-of-the-art algorithms in terms of both computational time and quality of treatment planning.
van Manen, J G; Kamphuis, J H; Goossensen, A; Timman, R; Busschbach, J J V; Verheul, R
2012-08-01
Using the concept map method, this study aimed to summarize and describe patient characteristics pertinent to treatment selection for patients with personality disorders (PDs). Initial patient characteristics were derived from the research literature and a survey among Dutch expert clinicians. Concept mapping is a formalized conceptualization procedure that describes the underlying cognitive structures people use in complex tasks, such as treatment allocation. Based on expert opinions of 29 Dutch clinicians, a concept map was generated that yielded eight domains of patient characteristics, i.e., Severity of symptoms, Severity of personality pathology, Ego-adaptive capacities, Motivation and working alliance, Social context, Social demographic characteristics, Trauma, and Treatment history and medical condition. These domains can be ordered along two bipolar axes, running from internal to external concepts and from vulnerability to strength concepts, respectively. Our findings may serve as input for the delineation of algorithms for patient-treatment matching research in PD.
The Texas medication algorithm project: clinical results for schizophrenia.
Miller, Alexander L; Crismon, M Lynn; Rush, A John; Chiles, John; Kashner, T Michael; Toprac, Marcia; Carmody, Thomas; Biggs, Melanie; Shores-Wilson, Kathy; Chiles, Judith; Witte, Brad; Bow-Thomas, Christine; Velligan, Dawn I; Trivedi, Madhukar; Suppes, Trisha; Shon, Steven
2004-01-01
In the Texas Medication Algorithm Project (TMAP), patients were given algorithm-guided treatment (ALGO) or treatment as usual (TAU). The ALGO intervention included a clinical coordinator to assist the physicians and administer a patient and family education program. The primary comparison in the schizophrenia module of TMAP was between patients seen in clinics in which ALGO was used (n = 165) and patients seen in clinics in which no algorithms were used (n = 144). A third group of patients, seen in clinics using an algorithm for bipolar or major depressive disorder but not for schizophrenia, was also studied (n = 156). The ALGO group had modestly greater improvement in symptoms (Brief Psychiatric Rating Scale) during the first quarter of treatment. The TAU group caught up by the end of 12 months. Cognitive functions were more improved in ALGO than in TAU at 3 months, and this difference was greater at 9 months (the final cognitive assessment). In secondary comparisons of ALGO with the second TAU group, the greater improvement in cognitive functioning was again noted, but the initial symptom difference was not significant.