Sample records for acenocoumarol dosing algorithm

  1. A novel acenocoumarol pharmacogenomic dosing algorithm for the Greek population of EU-PACT trial.

    PubMed

    Ragia, Georgia; Kolovou, Vana; Kolovou, Genovefa; Konstantinides, Stavros; Maltezos, Efstratios; Tavridou, Anna; Tziakas, Dimitrios; Maitland-van der Zee, Anke H; Manolopoulos, Vangelis G

    2017-01-01

    To generate and validate a pharmacogenomic-guided (PG) dosing algorithm for acenocoumarol in the Greek population. To compare its performance with other PG algorithms developed for the Greek population. A total of 140 Greek patients participants of the EU-PACT trial for acenocoumarol, a randomized clinical trial that prospectively compared the effect of a PG dosing algorithm with a clinical dosing algorithm on the percentage of time within INR therapeutic range, who reached acenocoumarol stable dose were included in the study. CYP2C9 and VKORC1 genotypes, age and weight affected acenocoumarol dose and predicted 53.9% of its variability. EU-PACT PG algorithm overestimated acenocoumarol dose across all different CYP2C9/VKORC1 functional phenotype bins (predicted dose vs stable dose in normal responders 2.31 vs 2.00 mg/day, p = 0.028, in sensitive responders 1.72 vs 1.50 mg/day, p = 0.003, in highly sensitive responders 1.39 vs 1.00 mg/day, p = 0.029). The PG algorithm previously developed for the Greek population overestimated the dose in normal responders (2.51 vs 2.00 mg/day, p < 0.001). Ethnic-specific dosing algorithm is suggested for better prediction of acenocoumarol dosage requirements in patients of Greek origin.

  2. Evaluation of genotype-guided acenocoumarol dosing algorithms in Russian patients.

    PubMed

    Sychev, Dmitriy Alexeyevich; Rozhkov, Aleksandr Vladimirovich; Ananichuk, Anna Viktorovna; Kazakov, Ruslan Evgenyevich

    2017-05-24

    Acenocoumarol dose is normally determined via step-by-step adjustment process based on International Normalized Ratio (INR) measurements. During this time, the risk of adverse reactions is especially high. Several genotype-based acenocoumarol dosing algorithms have been created to predict ideal doses at the start of anticoagulant therapy. Nine dosing algorithms were selected through a literature search. These were evaluated using a cohort of 63 patients with atrial fibrillation receiving acenocoumarol therapy. None of the existing algorithms could predict the ideal acenocoumarol dose in 50% of Russian patients. The Wolkanin-Bartnik algorithtm based on European population was the best-performing one with the highest correlation values (r=0.397), mean absolute error (MAE) 0.82 (±0.61). EU-PACT also managed to give an estimate within the ideal range in 43% of the cases. The two least accurate results were yielded by the Indian population-based algorithms. Among patients receiving amiodarone, algorithms by Schie and Tong proved to be the most effective with the MAE of 0.48±0.42 mg/day and 0.56±0.31 mg/day, respectively. Patient ethnicity and amiodarone intake are factors that must be considered when building future algorithms. Further research is required to find the perfect dosing formula of acenocoumarol maintenance doses in Russian patients.

  3. Efficiency and effectiveness of the use of an acenocoumarol pharmacogenetic dosing algorithm versus usual care in patients with venous thromboembolic disease initiating oral anticoagulation: study protocol for a randomized controlled trial

    PubMed Central

    2012-01-01

    Background Hemorrhagic events are frequent in patients on treatment with antivitamin-K oral anticoagulants due to their narrow therapeutic margin. Studies performed with acenocoumarol have shown the relationship between demographic, clinical and genotypic variants and the response to these drugs. Once the influence of these genetic and clinical factors on the dose of acenocoumarol needed to maintain a stable international normalized ratio (INR) has been demonstrated, new strategies need to be developed to predict the appropriate doses of this drug. Several pharmacogenetic algorithms have been developed for warfarin, but only three have been developed for acenocoumarol. After the development of a pharmacogenetic algorithm, the obvious next step is to demonstrate its effectiveness and utility by means of a randomized controlled trial. The aim of this study is to evaluate the effectiveness and efficiency of an acenocoumarol dosing algorithm developed by our group which includes demographic, clinical and pharmacogenetic variables (VKORC1, CYP2C9, CYP4F2 and ApoE) in patients with venous thromboembolism (VTE). Methods and design This is a multicenter, single blind, randomized controlled clinical trial. The protocol has been approved by La Paz University Hospital Research Ethics Committee and by the Spanish Drug Agency. Two hundred and forty patients with VTE in which oral anticoagulant therapy is indicated will be included. Randomization (case/control 1:1) will be stratified by center. Acenocoumarol dose in the control group will be scheduled and adjusted following common clinical practice; in the experimental arm dosing will be following an individualized algorithm developed and validated by our group. Patients will be followed for three months. The main endpoints are: 1) Percentage of patients with INR within the therapeutic range on day seven after initiation of oral anticoagulant therapy; 2) Time from the start of oral anticoagulant treatment to achievement of a stable INR within the therapeutic range; 3) Number of INR determinations within the therapeutic range in the first six weeks of treatment. Discussion To date, there are no clinical trials comparing pharmacogenetic acenocoumarol dosing algorithm versus routine clinical practice in VTE. Implementation of this pharmacogenetic algorithm in the clinical practice routine could reduce side effects and improve patient safety. Trial registration Eudra CT. Identifier: 2009-016643-18. PMID:23237631

  4. Pharmacogenetic-guided dosing of coumarin anticoagulants: algorithms for warfarin, acenocoumarol and phenprocoumon

    PubMed Central

    Verhoef, Talitha I; Redekop, William K; Daly, Ann K; van Schie, Rianne M F; de Boer, Anthonius; Maitland-van der Zee, Anke-Hilse

    2014-01-01

    Coumarin derivatives, such as warfarin, acenocoumarol and phenprocoumon are frequently prescribed oral anticoagulants to treat and prevent thromboembolism. Because there is a large inter-individual and intra-individual variability in dose–response and a small therapeutic window, treatment with coumarin derivatives is challenging. Certain polymorphisms in CYP2C9 and VKORC1 are associated with lower dose requirements and a higher risk of bleeding. In this review we describe the use of different coumarin derivatives, pharmacokinetic characteristics of these drugs and differences amongst the coumarins. We also describe the current clinical challenges and the role of pharmacogenetic factors. These genetic factors are used to develop dosing algorithms and can be used to predict the right coumarin dose. The effectiveness of this new dosing strategy is currently being investigated in clinical trials. PMID:23919835

  5. Therapeutic Effect of Low Doses of Acenocoumarol in the Course of Ischemia/Reperfusion-Induced Acute Pancreatitis in Rats.

    PubMed

    Warzecha, Zygmunt; Sendur, Paweł; Ceranowicz, Piotr; Cieszkowski, Jakub; Dembiński, Marcin; Sendur, Ryszard; Bonior, Joanna; Jaworek, Jolanta; Ambroży, Tadeusz; Olszanecki, Rafał; Kuśnierz-Cabala, Beata; Tomasz, Kaczmarzyk; Tomaszewska, Romana; Dembiński, Artur

    2017-04-21

    Intravascular activation of coagulation is observed in acute pancreatitis and is related to the severity of this inflammation. The aim of our study was to evaluate the impact of acenocoumarol therapy on the course of acute pancreatitis induced in male rats by pancreatic ischemia followed by reperfusion. Acenocoumarol at a dose of 50, 100, or 150 µg/kg/dose was administered intragastrically once a day, starting the first dose 24 h after the initiation of pancreatic reperfusion. Histological examination showed that treatment with acenocoumarol reduces pancreatic edema, necrosis, and hemorrhages in rats with pancreatitis. Moreover, the administration of acenocoumarol decreased pancreatic inflammatory infiltration and vacuolization of pancreatic acinar cells. These findings were accompanied with a reduction in the serum activity of lipase and amylase, concentration of interleukin-1β, and plasma d-Dimer concentration. Moreover, the administration of acenocoumarol improved pancreatic blood flow and pancreatic DNA synthesis. Acenocoumarol given at a dose of 150 µg/kg/dose was the most effective in the treatment of early phase acute pancreatitis. However later, acenocoumarol given at the highest dose failed to exhibit any therapeutic effect; whereas lower doses of acenocoumarol were still effective in the treatment of acute pancreatitis. Treatment with acenocoumarol accelerates the recovery of ischemia/reperfusion-induced acute pancreatitis in rats.

  6. Therapeutic Effect of Low Doses of Acenocoumarol in the Course of Ischemia/Reperfusion-Induced Acute Pancreatitis in Rats

    PubMed Central

    Warzecha, Zygmunt; Sendur, Paweł; Ceranowicz, Piotr; Cieszkowski, Jakub; Dembiński, Marcin; Sendur, Ryszard; Bonior, Joanna; Jaworek, Jolanta; Ambroży, Tadeusz; Olszanecki, Rafał; Kuśnierz-Cabala, Beata; Tomasz, Kaczmarzyk; Tomaszewska, Romana; Dembiński, Artur

    2017-01-01

    Intravascular activation of coagulation is observed in acute pancreatitis and is related to the severity of this inflammation. The aim of our study was to evaluate the impact of acenocoumarol therapy on the course of acute pancreatitis induced in male rats by pancreatic ischemia followed by reperfusion. Acenocoumarol at a dose of 50, 100, or 150 µg/kg/dose was administered intragastrically once a day, starting the first dose 24 h after the initiation of pancreatic reperfusion. Results: Histological examination showed that treatment with acenocoumarol reduces pancreatic edema, necrosis, and hemorrhages in rats with pancreatitis. Moreover, the administration of acenocoumarol decreased pancreatic inflammatory infiltration and vacuolization of pancreatic acinar cells. These findings were accompanied with a reduction in the serum activity of lipase and amylase, concentration of interleukin-1β, and plasma d-Dimer concentration. Moreover, the administration of acenocoumarol improved pancreatic blood flow and pancreatic DNA synthesis. Acenocoumarol given at a dose of 150 µg/kg/dose was the most effective in the treatment of early phase acute pancreatitis. However later, acenocoumarol given at the highest dose failed to exhibit any therapeutic effect; whereas lower doses of acenocoumarol were still effective in the treatment of acute pancreatitis. Conclusion: Treatment with acenocoumarol accelerates the recovery of ischemia/reperfusion-induced acute pancreatitis in rats. PMID:28430136

  7. Influence of CYP2C9 polymorphism and phenytoin co-administration on acenocoumarol dose in patients with cerebral venous thrombosis.

    PubMed

    De, Tanima; Christopher, Rita; Nagaraja, Dindagur

    2014-05-01

    The study aimed at evaluating the contribution of genetic variations in the drug metabolizing enzyme, CYP2C9, and the influence of co-medication with the antiepileptic drug, phenytoin, to variability in acenocoumarol response, in patients with cerebral venous thrombosis (CVT). 476 acenocoumarol-treated CVT patients (153 males and 323 females) were genotyped for CYP2C9*2 and CYP2C9*3 polymorphisms by PCR-RFLP method. Mean acenocoumarol dose required for achieving and maintaining a stable international normalized ratio (INR) was calculated for different genotypes. The effect of co-administration with phenytoin was determined. Genotype distributions of CYP2C9 were as follows: 83%CYP2C9*1/*1, 8.6%CYP2C9*1/*3, 5.9%CYP2C9*1/*2, 1.9%CYP2C9*3/*3, 0.4%CYP2C9*2/*3 and 0.2%CYP2C9*2/*2. During the initiation phase of anticoagulation the CYP2C9*2 allele was independently associated with low acenocoumarol dose requirement (Adjusted OR 5.38; 95%CI 1.65-17.49; p=0.005). Similarly, the adjusted odds ratio for requiring a low dose during the induction phase in patients bearing the CYP2C9*3 allele was 12.79 (95%CI 4.74-34.57; p<0.0001). During the maintenance phase, CYP2C9*2 and CYP2C9*3 alleles were associated with 19-fold (Adjusted OR 19.67; 95%CI 2.46-157.19; p=0.005) and 11.9-fold odds (Adjusted OR 11.98; 95%CI 2.61-55.08; p=0.001) of requiring a low dose. Clinical covariates such as age, alcohol consumption, postpartum state and oral contraceptive intake also influenced acenocoumarol dosage. Co-medication with phenytoin was associated with lower dose requirement across genotypes during the initiation phase. However, during the maintenance phase, phenytoin-treated patients of all genotypes required higher doses of acenocoumarol. This study emphasizes the fact that polymorphisms in CYP2C9 gene and co-medication with phenytoin alter the anticoagulant effect of acenocoumarol. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Genotype-based dosage of acenocoumarol in highly-sensitive geriatric patients.

    PubMed

    Lozano, Roberto; Franco, María-Esther; López, Luis; Moneva, Juan-José; Carrasco, Vicente; Pérez-Layo, Maria-Angeles

    2015-03-01

    Our aim was to determinate the acenocoumarol dose requirement in highly sensitive geriatric patients, based on a minimum of genotype (VKORC1 and CYP2C9) data. We used a Gaussian kernel density estimation test to identify patients highly sensitive to the drug and PHARMACHIP®-Cuma test (Progenika Biopharma, SA, Grifols, Spain) to determine the CYP2C9 and VKORC1 genotype. All highly sensitive geriatric patients were taking ≤5.6 mg/week of acenocoumarol (AC), and 86% of these patients presented the following genotypes: CYP2C9*1/*3 or CYP2C9*1/*2 plus VKORC1 A/G, CYP2C9*3/*3, or VKORC1 A/A. VKORC1 A and CYP2C9*2 and/or *3 allelic variants extremely influence on AC dose requirement of highly sensitive geriatric patients. These patients display acenocoumarol dose requirement of ≤5.6 mg/week.

  9. Genetic determinants of acenocoumarol and warfarin maintenance dose requirements in Slavic population: a potential role of CYP4F2 and GGCX polymorphisms.

    PubMed

    Wypasek, Ewa; Branicka, Agnieszka; Awsiuk, Magdalena; Sadowski, Jerzy; Undas, Anetta

    2014-09-01

    VKORC1 and cytochrome CYP2C9 genetic variants contribute largely to inter-individual variations in vitamin K antagonists (VKAs) dose requirements. Cytochrome P450 4F2 isoform (CYP4F2), gamma-glutamyl carboxylase (GGCX) and apolipoprotein E (APOE) polymorphisms have been suggested to be of minor significance. We sought to assess the impact of those polymorphisms on dose requirements in Central-Eastern European cohort of 479 patients receiving acenocoumarol (n=260) or warfarin (n=219). There were no differences between the acenocoumarol and warfarin groups with regard to the gender, age, body mass index and international normalized ratio. The VKORC1 c.-1639A allele carriers required a lower dose of acenocoumarol and warfarin than the non-carriers (28.0 [21.0-35.0] vs. 42.0 [28.0-56.0] mg/week, p<0.0001; 35.0 [28.0-52.0] vs. 52.0 [35.0-70.0] mg/week, p=0.0001, respectively). Carriers of 2 and/or 3 variant alleles for CYP2C9 also required a lower dose of warfarin as compared with 1 1 carriers (35.0 [31.5-52.5] vs. 43.8 [35.0-60.2] mg/week, p=0.02; 35.0 [23.5-35.0] vs. 43.8 [35.0-60.2] mg/week, p<0.0001, respectively). Similarly, possession of G allele of GGCX c.2084+45 polymorphism was associated with lower warfarin dose (35.0 [26.3-39.2] vs. 45.5 [35.0-65.1] mg/week, p=0.03). No effect of CYP2C9*2,-*3 and GGCX c.2084+45G>C polymorphisms on acenocoumarol dosage was observed. Interestingly, carriers of CYP4F2 c.1297A variant required a higher dose of acenocoumarol and warfarin than non-carriers (43.8 [35.0-60.2] vs. 35.0 [35.0-52.5] mg/week, p=0.01; 35.0 [28.0-52.5] vs. 28.0 [28.0-42.0] mg/week, p=0.05). We have shown for the first time, that besides VKORC1 and CYP2C9 genetic variants, the CYP4F2 c.1297A and GGCX c.2084+45G have a moderate effect on VKAs dose requirements in Slavic population from Central-Eastern Europe. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. [Comparison of quality and hemorragic risk of oral anticoagulant therapy using acenocoumarol versus warfarin].

    PubMed

    Oliva Berini, Elvira; Galán Alvarez, Pilar; Pacheco Onrubia, Ana María

    2008-06-21

    Long half life oral anticoagulants have shown a higher anticoagulation stability and a lower hemorragic risk than those of a short half life. We have compared therapeutic stability and hemorragic risk of acenocoumarol versus warfarin in 2 groups of patients on preventive anticoagulation because of atrial fibrilation (international normalised ratio [INR]: 2-3). Data on 120 patients treated with acenocoumarol and 120 on warfarin treatment who had started and continued treatment in our hospital for a minimum of a year was collected. The percentage of visits within the intended range of INR (2 to 3) was 65.5% with warfarin and 63.4% with acenocoumarol. Thirty percent of patients on warfarin had 75% or more of their controls within range, while for those treated with acenocoumarol this percentage was 22.5%. In the acenocoumarol group, 0.3 visits/patient/year presented an INR > or = 6 versus 0.07 in the warfarin group (p = 0.003). Patients treated with acenocoumarol show a higher risk of presenting with an INR > or = 6, but no statistically significant differences are observed in therapeutic stability.

  11. Effectiveness and safety of dabigatran versus acenocoumarol in ‘real-world’ patients with atrial fibrillation

    PubMed Central

    Korenstra, Jennie; Wijtvliet, E. Petra J.; Veeger, Nic J.G.M.; Geluk, Christiane A.; Bartels, G. Louis; Posma, Jan L.; Piersma-Wichers, Margriet; Van Gelder, Isabelle C.; Rienstra, Michiel; Tieleman, Robert G.

    2016-01-01

    Aims Randomized trials showed non-inferior or superior results of the non-vitamin-K-antagonist oral anticoagulants (NOACs) compared with warfarin. The aim of this study was to assess the effectiveness and safety of dabigatran (direct thrombin inhibitor) vs. acenocoumarol (vitamin K antagonist) in patients with atrial fibrillation (AF) in daily clinical practice. Methods and results In this observational study, we evaluated all consecutive patients who started anticoagulation because of AF in our outpatient clinic from 2010 to 2013. Data were collected from electronic patient charts. Primary outcomes were stroke or systemic embolism and major bleeding. Propensity score matching was applied to address the non-randomized design. In total, 920 consecutive AF patients were enrolled (442 dabigatran, 478 acenocoumarol), of which 2 × 383 were available for analysis after propensity score matching. Mean follow-up duration was 1.5 ± 0.56 year. The mean calculated stroke risk according to the CHA2DS2-VASc score was 3.5%/year in dabigatran vs. 3.7%/year acenocoumarol-treated patients. The actual incidence rate of stroke or systemic embolism was 0.8%/year [95% confidence interval (CI): 0.2–2.1] vs. 1.0%/year (95% CI: 0.4–2.1), respectively. Multivariable analysis confirmed this lower but non-significant risk in dabigatran vs. acenocoumarol after adjustment for the CHA2DS2-VASc score [hazard ratio (HR)dabigatran = 0.72, 95% CI: 0.20–2.63, P = 0.61]. According to the HAS-BLED score, the mean calculated bleeding risk was 1.7%/year in both groups. Actual incidence rate of major bleeding was 2.1%/year (95% CI: 1.0–3.8) in the dabigatran vs. 4.3%/year (95% CI: 2.9–6.2) in acenocoumarol. This over 50% reduction remained significant after adjustment for the HAS-BLED score (HRdabigatran = 0.45, 95% CI: 0.22–0.93, P = 0.031). Conclusion In ‘real-world’ patients with AF, dabigatran appears to be as effective, but significantly safer than acenocoumarol. PMID:26843571

  12. Latin American Clinical Epidemiology Network Series - Paper 2: Apixaban was cost-effective vs. acenocoumarol in patients with nonvalvular atrial fibrillation with moderate to severe risk of embolism in Chile.

    PubMed

    Lanas, Fernando; Castro, Constanza; Vallejos, Carlos; Bustos, Luis; de La Puente, Catherine; Velasquez, Monica; Zaror, Carlos

    2017-06-01

    Nonvalvular atrial fibrillation (NVAF) is a risk factor for ischemic stroke and systemic embolism. New oral anticoagulants are currently available. The objective of this study was to assess the incremental cost-utility ratio (ICUR) for apixaban vs. acenocoumarol in patients treated in Chile's public health system. We assessed cost-utility from the payer perspective with a lifetime Markov model. Epidemiologic characteristics, costs, and utilities were obtained from a Chilean cohort; data were completed with information from international literature. Incremental costs when using apixaban vs. acenocoumarol over a lifetime are CH$2,108,600 with an incremental effectiveness of 0.173 years of life gained (YLG) and 0.182 quality-adjusted life-year (QALY). The ICUR of apixaban vs. acenocoumarol was CH$12,188,439 per YLG and CH$11,585,714 per QALY. One to 3 times gross domestic product (GDP) per capita threshold is acceptable based on World Health Organization (WHO) norms. Chilean GDP per capita was CH$7,797,021 in 2013. The sensitivity analysis shows that these results are sensitive to the ischemic stroke risk with apixaban, and the intracranial hemorrhage risk due to the use of acenocoumarol. The use of apixaban in patients with NVAF in moderate-to-high risk of stroke is cost-effective, considering the payment threshold suggested by WHO. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Cost-effectiveness Analysis Comparing Apixaban and Acenocoumarol in the Prevention of Stroke in Patients With Nonvalvular Atrial Fibrillation in Spain.

    PubMed

    Barón Esquivias, Gonzalo; Escolar Albaladejo, Ginés; Zamorano, José Luis; Betegón Nicolás, Lourdes; Canal Fontcuberta, Cristina; de Salas-Cansado, Marina; Rubio-Rodríguez, Darío; Rubio-Terrés, Carlos

    2015-08-01

    Cost-effectiveness analysis of apixaban (5 mg twice daily) vs acenocoumarol (5mg/day) in the prevention of stroke in patients with nonvalvular atrial fibrillation in Spain. Markov model covering the patient's entire lifespan with 10 health states. Data on the efficacy and safety of the drugs were provided by the ARISTOTLE trial. Warfarin and acenocoumarol were assumed to have therapeutic equivalence. The Spanish National Health System and society. Information on the cost of the drugs, complications, and the management of the disease was obtained from Spanish sources. In a cohort of 1000 patients with nonvalvular atrial fibrillation, administration of apixaban rather than acenocoumarol would avoid 18 strokes, 71 hemorrhages (28 intracranial or major), 2 myocardial infarctions, 1 systemic embolism, and 23 related deaths. Apixaban would prolong life (by 0.187 years) and result in more quality-adjusted life years (by 0.194 years) per patient. With apixaban, the incremental costs for the Spanish National Health System and for society would be € 2,488 and € 1,826 per patient, respectively. Consequently, the costs per life year gained would be € 13,305 and € 9,765 and the costs per quality-adjusted life year gained would be € 12,825 and € 9,412 for the Spanish National Health System and for society, respectively. The stability of the baseline case was confirmed by sensitivity analyses. According to this analysis, apixaban may be cost-effective in the prevention of stroke in patients with nonvalvular atrial fibrillation compared with acenocoumarol. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  14. Effect of ramadan fasting on acenocoumarol-induced antocoagulant effect.

    PubMed

    Mzoughi, Khadija; Zairi, Ihsen; Fennira, Sana; Kamoun, Sofien; Jnifene, Zouhayer; Ben Moussa, Fethia; Kraiem, Sondos

    2017-10-01

    Eating patterns, food intake and type of alimentation vary greatly during the month of ramadan. Furthermore, fasting, which practiced during the month of ramadan, can have an impact on drug's metabolism. These two factors, fasting and eating habits changes during the month of ramadan, may impact acenocoumarol anticoagulant effect, translated by variations of INR values. The aim of our study was to see ramadan fasting effects on INR variations in patients treated by acenocoumarol. A prospective monocentric study was conducted during the ramadan month on fasting outpatients that were treated by acenocoumarol. Baseline INR values (e.i. most recent available value before the month of ramadan) were compared to INR values obtained during the month of ramadan. All patients were monitored for signs of secondary haemorrhagic complications linked to treatment by anti-vitamin K (AVK). Thirty patients were included in the study with a sex ratio 1. Patients mean age was 65 years. Around two thirds of the patients were treated by AVK for atrial fibrillation. The majority of patients (94%) have been treated by AVK for more than a year. Mean INR was significantly higher during the month of ramadan than baseline (3.51 vs 2.52; p< 0.0001). There were also more overdoses during the month of ramadan than baseline (9 vs. 0; p=0.014). The increased INR values highlights the need of a close monitoring of INR values during the month of ramadan, particularly in patients with a high haemorrhagic risk.

  15. Hepatic uptake and storage of warfarin. The relation with the target enzyme vitamin K 2,3-epoxide reductase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thijssen, H.H.; Baars, L.G.

    The mechanisms of the reported dose-dependent warfarin pharmacokinetics were investigated using (/sup 14/C)warfarin. When administered in microdoses (9 micrograms i.v.) to rats (male Wistars, 270-300 g), a steep distribution phase (T1/2 = 0.25 hr) was followed by a relatively slow beta-phase (T1/2 = 40 hr). The observed volume of distribution was 390 ml. This pharmacokinetic behavior contrasted highly with the one seen for higher (greater than 0.2 mg/kg) doses (unlabeled) warfarin; volume of distribution = 45 ml, T1/2 = 12.5 hr. If a macrodose (0.2 mg/kg) preceded (16 hr) the microdose, normal pharmacokinetics were observed for the latter, suggesting amore » saturable deep compartment. The administration of 4-hydroxycoumarins (i.e., acenocoumarol, phenprocoumon and warfarin) after the microdose of (/sup 14/C)warfarin was in its beta-phase caused a rapid rise of plasma (/sup 14/C)warfarin indicating (/sup 14/C)warfarin to be displaced from the deep compartment. The rate of appearance of (/sup 14/C)warfarin was 0.3 hr-1 irrespective the 4-hydroxycoumarin used. The hepatic distribution of (/sup 14/C)warfarin was investigated and the effect of a displacer thereupon. Fifty-three hours after the (/sup 14/C)warfarin administration, the liver contained about 40% of the dose; 45% of it was bound to microsomes. The administration of acenocoumarol (0.2 mg/kg) at 48 hr, halved the liver content. (/sup 14/C)warfarin was redistributed from microsomes (-65%) and from the 10,000 X g pellet (-65%) into the cytosol (+260%) and the plasma (+320%). Microsomal bound (/sup 14/C)warfarin in vitro could not be washed out or be displaced unless dithiothreitol (50 mM) was included in the washing buffers.« less

  16. Raman spectroscopy for the analytical quality control of low-dose break-scored tablets.

    PubMed

    Gómez, Diego A; Coello, Jordi; Maspoch, Santiago

    2016-05-30

    Quality control of solid dosage forms involves the analysis of end products according to well-defined criteria, including the assessment of the uniformity of dosage units (UDU). However, in the case of break-scored tablets, given that tablet splitting is widespread as a means to adjust doses, the uniform distribution of the active pharmaceutical ingredient (API) in all the possible fractions of the tablet must be assessed. A general procedure to accomplish with both issues, using Raman spectroscopy, is presented. It is based on the acquisition of a collection of spectra in different regions of the tablet, that later can be selected to determine the amount of API in the potential fractions that can result after splitting. The procedure has been applied to two commercial products, Sintrom 1 and Sintrom 4, with API (acenocoumarol) mass proportion of 2% and 0.7% respectively. Partial Least Squares (PLS) calibration models were constructed for the quantification of acenocoumarol in whole tablets using HPLC as a reference analytical method. Once validated, the calibration models were used to determine the API content in the different potential fragments of the scored Sintrom 4 tablets. Fragment mass measurements were also performed to estimate the range of masses of the halves and quarters that could result after tablet splitting. The results show that Raman spectroscopy can be an alternative analytical procedure to assess the uniformity of content, both in whole tablets as in its potential fragments, and that Sintrom 4 tablets can be perfectly split in halves, but some cautions have to be taken when considering the fragmentation in quarters. A practical alternative to the use of UDU test for the assessment of tablet fragments is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Evaluation of a reverse-hybridization StripAssay for the detection of genetic polymorphisms leading to acenocoumarol sensitivity.

    PubMed

    Gialeraki, Argyri; Markatos, Christos; Grouzi, Elisabeth; Merkouri, Efrosyni; Travlou, Anthi; Politou, Marianna

    2010-04-01

    Acenocoumarol is mainly catabolized by CYP2C9 isoform of cytochrome P450 (CYP) liver complex and exerts its anticoagulant effect through the inhibition of Vitamin K Epoxide Reductase (VKOR). The most important genetic polymorphisms which lead to an impaired enzymatic activity and therefore predispose to acenocoumarol sensitivity, are considered to be CYP2C9*2 (Arg144Cys), CYP2C9*3 (Ile359Leu) and VKORC1-1639G>A, respectively. In this study we compared the results of the PGXThrombo StripAssay kit (ViennaLab Diagnostics,Vienna, Austria) with direct DNA sequencing and in house Restriction Fragment Length Polymorphisms (RFLP) for the detection of the aforementioned Single Nucleotide Polymorphisms (SNPs). The reverse hybridization StripAssay was found to be equally effective with RFLP and direct DNA sequencing for the detection of CYP2C9*2 and CYP2C9*3 polymorphisms, respectively. The comparison of the RFLP reference method with the reverse hybridization StripAssay for the detection of VKORC1-1639 G>A polymorphism showed that the reverse hybridization StripAsssay might misclassify some A/A homozygotes as heterozygotes. Optimization of the hybridization procedures may eliminate the extra low signal band observed in some samples at the reverse hybridization StripAssay and improve its diagnostic value.

  18. Effect of diseases on response to vitamin K antagonists.

    PubMed

    Self, Timothy H; Owens, Ryan E; Sakaan, Sami A; Wallace, Jessica L; Sands, Christopher W; Howard-Thompson, Amanda

    2016-01-01

    The purpose of this review article is to summarize the literature on diseases that are documented to have an effect on response to warfarin and other VKAs. We searched the English literature from 1946 to September 2015 via PubMed, EMBASE, and Scopus for the effect of diseases on response vitamin K antagonists including warfarin, acenocoumarol, phenprocoumon, and fluindione. Among many factors modifying response to VKAs, several disease states are clinically relevant. Liver disease, hyperthyroidism, and CKD are well documented to increase response to VKAs. Decompensated heart failure, fever, and diarrhea may also elevate response to VKAs, but more study is needed. Hypothyroidism is associated with decreased effect of VKAs, and obese patients will likely require higher initial doses of VKAs. In order to minimize risks with VKAs while ensuring efficacy, clinicians must be aware of the effect of disease states when prescribing these oral anticoagulants.

  19. Statin use decreases coagulation in users of vitamin K antagonists.

    PubMed

    van Rein, Nienke; Biedermann, J S; Bonafacio, S M; Kruip, M J H A; van der Meer, F J M; Lijfering, W M

    2016-12-01

    The purpose of the study is to determine the immediate and long-term effect of statins on coagulation in patients treated with vitamin K antagonists (VKAs). We selected patients on VKAs of two Dutch anticoagulation clinics who initiated treatment with a statin between 2009 and 2013. Patients who initiated or stopped concomitant drugs that interact with VKAs or were hospitalised during follow-up were excluded. The VKA dosage (mg/day) after statin initiation was compared with the last VKA dosage before the statin was started. Immediate and long-term differences in VKA dosage (at 6 and 12 weeks) were calculated with a paired student t test. Four hundred thirty-five phenprocoumon users (mean age 70 years, 60 % men) and 303 acenocoumarol users (mean age 69 years, 58 % men) were included. After start of statin use, the immediate phenprocoumon dosage was 0.02 mg/day (95 % CI, 0.00 to 0.03) lower. At 6 and 12 weeks, these phenprocoumon dosages were 0.03 (95 % CI, 0.01 to 0.05) and 0.07 mg/day (95 % CI, 0.04 to 0.09) lower as compared with the dosage before first statin use. In acenocoumarol users, VKA dosage was 0.04 mg/day (95%CI, 0.01 to 0.07) (immediate effect), 0.10 (95 % CI, 0.03 to 0.16) (at 6 weeks), and 0.11 mg/day (95 % CI, 0.04 to 0.18) (after 12 weeks) lower. Initiation of statin treatment was associated with an immediate and long-term minor although statistically significant decrease in VKA dosage in both phenprocoumon and acenocoumarol users, which suggests that statins may have anticoagulant properties.

  20. Adult-onset Morgagni's hernia.

    PubMed

    Valdivielso Cortázar, Eduardo; Carral Martínez, David; Gómez Gutiérrez, Manuel; Bouzón Alejandro, Alberto

    2018-05-01

    We report the case of a 65-year-old male patient with Down's syndrome and a deep venous thrombosis on anticoagulation with acenocoumarol. The case presented due to nonspecific, predominantly postprandial epigastric discomfort, meteorism and aerophagia. A thoracoabdominal computed tomography (CT) scan revealed a Morgagni hernia with a cephalad migration of part of the stomach, ascending colon and transverse colon. After laparotomy, the defect was repaired using a titanium mesh and the patient had a favorable outcome.

  1. GTV-based prescription in SBRT for lung lesions using advanced dose calculation algorithms.

    PubMed

    Lacornerie, Thomas; Lisbona, Albert; Mirabel, Xavier; Lartigau, Eric; Reynaert, Nick

    2014-10-16

    The aim of current study was to investigate the way dose is prescribed to lung lesions during SBRT using advanced dose calculation algorithms that take into account electron transport (type B algorithms). As type A algorithms do not take into account secondary electron transport, they overestimate the dose to lung lesions. Type B algorithms are more accurate but still no consensus is reached regarding dose prescription. The positive clinical results obtained using type A algorithms should be used as a starting point. In current work a dose-calculation experiment is performed, presenting different prescription methods. Three cases with three different sizes of peripheral lung lesions were planned using three different treatment platforms. For each individual case 60 Gy to the PTV was prescribed using a type A algorithm and the dose distribution was recalculated using a type B algorithm in order to evaluate the impact of the secondary electron transport. Secondly, for each case a type B algorithm was used to prescribe 48 Gy to the PTV, and the resulting doses to the GTV were analyzed. Finally, prescriptions based on specific GTV dose volumes were evaluated. When using a type A algorithm to prescribe the same dose to the PTV, the differences regarding median GTV doses among platforms and cases were always less than 10% of the prescription dose. The prescription to the PTV based on type B algorithms, leads to a more important variability of the median GTV dose among cases and among platforms, (respectively 24%, and 28%). However, when 54 Gy was prescribed as median GTV dose, using a type B algorithm, the variability observed was minimal. Normalizing the prescription dose to the median GTV dose for lung lesions avoids variability among different cases and treatment platforms of SBRT when type B algorithms are used to calculate the dose. The combination of using a type A algorithm to optimize a homogeneous dose in the PTV and using a type B algorithm to prescribe the median GTV dose provides a very robust method for treating lung lesions.

  2. Summary of evidence-based guideline update: Prevention of stroke in nonvalvular atrial fibrillation

    PubMed Central

    Culebras, Antonio; Messé, Steven R.; Chaturvedi, Seemant; Kase, Carlos S.; Gronseth, Gary

    2014-01-01

    Objective: To update the 1998 American Academy of Neurology practice parameter on stroke prevention in nonvalvular atrial fibrillation (NVAF). How often do various technologies identify previously undetected NVAF? Which therapies reduce ischemic stroke risk with the least risk of hemorrhage, including intracranial hemorrhage? The complete guideline on which this summary is based is available as an online data supplement to this article. Methods: Systematic literature review; modified Delphi process recommendation formulation. Major conclusions: In patients with recent cryptogenic stroke, cardiac rhythm monitoring probably detects occult NVAF. In patients with NVAF, dabigatran, rivaroxaban, and apixaban are probably at least as effective as warfarin in preventing stroke and have a lower risk of intracranial hemorrhage. Triflusal plus acenocoumarol is likely more effective than acenocoumarol alone in reducing stroke risk. Clopidogrel plus aspirin is probably less effective than warfarin in preventing stroke and has a lower risk of intracranial bleeding. Clopidogrel plus aspirin as compared with aspirin alone probably reduces stroke risk but increases the risk of major hemorrhage. Apixaban is likely more effective than aspirin for decreasing stroke risk and has a bleeding risk similar to that of aspirin. Major recommendations: Clinicians might obtain outpatient cardiac rhythm studies in patients with cryptogenic stroke to identify patients with occult NVAF (Level C) and should routinely offer anticoagulation to patients with NVAF and a history of TIA/stroke (Level B). Specific patient considerations will inform anticoagulant selection in patients with NVAF judged to need anticoagulation. PMID:24566225

  3. Dose calculation accuracy of the Monte Carlo algorithm for CyberKnife compared with other commercially available dose calculation algorithms.

    PubMed

    Sharma, Subhash; Ott, Joseph; Williams, Jamone; Dickow, Danny

    2011-01-01

    Monte Carlo dose calculation algorithms have the potential for greater accuracy than traditional model-based algorithms. This enhanced accuracy is particularly evident in regions of lateral scatter disequilibrium, which can develop during treatments incorporating small field sizes and low-density tissue. A heterogeneous slab phantom was used to evaluate the accuracy of several commercially available dose calculation algorithms, including Monte Carlo dose calculation for CyberKnife, Analytical Anisotropic Algorithm and Pencil Beam convolution for the Eclipse planning system, and convolution-superposition for the Xio planning system. The phantom accommodated slabs of varying density; comparisons between planned and measured dose distributions were accomplished with radiochromic film. The Monte Carlo algorithm provided the most accurate comparison between planned and measured dose distributions. In each phantom irradiation, the Monte Carlo predictions resulted in gamma analysis comparisons >97%, using acceptance criteria of 3% dose and 3-mm distance to agreement. In general, the gamma analysis comparisons for the other algorithms were <95%. The Monte Carlo dose calculation algorithm for CyberKnife provides more accurate dose distribution calculations in regions of lateral electron disequilibrium than commercially available model-based algorithms. This is primarily because of the ability of Monte Carlo algorithms to implicitly account for tissue heterogeneities, density scaling functions; and/or effective depth correction factors are not required. Copyright © 2011 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  4. Simulation-Based Evaluation of Dose-Titration Algorithms for Rapid-Acting Insulin in Subjects with Type 2 Diabetes Mellitus Inadequately Controlled on Basal Insulin and Oral Antihyperglycemic Medications.

    PubMed

    Ma, Xiaosu; Chien, Jenny Y; Johnson, Jennal; Malone, James; Sinha, Vikram

    2017-08-01

    The purpose of this prospective, model-based simulation approach was to evaluate the impact of various rapid-acting mealtime insulin dose-titration algorithms on glycemic control (hemoglobin A1c [HbA1c]). Seven stepwise, glucose-driven insulin dose-titration algorithms were evaluated with a model-based simulation approach by using insulin lispro. Pre-meal blood glucose readings were used to adjust insulin lispro doses. Two control dosing algorithms were included for comparison: no insulin lispro (basal insulin+metformin only) or insulin lispro with fixed doses without titration. Of the seven dosing algorithms assessed, daily adjustment of insulin lispro dose, when glucose targets were met at pre-breakfast, pre-lunch, and pre-dinner, sequentially, demonstrated greater HbA1c reduction at 24 weeks, compared with the other dosing algorithms. Hypoglycemic rates were comparable among the dosing algorithms except for higher rates with the insulin lispro fixed-dose scenario (no titration), as expected. The inferior HbA1c response for the "basal plus metformin only" arm supports the additional glycemic benefit with prandial insulin lispro. Our model-based simulations support a simplified dosing algorithm that does not include carbohydrate counting, but that includes glucose targets for daily dose adjustment to maintain glycemic control with a low risk of hypoglycemia.

  5. Warfarin Dosing Algorithms Underpredict Dose Requirements in Patients Requiring ≥7 mg Daily: A Systematic Review and Meta-analysis.

    PubMed

    Saffian, S M; Duffull, S B; Wright, Dfb

    2017-08-01

    There is preliminary evidence to suggest that some published warfarin dosing algorithms produce biased maintenance dose predictions in patients who require higher than average doses. We conducted a meta-analysis of warfarin dosing algorithms to determine if there exists a systematic under- or overprediction of dose requirements for patients requiring ≥7 mg/day across published algorithms. Medline and Embase databases were searched up to September 2015. We quantified the proportion of over- and underpredicted doses in patients whose observed maintenance dose was ≥7 mg/day. The meta-analysis included 47 evaluations of 22 different warfarin dosing algorithms from 16 studies. The meta-analysis included data from 1,492 patients who required warfarin doses of ≥7 mg/day. All 22 algorithms were found to underpredict warfarin dosing requirements in patients who required ≥7 mg/day by an average of 2.3 mg/day with a pooled estimate of underpredicted doses of 92.3% (95% confidence interval 90.3-94.1, I 2 = 24%). © 2017 American Society for Clinical Pharmacology and Therapeutics.

  6. Influence of different dose calculation algorithms on the estimate of NTCP for lung complications.

    PubMed

    Hedin, Emma; Bäck, Anna

    2013-09-06

    Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose-volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient-specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm-specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction-based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman-Kutcher-Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm-specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types.

  7. Comparison of the Performance of the Warfarin Pharmacogenetics Algorithms in Patients with Surgery of Heart Valve Replacement and Heart Valvuloplasty.

    PubMed

    Xu, Hang; Su, Shi; Tang, Wuji; Wei, Meng; Wang, Tao; Wang, Dongjin; Ge, Weihong

    2015-09-01

    A large number of warfarin pharmacogenetics algorithms have been published. Our research was aimed to evaluate the performance of the selected pharmacogenetic algorithms in patients with surgery of heart valve replacement and heart valvuloplasty during the phase of initial and stable anticoagulation treatment. 10 pharmacogenetic algorithms were selected by searching PubMed. We compared the performance of the selected algorithms in a cohort of 193 patients during the phase of initial and stable anticoagulation therapy. Predicted dose was compared to therapeutic dose by using a predicted dose percentage that falls within 20% threshold of the actual dose (percentage within 20%) and mean absolute error (MAE). The average warfarin dose for patients was 3.05±1.23mg/day for initial treatment and 3.45±1.18mg/day for stable treatment. The percentages of the predicted dose within 20% of the therapeutic dose were 44.0±8.8% and 44.6±9.7% for the initial and stable phases, respectively. The MAEs of the selected algorithms were 0.85±0.18mg/day and 0.93±0.19mg/day, respectively. All algorithms had better performance in the ideal group than in the low dose and high dose groups. The only exception is the Wadelius et al. algorithm, which had better performance in the high dose group. The algorithms had similar performance except for the Wadelius et al. and Miao et al. algorithms, which had poor accuracy in our study cohort. The Gage et al. algorithm had better performance in both phases of initial and stable treatment. Algorithms had relatively higher accuracy in the >50years group of patients on the stable phase. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The impact of different algorithms for ideal body weight on screening for hydroxychloroquine retinopathy in women.

    PubMed

    Browning, David J; Lee, Chong; Rotberg, David

    2014-01-01

    To determine how algorithms for ideal body weight (IBW) affect hydroxychloroquine dosing in women. This was a retrospective study of 520 patients screened for hydroxychloroquine retinopathy. Charts were reviewed for sex, height, weight, and daily dose. The outcome measures were ranges of IBW across algorithms; rates of potentially toxic dosing; height thresholds below which 400 mg/d dosing is potentially toxic; and rates for which actual body weight (ABW) was less than IBW. Women made up 474 (91%) of the patients. The IBWs for a height varied from 30-34 pounds (13.6-15.5 kg) across algorithms. The threshold heights below which toxic dosing occurred varied from 62-70 inches (157.5-177.8 cm). Different algorithms placed 16%-98% of women in the toxic dosing range. The proportion for whom dosing should have been based on ABW rather than IBW ranged from 5%-31% across algorithms. Although hydroxychloroquine dosing should be based on the lesser of ABW and IBW, there is no consensus about the definition of IBW. The Michaelides algorithm is associated with the most frequent need to adjust dosing; the Metropolitan Life Insurance, large frame, mean value table with the least frequent need. No evidence indicates that one algorithm is superior to others.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pacaci, P; Cebe, M; Mabhouti, H

    Purpose: In this study, dosimetric comparison of field in field (FIF) and intensity modulated radiation therapy (IMRT) techniques used for treatment of whole breast radiotherapy (WBRT) were made. The dosimetric accuracy of treatment planning system (TPS) for Anisotropic Analytical Algorithm (AAA) and Acuros XB (AXB) algorithms in predicting PTV and OAR doses was also investigated. Methods: Two different treatment planning techniques of left-sided breast cancer were generated for rando phantom. FIF and IMRT plans were compared for doses in PTV and OAR volumes including ipsilateral lung, heart, left ascending coronary artery, contralateral lung and the contralateral breast. PTV and OARsmore » doses and homogeneity and conformality indexes were compared between two techniques. The accuracy of TPS dose calculation algorithms was tested by comparing PTV and OAR doses measured by thermoluminescent dosimetry with the dose calculated by the TPS using AAA and AXB for both techniques. Results: IMRT plans had better conformality and homogeneity indexes than FIF technique and it spared OARs better than FIF. While both algorithms overestimated PTV doses they underestimated all OAR doses. For IMRT plan, PTV doses, overestimation up to 2.5 % was seen with AAA algorithm but it decreased to 1.8 % when AXB algorithm was used. Based on the results of the anthropomorphic measurements for OAR doses, underestimation greater than 7 % is possible by the AAA. The results from the AXB are much better than the AAA algorithm. However, underestimations of 4.8 % were found in some of the points even for AXB. For FIF plan, similar trend was seen for PTV and OARs doses in both algorithm. Conclusion: When using the Eclipse TPS for breast cancer, AXB the should be used instead of the AAA algorithm, bearing in mind that the AXB may still underestimate all OAR doses.« less

  10. Liquid chromatography with tandem mass spectrometry for the simultaneous identification and quantification of cardiovascular drugs applied to the detection of substandard and falsified drugs.

    PubMed

    Bernard, Mélisande; Akrout, Wiem; Van Buu, Christelle Tran; Metz, Carole; Antignac, Marie; Yagoubi, Najet; Do, Bernard

    2015-02-01

    The counterfeiting of pharmaceuticals has been detected since about 1990 and has alarmingly continued to pick up steam. We have been recently involved in an evaluation program of some of the most commonly prescribed cardiovascular drugs in Africa, for analysing an important number of tablets or capsules obtained from different places in seven African countries. A reversed-phase high-performance liquid chromatography with tandem mass spectrometry method was developed and validated to simultaneously control the identity and the quantity of acenocoumarol, amlodipine, atenolol, captopril, furosemide, hydrochlorothiazide and simvastatin in tablets. Their separation was performed on a Kinetex® C(18) (100 mm × 2.1 mm inside diameter, 2.6 μm) column using a gradient elution of 20 mM ammonium formate buffer and acetonitrile (90:10 10:90 v/v) at a flow rate of 0.5 mL/min. The analytes were detected using electrospray ionisation tandem mass spectrometry in both positive and negative modes with multiple reaction monitoring. Tandem mass spectrometry fragmentation patterns of captopril, furosemide and acenocoumarol, up to now not detailed in the literature, were also studied to assist in the selection of the most relevant transitions towards the objectives. The developed method was validated as per International Conference on Harmonisation guidelines with respect to specificity, linearity, trueness, precision, limits of detection and quantification. It has been successfully applied to the control of oral forms of seven cardiovascular drugs collected in African countries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Evaluation of 16 genotype-guided Warfarin Dosing Algorithms in 310 Korean Patients Receiving Warfarin Treatment: Poor Prediction Performance in VKORC1 1173C Carriers.

    PubMed

    Yang, Mina; Choi, Rihwa; Kim, June Soo; On, Young Keun; Bang, Oh Young; Cho, Hyun-Jung; Lee, Soo-Youn

    2016-12-01

    The purpose of this study was to evaluate the performance of 16 previously published warfarin dosing algorithms in Korean patients. The 16 algorithms were selected through a literature search and evaluated using a cohort of 310 Korean patients with atrial fibrillation or cerebral infarction who were receiving warfarin therapy. A large interindividual variation (up to 11-fold) in warfarin dose was observed (median, 25 mg/wk; range, 7-77 mg/wk). Estimated dose and actual maintenance dose correlated well overall (r range, 0.52-0.73). Mean absolute error (MAE) of the 16 algorithms ranged from -1.2 to -20.1 mg/wk. The percentage of patients whose estimated dose fell within 20% of the actual dose ranged from 1.0% to 49%. All algorithms showed poor accuracy with increased MAE in a higher dose range. Performance of the dosing algorithms was worse in patients with VKORC1 1173TC or CC than in total (r range, 0.38-0.61 vs 0.52-0.73; MAE range, -2.6 to -28.0 mg/wk vs -1.2 to -20.1 mg/wk). The algorithms had comparable prediction abilities but showed limited accuracy depending on ethnicity, warfarin dose, and VKORC1 genotype. Further studies are needed to develop genotype-guided warfarin dosing algorithms with greater accuracy in the Korean population. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  12. SU-E-T-91: Accuracy of Dose Calculation Algorithms for Patients Undergoing Stereotactic Ablative Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajaldeen, A; Ramachandran, P; Geso, M

    2015-06-15

    Purpose: The purpose of this study was to investigate and quantify the variation in dose distributions in small field lung cancer radiotherapy using seven different dose calculation algorithms. Methods: The study was performed in 21 lung cancer patients who underwent Stereotactic Ablative Body Radiotherapy (SABR). Two different methods (i) Same dose coverage to the target volume (named as same dose method) (ii) Same monitor units in all algorithms (named as same monitor units) were used for studying the performance of seven different dose calculation algorithms in XiO and Eclipse treatment planning systems. The seven dose calculation algorithms include Superposition, Fastmore » superposition, Fast Fourier Transform ( FFT) Convolution, Clarkson, Anisotropic Analytic Algorithm (AAA), Acurous XB and pencil beam (PB) algorithms. Prior to this, a phantom study was performed to assess the accuracy of these algorithms. Superposition algorithm was used as a reference algorithm in this study. The treatment plans were compared using different dosimetric parameters including conformity, heterogeneity and dose fall off index. In addition to this, the dose to critical structures like lungs, heart, oesophagus and spinal cord were also studied. Statistical analysis was performed using Prism software. Results: The mean±stdev with conformity index for Superposition, Fast superposition, Clarkson and FFT convolution algorithms were 1.29±0.13, 1.31±0.16, 2.2±0.7 and 2.17±0.59 respectively whereas for AAA, pencil beam and Acurous XB were 1.4±0.27, 1.66±0.27 and 1.35±0.24 respectively. Conclusion: Our study showed significant variations among the seven different algorithms. Superposition and AcurosXB algorithms showed similar values for most of the dosimetric parameters. Clarkson, FFT convolution and pencil beam algorithms showed large differences as compared to superposition algorithms. Based on our study, we recommend Superposition and AcurosXB algorithms as the first choice of algorithms in lung cancer radiotherapy involving small fields. However, further investigation by Monte Carlo simulation is required to confirm our results.« less

  13. Accuracy of patient-specific organ dose estimates obtained using an automated image segmentation algorithm.

    PubMed

    Schmidt, Taly Gilat; Wang, Adam S; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh

    2016-10-01

    The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was [Formula: see text], with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors.

  14. Accuracy of patient-specific organ dose estimates obtained using an automated image segmentation algorithm

    PubMed Central

    Schmidt, Taly Gilat; Wang, Adam S.; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh

    2016-01-01

    Abstract. The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was −7%, with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors. PMID:27921070

  15. Evaluation of an iterative model-based CT reconstruction algorithm by intra-patient comparison of standard and ultra-low-dose examinations.

    PubMed

    Noël, Peter B; Engels, Stephan; Köhler, Thomas; Muenzel, Daniela; Franz, Daniela; Rasper, Michael; Rummeny, Ernst J; Dobritz, Martin; Fingerle, Alexander A

    2018-01-01

    Background The explosive growth of computer tomography (CT) has led to a growing public health concern about patient and population radiation dose. A recently introduced technique for dose reduction, which can be combined with tube-current modulation, over-beam reduction, and organ-specific dose reduction, is iterative reconstruction (IR). Purpose To evaluate the quality, at different radiation dose levels, of three reconstruction algorithms for diagnostics of patients with proven liver metastases under tumor follow-up. Material and Methods A total of 40 thorax-abdomen-pelvis CT examinations acquired from 20 patients in a tumor follow-up were included. All patients were imaged using the standard-dose and a specific low-dose CT protocol. Reconstructed slices were generated by using three different reconstruction algorithms: a classical filtered back projection (FBP); a first-generation iterative noise-reduction algorithm (iDose4); and a next generation model-based IR algorithm (IMR). Results The overall detection of liver lesions tended to be higher with the IMR algorithm than with FBP or iDose4. The IMR dataset at standard dose yielded the highest overall detectability, while the low-dose FBP dataset showed the lowest detectability. For the low-dose protocols, a significantly improved detectability of the liver lesion can be reported compared to FBP or iDose 4 ( P = 0.01). The radiation dose decreased by an approximate factor of 5 between the standard-dose and the low-dose protocol. Conclusion The latest generation of IR algorithms significantly improved the diagnostic image quality and provided virtually noise-free images for ultra-low-dose CT imaging.

  16. A clinical study of lung cancer dose calculation accuracy with Monte Carlo simulation.

    PubMed

    Zhao, Yanqun; Qi, Guohai; Yin, Gang; Wang, Xianliang; Wang, Pei; Li, Jian; Xiao, Mingyong; Li, Jie; Kang, Shengwei; Liao, Xiongfei

    2014-12-16

    The accuracy of dose calculation is crucial to the quality of treatment planning and, consequently, to the dose delivered to patients undergoing radiation therapy. Current general calculation algorithms such as Pencil Beam Convolution (PBC) and Collapsed Cone Convolution (CCC) have shortcomings in regard to severe inhomogeneities, particularly in those regions where charged particle equilibrium does not hold. The aim of this study was to evaluate the accuracy of the PBC and CCC algorithms in lung cancer radiotherapy using Monte Carlo (MC) technology. Four treatment plans were designed using Oncentra Masterplan TPS for each patient. Two intensity-modulated radiation therapy (IMRT) plans were developed using the PBC and CCC algorithms, and two three-dimensional conformal therapy (3DCRT) plans were developed using the PBC and CCC algorithms. The DICOM-RT files of the treatment plans were exported to the Monte Carlo system to recalculate. The dose distributions of GTV, PTV and ipsilateral lung calculated by the TPS and MC were compared. For 3DCRT and IMRT plans, the mean dose differences for GTV between the CCC and MC increased with decreasing of the GTV volume. For IMRT, the mean dose differences were found to be higher than that of 3DCRT. The CCC algorithm overestimated the GTV mean dose by approximately 3% for IMRT. For 3DCRT plans, when the volume of the GTV was greater than 100 cm(3), the mean doses calculated by CCC and MC almost have no difference. PBC shows large deviations from the MC algorithm. For the dose to the ipsilateral lung, the CCC algorithm overestimated the dose to the entire lung, and the PBC algorithm overestimated V20 but underestimated V5; the difference in V10 was not statistically significant. PBC substantially overestimates the dose to the tumour, but the CCC is similar to the MC simulation. It is recommended that the treatment plans for lung cancer be developed using an advanced dose calculation algorithm other than PBC. MC can accurately calculate the dose distribution in lung cancer and can provide a notably effective tool for benchmarking the performance of other dose calculation algorithms within patients.

  17. SU-F-P-56: On a New Approach to Reconstruct the Patient Dose From Phantom Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bangtsson, E; Vries, W de

    Purpose: The development of complex radiation treatment schemes emphasizes the need for advanced QA analysis methods to ensure patient safety. One such tool is the Delta4 DVH Anatomy software, where the patient dose is reconstructed from phantom measurements. Deviations in the measured dose are transferred to the patient anatomy and their clinical impact is evaluated in situ. Results from the original algorithm revealed weaknesses that may introduce artefacts in the reconstructed dose. These can lead to false negatives or obscure the effects of minor dose deviations from delivery failures. Here, we will present results from a new patient dose reconstructionmore » algorithm. Methods: The main steps of the new algorithm are: (1) the dose delivered to a phantom is measured in a number of detector positions. (2) The measured dose is compared to an internally calculated dose distribution evaluated in said positions. The so-obtained dose difference is (3) used to calculate an energy fluence difference. This entity is (4) used as input to a patient dose correction calculation routine. Finally, the patient dose is reconstructed by adding said patient dose correction to the planned patient dose. The internal dose calculation in step (2) and (4) is based on the Pencil Beam algorithm. Results: The new patient dose reconstruction algorithm have been tested on a number of patients and the standard metrics dose deviation (DDev), distance-to-agreement (DTA) and Gamma index are improved when compared to the original algorithm. In a certain case the Gamma index (3%/3mm) increases from 72.9% to 96.6%. Conclusion: The patient dose reconstruction algorithm is improved. This leads to a reduction in non-physical artefacts in the reconstructed patient dose. As a consequence, the possibility to detect deviations in the dose that is delivered to the patient is improved. An increase in Gamma index for the PTV can be seen. The corresponding author is an employee of ScandiDos.« less

  18. Influence of dose calculation algorithms on the predicted dose distribution and NTCP values for NSCLC patients.

    PubMed

    Nielsen, Tine B; Wieslander, Elinore; Fogliata, Antonella; Nielsen, Morten; Hansen, Olfred; Brink, Carsten

    2011-05-01

    To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms. Six dose algorithms from four different treatment planning systems were investigated: Eclipse AAA, Oncentra MasterPlan Collapsed Cone and Pencil Beam, Pinnacle Collapsed Cone and XiO Multigrid Superposition, and Fast Fourier Transform Convolution. Twenty NSCLC patients treated in the period 2001-2006 at the same accelerator were included and the accelerator used for treatments were modeled in the different systems. The treatment plans were recalculated with the same number of monitor units and beam arrangements across the dose algorithms. Dose volume histograms of the GTV, PTV, combined lungs (excluding the GTV), and heart were exported and evaluated. NTCP values for heart and lungs were calculated using the relative seriality model and the LKB model, respectively. Furthermore, NTCP for the lungs were calculated from two different model parameter sets. Calculations and evaluations were performed both including and excluding density corrections. There are found statistical significant differences between the calculated dose to heart, lung, and targets across the algorithms. Mean lung dose and V20 are not very sensitive to change between the investigated dose calculation algorithms. However, the different dose levels for the PTV averaged over the patient population are varying up to 11%. The predicted NTCP values for pneumonitis vary between 0.20 and 0.24 or 0.35 and 0.48 across the investigated dose algorithms depending on the chosen model parameter set. The influence of the use of density correction in the dose calculation on the predicted NTCP values depends on the specific dose calculation algorithm and the model parameter set. For fixed values of these, the changes in NTCP can be up to 45%. Calculated NTCP values for pneumonitis are more sensitive to the choice of algorithm than mean lung dose and V20 which are also commonly used for plan evaluation. The NTCP values for heart complication are, in this study, not very sensitive to the choice of algorithm. Dose calculations based on density corrections result in quite different NTCP values than calculations without density corrections. It is therefore important when working with NTCP planning to use NTCP parameter values based on calculations and treatments similar to those for which the NTCP is of interest.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vikraman, S; Ramu, M; Karrthick, Kp

    Purpose: The purpose of this study was to validate the advent of COMPASS 3D dosimetry as a routine pre treatment verification tool with commercially available CMS Monaco and Oncentra Masterplan planning system. Methods: Twenty esophagus patients were selected for this study. All these patients underwent radical VMAT treatment in Elekta Linac and plans were generated in Monaco v5.0 with MonteCarlo(MC) dose calculation algorithm. COMPASS 3D dosimetry comprises an advanced dose calculation algorithm of collapsed cone convolution(CCC). To validate CCC algorithm in COMPASS, The DICOM RT Plans generated using Monaco MC algorithm were transferred to Oncentra Masterplan v4.3 TPS. Only finalmore » dose calculations were performed using CCC algorithm with out optimization in Masterplan planning system. It is proven that MC algorithm is an accurate algorithm and obvious that there will be a difference with MC and CCC algorithms. Hence CCC in COMPASS should be validated with other commercially available CCC algorithm. To use the CCC as pretreatment verification tool with reference to MC generated treatment plans, CCC in OMP and CCC in COMPASS were validated using dose volume based indices such as D98, D95 for target volumes and OAR doses. Results: The point doses for open beams were observed <1% with reference to Monaco MC algorithms. Comparisons of CCC(OMP) Vs CCC(COMPASS) showed a mean difference of 1.82%±1.12SD and 1.65%±0.67SD for D98 and D95 respectively for Target coverage. Maximum point dose of −2.15%±0.60SD difference was observed in target volume. The mean lung dose of −2.68%±1.67SD was noticed between OMP and COMPASS. The maximum point doses for spinal cord were −1.82%±0.287SD. Conclusion: In this study, the accuracy of CCC algorithm in COMPASS 3D dosimetry was validated by compared with CCC algorithm in OMP TPS. Dose calculation in COMPASS is feasible within < 2% in comparison with commercially available TPS algorithms.« less

  20. Verification of Pharmacogenetics-Based Warfarin Dosing Algorithms in Han-Chinese Patients Undertaking Mechanic Heart Valve Replacement

    PubMed Central

    Zhao, Li; Chen, Chunxia; Li, Bei; Dong, Li; Guo, Yingqiang; Xiao, Xijun; Zhang, Eryong; Qin, Li

    2014-01-01

    Objective To study the performance of pharmacogenetics-based warfarin dosing algorithms in the initial and the stable warfarin treatment phases in a cohort of Han-Chinese patients undertaking mechanic heart valve replacement. Methods We searched PubMed, Chinese National Knowledge Infrastructure and Wanfang databases for selecting pharmacogenetics-based warfarin dosing models. Patients with mechanic heart valve replacement were consecutively recruited between March 2012 and July 2012. The predicted warfarin dose of each patient was calculated and compared with the observed initial and stable warfarin doses. The percentage of patients whose predicted dose fell within 20% of their actual therapeutic dose (percentage within 20%), and the mean absolute error (MAE) were utilized to evaluate the predictive accuracy of all the selected algorithms. Results A total of 8 algorithms including Du, Huang, Miao, Wei, Zhang, Lou, Gage, and International Warfarin Pharmacogenetics Consortium (IWPC) model, were tested in 181 patients. The MAE of the Gage, IWPC and 6 Han-Chinese pharmacogenetics-based warfarin dosing algorithms was less than 0.6 mg/day in accuracy and the percentage within 20% exceeded 45% in all of the selected models in both the initial and the stable treatment stages. When patients were stratified according to the warfarin dose range, all of the equations demonstrated better performance in the ideal-dose range (1.88–4.38 mg/day) than the low-dose range (<1.88 mg/day). Among the 8 algorithms compared, the algorithms of Wei, Huang, and Miao showed a lower MAE and higher percentage within 20% in both the initial and the stable warfarin dose prediction and in the low-dose and the ideal-dose ranges. Conclusions All of the selected pharmacogenetics-based warfarin dosing regimens performed similarly in our cohort. However, the algorithms of Wei, Huang, and Miao showed a better potential for warfarin prediction in the initial and the stable treatment phases in Han-Chinese patients undertaking mechanic heart valve replacement. PMID:24728385

  1. Verification of pharmacogenetics-based warfarin dosing algorithms in Han-Chinese patients undertaking mechanic heart valve replacement.

    PubMed

    Zhao, Li; Chen, Chunxia; Li, Bei; Dong, Li; Guo, Yingqiang; Xiao, Xijun; Zhang, Eryong; Qin, Li

    2014-01-01

    To study the performance of pharmacogenetics-based warfarin dosing algorithms in the initial and the stable warfarin treatment phases in a cohort of Han-Chinese patients undertaking mechanic heart valve replacement. We searched PubMed, Chinese National Knowledge Infrastructure and Wanfang databases for selecting pharmacogenetics-based warfarin dosing models. Patients with mechanic heart valve replacement were consecutively recruited between March 2012 and July 2012. The predicted warfarin dose of each patient was calculated and compared with the observed initial and stable warfarin doses. The percentage of patients whose predicted dose fell within 20% of their actual therapeutic dose (percentage within 20%), and the mean absolute error (MAE) were utilized to evaluate the predictive accuracy of all the selected algorithms. A total of 8 algorithms including Du, Huang, Miao, Wei, Zhang, Lou, Gage, and International Warfarin Pharmacogenetics Consortium (IWPC) model, were tested in 181 patients. The MAE of the Gage, IWPC and 6 Han-Chinese pharmacogenetics-based warfarin dosing algorithms was less than 0.6 mg/day in accuracy and the percentage within 20% exceeded 45% in all of the selected models in both the initial and the stable treatment stages. When patients were stratified according to the warfarin dose range, all of the equations demonstrated better performance in the ideal-dose range (1.88-4.38 mg/day) than the low-dose range (<1.88 mg/day). Among the 8 algorithms compared, the algorithms of Wei, Huang, and Miao showed a lower MAE and higher percentage within 20% in both the initial and the stable warfarin dose prediction and in the low-dose and the ideal-dose ranges. All of the selected pharmacogenetics-based warfarin dosing regimens performed similarly in our cohort. However, the algorithms of Wei, Huang, and Miao showed a better potential for warfarin prediction in the initial and the stable treatment phases in Han-Chinese patients undertaking mechanic heart valve replacement.

  2. [New oral anticoagulant drugs].

    PubMed

    Berkovits, Alejandro; Aizman, Andrés; Zúñiga, Pamela; Pereira, Jaime; Mezzano, Diego

    2011-10-01

    Thromboembolic disease (TED) is the leading cause of morbidity and mortality worldwide. The hallmark of oral long-term anticoagulant therapy has been the use of vitamin K antagonists, whose anticoagulant effect is exerted inhibiting vitamin K epoxide reductase. Warfarin and acenocoumarol are the most commonly used. In the last five years several new drugs for long term anticoagulation have been developed, which can inhibit single clotting factors with the purpose of improving drug therapeutic range and, ideally, minimizing bleeding risks. This review addresses the state of the art on the clinical use of inhibitors of activated factor X and thrombin.

  3. Evaluation of hybrid inverse planning and optimization (HIPO) algorithm for optimization in real-time, high-dose-rate (HDR) brachytherapy for prostate.

    PubMed

    Pokharel, Shyam; Rana, Suresh; Blikenstaff, Joseph; Sadeghi, Amir; Prestidge, Bradley

    2013-07-08

    The purpose of this study is to investigate the effectiveness of the HIPO planning and optimization algorithm for real-time prostate HDR brachytherapy. This study consists of 20 patients who underwent ultrasound-based real-time HDR brachytherapy of the prostate using the treatment planning system called Oncentra Prostate (SWIFT version 3.0). The treatment plans for all patients were optimized using inverse dose-volume histogram-based optimization followed by graphical optimization (GRO) in real time. The GRO is manual manipulation of isodose lines slice by slice. The quality of the plan heavily depends on planner expertise and experience. The data for all patients were retrieved later, and treatment plans were created and optimized using HIPO algorithm with the same set of dose constraints, number of catheters, and set of contours as in the real-time optimization algorithm. The HIPO algorithm is a hybrid because it combines both stochastic and deterministic algorithms. The stochastic algorithm, called simulated annealing, searches the optimal catheter distributions for a given set of dose objectives. The deterministic algorithm, called dose-volume histogram-based optimization (DVHO), optimizes three-dimensional dose distribution quickly by moving straight downhill once it is in the advantageous region of the search space given by the stochastic algorithm. The PTV receiving 100% of the prescription dose (V100) was 97.56% and 95.38% with GRO and HIPO, respectively. The mean dose (D(mean)) and minimum dose to 10% volume (D10) for the urethra, rectum, and bladder were all statistically lower with HIPO compared to GRO using the student pair t-test at 5% significance level. HIPO can provide treatment plans with comparable target coverage to that of GRO with a reduction in dose to the critical structures.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwai, P; Lins, L Nadler

    Purpose: There is a lack of studies with significant cohort data about patients using pacemaker (PM), implanted cardioverter defibrillator (ICD) or cardiac resynchronization therapy (CRT) device undergoing radiotherapy. There is no literature comparing the cumulative doses delivered to those cardiac implanted electronic devices (CIED) calculated by different algorithms neither studies comparing doses with heterogeneity correction or not. The aim of this study was to evaluate the influence of the algorithms Pencil Beam Convolution (PBC), Analytical Anisotropic Algorithm (AAA) and Acuros XB (AXB) as well as heterogeneity correction on risk categorization of patients. Methods: A retrospective analysis of 19 3DCRT ormore » IMRT plans of 17 patients was conducted, calculating the dose delivered to CIED using three different calculation algorithms. Doses were evaluated with and without heterogeneity correction for comparison. Risk categorization of the patients was based on their CIED dependency and cumulative dose in the devices. Results: Total estimated doses at CIED calculated by AAA or AXB were higher than those calculated by PBC in 56% of the cases. In average, the doses at CIED calculated by AAA and AXB were higher than those calculated by PBC (29% and 4% higher, respectively). The maximum difference of doses calculated by each algorithm was about 1 Gy, either using heterogeneity correction or not. Values of maximum dose calculated with heterogeneity correction showed that dose at CIED was at least equal or higher in 84% of the cases with PBC, 77% with AAA and 67% with AXB than dose obtained with no heterogeneity correction. Conclusion: The dose calculation algorithm and heterogeneity correction did not change the risk categorization. Since higher estimated doses delivered to CIED do not compromise treatment precautions to be taken, it’s recommend that the most sophisticated algorithm available should be used to predict dose at the CIED using heterogeneity correction.« less

  5. A new warfarin dosing algorithm including VKORC1 3730 G > A polymorphism: comparison with results obtained by other published algorithms.

    PubMed

    Cini, Michela; Legnani, Cristina; Cosmi, Benilde; Guazzaloca, Giuliana; Valdrè, Lelia; Frascaro, Mirella; Palareti, Gualtiero

    2012-08-01

    Warfarin dosing is affected by clinical and genetic variants, but the contribution of the genotype associated with warfarin resistance in pharmacogenetic algorithms has not been well assessed yet. We developed a new dosing algorithm including polymorphisms associated both with warfarin sensitivity and resistance in the Italian population, and its performance was compared with those of eight previously published algorithms. Clinical and genetic data (CYP2C9*2, CYP2C9*3, VKORC1 -1639 G > A, and VKORC1 3730 G > A) were used to elaborate the new algorithm. Derivation and validation groups comprised 55 (58.2% men, mean age 69 years) and 40 (57.5% men, mean age 70 years) patients, respectively, who were on stable anticoagulation therapy for at least 3 months with different oral anticoagulation therapy (OAT) indications. Performance of the new algorithm, evaluated with mean absolute error (MAE) defined as the absolute value of the difference between observed daily maintenance dose and predicted daily dose, correlation with the observed dose and R(2) value, was comparable with or slightly lower than that obtained using the other algorithms. The new algorithm could correctly assign 53.3%, 50.0%, and 57.1% of patients to the low (≤25 mg/week), intermediate (26-44 mg/week) and high (≥ 45 mg/week) dosing range, respectively. Our data showed a significant increase in predictive accuracy among patients requiring high warfarin dose compared with the other algorithms (ranging from 0% to 28.6%). The algorithm including VKORC1 3730 G > A, associated with warfarin resistance, allowed a more accurate identification of resistant patients who require higher warfarin dosage.

  6. Hybrid dose calculation: a dose calculation algorithm for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Donzelli, Mattia; Bräuer-Krisch, Elke; Oelfke, Uwe; Wilkens, Jan J.; Bartzsch, Stefan

    2018-02-01

    Microbeam radiation therapy (MRT) is still a preclinical approach in radiation oncology that uses planar micrometre wide beamlets with extremely high peak doses, separated by a few hundred micrometre wide low dose regions. Abundant preclinical evidence demonstrates that MRT spares normal tissue more effectively than conventional radiation therapy, at equivalent tumour control. In order to launch first clinical trials, accurate and efficient dose calculation methods are an inevitable prerequisite. In this work a hybrid dose calculation approach is presented that is based on a combination of Monte Carlo and kernel based dose calculation. In various examples the performance of the algorithm is compared to purely Monte Carlo and purely kernel based dose calculations. The accuracy of the developed algorithm is comparable to conventional pure Monte Carlo calculations. In particular for inhomogeneous materials the hybrid dose calculation algorithm out-performs purely convolution based dose calculation approaches. It is demonstrated that the hybrid algorithm can efficiently calculate even complicated pencil beam and cross firing beam geometries. The required calculation times are substantially lower than for pure Monte Carlo calculations.

  7. Recommendations for dose calculations of lung cancer treatment plans treated with stereotactic ablative body radiotherapy (SABR)

    NASA Astrophysics Data System (ADS)

    Devpura, S.; Siddiqui, M. S.; Chen, D.; Liu, D.; Li, H.; Kumar, S.; Gordon, J.; Ajlouni, M.; Movsas, B.; Chetty, I. J.

    2014-03-01

    The purpose of this study was to systematically evaluate dose distributions computed with 5 different dose algorithms for patients with lung cancers treated using stereotactic ablative body radiotherapy (SABR). Treatment plans for 133 lung cancer patients, initially computed with a 1D-pencil beam (equivalent-path-length, EPL-1D) algorithm, were recalculated with 4 other algorithms commissioned for treatment planning, including 3-D pencil-beam (EPL-3D), anisotropic analytical algorithm (AAA), collapsed cone convolution superposition (CCC), and Monte Carlo (MC). The plan prescription dose was 48 Gy in 4 fractions normalized to the 95% isodose line. Tumors were classified according to location: peripheral tumors surrounded by lung (lung-island, N=39), peripheral tumors attached to the rib-cage or chest wall (lung-wall, N=44), and centrally-located tumors (lung-central, N=50). Relative to the EPL-1D algorithm, PTV D95 and mean dose values computed with the other 4 algorithms were lowest for "lung-island" tumors with smallest field sizes (3-5 cm). On the other hand, the smallest differences were noted for lung-central tumors treated with largest field widths (7-10 cm). Amongst all locations, dose distribution differences were most strongly correlated with tumor size for lung-island tumors. For most cases, convolution/superposition and MC algorithms were in good agreement. Mean lung dose (MLD) values computed with the EPL-1D algorithm were highly correlated with that of the other algorithms (correlation coefficient =0.99). The MLD values were found to be ~10% lower for small lung-island tumors with the model-based (conv/superposition and MC) vs. the correction-based (pencil-beam) algorithms with the model-based algorithms predicting greater low dose spread within the lungs. This study suggests that pencil beam algorithms should be avoided for lung SABR planning. For the most challenging cases, small tumors surrounded entirely by lung tissue (lung-island type), a Monte-Carlo-based algorithm may be warranted.

  8. Influence of different dose calculation algorithms on the estimate of NTCP for lung complications

    PubMed Central

    Bäck, Anna

    2013-01-01

    Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose‐volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient‐specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm‐specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction‐based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman‐Kutcher‐Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm‐specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types. PACS numbers: 87.53.‐j, 87.53.Kn, 87.55.‐x, 87.55.dh, 87.55.kd PMID:24036865

  9. Dosing algorithm to target a predefined AUC in patients with primary central nervous system lymphoma receiving high dose methotrexate.

    PubMed

    Joerger, Markus; Ferreri, Andrés J M; Krähenbühl, Stephan; Schellens, Jan H M; Cerny, Thomas; Zucca, Emanuele; Huitema, Alwin D R

    2012-02-01

    There is no consensus regarding optimal dosing of high dose methotrexate (HDMTX) in patients with primary CNS lymphoma. Our aim was to develop a convenient dosing algorithm to target AUC(MTX) in the range between 1000 and 1100 µmol l(-1) h. A population covariate model from a pooled dataset of 131 patients receiving HDMTX was used to simulate concentration-time curves of 10,000 patients and test the efficacy of a dosing algorithm based on 24 h MTX plasma concentrations to target the prespecified AUC(MTX) . These data simulations included interindividual, interoccasion and residual unidentified variability. Patients received a total of four simulated cycles of HDMTX and adjusted MTX dosages were given for cycles two to four. The dosing algorithm proposes MTX dose adaptations ranging from +75% in patients with MTX C(24) < 0.5 µmol l(-1) up to -35% in patients with MTX C(24) > 12 µmol l(-1). The proposed dosing algorithm resulted in a marked improvement of the proportion of patients within the AUC(MTX) target between 1000 and 1100 µmol l(-1) h (11% with standard MTX dose, 35% with the adjusted dose) and a marked reduction of the interindividual variability of MTX exposure. A simple and practical dosing algorithm for HDMTX has been developed based on MTX 24 h plasma concentrations, and its potential efficacy in improving the proportion of patients within a prespecified target AUC(MTX) and reducing the interindividual variability of MTX exposure has been shown by data simulations. The clinical benefit of this dosing algorithm should be assessed in patients with primary central nervous system lymphoma (PCNSL). © 2011 The Authors. British Journal of Clinical Pharmacology © 2011 The British Pharmacological Society.

  10. Comparison of selected dose calculation algorithms in radiotherapy treatment planning for tissues with inhomogeneities

    NASA Astrophysics Data System (ADS)

    Woon, Y. L.; Heng, S. P.; Wong, J. H. D.; Ung, N. M.

    2016-03-01

    Inhomogeneity correction is recommended for accurate dose calculation in radiotherapy treatment planning since human body are highly inhomogeneous with the presence of bones and air cavities. However, each dose calculation algorithm has its own limitations. This study is to assess the accuracy of five algorithms that are currently implemented for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC) and Acuros XB (AXB). The calculated dose was compared with the measured dose using radiochromic film (Gafchromic EBT2) in inhomogeneous phantoms. In addition, the dosimetric impact of different algorithms on intensity modulated radiotherapy (IMRT) was studied for head and neck region. MC had the best agreement with the measured percentage depth dose (PDD) within the inhomogeneous region. This was followed by AXB, AAA, SP and PBC. For IMRT planning, MC algorithm is recommended for treatment planning in preference to PBC and SP. The MC and AXB algorithms were found to have better accuracy in terms of inhomogeneity correction and should be used for tumour volume within the proximity of inhomogeneous structures.

  11. Dental management of patients receiving anticoagulant and/or antiplatelet treatment

    PubMed Central

    Chaveli-López, Begonya; Gavaldá-Esteve, Carmen

    2014-01-01

    Introduction: Adequate hemostasis is crucial for the success of invasive dental treatment, since bleeding problems can give rise to complications associated with important morbidity-mortality. The dental treatment of patients who tend to an increased risk of bleeding due to the use of anticoagulant and/or antiplatelet drugs raises a challenge in the daily practice of dental professionals. Adequate knowledge of the mechanisms underlying hemostasis, and the optimized management of such patients, are therefore very important issues. Objectives: A study is made of the anticoagulant / antiplatelet drugs currently available on the market, with evaluation of the risks and benefits of suspending such drugs prior to invasive dental treatment. In addition, a review is made of the current management protocols used in these patients. Material and Methods: A literature search was made in the PubMed, Cochrane Library and Scopus databases, covering all studies published in the last 5 years in English and Spanish. Studies conducted in humans and with scientific evidence levels 1 and 2 (metaanalyses, systematic reviews, randomized phase 1 and 2 trials, cohort studies and case-control studies) were considered. The keywords used for the search were: tooth extraction, oral surgery, hemostasis, platelet aggregation inhibitors, antiplatelet drugs, anticoagulants, warfarin, acenocoumarol. Results and Conclusions: Many management protocols have been developed, though in all cases a full clinical history is required, together with complementary hemostatic tests to minimize any risks derived from dental treatment. Many authors consider that patient medication indicated for the treatment of background disease should not be altered or suspended unless so indicated by the prescribing physician. Local hemostatic measures have been shown to suffice for controlling possible bleeding problems resulting from dental treatment. Key words:Tooth extraction, oral surgery, hemostasis, platelet aggregation inhibitors, antiplatelet drugs, anticoagulants, warfarin, acenocoumarol. PMID:24790716

  12. Sparsity constrained split feasibility for dose-volume constraints in inverse planning of intensity-modulated photon or proton therapy

    NASA Astrophysics Data System (ADS)

    Penfold, Scott; Zalas, Rafał; Casiraghi, Margherita; Brooke, Mark; Censor, Yair; Schulte, Reinhard

    2017-05-01

    A split feasibility formulation for the inverse problem of intensity-modulated radiation therapy treatment planning with dose-volume constraints included in the planning algorithm is presented. It involves a new type of sparsity constraint that enables the inclusion of a percentage-violation constraint in the model problem and its handling by continuous (as opposed to integer) methods. We propose an iterative algorithmic framework for solving such a problem by applying the feasibility-seeking CQ-algorithm of Byrne combined with the automatic relaxation method that uses cyclic projections. Detailed implementation instructions are furnished. Functionality of the algorithm was demonstrated through the creation of an intensity-modulated proton therapy plan for a simple 2D C-shaped geometry and also for a realistic base-of-skull chordoma treatment site. Monte Carlo simulations of proton pencil beams of varying energy were conducted to obtain dose distributions for the 2D test case. A research release of the Pinnacle 3 proton treatment planning system was used to extract pencil beam doses for a clinical base-of-skull chordoma case. In both cases the beamlet doses were calculated to satisfy dose-volume constraints according to our new algorithm. Examination of the dose-volume histograms following inverse planning with our algorithm demonstrated that it performed as intended. The application of our proposed algorithm to dose-volume constraint inverse planning was successfully demonstrated. Comparison with optimized dose distributions from the research release of the Pinnacle 3 treatment planning system showed the algorithm could achieve equivalent or superior results.

  13. Pharmacogenetics-based warfarin dosing algorithm decreases time to stable anticoagulation and the risk of major hemorrhage: an updated meta-analysis of randomized controlled trials.

    PubMed

    Wang, Zhi-Quan; Zhang, Rui; Zhang, Peng-Pai; Liu, Xiao-Hong; Sun, Jian; Wang, Jun; Feng, Xiang-Fei; Lu, Qiu-Fen; Li, Yi-Gang

    2015-04-01

    Warfarin is yet the most widely used oral anticoagulant for thromboembolic diseases, despite the recently emerged novel anticoagulants. However, difficulty in maintaining stable dose within the therapeutic range and subsequent serious adverse effects markedly limited its use in clinical practice. Pharmacogenetics-based warfarin dosing algorithm is a recently emerged strategy to predict the initial and maintaining dose of warfarin. However, whether this algorithm is superior over conventional clinically guided dosing algorithm remains controversial. We made a comparison of pharmacogenetics-based versus clinically guided dosing algorithm by an updated meta-analysis. We searched OVID MEDLINE, EMBASE, and the Cochrane Library for relevant citations. The primary outcome was the percentage of time in therapeutic range. The secondary outcomes were time to stable therapeutic dose and the risks of adverse events including all-cause mortality, thromboembolic events, total bleedings, and major bleedings. Eleven randomized controlled trials with 2639 participants were included. Our pooled estimates indicated that pharmacogenetics-based dosing algorithm did not improve percentage of time in therapeutic range [weighted mean difference, 4.26; 95% confidence interval (CI), -0.50 to 9.01; P = 0.08], but it significantly shortened the time to stable therapeutic dose (weighted mean difference, -8.67; 95% CI, -11.86 to -5.49; P < 0.00001). Additionally, pharmacogenetics-based algorithm significantly reduced the risk of major bleedings (odds ratio, 0.48; 95% CI, 0.23 to 0.98; P = 0.04), but it did not reduce the risks of all-cause mortality, total bleedings, or thromboembolic events. Our results suggest that pharmacogenetics-based warfarin dosing algorithm significantly improves the efficiency of International Normalized Ratio correction and reduces the risk of major hemorrhage.

  14. Experimental evaluation of a GPU-based Monte Carlo dose calculation algorithm in the Monaco treatment planning system.

    PubMed

    Paudel, Moti R; Kim, Anthony; Sarfehnia, Arman; Ahmad, Sayed B; Beachey, David J; Sahgal, Arjun; Keller, Brian M

    2016-11-08

    A new GPU-based Monte Carlo dose calculation algorithm (GPUMCD), devel-oped by the vendor Elekta for the Monaco treatment planning system (TPS), is capable of modeling dose for both a standard linear accelerator and an Elekta MRI linear accelerator. We have experimentally evaluated this algorithm for a standard Elekta Agility linear accelerator. A beam model was developed in the Monaco TPS (research version 5.09.06) using the commissioned beam data for a 6 MV Agility linac. A heterogeneous phantom representing several scenarios - tumor-in-lung, lung, and bone-in-tissue - was designed and built. Dose calculations in Monaco were done using both the current clinical Monte Carlo algorithm, XVMC, and the new GPUMCD algorithm. Dose calculations in a Pinnacle TPS were also produced using the collapsed cone convolution (CCC) algorithm with heterogeneity correc-tion. Calculations were compared with the measured doses using an ionization chamber (A1SL) and Gafchromic EBT3 films for 2 × 2 cm2, 5 × 5 cm2, and 10 × 10 cm2 field sizes. The percentage depth doses (PDDs) calculated by XVMC and GPUMCD in a homogeneous solid water phantom were within 2%/2 mm of film measurements and within 1% of ion chamber measurements. For the tumor-in-lung phantom, the calculated doses were within 2.5%/2.5 mm of film measurements for GPUMCD. For the lung phantom, doses calculated by all of the algorithms were within 3%/3 mm of film measurements, except for the 2 × 2 cm2 field size where the CCC algorithm underestimated the depth dose by ~ 5% in a larger extent of the lung region. For the bone phantom, all of the algorithms were equivalent and calculated dose to within 2%/2 mm of film measurements, except at the interfaces. Both GPUMCD and XVMC showed interface effects, which were more pronounced for GPUMCD and were comparable to film measurements, whereas the CCC algorithm showed these effects poorly. © 2016 The Authors.

  15. The Impact of Monte Carlo Dose Calculations on Intensity-Modulated Radiation Therapy

    NASA Astrophysics Data System (ADS)

    Siebers, J. V.; Keall, P. J.; Mohan, R.

    The effect of dose calculation accuracy for IMRT was studied by comparing different dose calculation algorithms. A head and neck IMRT plan was optimized using a superposition dose calculation algorithm. Dose was re-computed for the optimized plan using both Monte Carlo and pencil beam dose calculation algorithms to generate patient and phantom dose distributions. Tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP) were computed to estimate the plan outcome. For the treatment plan studied, Monte Carlo best reproduces phantom dose measurements, the TCP was slightly lower than the superposition and pencil beam results, and the NTCP values differed little.

  16. Accuracy of patient specific organ-dose estimates obtained using an automated image segmentation algorithm

    NASA Astrophysics Data System (ADS)

    Gilat-Schmidt, Taly; Wang, Adam; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh

    2016-03-01

    The overall goal of this work is to develop a rapid, accurate and fully automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using a deterministic Boltzmann Transport Equation solver and automated CT segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. The investigated algorithm uses a combination of feature-based and atlas-based methods. A multiatlas approach was also investigated. We hypothesize that the auto-segmentation algorithm is sufficiently accurate to provide organ dose estimates since random errors at the organ boundaries will average out when computing the total organ dose. To test this hypothesis, twenty head-neck CT scans were expertly segmented into nine regions. A leave-one-out validation study was performed, where every case was automatically segmented with each of the remaining cases used as the expert atlas, resulting in nineteen automated segmentations for each of the twenty datasets. The segmented regions were applied to gold-standard Monte Carlo dose maps to estimate mean and peak organ doses. The results demonstrated that the fully automated segmentation algorithm estimated the mean organ dose to within 10% of the expert segmentation for regions other than the spinal canal, with median error for each organ region below 2%. In the spinal canal region, the median error was 7% across all data sets and atlases, with a maximum error of 20%. The error in peak organ dose was below 10% for all regions, with a median error below 4% for all organ regions. The multiple-case atlas reduced the variation in the dose estimates and additional improvements may be possible with more robust multi-atlas approaches. Overall, the results support potential feasibility of an automated segmentation algorithm to provide accurate organ dose estimates.

  17. Comparison of build-up region doses in oblique tangential 6 MV photon beams calculated by AAA and CCC algorithms in breast Rando phantom

    NASA Astrophysics Data System (ADS)

    Masunun, P.; Tangboonduangjit, P.; Dumrongkijudom, N.

    2016-03-01

    The purpose of this study is to compare the build-up region doses on breast Rando phantom surface with the bolus covered, the doses in breast Rando phantom and also the doses in a lung that is the heterogeneous region by two algorithms. The AAA in Eclipse TPS and the collapsed cone convolution algorithm in Pinnacle treatment planning system were used to plan in tangential field technique with 6 MV photon beam at 200 cGy total doses in Breast Rando phantom with bolus covered (5 mm and 10 mm). TLDs were calibrated with Cobalt-60 and used to measure the doses in irradiation process. The results in treatment planning show that the doses in build-up region and the doses in breast phantom were closely matched in both algorithms which are less than 2% differences. However, overestimate of doses in a lung (L2) were found in AAA with 13.78% and 6.06% differences at 5 mm and 10 mm bolus thickness, respectively when compared with CCC algorithm. The TLD measurements show the underestimate in buildup region and in breast phantom but the doses in a lung (L2) were overestimated when compared with the doses in the two plannings at both thicknesses of the bolus.

  18. SU-F-T-600: Influence of Acuros XB and AAA Dose Calculation Algorithms On Plan Quality Metrics and Normal Lung Doses in Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yaparpalvi, R; Mynampati, D; Kuo, H

    Purpose: To study the influence of superposition-beam model (AAA) and determinant-photon transport-solver (Acuros XB) dose calculation algorithms on the treatment plan quality metrics and on normal lung dose in Lung SBRT. Methods: Treatment plans of 10 Lung SBRT patients were randomly selected. Patients were prescribed to a total dose of 50-54Gy in 3–5 fractions (10?5 or 18?3). Doses were optimized accomplished with 6-MV using 2-arcs (VMAT). Doses were calculated using AAA algorithm with heterogeneity correction. For each plan, plan quality metrics in the categories- coverage, homogeneity, conformity and gradient were quantified. Repeat dosimetry for these AAA treatment plans was performedmore » using AXB algorithm with heterogeneity correction for same beam and MU parameters. Plan quality metrics were again evaluated and compared with AAA plan metrics. For normal lung dose, V{sub 20} and V{sub 5} to (Total lung- GTV) were evaluated. Results: The results are summarized in Supplemental Table 1. PTV volume was mean 11.4 (±3.3) cm{sup 3}. Comparing RTOG 0813 protocol criteria for conformality, AXB plans yielded on average, similar PITV ratio (individual PITV ratio differences varied from −9 to +15%), reduced target coverage (−1.6%) and increased R50% (+2.6%). Comparing normal lung doses, the lung V{sub 20} (+3.1%) and V{sub 5} (+1.5%) were slightly higher for AXB plans compared to AAA plans. High-dose spillage ((V105%PD - PTV)/ PTV) was slightly lower for AXB plans but the % low dose spillage (D2cm) was similar between the two calculation algorithms. Conclusion: AAA algorithm overestimates lung target dose. Routinely adapting to AXB for dose calculations in Lung SBRT planning may improve dose calculation accuracy, as AXB based calculations have been shown to be closer to Monte Carlo based dose predictions in accuracy and with relatively faster computational time. For clinical practice, revisiting dose-fractionation in Lung SBRT to correct for dose overestimates attributable to algorithm may very well be warranted.« less

  19. Sensitivity of NTCP parameter values against a change of dose calculation algorithm.

    PubMed

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-01

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.

  20. Sensitivity of NTCP parameter values against a change of dose calculation algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-15

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis withmore » those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.« less

  1. SU-E-T-516: Dosimetric Validation of AcurosXB Algorithm in Comparison with AAA & CCC Algorithms for VMAT Technique.

    PubMed

    Kathirvel, M; Subramanian, V Sai; Arun, G; Thirumalaiswamy, S; Ramalingam, K; Kumar, S Ashok; Jagadeesh, K

    2012-06-01

    To dosimetrically validate AcurosXB algorithm for Volumetric Modulated Arc Therapy (VMAT) in comparison with standard clinical Anisotropic Analytic Algorithm(AAA) and Collapsed Cone Convolution(CCC) dose calculation algorithms. AcurosXB dose calculation algorithm is available with Varian Eclipse treatment planning system (V10). It uses grid-based Boltzmann equation solver to predict dose precisely in lesser time. This study was made to realize algorithms ability to predict dose accurately as its delivery for which five clinical cases each of Brain, Head&Neck, Thoracic, Pelvic and SBRT were taken. Verification plans were created on multicube phantom with iMatrixx-2D detector array and then dose prediction was done with AcurosXB, AAA & CCC (COMPASS System) algorithm and the same were delivered onto CLINAC-iX treatment machine. Delivered dose was captured in iMatrixx plane for all 25 plans. Measured dose was taken as reference to quantify the agreement between AcurosXB calculation algorithm against previously validated AAA and CCC algorithm. Gamma evaluation was performed with clinical criteria distance-to-agreement 3&2mm and dose difference 3&2% in omnipro-I'MRT software. Plans were evaluated in terms of correlation coefficient, quantitative area gamma and average gamma. Study shows good agreement between mean correlation 0.9979±0.0012, 0.9984±0.0009 & 0.9979±0.0011 for AAA, CCC & Acuros respectively. Mean area gamma for criteria 3mm/3% was found to be 98.80±1.04, 98.14±2.31, 98.08±2.01 and 2mm/2% was found to be 93.94±3.83, 87.17±10.54 & 92.36±5.46 for AAA, CCC & Acuros respectively. Mean average gamma for 3mm/3% was 0.26±0.07, 0.42±0.08, 0.28±0.09 and 2mm/2% was found to be 0.39±0.10, 0.64±0.11, 0.42±0.13 for AAA, CCC & Acuros respectively. This study demonstrated that the AcurosXB algorithm had a good agreement with the AAA & CCC in terms of dose prediction. In conclusion AcurosXB algorithm provides a valid, accurate and speedy alternative to AAA and CCC algorithms in a busy clinical environment. © 2012 American Association of Physicists in Medicine.

  2. Effect of deformable registration on the dose calculated in radiation therapy planning CT scans of lung cancer patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cunliffe, Alexandra R.; Armato, Samuel G.; White, Bradley

    2015-01-15

    Purpose: To characterize the effects of deformable image registration of serial computed tomography (CT) scans on the radiation dose calculated from a treatment planning scan. Methods: Eighteen patients who received curative doses (≥60 Gy, 2 Gy/fraction) of photon radiation therapy for lung cancer treatment were retrospectively identified. For each patient, a diagnostic-quality pretherapy (4–75 days) CT scan and a treatment planning scan with an associated dose map were collected. To establish correspondence between scan pairs, a researcher manually identified anatomically corresponding landmark point pairs between the two scans. Pretherapy scans then were coregistered with planning scans (and associated dose maps)more » using the demons deformable registration algorithm and two variants of the Fraunhofer MEVIS algorithm (“Fast” and “EMPIRE10”). Landmark points in each pretherapy scan were automatically mapped to the planning scan using the displacement vector field output from each of the three algorithms. The Euclidean distance between manually and automatically mapped landmark points (d{sub E}) and the absolute difference in planned dose (|ΔD|) were calculated. Using regression modeling, |ΔD| was modeled as a function of d{sub E}, dose (D), dose standard deviation (SD{sub dose}) in an eight-pixel neighborhood, and the registration algorithm used. Results: Over 1400 landmark point pairs were identified, with 58–93 (median: 84) points identified per patient. Average |ΔD| across patients was 3.5 Gy (range: 0.9–10.6 Gy). Registration accuracy was highest using the Fraunhofer MEVIS EMPIRE10 algorithm, with an average d{sub E} across patients of 5.2 mm (compared with >7 mm for the other two algorithms). Consequently, average |ΔD| was also lowest using the Fraunhofer MEVIS EMPIRE10 algorithm. |ΔD| increased significantly as a function of d{sub E} (0.42 Gy/mm), D (0.05 Gy/Gy), SD{sub dose} (1.4 Gy/Gy), and the algorithm used (≤1 Gy). Conclusions: An average error of <4 Gy in radiation dose was introduced when points were mapped between CT scan pairs using deformable registration, with the majority of points yielding dose-mapping error <2 Gy (approximately 3% of the total prescribed dose). Registration accuracy was highest using the Fraunhofer MEVIS EMPIRE10 algorithm, resulting in the smallest errors in mapped dose. Dose differences following registration increased significantly with increasing spatial registration errors, dose, and dose gradient (i.e., SD{sub dose}). This model provides a measurement of the uncertainty in the radiation dose when points are mapped between serial CT scans through deformable registration.« less

  3. Investigation of photon beam models in heterogeneous media of modern radiotherapy.

    PubMed

    Ding, W; Johnston, P N; Wong, T P Y; Bubb, I F

    2004-06-01

    This study investigates the performance of photon beam models in dose calculations involving heterogeneous media in modern radiotherapy. Three dose calculation algorithms implemented in the CMS FOCUS treatment planning system have been assessed and validated using ionization chambers, thermoluminescent dosimeters (TLDs) and film. The algorithms include the multigrid superposition (MGS) algorithm, fast Fourier Transform Convolution (FFTC) algorithm and Clarkson algorithm. Heterogeneous phantoms used in the study consist of air cavities, lung analogue and an anthropomorphic phantom. Depth dose distributions along the central beam axis for 6 MV and 10 MV photon beams with field sizes of 5 cm x 5 cm and 10 cm x 10 cm were measured in the air cavity phantoms and lung analogue phantom. Point dose measurements were performed in the anthropomorphic phantom. Calculated results with three dose calculation algorithms were compared with measured results. In the air cavity phantoms, the maximum dose differences between the algorithms and the measurements were found at the distal surface of the air cavity with a 10 MV photon beam and a 5 cm x 5 cm field size. The differences were 3.8%. 24.9% and 27.7% for the MGS. FFTC and Clarkson algorithms. respectively. Experimental measurements of secondary electron build-up range beyond the air cavity showed an increase with decreasing field size, increasing energy and increasing air cavity thickness. The maximum dose differences in the lung analogue with 5 cm x 5 cm field size were found to be 0.3%. 4.9% and 6.9% for the MGS. FFTC and Clarkson algorithms with a 6 MV photon beam and 0.4%. 6.3% and 9.1% with a 10 MV photon beam, respectively. In the anthropomorphic phantom, the dose differences between calculations using the MGS algorithm and measurements with TLD rods were less than +/-4.5% for 6 MV and 10 MV photon beams with 10 cm x 10 cm field size and 6 MV photon beam with 5 cm x 5 cm field size, and within +/-7.5% for 10 MV with 5 cm x 5 cm field size, respectively. The FFTC and Clarkson algorithms overestimate doses at all dose points in the lung of the anthropomorphic phantom. In conclusion, the MGS is the most accurate dose calculation algorithm of investigated photon beam models. It is strongly recommended for implementation in modern radiotherapy with multiple small fields when heterogeneous media are in the treatment fields.

  4. Three-Dimensional Electron Beam Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Shiu, Almon Sowchee

    The MDAH pencil-beam algorithm developed by Hogstrom et al (1981) has been widely used in clinics for electron beam dose calculations for radiotherapy treatment planning. The primary objective of this research was to address several deficiencies of that algorithm and to develop an enhanced version. Two enhancements have been incorporated into the pencil-beam algorithm; one models fluence rather than planar fluence, and the other models the bremsstrahlung dose using measured beam data. Comparisons of the resulting calculated dose distributions with measured dose distributions for several test phantoms have been made. From these results it is concluded (1) that the fluence-based algorithm is more accurate to use for the dose calculation in an inhomogeneous slab phantom, and (2) the fluence-based calculation provides only a limited improvement to the accuracy the calculated dose in the region just downstream of the lateral edge of an inhomogeneity. The source of the latter inaccuracy is believed primarily due to assumptions made in the pencil beam's modeling of the complex phantom or patient geometry. A pencil-beam redefinition model was developed for the calculation of electron beam dose distributions in three dimensions. The primary aim of this redefinition model was to solve the dosimetry problem presented by deep inhomogeneities, which was the major deficiency of the enhanced version of the MDAH pencil-beam algorithm. The pencil-beam redefinition model is based on the theory of electron transport by redefining the pencil beams at each layer of the medium. The unique approach of this model is that all the physical parameters of a given pencil beam are characterized for multiple energy bins. Comparisons of the calculated dose distributions with measured dose distributions for a homogeneous water phantom and for phantoms with deep inhomogeneities have been made. From these results it is concluded that the redefinition algorithm is superior to the conventional, fluence-based, pencil-beam algorithm, especially in predicting the dose distribution downstream of a local inhomogeneity. The accuracy of this algorithm appears sufficient for clinical use, and the algorithm is structured for future expansion of the physical model if required for site specific treatment planning problems.

  5. Fatal association of mechanical valve thrombosis with dabigatran: a report of two cases.

    PubMed

    Atar, Shaul; Wishniak, Alice; Shturman, Alexander; Shtiwi, Sewaed; Brezins, Marc

    2013-07-01

    Several new oral anticoagulants have been approved for thromboembolism prevention in patients with nonvalvular atrial fibrillation. However, they are not yet approved for anticoagulation use in patients with prosthetic mechanical valves, and no randomized data have been published so far on their safety of use in these patients. We present two cases of patients with prosthetic mechanical mitral valves who were switched from warfarin and acenocoumarol to dabigatran and within 1 month experienced severe valve complications resulting in death. One patient experienced stroke and later cardiogenic shock and death, and the other experienced pulmonary edema, cardiogenic shock, and subsequent death.

  6. Evaluation of six TPS algorithms in computing entrance and exit doses.

    PubMed

    Tan, Yun I; Metwaly, Mohamed; Glegg, Martin; Baggarley, Shaun; Elliott, Alex

    2014-05-08

    Entrance and exit doses are commonly measured in in vivo dosimetry for comparison with expected values, usually generated by the treatment planning system (TPS), to verify accuracy of treatment delivery. This report aims to evaluate the accuracy of six TPS algorithms in computing entrance and exit doses for a 6 MV beam. The algorithms tested were: pencil beam convolution (Eclipse PBC), analytical anisotropic algorithm (Eclipse AAA), AcurosXB (Eclipse AXB), FFT convolution (XiO Convolution), multigrid superposition (XiO Superposition), and Monte Carlo photon (Monaco MC). Measurements with ionization chamber (IC) and diode detector in water phantoms were used as a reference. Comparisons were done in terms of central axis point dose, 1D relative profiles, and 2D absolute gamma analysis. Entrance doses computed by all TPS algorithms agreed to within 2% of the measured values. Exit doses computed by XiO Convolution, XiO Superposition, Eclipse AXB, and Monaco MC agreed with the IC measured doses to within 2%-3%. Meanwhile, Eclipse PBC and Eclipse AAA computed exit doses were higher than the IC measured doses by up to 5.3% and 4.8%, respectively. Both algorithms assume that full backscatter exists even at the exit level, leading to an overestimation of exit doses. Despite good agreements at the central axis for Eclipse AXB and Monaco MC, 1D relative comparisons showed profiles mismatched at depths beyond 11.5 cm. Overall, the 2D absolute gamma (3%/3 mm) pass rates were better for Monaco MC, while Eclipse AXB failed mostly at the outer 20% of the field area. The findings of this study serve as a useful baseline for the implementation of entrance and exit in vivo dosimetry in clinical departments utilizing any of these six common TPS algorithms for reference comparison.

  7. Short-term reproducibility of computed tomography-based lung density measurements in alpha-1 antitrypsin deficiency and smokers with emphysema.

    PubMed

    Shaker, S B; Dirksen, A; Laursen, L C; Maltbaek, N; Christensen, L; Sander, U; Seersholm, N; Skovgaard, L T; Nielsen, L; Kok-Jensen, A

    2004-07-01

    To study the short-term reproducibility of lung density measurements by multi-slice computed tomography (CT) using three different radiation doses and three reconstruction algorithms. Twenty-five patients with smoker's emphysema and 25 patients with alpha1-antitrypsin deficiency underwent 3 scans at 2-week intervals. Low-dose protocol was applied, and images were reconstructed with bone, detail, and soft algorithms. Total lung volume (TLV), 15th percentile density (PD-15), and relative area at -910 Hounsfield units (RA-910) were obtained from the images using Pulmo-CMS software. Reproducibility of PD-15 and RA-910 and the influence of radiation dose, reconstruction algorithm, and type of emphysema were then analysed. The overall coefficient of variation of volume adjusted PD-15 for all combinations of radiation dose and reconstruction algorithm was 3.7%. The overall standard deviation of volume-adjusted RA-910 was 1.7% (corresponding to a coefficient of variation of 6.8%). Radiation dose, reconstruction algorithm, and type of emphysema had no significant influence on the reproducibility of PD-15 and RA-910. However, bone algorithm and very low radiation dose result in overestimation of the extent of emphysema. Lung density measurement by CT is a sensitive marker for quantitating both subtypes of emphysema. A CT-protocol with radiation dose down to 16 mAs and soft or detail reconstruction algorithm is recommended.

  8. SU-F-J-133: Adaptive Radiation Therapy with a Four-Dimensional Dose Calculation Algorithm That Optimizes Dose Distribution Considering Breathing Motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, I; Algan, O; Ahmad, S

    Purpose: To model patient motion and produce four-dimensional (4D) optimized dose distributions that consider motion-artifacts in the dose calculation during the treatment planning process. Methods: An algorithm for dose calculation is developed where patient motion is considered in dose calculation at the stage of the treatment planning. First, optimal dose distributions are calculated for the stationary target volume where the dose distributions are optimized considering intensity-modulated radiation therapy (IMRT). Second, a convolution-kernel is produced from the best-fitting curve which matches the motion trajectory of the patient. Third, the motion kernel is deconvolved with the initial dose distribution optimized for themore » stationary target to produce a dose distribution that is optimized in four-dimensions. This algorithm is tested with measured doses using a mobile phantom that moves with controlled motion patterns. Results: A motion-optimized dose distribution is obtained from the initial dose distribution of the stationary target by deconvolution with the motion-kernel of the mobile target. This motion-optimized dose distribution is equivalent to that optimized for the stationary target using IMRT. The motion-optimized and measured dose distributions are tested with the gamma index with a passing rate of >95% considering 3% dose-difference and 3mm distance-to-agreement. If the dose delivery per beam takes place over several respiratory cycles, then the spread-out of the dose distributions is only dependent on the motion amplitude and not affected by motion frequency and phase. This algorithm is limited to motion amplitudes that are smaller than the length of the target along the direction of motion. Conclusion: An algorithm is developed to optimize dose in 4D. Besides IMRT that provides optimal dose coverage for a stationary target, it extends dose optimization to 4D considering target motion. This algorithm provides alternative to motion management techniques such as beam-gating or breath-holding and has potential applications in adaptive radiation therapy.« less

  9. Dosimetric comparison of lung stereotactic body radiotherapy treatment plans using averaged computed tomography and end-exhalation computed tomography images: Evaluation of the effect of different dose-calculation algorithms and prescription methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitsuyoshi, Takamasa; Nakamura, Mitsuhiro, E-mail: m_nkmr@kuhp.kyoto-u.ac.jp; Matsuo, Yukinori

    The purpose of this article is to quantitatively evaluate differences in dose distributions calculated using various computed tomography (CT) datasets, dose-calculation algorithms, and prescription methods in stereotactic body radiotherapy (SBRT) for patients with early-stage lung cancer. Data on 29 patients with early-stage lung cancer treated with SBRT were retrospectively analyzed. Averaged CT (Ave-CT) and expiratory CT (Ex-CT) images were reconstructed for each patient using 4-dimensional CT data. Dose distributions were initially calculated using the Ave-CT images and recalculated (in the same monitor units [MUs]) by employing Ex-CT images with the same beam arrangements. The dose-volume parameters, including D{sub 95}, D{submore » 90}, D{sub 50}, and D{sub 2} of the planning target volume (PTV), were compared between the 2 image sets. To explore the influence of dose-calculation algorithms and prescription methods on the differences in dose distributions evident between Ave-CT and Ex-CT images, we calculated dose distributions using the following 3 different algorithms: x-ray Voxel Monte Carlo (XVMC), Acuros XB (AXB), and the anisotropic analytical algorithm (AAA). We also used 2 different dose-prescription methods; the isocenter prescription and the PTV periphery prescription methods. All differences in PTV dose-volume parameters calculated using Ave-CT and Ex-CT data were within 3 percentage points (%pts) employing the isocenter prescription method, and within 1.5%pts using the PTV periphery prescription method, irrespective of which of the 3 algorithms (XVMC, AXB, and AAA) was employed. The frequencies of dose-volume parameters differing by >1%pt when the XVMC and AXB were used were greater than those associated with the use of the AAA, regardless of the dose-prescription method employed. All differences in PTV dose-volume parameters calculated using Ave-CT and Ex-CT data on patients who underwent lung SBRT were within 3%pts, regardless of the dose-calculation algorithm or the dose-prescription method employed.« less

  10. In vivo verification of radiation dose delivered to healthy tissue during radiotherapy for breast cancer

    NASA Astrophysics Data System (ADS)

    Lonski, P.; Taylor, M. L.; Hackworth, W.; Phipps, A.; Franich, R. D.; Kron, T.

    2014-03-01

    Different treatment planning system (TPS) algorithms calculate radiation dose in different ways. This work compares measurements made in vivo to the dose calculated at out-of-field locations using three different commercially available algorithms in the Eclipse treatment planning system. LiF: Mg, Cu, P thermoluminescent dosimeter (TLD) chips were placed with 1 cm build-up at six locations on the contralateral side of 5 patients undergoing radiotherapy for breast cancer. TLD readings were compared to calculations of Pencil Beam Convolution (PBC), Anisotropic Analytical Algorithm (AAA) and Acuros XB (XB). AAA predicted zero dose at points beyond 16 cm from the field edge. In the same region PBC returned an unrealistically constant result independent of distance and XB showed good agreement to measured data although consistently underestimated by ~0.1 % of the prescription dose. At points closer to the field edge XB was the superior algorithm, exhibiting agreement with TLD results to within 15 % of measured dose. Both AAA and PBC showed mixed agreement, with overall discrepancies considerably greater than XB. While XB is certainly the preferable algorithm, it should be noted that TPS algorithms in general are not designed to calculate dose at peripheral locations and calculation results in such regions should be treated with caution.

  11. Midline Dose Verification with Diode In Vivo Dosimetry for External Photon Therapy of Head and Neck and Pelvis Cancers During Initial Large-Field Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tung, Chuan-Jong; Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, Hsinchu, Taiwan; Yu, Pei-Chieh

    2010-01-01

    During radiotherapy treatments, quality assurance/control is essential, particularly dose delivery to patients. This study was designed to verify midline doses with diode in vivo dosimetry. Dosimetry was studied for 6-MV bilateral fields in head and neck cancer treatments and 10-MV bilateral and anteroposterior/posteroanterior (AP/PA) fields in pelvic cancer treatments. Calibrations with corrections of diodes were performed using plastic water phantoms; 190 and 100 portals were studied for head and neck and pelvis treatments, respectively. Calculations of midline doses were made using the midline transmission, arithmetic mean, and geometric mean algorithms. These midline doses were compared with the treatment planning systemmore » target doses for lateral or AP (PA) portals and paired opposed portals. For head and neck treatments, all 3 algorithms were satisfactory, although the geometric mean algorithm was less accurate and more uncertain. For pelvis treatments, the arithmetic mean algorithm seemed unacceptable, whereas the other algorithms were satisfactory. The random error was reduced by using averaged midline doses of paired opposed portals because the asymmetric effect was averaged out. Considering the simplicity of in vivo dosimetry, the arithmetic mean and geometric mean algorithm should be adopted for head/neck and pelvis treatments, respectively.« less

  12. SU-E-T-454: Dosimetric Comparison between Pencil Beam and Monte Carlo Algorithms for SBRT Lung Treatment Using IPlan V4.1 TPS and CIRS Thorax Phantom.

    PubMed

    Fernandez, M Castrillon; Venencia, C; Garrigó, E; Caussa, L

    2012-06-01

    To compare measured and calculated doses using Pencil Beam (PB) and Monte Carlo (MC) algorithm on a CIRS thorax phantom for SBRT lung treatments. A 6MV photon beam generated by a Primus linac with an Optifocus MLC (Siemens) was used. Dose calculation was done using iPlan v4.1.2 TPS (BrainLAB) by PB and MC (dose to water and dose to medium) algorithms. The commissioning of both algorithms was done reproducing experimental measurements in water. A CIRS thorax phantom was used to compare doses using a Farmer type ion chamber (PTW) and EDR2 radiographic films (KODAK). The ionization chamber, into a tissue equivalent insert, was placed in two position of lung tissue and was irradiated using three treatments plans. Axial dose distributions were measured for four treatments plans using conformal and IMRT technique. Dose distribution comparisons were done by dose profiles and gamma index (3%/3mm). For the studied beam configurations, ion chamber measurements shows that PB overestimate the dose up to 8.5%, whereas MC has a maximum variation of 1.6%. Dosimetric analysis using dose profiles shows that PB overestimates the dose in the region corresponding to the lung up to 16%. For axial dose distribution comparison the percentage of pixels with gamma index bigger than one for MC and PB was, plan 1: 95.6% versus 87.4%, plan 2: 91.2% versus 77.6%, plan 3: 99.7% versus 93.1% and for plan 4: 98.8% versus 91.7%. It was confirmed that the lower dosimetric errors calculated applying MC algorithm appears when the spatial resolution and variance decrease at the expense of increased computation time. The agreement between measured and calculated doses, in a phantom with lung heterogeneities, is better with MC algorithm. PB algorithm overestimates the doses in lung tissue, which could have a clinical impact in SBRT lung treatments. © 2012 American Association of Physicists in Medicine.

  13. Evidence-based algorithm for heparin dosing before cardiopulmonary bypass. Part 1: Development of the algorithm.

    PubMed

    McKinney, Mark C; Riley, Jeffrey B

    2007-12-01

    The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time < 450 seconds with > 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.

  14. Algorithms for the optimization of RBE-weighted dose in particle therapy.

    PubMed

    Horcicka, M; Meyer, C; Buschbacher, A; Durante, M; Krämer, M

    2013-01-21

    We report on various algorithms used for the nonlinear optimization of RBE-weighted dose in particle therapy. Concerning the dose calculation carbon ions are considered and biological effects are calculated by the Local Effect Model. Taking biological effects fully into account requires iterative methods to solve the optimization problem. We implemented several additional algorithms into GSI's treatment planning system TRiP98, like the BFGS-algorithm and the method of conjugated gradients, in order to investigate their computational performance. We modified textbook iteration procedures to improve the convergence speed. The performance of the algorithms is presented by convergence in terms of iterations and computation time. We found that the Fletcher-Reeves variant of the method of conjugated gradients is the algorithm with the best computational performance. With this algorithm we could speed up computation times by a factor of 4 compared to the method of steepest descent, which was used before. With our new methods it is possible to optimize complex treatment plans in a few minutes leading to good dose distributions. At the end we discuss future goals concerning dose optimization issues in particle therapy which might benefit from fast optimization solvers.

  15. Algorithms for the optimization of RBE-weighted dose in particle therapy

    NASA Astrophysics Data System (ADS)

    Horcicka, M.; Meyer, C.; Buschbacher, A.; Durante, M.; Krämer, M.

    2013-01-01

    We report on various algorithms used for the nonlinear optimization of RBE-weighted dose in particle therapy. Concerning the dose calculation carbon ions are considered and biological effects are calculated by the Local Effect Model. Taking biological effects fully into account requires iterative methods to solve the optimization problem. We implemented several additional algorithms into GSI's treatment planning system TRiP98, like the BFGS-algorithm and the method of conjugated gradients, in order to investigate their computational performance. We modified textbook iteration procedures to improve the convergence speed. The performance of the algorithms is presented by convergence in terms of iterations and computation time. We found that the Fletcher-Reeves variant of the method of conjugated gradients is the algorithm with the best computational performance. With this algorithm we could speed up computation times by a factor of 4 compared to the method of steepest descent, which was used before. With our new methods it is possible to optimize complex treatment plans in a few minutes leading to good dose distributions. At the end we discuss future goals concerning dose optimization issues in particle therapy which might benefit from fast optimization solvers.

  16. The Creating an Optimal Warfarin Nomogram (CROWN) Study

    PubMed Central

    Perlstein, Todd S.; Goldhaber, Samuel Z.; Nelson, Kerrie; Joshi, Victoria; Morgan, T. Vance; Lesko, Lawrence J.; Lee, Joo-Yeon; Gobburu, Jogarao; Schoenfeld, David; Kucherlapati, Raju; Freeman, Mason W.; Creager, Mark A.

    2014-01-01

    A significant proportion of warfarin dose variability is explained by variation in the genotypes of the cytochrome P450 CYP2C9 and the vitamin K epoxide reductase complex, VKORC1, enzymes that influence warfarin metabolism and sensitivity, respectively. We sought to develop an optimal pharmacogenetic warfarin dosing algorithm that incorporated clinical and genetic information. We enroled patients initiating warfarin therapy. Genotyping was performed of the VKORC1, –1639G>A, the CYP2C9*2, 430C>T, and the CYP2C9*3, 1075C>A genotypes. The initial warfarin dosing algorithm (Algorithm A) was based upon established clinical practice and published warfarin pharmacogenetic information. Subsequent dosing algorithms (Algorithms B and Algorithm C) were derived from pharmacokinetic / pharmacodynamic (PK/PD) modelling of warfarin dose, international normalised ratio (INR), clinical and genetic factors from patients treated by the preceding algorithm(s). The primary outcome was the time in the therapeutic range, considered an INR of 1.8 to 3.2. A total of 344 subjects are included in the study analyses. The mean percentage time within the therapeutic range for each subject increased progressively from Algorithm A to Algorithm C from 58.9 (22.0), to 59.7 (23.0), to 65.8 (16.9) percent (p = 0.04). Improvement also occurred in most secondary endpoints, which included the per-patient percentage of INRs outside of the therapeutic range (p = 0.004), the time to the first therapeutic INR (p = 0.07), and the time to achieve stable therapeutic anticoagulation (p < 0.001). In conclusion, warfarin pharmacogenetic dosing can be optimised in real time utilising observed PK/PD information in an adaptive fashion. Clinical Trial Registration ClinicalTrials.gov (NCT00401414) PMID:22116191

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serin, E.; Codel, G.; Mabhouti, H.

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom.more » Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.« less

  18. Performance of dose calculation algorithms from three generations in lung SBRT: comparison with full Monte Carlo‐based dose distributions

    PubMed Central

    Kapanen, Mika K.; Hyödynmaa, Simo J.; Wigren, Tuija K.; Pitkänen, Maunu A.

    2014-01-01

    The accuracy of dose calculation is a key challenge in stereotactic body radiotherapy (SBRT) of the lung. We have benchmarked three photon beam dose calculation algorithms — pencil beam convolution (PBC), anisotropic analytical algorithm (AAA), and Acuros XB (AXB) — implemented in a commercial treatment planning system (TPS), Varian Eclipse. Dose distributions from full Monte Carlo (MC) simulations were regarded as a reference. In the first stage, for four patients with central lung tumors, treatment plans using 3D conformal radiotherapy (CRT) technique applying 6 MV photon beams were made using the AXB algorithm, with planning criteria according to the Nordic SBRT study group. The plans were recalculated (with same number of monitor units (MUs) and identical field settings) using BEAMnrc and DOSXYZnrc MC codes. The MC‐calculated dose distributions were compared to corresponding AXB‐calculated dose distributions to assess the accuracy of the AXB algorithm, to which then other TPS algorithms were compared. In the second stage, treatment plans were made for ten patients with 3D CRT technique using both the PBC algorithm and the AAA. The plans were recalculated (with same number of MUs and identical field settings) with the AXB algorithm, then compared to original plans. Throughout the study, the comparisons were made as a function of the size of the planning target volume (PTV), using various dose‐volume histogram (DVH) and other parameters to quantitatively assess the plan quality. In the first stage also, 3D gamma analyses with threshold criteria 3%/3 mm and 2%/2 mm were applied. The AXB‐calculated dose distributions showed relatively high level of agreement in the light of 3D gamma analysis and DVH comparison against the full MC simulation, especially with large PTVs, but, with smaller PTVs, larger discrepancies were found. Gamma agreement index (GAI) values between 95.5% and 99.6% for all the plans with the threshold criteria 3%/3 mm were achieved, but 2%/2 mm threshold criteria showed larger discrepancies. The TPS algorithm comparison results showed large dose discrepancies in the PTV mean dose (D50%), nearly 60%, for the PBC algorithm, and differences of nearly 20% for the AAA, occurring also in the small PTV size range. This work suggests the application of independent plan verification, when the AAA or the AXB algorithm are utilized in lung SBRT having PTVs smaller than 20‐25 cc. The calculated data from this study can be used in converting the SBRT protocols based on type ‘a’ and/or type ‘b’ algorithms for the most recent generation type ‘c’ algorithms, such as the AXB algorithm. PACS numbers: 87.55.‐x, 87.55.D‐, 87.55.K‐, 87.55.kd, 87.55.Qr PMID:24710454

  19. SU-E-T-371: Evaluating the Convolution Algorithm of a Commercially Available Radiosurgery Irradiator Using a Novel Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cates, J; Drzymala, R

    2015-06-15

    Purpose: The purpose of this study was to develop and use a novel phantom to evaluate the accuracy and usefulness of the Leskell Gamma Plan convolution-based dose calculation algorithm compared with the current TMR10 algorithm. Methods: A novel phantom was designed to fit the Leskell Gamma Knife G Frame which could accommodate various materials in the form of one inch diameter, cylindrical plugs. The plugs were split axially to allow EBT2 film placement. Film measurements were made during two experiments. The first utilized plans generated on a homogeneous acrylic phantom setup using the TMR10 algorithm, with various materials inserted intomore » the phantom during film irradiation to assess the effect on delivered dose due to unplanned heterogeneities upstream in the beam path. The second experiment utilized plans made on CT scans of different heterogeneous setups, with one plan using the TMR10 dose calculation algorithm and the second using the convolution-based algorithm. Materials used to introduce heterogeneities included air, LDPE, polystyrene, Delrin, Teflon, and aluminum. Results: The data shows that, as would be expected, having heterogeneities in the beam path does induce dose delivery error when using the TMR10 algorithm, with the largest errors being due to the heterogeneities with electron densities most different from that of water, i.e. air, Teflon, and aluminum. Additionally, the Convolution algorithm did account for the heterogeneous material and provided a more accurate predicted dose, in extreme cases up to a 7–12% improvement over the TMR10 algorithm. The convolution algorithm expected dose was accurate to within 3% in all cases. Conclusion: This study proves that the convolution algorithm is an improvement over the TMR10 algorithm when heterogeneities are present. More work is needed to determine what the heterogeneity size/volume limits are where this improvement exists, and in what clinical and/or research cases this would be relevant.« less

  20. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.

    PubMed

    Duconge, Jorge; Ramos, Alga S; Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y; Cadilla, Carmen L; Cruz, Iadelisse; Feliu, Juan F; Vergara, Cunegundo; Ruaño, Gualberto

    2016-01-01

    This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001). The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. ClinicalTrials.gov NCT01318057.

  1. A comparison of two dose calculation algorithms-anisotropic analytical algorithm and Acuros XB-for radiation therapy planning of canine intranasal tumors.

    PubMed

    Nagata, Koichi; Pethel, Timothy D

    2017-07-01

    Although anisotropic analytical algorithm (AAA) and Acuros XB (AXB) are both radiation dose calculation algorithms that take into account the heterogeneity within the radiation field, Acuros XB is inherently more accurate. The purpose of this retrospective method comparison study was to compare them and evaluate the dose discrepancy within the planning target volume (PTV). Radiation therapy (RT) plans of 11 dogs with intranasal tumors treated by radiation therapy at the University of Georgia were evaluated. All dogs were planned for intensity-modulated radiation therapy using nine coplanar X-ray beams that were equally spaced, then dose calculated with anisotropic analytical algorithm. The same plan with the same monitor units was then recalculated using Acuros XB for comparisons. Each dog's planning target volume was separated into air, bone, and tissue and evaluated. The mean dose to the planning target volume estimated by Acuros XB was 1.3% lower. It was 1.4% higher for air, 3.7% lower for bone, and 0.9% lower for tissue. The volume of planning target volume covered by the prescribed dose decreased by 21% when Acuros XB was used due to increased dose heterogeneity within the planning target volume. Anisotropic analytical algorithm relatively underestimates the dose heterogeneity and relatively overestimates the dose to the bone and tissue within the planning target volume for the radiation therapy planning of canine intranasal tumors. This can be clinically significant especially if the tumor cells are present within the bone, because it may result in relative underdosing of the tumor. © 2017 American College of Veterinary Radiology.

  2. Evaluation of six TPS algorithms in computing entrance and exit doses

    PubMed Central

    Metwaly, Mohamed; Glegg, Martin; Baggarley, Shaun P.; Elliott, Alex

    2014-01-01

    Entrance and exit doses are commonly measured in in vivo dosimetry for comparison with expected values, usually generated by the treatment planning system (TPS), to verify accuracy of treatment delivery. This report aims to evaluate the accuracy of six TPS algorithms in computing entrance and exit doses for a 6 MV beam. The algorithms tested were: pencil beam convolution (Eclipse PBC), analytical anisotropic algorithm (Eclipse AAA), AcurosXB (Eclipse AXB), FFT convolution (XiO Convolution), multigrid superposition (XiO Superposition), and Monte Carlo photon (Monaco MC). Measurements with ionization chamber (IC) and diode detector in water phantoms were used as a reference. Comparisons were done in terms of central axis point dose, 1D relative profiles, and 2D absolute gamma analysis. Entrance doses computed by all TPS algorithms agreed to within 2% of the measured values. Exit doses computed by XiO Convolution, XiO Superposition, Eclipse AXB, and Monaco MC agreed with the IC measured doses to within 2%‐3%. Meanwhile, Eclipse PBC and Eclipse AAA computed exit doses were higher than the IC measured doses by up to 5.3% and 4.8%, respectively. Both algorithms assume that full backscatter exists even at the exit level, leading to an overestimation of exit doses. Despite good agreements at the central axis for Eclipse AXB and Monaco MC, 1D relative comparisons showed profiles mismatched at depths beyond 11.5 cm. Overall, the 2D absolute gamma (3%/3 mm) pass rates were better for Monaco MC, while Eclipse AXB failed mostly at the outer 20% of the field area. The findings of this study serve as a useful baseline for the implementation of entrance and exit in vivo dosimetry in clinical departments utilizing any of these six common TPS algorithms for reference comparison. PACS numbers: 87.55.‐x, 87.55.D‐, 87.55.N‐, 87.53.Bn PMID:24892349

  3. Agreement between gamma passing rates using computed tomography in radiotherapy and secondary cancer risk prediction from more advanced dose calculated models

    PubMed Central

    Balosso, Jacques

    2017-01-01

    Background During the past decades, in radiotherapy, the dose distributions were calculated using density correction methods with pencil beam as type ‘a’ algorithm. The objectives of this study are to assess and evaluate the impact of dose distribution shift on the predicted secondary cancer risk (SCR), using modern advanced dose calculation algorithms, point kernel, as type ‘b’, which consider change in lateral electrons transport. Methods Clinical examples of pediatric cranio-spinal irradiation patients were evaluated. For each case, two radiotherapy treatment plans with were generated using the same prescribed dose to the target resulting in different number of monitor units (MUs) per field. The dose distributions were calculated, respectively, using both algorithms types. A gamma index (γ) analysis was used to compare dose distribution in the lung. The organ equivalent dose (OED) has been calculated with three different models, the linear, the linear-exponential and the plateau dose response curves. The excess absolute risk ratio (EAR) was also evaluated as (EAR = OED type ‘b’ / OED type ‘a’). Results The γ analysis results indicated an acceptable dose distribution agreement of 95% with 3%/3 mm. Although, the γ-maps displayed dose displacement >1 mm around the healthy lungs. Compared to type ‘a’, the OED values from type ‘b’ dose distributions’ were about 8% to 16% higher, leading to an EAR ratio >1, ranged from 1.08 to 1.13 depending on SCR models. Conclusions The shift of dose calculation in radiotherapy, according to the algorithm, can significantly influence the SCR prediction and the plan optimization, since OEDs are calculated from DVH for a specific treatment. The agreement between dose distribution and SCR prediction depends on dose response models and epidemiological data. In addition, the γ passing rates of 3%/3 mm does not translate the difference, up to 15%, in the predictions of SCR resulting from alternative algorithms. Considering that modern algorithms are more accurate, showing more precisely the dose distributions, but that the prediction of absolute SCR is still very imprecise, only the EAR ratio could be used to rank radiotherapy plans. PMID:28811995

  4. The energy-dependent electron loss model: backscattering and application to heterogeneous slab media.

    PubMed

    Lee, Tae Kyu; Sandison, George A

    2003-01-21

    Electron backscattering has been incorporated into the energy-dependent electron loss (EL) model and the resulting algorithm is applied to predict dose deposition in slab heterogeneous media. This algorithm utilizes a reflection coefficient from the interface that is computed on the basis of Goudsmit-Saunderson theory and an average energy for the backscattered electrons based on Everhart's theory. Predictions of dose deposition in slab heterogeneous media are compared to the Monte Carlo based dose planning method (DPM) and a numerical discrete ordinates method (DOM). The slab media studied comprised water/Pb, water/Al, water/bone, water/bone/water, and water/lung/water, and incident electron beam energies of 10 MeV and 18 MeV. The predicted dose enhancement due to backscattering is accurate to within 3% of dose maximum even for lead as the backscattering medium. Dose discrepancies at large depths beyond the interface were as high as 5% of dose maximum and we speculate that this error may be attributed to the EL model assuming a Gaussian energy distribution for the electrons at depth. The computational cost is low compared to Monte Carlo simulations making the EL model attractive as a fast dose engine for dose optimization algorithms. The predictive power of the algorithm demonstrates that the small angle scattering restriction on the EL model can be overcome while retaining dose calculation accuracy and requiring only one free variable, chi, in the algorithm to be determined in advance of calculation.

  5. SU-F-T-444: Quality Improvement Review of Radiation Therapy Treatment Planning in the Presence of Dental Implants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parenica, H; Ford, J; Mavroidis, P

    Purpose: To quantify and compare the effect of metallic dental implants (MDI) on dose distributions calculated using Collapsed Cone Convolution Superposition (CCCS) algorithm or a Monte Carlo algorithm (with and without correcting for the density of the MDI). Methods: Seven previously treated patients to the head and neck region were included in this study. The MDI and the streaking artifacts on the CT images were carefully contoured. For each patient a plan was optimized and calculated using the Pinnacle3 treatment planning system (TPS). For each patient two dose calculations were performed, a) with the densities of the MDI and CTmore » artifacts overridden (12 g/cc and 1 g/cc respectively) and b) without density overrides. The plans were then exported to the Monaco TPS and recalculated using Monte Carlo dose calculation algorithm. The changes in dose to PTVs and surrounding Regions of Interest (ROIs) were examined between all plans. Results: The Monte Carlo dose calculation indicated that PTVs received 6% lower dose than the CCCS algorithm predicted. In some cases, the Monte Carlo algorithm indicated that surrounding ROIs received higher dose (up to a factor of 2). Conclusion: Not properly accounting for dental implants can impact both the high dose regions (PTV) and the low dose regions (OAR). This study implies that if MDI and the artifacts are not appropriately contoured and given the correct density, there is potential significant impact on PTV coverage and OAR maximum doses.« less

  6. The energy-dependent electron loss model: backscattering and application to heterogeneous slab media

    NASA Astrophysics Data System (ADS)

    Lee, Tae Kyu; Sandison, George A.

    2003-01-01

    Electron backscattering has been incorporated into the energy-dependent electron loss (EL) model and the resulting algorithm is applied to predict dose deposition in slab heterogeneous media. This algorithm utilizes a reflection coefficient from the interface that is computed on the basis of Goudsmit-Saunderson theory and an average energy for the backscattered electrons based on Everhart's theory. Predictions of dose deposition in slab heterogeneous media are compared to the Monte Carlo based dose planning method (DPM) and a numerical discrete ordinates method (DOM). The slab media studied comprised water/Pb, water/Al, water/bone, water/bone/water, and water/lung/water, and incident electron beam energies of 10 MeV and 18 MeV. The predicted dose enhancement due to backscattering is accurate to within 3% of dose maximum even for lead as the backscattering medium. Dose discrepancies at large depths beyond the interface were as high as 5% of dose maximum and we speculate that this error may be attributed to the EL model assuming a Gaussian energy distribution for the electrons at depth. The computational cost is low compared to Monte Carlo simulations making the EL model attractive as a fast dose engine for dose optimization algorithms. The predictive power of the algorithm demonstrates that the small angle scattering restriction on the EL model can be overcome while retaining dose calculation accuracy and requiring only one free variable, χ, in the algorithm to be determined in advance of calculation.

  7. Dosimetric evaluation of a commercial proton spot scanning Monte-Carlo dose algorithm: comparisons against measurements and simulations

    NASA Astrophysics Data System (ADS)

    Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R.; St. James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles

    2017-10-01

    RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within  ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and  >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within  ±3% and distal fall-off to within 2 mm. In an anthropomorphic phantom, the gamma index (dose tolerance  =  3%, distance-to-agreement  =  3 mm) was greater than 90% for six out of seven planes using the RS-MC, and three out seven for the RS-PBA. The RS-MC algorithm demonstrated improved dosimetric accuracy over the RS-PBA in the presence of homogenous, heterogeneous and anthropomorphic phantoms. The computation performance of the RS-MC was similar to the RS-PBA algorithm. For complex disease sites like breast, head and neck, and lung cancer, the RS-MC algorithm will provide significantly more accurate treatment planning.

  8. Dosimetric evaluation of a commercial proton spot scanning Monte-Carlo dose algorithm: comparisons against measurements and simulations.

    PubMed

    Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R; St James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles

    2017-09-12

    RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within  ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and  >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within  ±3% and distal fall-off to within 2 mm. In an anthropomorphic phantom, the gamma index (dose tolerance  =  3%, distance-to-agreement  =  3 mm) was greater than 90% for six out of seven planes using the RS-MC, and three out seven for the RS-PBA. The RS-MC algorithm demonstrated improved dosimetric accuracy over the RS-PBA in the presence of homogenous, heterogeneous and anthropomorphic phantoms. The computation performance of the RS-MC was similar to the RS-PBA algorithm. For complex disease sites like breast, head and neck, and lung cancer, the RS-MC algorithm will provide significantly more accurate treatment planning.

  9. Dose specification for radiation therapy: dose to water or dose to medium?

    NASA Astrophysics Data System (ADS)

    Ma, C.-M.; Li, Jinsheng

    2011-05-01

    The Monte Carlo method enables accurate dose calculation for radiation therapy treatment planning and has been implemented in some commercial treatment planning systems. Unlike conventional dose calculation algorithms that provide patient dose information in terms of dose to water with variable electron density, the Monte Carlo method calculates the energy deposition in different media and expresses dose to a medium. This paper discusses the differences in dose calculated using water with different electron densities and that calculated for different biological media and the clinical issues on dose specification including dose prescription and plan evaluation using dose to water and dose to medium. We will demonstrate that conventional photon dose calculation algorithms compute doses similar to those simulated by Monte Carlo using water with different electron densities, which are close (<4% differences) to doses to media but significantly different (up to 11%) from doses to water converted from doses to media following American Association of Physicists in Medicine (AAPM) Task Group 105 recommendations. Our results suggest that for consistency with previous radiation therapy experience Monte Carlo photon algorithms report dose to medium for radiotherapy dose prescription, treatment plan evaluation and treatment outcome analysis.

  10. Evaluation of the Eclipse eMC algorithm for bolus electron conformal therapy using a standard verification dataset.

    PubMed

    Carver, Robert L; Sprunger, Conrad P; Hogstrom, Kenneth R; Popple, Richard A; Antolak, John A

    2016-05-08

    The purpose of this study was to evaluate the accuracy and calculation speed of electron dose distributions calculated by the Eclipse electron Monte Carlo (eMC) algorithm for use with bolus electron conformal therapy (ECT). The recent com-mercial availability of bolus ECT technology requires further validation of the eMC dose calculation algorithm. eMC-calculated electron dose distributions for bolus ECT have been compared to previously measured TLD-dose points throughout patient-based cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV (planning treatment volume) CT anatomy. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The treatment plans were imported into the Eclipse treatment planning system, and electron dose distributions calculated using 1% and < 0.2% statistical uncertainties. The accuracy of the dose calculations using moderate smoothing and no smooth-ing were evaluated. Dose differences (eMC-calculated less measured dose) were evaluated in terms of absolute dose difference, where 100% equals the given dose, as well as distance to agreement (DTA). Dose calculations were also evaluated for calculation speed. Results from the eMC for the retromolar trigone phantom using 1% statistical uncertainty without smoothing showed calculated dose at 89% (41/46) of the measured TLD-dose points was within 3% dose difference or 3 mm DTA of the measured value. The average dose difference was -0.21%, and the net standard deviation was 2.32%. Differences as large as 3.7% occurred immediately distal to the mandible bone. Results for the nose phantom, using 1% statistical uncertainty without smoothing, showed calculated dose at 93% (53/57) of the measured TLD-dose points within 3% dose difference or 3 mm DTA. The average dose difference was 1.08%, and the net standard deviation was 3.17%. Differences as large as 10% occurred lateral to the nasal air cavities. Including smoothing had insignificant effects on the accuracy of the retromolar trigone phantom calculations, but reduced the accuracy of the nose phantom calculations in the high-gradient dose areas. Dose calculation times with 1% statistical uncertainty for the retromolar trigone and nose treatment plans were 30 s and 24 s, respectively, using 16 processors (Intel Xeon E5-2690, 2.9 GHz) on a framework agent server (FAS). In comparison, the eMC was significantly more accurate than the pencil beam algorithm (PBA). The eMC has comparable accuracy to the pencil beam redefinition algorithm (PBRA) used for bolus ECT planning and has acceptably low dose calculation times. The eMC accuracy decreased when smoothing was used in high-gradient dose regions. The eMC accuracy was consistent with that previously reported for accuracy of the eMC electron dose algorithm and shows that the algorithm is suitable for clinical implementation of bolus ECT.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrower, Sara L., E-mail: slloupot@mdanderson.org; Shaitelman, Simona F.; Bloom, Elizabeth

    Purpose: To compare the treatment plans for accelerated partial breast irradiation calculated by the new commercially available collapsed cone convolution (CCC) and current standard TG-43–based algorithms for 50 patients treated at our institution with either a Strut-Adjusted Volume Implant (SAVI) or Contura device. Methods and Materials: We recalculated target coverage, volume of highly dosed normal tissue, and dose to organs at risk (ribs, skin, and lung) with each algorithm. For 1 case an artificial air pocket was added to simulate 10% nonconformance. We performed a Wilcoxon signed rank test to determine the median differences in the clinical indices V90, V95, V100,more » V150, V200, and highest-dosed 0.1 cm{sup 3} and 1.0 cm{sup 3} of rib, skin, and lung between the two algorithms. Results: The CCC algorithm calculated lower values on average for all dose-volume histogram parameters. Across the entire patient cohort, the median difference in the clinical indices calculated by the 2 algorithms was <10% for dose to organs at risk, <5% for target volume coverage (V90, V95, and V100), and <4 cm{sup 3} for dose to normal breast tissue (V150 and V200). No discernable difference was seen in the nonconformance case. Conclusions: We found that on average over our patient population CCC calculated (<10%) lower doses than TG-43. These results should inform clinicians as they prepare for the transition to heterogeneous dose calculation algorithms and determine whether clinical tolerance limits warrant modification.« less

  12. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics

    PubMed Central

    Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y.; Cadilla, Carmen L.; Cruz, Iadelisse; Feliu, Juan F.; Vergara, Cunegundo; Ruaño, Gualberto

    2016-01-01

    Aim This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. Patients & Methods A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. Results The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001). The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Conclusions Results supported our rationale to incorporate individual’s genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. Trial Registration ClinicalTrials.gov NCT01318057 PMID:26745506

  13. WE-AB-207B-05: Correlation of Normal Lung Density Changes with Dose After Stereotactic Body Radiotherapy (SBRT) for Early Stage Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Q; Devpura, S; Feghali, K

    2016-06-15

    Purpose: To investigate correlation of normal lung CT density changes with dose accuracy and outcome after SBRT for patients with early stage lung cancer. Methods: Dose distributions for patients originally planned and treated using a 1-D pencil beam-based (PB-1D) dose algorithm were retrospectively recomputed using algorithms: 3-D pencil beam (PB-3D), and model-based Methods: AAA, Acuros XB (AXB), and Monte Carlo (MC). Prescription dose was 12 Gy × 4 fractions. Planning CT images were rigidly registered to the followup CT datasets at 6–9 months after treatment. Corresponding dose distributions were mapped from the planning to followup CT images. Following the methodmore » of Palma et al .(1–2), Hounsfield Unit (HU) changes in lung density in individual, 5 Gy, dose bins from 5–45 Gy were assessed in the peri-tumor region, defined as a uniform, 3 cm expansion around the ITV(1). Results: There is a 10–15% displacement of the high dose region (40–45 Gy) with the model-based algorithms, relative to the PB method, due to the electron scattering of dose away from the tumor into normal lung tissue (Fig.1). Consequently, the high-dose lung region falls within the 40–45 Gy dose range, causing an increase in HU change in this region, as predicted by model-based algorithms (Fig.2). The patient with the highest HU change (∼110) had mild radiation pneumonitis, and the patient with HU change of ∼80–90 had shortness of breath. No evidence of pneumonitis was observed for the 3 patients with smaller CT density changes (<50 HU). Changes in CT densities, and dose-response correlation, as computed with model-based algorithms, are in excellent agreement with the findings of Palma et al. (1–2). Conclusion: Dose computed with PB (1D or 3D) algorithms was poorly correlated with clinically relevant CT density changes, as opposed to model-based algorithms. A larger cohort of patients is needed to confirm these results. This work was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  14. Superficial dose evaluation of four dose calculation algorithms

    NASA Astrophysics Data System (ADS)

    Cao, Ying; Yang, Xiaoyu; Yang, Zhen; Qiu, Xiaoping; Lv, Zhiping; Lei, Mingjun; Liu, Gui; Zhang, Zijian; Hu, Yongmei

    2017-08-01

    Accurate superficial dose calculation is of major importance because of the skin toxicity in radiotherapy, especially within the initial 2 mm depth being considered more clinically relevant. The aim of this study is to evaluate superficial dose calculation accuracy of four commonly used algorithms in commercially available treatment planning systems (TPS) by Monte Carlo (MC) simulation and film measurements. The superficial dose in a simple geometrical phantom with size of 30 cm×30 cm×30 cm was calculated by PBC (Pencil Beam Convolution), AAA (Analytical Anisotropic Algorithm), AXB (Acuros XB) in Eclipse system and CCC (Collapsed Cone Convolution) in Raystation system under the conditions of source to surface distance (SSD) of 100 cm and field size (FS) of 10×10 cm2. EGSnrc (BEAMnrc/DOSXYZnrc) program was performed to simulate the central axis dose distribution of Varian Trilogy accelerator, combined with measurements of superficial dose distribution by an extrapolation method of multilayer radiochromic films, to estimate the dose calculation accuracy of four algorithms in the superficial region which was recommended in detail by the ICRU (International Commission on Radiation Units and Measurement) and the ICRP (International Commission on Radiological Protection). In superficial region, good agreement was achieved between MC simulation and film extrapolation method, with the mean differences less than 1%, 2% and 5% for 0°, 30° and 60°, respectively. The relative skin dose errors were 0.84%, 1.88% and 3.90%; the mean dose discrepancies (0°, 30° and 60°) between each of four algorithms and MC simulation were (2.41±1.55%, 3.11±2.40%, and 1.53±1.05%), (3.09±3.00%, 3.10±3.01%, and 3.77±3.59%), (3.16±1.50%, 8.70±2.84%, and 18.20±4.10%) and (14.45±4.66%, 10.74±4.54%, and 3.34±3.26%) for AXB, CCC, AAA and PBC respectively. Monte Carlo simulation verified the feasibility of the superficial dose measurements by multilayer Gafchromic films. And the rank of superficial dose calculation accuracy of four algorithms was AXB>CCC>AAA>PBC. Care should be taken when using the AAA and PBC algorithms in the superficial dose calculation.

  15. Accuracy assessment of pharmacogenetically predictive warfarin dosing algorithms in patients of an academic medical center anticoagulation clinic.

    PubMed

    Shaw, Paul B; Donovan, Jennifer L; Tran, Maichi T; Lemon, Stephenie C; Burgwinkle, Pamela; Gore, Joel

    2010-08-01

    The objectives of this retrospective cohort study are to evaluate the accuracy of pharmacogenetic warfarin dosing algorithms in predicting therapeutic dose and to determine if this degree of accuracy warrants the routine use of genotyping to prospectively dose patients newly started on warfarin. Seventy-one patients of an outpatient anticoagulation clinic at an academic medical center who were age 18 years or older on a stable, therapeutic warfarin dose with international normalized ratio (INR) goal between 2.0 and 3.0, and cytochrome P450 isoenzyme 2C9 (CYP2C9) and vitamin K epoxide reductase complex subunit 1 (VKORC1) genotypes available between January 1, 2007 and September 30, 2008 were included. Six pharmacogenetic warfarin dosing algorithms were identified from the medical literature. Additionally, a 5 mg fixed dose approach was evaluated. Three algorithms, Zhu et al. (Clin Chem 53:1199-1205, 2007), Gage et al. (J Clin Ther 84:326-331, 2008), and International Warfarin Pharmacogenetic Consortium (IWPC) (N Engl J Med 360:753-764, 2009) were similar in the primary accuracy endpoints with mean absolute error (MAE) ranging from 1.7 to 1.8 mg/day and coefficient of determination R (2) from 0.61 to 0.66. However, the Zhu et al. algorithm severely over-predicted dose (defined as >or=2x or >or=2 mg/day more than actual dose) in twice as many (14 vs. 7%) patients as Gage et al. 2008 and IWPC 2009. In conclusion, the algorithms published by Gage et al. 2008 and the IWPC 2009 were the two most accurate pharmacogenetically based equations available in the medical literature in predicting therapeutic warfarin dose in our study population. However, the degree of accuracy demonstrated does not support the routine use of genotyping to prospectively dose all patients newly started on warfarin.

  16. TU-A-12A-07: CT-Based Biomarkers to Characterize Lung Lesion: Effects of CT Dose, Slice Thickness and Reconstruction Algorithm Based Upon a Phantom Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, B; Tan, Y; Tsai, W

    2014-06-15

    Purpose: Radiogenomics promises the ability to study cancer tumor genotype from the phenotype obtained through radiographic imaging. However, little attention has been paid to the sensitivity of image features, the image-based biomarkers, to imaging acquisition techniques. This study explores the impact of CT dose, slice thickness and reconstruction algorithm on measuring image features using a thorax phantom. Methods: Twentyfour phantom lesions of known volume (1 and 2mm), shape (spherical, elliptical, lobular and spicular) and density (-630, -10 and +100 HU) were scanned on a GE VCT at four doses (25, 50, 100, and 200 mAs). For each scan, six imagemore » series were reconstructed at three slice thicknesses of 5, 2.5 and 1.25mm with continuous intervals, using the lung and standard reconstruction algorithms. The lesions were segmented with an in-house 3D algorithm. Fifty (50) image features representing lesion size, shape, edge, and density distribution/texture were computed. Regression method was employed to analyze the effect of CT dose, slice of thickness and reconstruction algorithm on these features adjusting 3 confounding factors (size, density and shape of phantom lesions). Results: The coefficients of CT dose, slice thickness and reconstruction algorithm are presented in Table 1 in the supplementary material. No significant difference was found between the image features calculated on low dose CT scans (25mAs and 50mAs). About 50% texture features were found statistically different between low doses and high doses (100 and 200mAs). Significant differences were found for almost all features when calculated on 1.25mm, 2.5mm, and 5mm slice thickness images. Reconstruction algorithms significantly affected all density-based image features, but not morphological features. Conclusions: There is a great need to standardize the CT imaging protocols for radiogenomics study because CT dose, slice thickness and reconstruction algorithm impact quantitative image features to various degrees as our study has shown.« less

  17. Pilot validation of an individualised pharmacokinetic algorithm for protamine dosing after systemic heparinisation for cardiopulmonary bypass.

    PubMed

    Miles, Lachlan F; Marchiori, Paolo; Falter, Florian

    2017-09-01

    This manuscript represents a pilot study assessing the feasibility of a single-compartment, individualised, pharmacokinetic algorithm for protamine dosing after cardiopulmonary bypass. A pilot cohort study in a specialist NHS cardiothoracic hospital targeting patients undergoing elective cardiac surgery using cardiopulmonary bypass. Patients received protamine doses according to a pharmacokinetic algorithm (n = 30) or using an empirical, fixed-dose model (n = 30). Categorical differences between the groups were evaluated using the Chi-squared test or Fisher's exact test. Continuous data was analysed using a paired Student's t-test for parametric data and the paired samples Wilcoxon test for non-parametric data. Patients who had protamine dosing according to the algorithm demonstrated a lower protamine requirement post-bypass relative to empirical management as measured by absolute dose (243 ± 49mg vs. 305 ± 34.7mg; p<0.001) and the heparin to protamine ratio (0.79 ± 0.12 vs. 1.1 ± 0.15; p<0.001). There was no difference in the pre- to post-bypass activated clotting time (ACT) ratio (1.05 ± 0.12 vs. 1.02 ± 0.15; p=0.9). Patients who received protamine according to the algorithm had no significant difference in transfusion requirement (13.3% vs. 30.0%; p=0.21). This study showed that an individualized pharmacokinetic algorithm for the reversal of heparin after cardiopulmonary bypass is feasible in comparison with a fixed dosing strategy and may reduce the protamine requirement following on-pump cardiac surgery.

  18. A comparison of the convolution and TMR10 treatment planning algorithms for Gamma Knife® radiosurgery

    PubMed Central

    Wright, Gavin; Harrold, Natalie; Bownes, Peter

    2018-01-01

    Aims To compare the accuracies of the convolution and TMR10 Gamma Knife treatment planning algorithms, and assess the impact upon clinical practice of implementing convolution-based treatment planning. Methods Doses calculated by both algorithms were compared against ionisation chamber measurements in homogeneous and heterogeneous phantoms. Relative dose distributions calculated by both algorithms were compared against film-derived 2D isodose plots in a heterogeneous phantom, with distance-to-agreement (DTA) measured at the 80%, 50% and 20% isodose levels. A retrospective planning study compared 19 clinically acceptable metastasis convolution plans against TMR10 plans with matched shot times, allowing novel comparison of true dosimetric parameters rather than total beam-on-time. Gamma analysis and dose-difference analysis were performed on each pair of dose distributions. Results Both algorithms matched point dose measurement within ±1.1% in homogeneous conditions. Convolution provided superior point-dose accuracy in the heterogeneous phantom (-1.1% v 4.0%), with no discernible differences in relative dose distribution accuracy. In our study convolution-calculated plans yielded D99% 6.4% (95% CI:5.5%-7.3%,p<0.001) less than shot matched TMR10 plans. For gamma passing criteria 1%/1mm, 16% of targets had passing rates >95%. The range of dose differences in the targets was 0.2-4.6Gy. Conclusions Convolution provides superior accuracy versus TMR10 in heterogeneous conditions. Implementing convolution would result in increased target doses therefore its implementation may require a revaluation of prescription doses. PMID:29657896

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paudel, M R; Beachey, D J; Sarfehnia, A

    Purpose: A new commercial GPU-based Monte Carlo dose calculation algorithm (GPUMCD) developed by the vendor Elekta™ to be used in the Monaco Treatment Planning System (TPS) is capable of modeling dose for both a standard linear accelerator and for an Elekta MRI-Linear accelerator (modeling magnetic field effects). We are evaluating this algorithm in two parts: commissioning the algorithm for an Elekta Agility linear accelerator (the focus of this work) and evaluating the algorithm’s ability to model magnetic field effects for an MRI-linear accelerator. Methods: A beam model was developed in the Monaco TPS (v.5.09.06) using the commissioned beam data formore » a 6MV Agility linac. A heterogeneous phantom representing tumor-in-lung, lung, bone-in-tissue, and prosthetic was designed/built. Dose calculations in Monaco were done using the current clinical algorithm (XVMC) and the new GPUMCD algorithm (1 mm3 voxel size, 0.5% statistical uncertainty) and in the Pinnacle TPS using the collapsed cone convolution (CCC) algorithm. These were compared with the measured doses using an ionization chamber (A1SL) and Gafchromic EBT3 films for 2×2 cm{sup 2}, 5×5 cm{sup 2}, and 10×10 cm{sup 2} field sizes. Results: The calculated central axis percentage depth doses (PDDs) in homogeneous solid water were within 2% compared to measurements for XVMC and GPUMCD. For tumor-in-lung and lung phantoms, doses calculated by all of the algorithms were within the experimental uncertainty of the measurements (±2% in the homogeneous phantom and ±3% for the tumor-in-lung or lung phantoms), except for 2×2 cm{sup 2} field size where only the CCC algorithm differs from film by 5% in the lung region. The analysis for bone-in-tissue and the prosthetic phantoms are ongoing. Conclusion: The new GPUMCD algorithm calculated dose comparable to both the XVMC algorithm and to measurements in both a homogeneous solid water medium and the heterogeneous phantom representing lung or tumor-in-lung for 2×2 cm{sup 2}-10×10 cm{sup 2} field sizes. Funding support was obtained from Elekta.« less

  20. Speed and convergence properties of gradient algorithms for optimization of IMRT.

    PubMed

    Zhang, Xiaodong; Liu, Helen; Wang, Xiaochun; Dong, Lei; Wu, Qiuwen; Mohan, Radhe

    2004-05-01

    Gradient algorithms are the most commonly employed search methods in the routine optimization of IMRT plans. It is well known that local minima can exist for dose-volume-based and biology-based objective functions. The purpose of this paper is to compare the relative speed of different gradient algorithms, to investigate the strategies for accelerating the optimization process, to assess the validity of these strategies, and to study the convergence properties of these algorithms for dose-volume and biological objective functions. With these aims in mind, we implemented Newton's, conjugate gradient (CG), and the steepest decent (SD) algorithms for dose-volume- and EUD-based objective functions. Our implementation of Newton's algorithm approximates the second derivative matrix (Hessian) by its diagonal. The standard SD algorithm and the CG algorithm with "line minimization" were also implemented. In addition, we investigated the use of a variation of the CG algorithm, called the "scaled conjugate gradient" (SCG) algorithm. To accelerate the optimization process, we investigated the validity of the use of a "hybrid optimization" strategy, in which approximations to calculated dose distributions are used during most of the iterations. Published studies have indicated that getting trapped in local minima is not a significant problem. To investigate this issue further, we first obtained, by trial and error, and starting with uniform intensity distributions, the parameters of the dose-volume- or EUD-based objective functions which produced IMRT plans that satisfied the clinical requirements. Using the resulting optimized intensity distributions as the initial guess, we investigated the possibility of getting trapped in a local minimum. For most of the results presented, we used a lung cancer case. To illustrate the generality of our methods, the results for a prostate case are also presented. For both dose-volume and EUD based objective functions, Newton's method far outperforms other algorithms in terms of speed. The SCG algorithm, which avoids expensive "line minimization," can speed up the standard CG algorithm by at least a factor of 2. For the same initial conditions, all algorithms converge essentially to the same plan. However, we demonstrate that for any of the algorithms studied, starting with previously optimized intensity distributions as the initial guess but for different objective function parameters, the solution frequently gets trapped in local minima. We found that the initial intensity distribution obtained from IMRT optimization utilizing objective function parameters, which favor a specific anatomic structure, would lead to a local minimum corresponding to that structure. Our results indicate that from among the gradient algorithms tested, Newton's method appears to be the fastest by far. Different gradient algorithms have the same convergence properties for dose-volume- and EUD-based objective functions. The hybrid dose calculation strategy is valid and can significantly accelerate the optimization process. The degree of acceleration achieved depends on the type of optimization problem being addressed (e.g., IMRT optimization, intensity modulated beam configuration optimization, or objective function parameter optimization). Under special conditions, gradient algorithms will get trapped in local minima, and reoptimization, starting with the results of previous optimization, will lead to solutions that are generally not significantly different from the local minimum.

  1. New Anticoagulant Agents: Incidence of Adverse Drug Reactions and New Signals Thereof.

    PubMed

    Treceño-Lobato, Carlos; Jiménez-Serranía, María-Isabel; Martínez-García, Raquel; Corzo-Delibes, Francisco; Martín Arias, Luis H

    2018-06-04

    The aim of this study was to evaluate the adverse drug reaction (ADR) incidence rate and new signals thereof for classic compared with new anticoagulants in real-life ambulatory settings. The authors performed an observational cross-sectional study in two cohorts of surveyed patients treated with vitamin K antagonists (VKAs; acenocoumarol or warfarin) or nonvitamin K antagonist oral anticoagulants (NOACs; apixaban, edoxaban, rivaroxaban, dabigatran etexilate). Descriptive, clinical, and ADRs data were reported and analyzed through a bivariate analysis (odds ratio [OR]) to compare the ADRs incidence rate and an adaptation of Bayesian methodology (false discovery rate [FDR] < 0.05) to detect new signals. A total of 334 patients were surveyed-average international normalized ratio (INR) of 2.6-and 45.4% taking new anticoagulants. Note that 835 ADRs were reported; 2.5 per patient (2.8 in the VKA cohort, 2.1 in the NOAC cohort). The authors obtained higher risk of epistaxis (OR, 2.18; 95% confidence interval [CI], 1.01-4.74) and hematoma (OR, 2.43; 95% CI, 1.39-4.25) with VKAs and lower risk of global bleeding symptoms with NOACs (OR, 0.45; 95% CI, 0.28-0.71). After standardizing the data, a significant risk of diarrhea with VKAs was observed (OR, 3.37; 95% CI, 1.09-10.41). They also detected an intense positive signal regarding the use of VKAs and osteoporosis (FDR < 0.001), specifically acenocoumarol (FDR < 0.002). NOACs presented lower risk of bleeding, especially dabigatran (FDR < 0.031), and of dermatological pathologies with apixaban being the safest (FDR = 0.050). The lower risk of global bleeding and a potential protective effect against osteoporosis in patients treated with NOACs postulate them as safer than VKAs. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  2. [Gastrointestinal lesions and characteristics of acute gastrointestinal bleeding in acenocoumarol-treated patients].

    PubMed

    Nantes, Óscar; Zozaya, José Manuel; Montes, Ramón; Hermida, José

    2014-01-01

    In the last few years, the number of anticoagulated patients has significantly increased and, as a consequence, so have hemorrhagic complications due to this therapy. We analyzed gastrointestinal (GI) bleeding because it is the most frequent type of major bleeding in these patients, and we hypothesized that they would have lesions responsible for GI bleeding regardless of the intensity of anticoagulation, although excessively anticoagulated patients would have more serious hemorrhages. To study the characteristics of anticoagulated patients with GI bleeding and the relationship between the degree of anticoagulation and a finding of causative lesions and bleeding severity. We prospectively studied 96 patients, all anticoagulated with acenocoumarol and consecutively admitted to hospital between 01/01/2003 and 09/30/2005 because of acute GI bleeding. We excluded patients with severe liver disease, as well as nine patients with incomplete details. The incidence of GI bleeding requiring hospitalization was 19.6 cases/100,000 inhabitants-year. In 90% of patients, we found a causative (85% of upper GI bleeding and 50% of lower GI bleeding) or potentially causative lesion, and 30% of them required endoscopic treatment, without differences depending on the intensity of anticoagulation. No relationship was found between the type of lesions observed and the degree of anticoagulation in these patients. Patients who received more intense anticoagulation therapy had more severe hemorrhages (23% of patients with an INR ≥4 had a life-threatening bleed versus only 4% of patients with INR <4). We found an incidence of 20 severe GI bleeding episodes in anticoagulated patients per 100,000 inhabitants-year, with no difference in localization or in the frequency of causative lesions depending on the intensity of anticoagulation. Patients receiving more intense anticoagulation had more severe GI bleeding episodes. Copyright © 2013 Elsevier España, S.L. and AEEH y AEG. All rights reserved.

  3. SU-E-T-374: Evaluation and Verification of Dose Calculation Accuracy with Different Dose Grid Sizes for Intracranial Stereotactic Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, C; Schultheiss, T

    Purpose: In this study, we aim to evaluate the effect of dose grid size on the accuracy of calculated dose for small lesions in intracranial stereotactic radiosurgery (SRS), and to verify dose calculation accuracy with radiochromic film dosimetry. Methods: 15 intracranial lesions from previous SRS patients were retrospectively selected for this study. The planning target volume (PTV) ranged from 0.17 to 2.3 cm{sup 3}. A commercial treatment planning system was used to generate SRS plans using the volumetric modulated arc therapy (VMAT) technique using two arc fields. Two convolution-superposition-based dose calculation algorithms (Anisotropic Analytical Algorithm and Acuros XB algorithm) weremore » used to calculate volume dose distribution with dose grid size ranging from 1 mm to 3 mm with 0.5 mm step size. First, while the plan monitor units (MU) were kept constant, PTV dose variations were analyzed. Second, with 95% of the PTV covered by the prescription dose, variations of the plan MUs as a function of dose grid size were analyzed. Radiochomic films were used to compare the delivered dose and profile with the calculated dose distribution with different dose grid sizes. Results: The dose to the PTV, in terms of the mean dose, maximum, and minimum dose, showed steady decrease with increasing dose grid size using both algorithms. With 95% of the PTV covered by the prescription dose, the total MU increased with increasing dose grid size in most of the plans. Radiochromic film measurements showed better agreement with dose distributions calculated with 1-mm dose grid size. Conclusion: Dose grid size has significant impact on calculated dose distribution in intracranial SRS treatment planning with small target volumes. Using the default dose grid size could lead to under-estimation of delivered dose. A small dose grid size should be used to ensure calculation accuracy and agreement with QA measurements.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badkul, R; Nicolai, W; Pokhrel, D

    Purpose: To compare the impact of Pencil Beam(PB) and Anisotropic Analytic Algorithm(AAA) dose calculation algorithms on OARs and planning target volume (PTV) in thoracic spine stereotactic body radiation therapy (SBRT). Methods: Ten Spine SBRT patients were planned on Brainlab iPlan system using hybrid plan consisting of 1–2 non-coplanar conformal-dynamic arcs and few IMRT beams treated on NovalisTx with 6MV photon. Dose prescription varied from 20Gy to 30Gy in 5 fractions depending on the situation of the patient. PB plans were retrospectively recalculated using the Varian Eclipse with AAA algorithm using same MUs, MLC pattern and grid size(3mm).Differences in dose volumemore » parameters for PTV, spinal cord, lung, and esophagus were analyzed and compared for PB and AAA algorithms. OAR constrains were followed per RTOG-0631. Results: Since patients were treated using PB calculation, we compared all the AAA DVH values with respect to PB plan values as standard, although AAA predicts the dose more accurately than PB. PTV(min), PTV(Max), PTV(mean), PTV(D99%), PTV(D90%) were overestimated with AAA calculation on average by 3.5%, 1.84%, 0.95%, 3.98% and 1.55% respectively as compared to PB. All lung DVH parameters were underestimated with AAA algorithm mean deviation of lung V20, V10, V5, and 1000cc were 42.81%,19.83%, 18.79%, and 18.35% respectively. AAA overestimated Cord(0.35cc) by mean of 17.3%; cord (0.03cc) by 12.19% and cord(max) by 10.5% as compared to PB. Esophagus max dose were overestimated by 4.4% and 5cc by 3.26% for AAA algorithm as compared to PB. Conclusion: AAA overestimated the PTV dose values by up to 4%.The lung DVH had the greatest underestimation of dose by AAA versus PB. Spinal cord dose was overestimated by AAA versus PB. Given the critical importance of accuracy of OAR and PTV dose calculation for SBRT spine, more accurate algorithms and validation of calculated doses in phantom models are indicated.« less

  5. Comparison of dosimetric and radiobiological parameters on plans for prostate stereotactic body radiotherapy using an endorectal balloon for different dose-calculation algorithms and delivery-beam modes

    NASA Astrophysics Data System (ADS)

    Kang, Sang-Won; Suh, Tae-Suk; Chung, Jin-Beom; Eom, Keun-Yong; Song, Changhoon; Kim, In-Ah; Kim, Jae-Sung; Lee, Jeong-Woo; Cho, Woong

    2017-02-01

    The purpose of this study was to evaluate the impact of dosimetric and radiobiological parameters on treatment plans by using different dose-calculation algorithms and delivery-beam modes for prostate stereotactic body radiation therapy using an endorectal balloon. For 20 patients with prostate cancer, stereotactic body radiation therapy (SBRT) plans were generated by using a 10-MV photon beam with flattening filter (FF) and flattening-filter-free (FFF) modes. The total treatment dose prescribed was 42.7 Gy in 7 fractions to cover at least 95% of the planning target volume (PTV) with 95% of the prescribed dose. The dose computation was initially performed using an anisotropic analytical algorithm (AAA) in the Eclipse treatment planning system (Varian Medical Systems, Palo Alto, CA) and was then re-calculated using Acuros XB (AXB V. 11.0.34) with the same monitor units and multileaf collimator files. The dosimetric and the radiobiological parameters for the PTV and organs at risk (OARs) were analyzed from the dose-volume histogram. An obvious difference in dosimetric parameters between the AAA and the AXB plans was observed in the PTV and rectum. Doses to the PTV, excluding the maximum dose, were always higher in the AAA plans than in the AXB plans. However, doses to the other OARs were similar in both algorithm plans. In addition, no difference was observed in the dosimetric parameters for different delivery-beam modes when using the same algorithm to generate plans. As a result of the dosimetric parameters, the radiobiological parameters for the two algorithm plans presented an apparent difference in the PTV and the rectum. The average tumor control probability of the AAA plans was higher than that of the AXB plans. The average normal tissue complication probability (NTCP) to rectum was lower in the AXB plans than in the AAA plans. The AAA and the AXB plans yielded very similar NTCPs for the other OARs. In plans using the same algorithms, the NTCPs for delivery-beam modes showed no differences. This study demonstrated that the dosimetric and the radiobiological parameters for the PTV and the rectum affected the dose-calculation algorithms for prostate SBRT using an endorectal balloon. However, the dosimetric and the radiobiological parameters in the AAA and the AXB plans for other OARs were similar. Furthermore, difference between the dosimetric and the radiobiological parameters for different delivery-beam modes were not found when the same algorithm was used to generate the treatment plan.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cebe, M; Pacaci, P; Mabhouti, H

    Purpose: In this study, the two available calculation algorithms of the Varian Eclipse treatment planning system(TPS), the electron Monte Carlo(eMC) and General Gaussian Pencil Beam(GGPB) algorithms were used to compare measured and calculated peripheral dose distribution of electron beams. Methods: Peripheral dose measurements were carried out for 6, 9, 12, 15, 18 and 22 MeV electron beams of Varian Triology machine using parallel plate ionization chamber and EBT3 films in the slab phantom. Measurements were performed for 6×6, 10×10 and 25×25cm{sup 2} cone sizes at dmax of each energy up to 20cm beyond the field edges. Using the same filmmore » batch, the net OD to dose calibration curve was obtained for each energy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. Dose distribution measured using parallel plate ionization chamber and EBT3 film and calculated by eMC and GGPB algorithms were compared. The measured and calculated data were then compared to find which algorithm calculates peripheral dose distribution more accurately. Results: The agreement between measurement and eMC was better than GGPB. The TPS underestimated the out of field doses. The difference between measured and calculated doses increase with the cone size. The largest deviation between calculated and parallel plate ionization chamber measured dose is less than 4.93% for eMC, but it can increase up to 7.51% for GGPB. For film measurement, the minimum gamma analysis passing rates between measured and calculated dose distributions were 98.2% and 92.7% for eMC and GGPB respectively for all field sizes and energies. Conclusion: Our results show that the Monte Carlo algorithm for electron planning in Eclipse is more accurate than previous algorithms for peripheral dose distributions. It must be emphasized that the use of GGPB for planning large field treatments with 6 MeV could lead to inaccuracies of clinical significance.« less

  7. SU-E-T-538: Evaluation of IMRT Dose Calculation Based on Pencil-Beam and AAA Algorithms.

    PubMed

    Yuan, Y; Duan, J; Popple, R; Brezovich, I

    2012-06-01

    To evaluate the accuracy of dose calculation for intensity modulated radiation therapy (IMRT) based on Pencil Beam (PB) and Analytical Anisotropic Algorithm (AAA) computation algorithms. IMRT plans of twelve patients with different treatment sites, including head/neck, lung and pelvis, were investigated. For each patient, dose calculation with PB and AAA algorithms using dose grid sizes of 0.5 mm, 0.25 mm, and 0.125 mm, were compared with composite-beam ion chamber and film measurements in patient specific QA. Discrepancies between the calculation and the measurement were evaluated by percentage error for ion chamber dose and γ〉l failure rate in gamma analysis (3%/3mm) for film dosimetry. For 9 patients, ion chamber dose calculated with AAA-algorithms is closer to ion chamber measurement than that calculated with PB algorithm with grid size of 2.5 mm, though all calculated ion chamber doses are within 3% of the measurements. For head/neck patients and other patients with large treatment volumes, γ〉l failure rate is significantly reduced (within 5%) with AAA-based treatment planning compared to generally more than 10% with PB-based treatment planning (grid size=2.5 mm). For lung and brain cancer patients with medium and small treatment volumes, γ〉l failure rates are typically within 5% for both AAA and PB-based treatment planning (grid size=2.5 mm). For both PB and AAA-based treatment planning, improvements of dose calculation accuracy with finer dose grids were observed in film dosimetry of 11 patients and in ion chamber measurements for 3 patients. AAA-based treatment planning provides more accurate dose calculation for head/neck patients and other patients with large treatment volumes. Compared with film dosimetry, a γ〉l failure rate within 5% can be achieved for AAA-based treatment planning. © 2012 American Association of Physicists in Medicine.

  8. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences.

    PubMed

    Li, Haisen S; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S; Chetty, Indrin J

    2014-01-06

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  9. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences

    NASA Astrophysics Data System (ADS)

    Li, Haisen S.; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S.; Chetty, Indrin J.

    2014-01-01

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  10. The accuracy of the out-of-field dose calculations using a model based algorithm in a commercial treatment planning system

    NASA Astrophysics Data System (ADS)

    Wang, Lilie; Ding, George X.

    2014-07-01

    The out-of-field dose can be clinically important as it relates to the dose of the organ-at-risk, although the accuracy of its calculation in commercial radiotherapy treatment planning systems (TPSs) receives less attention. This study evaluates the uncertainties of out-of-field dose calculated with a model based dose calculation algorithm, anisotropic analytical algorithm (AAA), implemented in a commercial radiotherapy TPS, Varian Eclipse V10, by using Monte Carlo (MC) simulations, in which the entire accelerator head is modeled including the multi-leaf collimators. The MC calculated out-of-field doses were validated by experimental measurements. The dose calculations were performed in a water phantom as well as CT based patient geometries and both static and highly modulated intensity-modulated radiation therapy (IMRT) fields were evaluated. We compared the calculated out-of-field doses, defined as lower than 5% of the prescription dose, in four H&N cancer patients and two lung cancer patients treated with volumetric modulated arc therapy (VMAT) and IMRT techniques. The results show that the discrepancy of calculated out-of-field dose profiles between AAA and the MC depends on the depth and is generally less than 1% for in water phantom comparisons and in CT based patient dose calculations for static field and IMRT. In cases of VMAT plans, the difference between AAA and MC is <0.5%. The clinical impact resulting from the error on the calculated organ doses were analyzed by using dose-volume histograms. Although the AAA algorithm significantly underestimated the out-of-field doses, the clinical impact on the calculated organ doses in out-of-field regions may not be significant in practice due to very low out-of-field doses relative to the target dose.

  11. SU-E-T-339: Dosimetric Verification of Acuros XB Dose Calculation Algorithm On An Air Cavity for 6-MV Flattening Filter-Free Beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S; Suh, T; Chung, J

    Purpose: This study was to verify the accuracy of Acuros XB (AXB) dose calculation algorithm on an air cavity for a single radiation field using 6-MV flattening filter-free (FFF) beam. Methods: A rectangular slab phantom containing an air cavity was made for this study. The CT images of the phantom for dose calculation were scanned with and without film at measurement depths (4.5, 5.5, 6.5 and 7.5 cm). The central axis doses (CADs) and the off-axis doses (OADs) were measured by film and calculated with Analytical Anisotropic Algorithm (AAA) and AXB for field sizes ranging from 2 Χ 2 tomore » 5 Χ 5 cm{sup 2} of 6-MV FFF beams. Both algorithms were divided into AXB-w and AAA -w when included the film in phantom for dose calculation, and AXB-w/o and AAA-w/o in calculation without film. The calculated OADs for both algorithms were compared with the measured OADs and difference values were determined using root means squares error (RMSE) and gamma evaluation. Results: The percentage differences (%Diffs) between the measured and calculated CAD for AXB-w was most agreement than others. Compared to the %Diff with and without film, the %Diffs with film were decreased than without within both algorithms. The %Diffs for both algorithms were reduced with increasing field size and increased relative to the depth increment. RMSEs of CAD for AXB-w were within 10.32% for both inner-profile and penumbra, while the corresponding values of AAA-w appeared to 96.50%. Conclusion: This study demonstrated that the dose calculation with AXB within air cavity shows more accurate than with AAA compared to the measured dose. Furthermore, we found that the AXB-w was superior to AXB-w/o in this region when compared against the measurements.« less

  12. SU-F-I-09: Improvement of Image Registration Using Total-Variation Based Noise Reduction Algorithms for Low-Dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mukherjee, S; Farr, J; Merchant, T

    Purpose: To study the effect of total-variation based noise reduction algorithms to improve the image registration of low-dose CBCT for patient positioning in radiation therapy. Methods: In low-dose CBCT, the reconstructed image is degraded by excessive quantum noise. In this study, we developed a total-variation based noise reduction algorithm and studied the effect of the algorithm on noise reduction and image registration accuracy. To study the effect of noise reduction, we have calculated the peak signal-to-noise ratio (PSNR). To study the improvement of image registration, we performed image registration between volumetric CT and MV- CBCT images of different head-and-neck patientsmore » and calculated the mutual information (MI) and Pearson correlation coefficient (PCC) as a similarity metric. The PSNR, MI and PCC were calculated for both the noisy and noise-reduced CBCT images. Results: The algorithms were shown to be effective in reducing the noise level and improving the MI and PCC for the low-dose CBCT images tested. For the different head-and-neck patients, a maximum improvement of PSNR of 10 dB with respect to the noisy image was calculated. The improvement of MI and PCC was 9% and 2% respectively. Conclusion: Total-variation based noise reduction algorithm was studied to improve the image registration between CT and low-dose CBCT. The algorithm had shown promising results in reducing the noise from low-dose CBCT images and improving the similarity metric in terms of MI and PCC.« less

  13. Warfarin Pharmacogenomics in Diverse Populations.

    PubMed

    Kaye, Justin B; Schultz, Lauren E; Steiner, Heidi E; Kittles, Rick A; Cavallari, Larisa H; Karnes, Jason H

    2017-09-01

    Genotype-guided warfarin dosing algorithms are a rational approach to optimize warfarin dosing and potentially reduce adverse drug events. Diverse populations, such as African Americans and Latinos, have greater variability in warfarin dose requirements and are at greater risk for experiencing warfarin-related adverse events compared with individuals of European ancestry. Although these data suggest that patients of diverse populations may benefit from improved warfarin dose estimation, the vast majority of literature on genotype-guided warfarin dosing, including data from prospective randomized trials, is in populations of European ancestry. Despite differing frequencies of variants by race/ethnicity, most evidence in diverse populations evaluates variants that are most common in populations of European ancestry. Algorithms that do not include variants important across race/ethnic groups are unlikely to benefit diverse populations. In some race/ethnic groups, development of race-specific or admixture-based algorithms may facilitate improved genotype-guided warfarin dosing algorithms above and beyond that seen in individuals of European ancestry. These observations should be considered in the interpretation of literature evaluating the clinical utility of genotype-guided warfarin dosing. Careful consideration of race/ethnicity and additional evidence focused on improving warfarin dosing algorithms across race/ethnic groups will be necessary for successful clinical implementation of warfarin pharmacogenomics. The evidence for warfarin pharmacogenomics has a broad significance for pharmacogenomic testing, emphasizing the consideration of race/ethnicity in discovery of gene-drug pairs and development of clinical recommendations for pharmacogenetic testing. © 2017 Pharmacotherapy Publications, Inc.

  14. Influence of radiation dose and iterative reconstruction algorithms for measurement accuracy and reproducibility of pulmonary nodule volumetry: A phantom study.

    PubMed

    Kim, Hyungjin; Park, Chang Min; Song, Yong Sub; Lee, Sang Min; Goo, Jin Mo

    2014-05-01

    To evaluate the influence of radiation dose settings and reconstruction algorithms on the measurement accuracy and reproducibility of semi-automated pulmonary nodule volumetry. CT scans were performed on a chest phantom containing various nodules (10 and 12mm; +100, -630 and -800HU) at 120kVp with tube current-time settings of 10, 20, 50, and 100mAs. Each CT was reconstructed using filtered back projection (FBP), iDose(4) and iterative model reconstruction (IMR). Semi-automated volumetry was performed by two radiologists using commercial volumetry software for nodules at each CT dataset. Noise, contrast-to-noise ratio and signal-to-noise ratio of CT images were also obtained. The absolute percentage measurement errors and differences were then calculated for volume and mass. The influence of radiation dose and reconstruction algorithm on measurement accuracy, reproducibility and objective image quality metrics was analyzed using generalized estimating equations. Measurement accuracy and reproducibility of nodule volume and mass were not significantly associated with CT radiation dose settings or reconstruction algorithms (p>0.05). Objective image quality metrics of CT images were superior in IMR than in FBP or iDose(4) at all radiation dose settings (p<0.05). Semi-automated nodule volumetry can be applied to low- or ultralow-dose chest CT with usage of a novel iterative reconstruction algorithm without losing measurement accuracy and reproducibility. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Pediatric chest HRCT using the iDose4 Hybrid Iterative Reconstruction Algorithm: Which iDose level to choose?

    NASA Astrophysics Data System (ADS)

    Smarda, M.; Alexopoulou, E.; Mazioti, A.; Kordolaimi, S.; Ploussi, A.; Priftis, K.; Efstathopoulos, E.

    2015-09-01

    Purpose of the study is to determine the appropriate iterative reconstruction (IR) algorithm level that combines image quality and diagnostic confidence, for pediatric patients undergoing high-resolution computed tomography (HRCT). During the last 2 years, a total number of 20 children up to 10 years old with a clinical presentation of chronic bronchitis underwent HRCT in our department's 64-detector row CT scanner using the iDose IR algorithm, with almost similar image settings (80kVp, 40-50 mAs). CT images were reconstructed with all iDose levels (level 1 to 7) as well as with filtered-back projection (FBP) algorithm. Subjective image quality was evaluated by 2 experienced radiologists in terms of image noise, sharpness, contrast and diagnostic acceptability using a 5-point scale (1=excellent image, 5=non-acceptable image). Artifacts existance was also pointed out. All mean scores from both radiologists corresponded to satisfactory image quality (score ≤3), even with the FBP algorithm use. Almost excellent (score <2) overall image quality was achieved with iDose levels 5 to 7, but oversmoothing artifacts appearing with iDose levels 6 and 7 affected the diagnostic confidence. In conclusion, the use of iDose level 5 enables almost excellent image quality without considerable artifacts affecting the diagnosis. Further evaluation is needed in order to draw more precise conclusions.

  16. Characterisation of mega-voltage electron pencil beam dose distributions: viability of a measurement-based approach.

    PubMed

    Barnes, M P; Ebert, M A

    2008-03-01

    The concept of electron pencil-beam dose distributions is central to pencil-beam algorithms used in electron beam radiotherapy treatment planning. The Hogstrom algorithm, which is a common algorithm for electron treatment planning, models large electron field dose distributions by the superposition of a series of pencil beam dose distributions. This means that the accurate characterisation of an electron pencil beam is essential for the accuracy of the dose algorithm. The aim of this study was to evaluate a measurement based approach for obtaining electron pencil-beam dose distributions. The primary incentive for the study was the accurate calculation of dose distributions for narrow fields as traditional electron algorithms are generally inaccurate for such geometries. Kodak X-Omat radiographic film was used in a solid water phantom to measure the dose distribution of circular 12 MeV beams from a Varian 21EX linear accelerator. Measurements were made for beams of diameter, 1.5, 2, 4, 8, 16 and 32 mm. A blocked-field technique was used to subtract photon contamination in the beam. The "error function" derived from Fermi-Eyges Multiple Coulomb Scattering (MCS) theory for corresponding square fields was used to fit resulting dose distributions so that extrapolation down to a pencil beam distribution could be made. The Monte Carlo codes, BEAM and EGSnrc were used to simulate the experimental arrangement. The 8 mm beam dose distribution was also measured with TLD-100 microcubes. Agreement between film, TLD and Monte Carlo simulation results were found to be consistent with the spatial resolution used. The study has shown that it is possible to extrapolate narrow electron beam dose distributions down to a pencil beam dose distribution using the error function. However, due to experimental uncertainties and measurement difficulties, Monte Carlo is recommended as the method of choice for characterising electron pencil-beam dose distributions.

  17. Dosimetric Evaluation of Metal Artefact Reduction using Metal Artefact Reduction (MAR) Algorithm and Dual-energy Computed Tomography (CT) Method

    NASA Astrophysics Data System (ADS)

    Laguda, Edcer Jerecho

    Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient's medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method. Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated. Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated. Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.

  18. Automated coronary artery calcification detection on low-dose chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Cham, Matthew D.; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.

    2014-03-01

    Coronary artery calcification (CAC) measurement from low-dose CT images can be used to assess the risk of coronary artery disease. A fully automatic algorithm to detect and measure CAC from low-dose non-contrast, non-ECG-gated chest CT scans is presented. Based on the automatically detected CAC, the Agatston score (AS), mass score and volume score were computed. These were compared with scores obtained manually from standard-dose ECG-gated scans and low-dose un-gated scans of the same patient. The automatic algorithm segments the heart region based on other pre-segmented organs to provide a coronary region mask. The mitral valve and aortic valve calcification is identified and excluded. All remaining voxels greater than 180HU within the mask region are considered as CAC candidates. The heart segmentation algorithm was evaluated on 400 non-contrast cases with both low-dose and regular dose CT scans. By visual inspection, 371 (92.8%) of the segmentations were acceptable. The automated CAC detection algorithm was evaluated on 41 low-dose non-contrast CT scans. Manual markings were performed on both low-dose and standard-dose scans for these cases. Using linear regression, the correlation of the automatic AS with the standard-dose manual scores was 0.86; with the low-dose manual scores the correlation was 0.91. Standard risk categories were also computed. The automated method risk category agreed with manual markings of gated scans for 24 cases while 15 cases were 1 category off. For low-dose scans, the automatic method agreed with 33 cases while 7 cases were 1 category off.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokhrel, D; Sood, S; Badkul, R

    Purpose: To compare dose distributions calculated using PB-hete vs. XVMC algorithms for SRT treatments of cavernous sinus tumors. Methods: Using PB-hete SRT, five patients with cavernous sinus tumors received the prescription dose of 25 Gy in 5 fractions for planning target volume PTV(V100%)=95%. Gross tumor volume (GTV) and organs at risk (OARs) were delineated on T1/T2 MRI-CT-fused images. PTV (range 2.1–84.3cc, mean=21.7cc) was generated using a 5mm uniform-margin around GTV. PB-hete SRT plans included a combination of non-coplanar conformal arcs/static beams delivered by Novalis-TX consisting of HD-MLCs and a 6MV-SRS(1000 MU/min) beam. Plans were re-optimized using XVMC algorithm with identicalmore » beam geometry and MLC positions. Comparison of plan specific PTV(V99%), maximal, mean, isocenter doses, and total monitor units(MUs) were evaluated. Maximal dose to OARs such as brainstem, optic-pathway, spinal cord, and lenses as well as normal tissue volume receiving 12Gy(V12) were compared between two algorithms. All analysis was performed using two-tailed paired t-tests of an upper-bound p-value of <0.05. Results: Using either algorithm, no dosimetrically significant differences in PTV coverage (PTVV99%,maximal, mean, isocenter doses) and total number of MUs were observed (all p-values >0.05, mean ratios within 2%). However, maximal doses to optic-chiasm and nerves were significantly under-predicted using PB-hete (p=0.04). Maximal brainstem, spinal cord, lens dose and V12 were all comparable between two algorithms, with exception of one patient with the largest PTV who exhibited 11% higher V12 with XVMC. Conclusion: Unlike lung tumors, XVMC and PB-hete treatment plans provided similar PTV coverage for cavernous sinus tumors. Majority of OARs doses were comparable between two algorithms, except for small structures such as optic chiasm/nerves which could potentially receive higher doses when using XVMC algorithm. Special attention may need to be paid on a case-by-case basis when planning for sinus SRT based on tumor size and location to OARs particularly the optic apparatus.« less

  20. Comparison of normal tissue dose calculation methods for epidemiological studies of radiotherapy patients.

    PubMed

    Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik

    2018-06-01

    Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.

  1. Deep Convolutional Framelet Denosing for Low-Dose CT via Wavelet Residual Network.

    PubMed

    Kang, Eunhee; Chang, Won; Yoo, Jaejun; Ye, Jong Chul

    2018-06-01

    Model-based iterative reconstruction algorithms for low-dose X-ray computed tomography (CT) are computationally expensive. To address this problem, we recently proposed a deep convolutional neural network (CNN) for low-dose X-ray CT and won the second place in 2016 AAPM Low-Dose CT Grand Challenge. However, some of the textures were not fully recovered. To address this problem, here we propose a novel framelet-based denoising algorithm using wavelet residual network which synergistically combines the expressive power of deep learning and the performance guarantee from the framelet-based denoising algorithms. The new algorithms were inspired by the recent interpretation of the deep CNN as a cascaded convolution framelet signal representation. Extensive experimental results confirm that the proposed networks have significantly improved performance and preserve the detail texture of the original images.

  2. Improving Cancer Detection and Dose Efficiency in Dedicated Breast Cancer CT

    DTIC Science & Technology

    2010-02-01

    source trajectory and data truncation, which can however be solved with the back-projection filtration ( BPF ) algorithm [6,7]. I have used the BPF ...high to low radiation dose levels. I have investigated noise properties in images reconstructed by use of FDK and BPF algorithms at different noise...analytic algorithms such as the FDK and BPF algorithms are applied to sparse-view data, the reconstruction images will contain artifacts such as streak

  3. Automated algorithm for CBCT-based dose calculations of prostate radiotherapy with bilateral hip prostheses.

    PubMed

    Almatani, Turki; Hugtenburg, Richard P; Lewis, Ryan D; Barley, Susan E; Edwards, Mark A

    2016-10-01

    Cone beam CT (CBCT) images contain more scatter than a conventional CT image and therefore provide inaccurate Hounsfield units (HUs). Consequently, CBCT images cannot be used directly for radiotherapy dose calculation. The aim of this study is to enable dose calculations to be performed with the use of CBCT images taken during radiotherapy and evaluate the necessity of replanning. A patient with prostate cancer with bilateral metallic prosthetic hip replacements was imaged using both CT and CBCT. The multilevel threshold (MLT) algorithm was used to categorize pixel values in the CBCT images into segments of homogeneous HU. The variation in HU with position in the CBCT images was taken into consideration. This segmentation method relies on the operator dividing the CBCT data into a set of volumes where the variation in the relationship between pixel values and HUs is small. An automated MLT algorithm was developed to reduce the operator time associated with the process. An intensity-modulated radiation therapy plan was generated from CT images of the patient. The plan was then copied to the segmented CBCT (sCBCT) data sets with identical settings, and the doses were recalculated and compared. Gamma evaluation showed that the percentage of points in the rectum with γ < 1 (3%/3 mm) were 98.7% and 97.7% in the sCBCT using MLT and the automated MLT algorithms, respectively. Compared with the planning CT (pCT) plan, the MLT algorithm showed -0.46% dose difference with 8 h operator time while the automated MLT algorithm showed -1.3%, which are both considered to be clinically acceptable, when using collapsed cone algorithm. The segmentation of CBCT images using the method in this study can be used for dose calculation. For a patient with prostate cancer with bilateral hip prostheses and the associated issues with CT imaging, the MLT algorithms achieved a sufficient dose calculation accuracy that is clinically acceptable. The automated MLT algorithm reduced the operator time associated with implementing the MLT algorithm to achieve clinically acceptable accuracy. This saved time makes the automated MLT algorithm superior and easier to implement in the clinical setting. The MLT algorithm has been extended to the complex example of a patient with bilateral hip prostheses, which with the introduction of automation is feasible for use in adaptive radiotherapy, as an alternative to obtaining a new pCT and reoutlining the structures.

  4. Toward adaptive radiotherapy for head and neck patients: Uncertainties in dose warping due to the choice of deformable registration algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veiga, Catarina, E-mail: catarina.veiga.11@ucl.ac.uk; Royle, Gary; Lourenço, Ana Mónica

    2015-02-15

    Purpose: The aims of this work were to evaluate the performance of several deformable image registration (DIR) algorithms implemented in our in-house software (NiftyReg) and the uncertainties inherent to using different algorithms for dose warping. Methods: The authors describe a DIR based adaptive radiotherapy workflow, using CT and cone-beam CT (CBCT) imaging. The transformations that mapped the anatomy between the two time points were obtained using four different DIR approaches available in NiftyReg. These included a standard unidirectional algorithm and more sophisticated bidirectional ones that encourage or ensure inverse consistency. The forward (CT-to-CBCT) deformation vector fields (DVFs) were used tomore » propagate the CT Hounsfield units and structures to the daily geometry for “dose of the day” calculations, while the backward (CBCT-to-CT) DVFs were used to remap the dose of the day onto the planning CT (pCT). Data from five head and neck patients were used to evaluate the performance of each implementation based on geometrical matching, physical properties of the DVFs, and similarity between warped dose distributions. Geometrical matching was verified in terms of dice similarity coefficient (DSC), distance transform, false positives, and false negatives. The physical properties of the DVFs were assessed calculating the harmonic energy, determinant of the Jacobian, and inverse consistency error of the transformations. Dose distributions were displayed on the pCT dose space and compared using dose difference (DD), distance to dose difference, and dose volume histograms. Results: All the DIR algorithms gave similar results in terms of geometrical matching, with an average DSC of 0.85 ± 0.08, but the underlying properties of the DVFs varied in terms of smoothness and inverse consistency. When comparing the doses warped by different algorithms, we found a root mean square DD of 1.9% ± 0.8% of the prescribed dose (pD) and that an average of 9% ± 4% of voxels within the treated volume failed a 2%pD DD-test (DD{sub 2%-pp}). Larger DD{sub 2%-pp} was found within the high dose gradient (21% ± 6%) and regions where the CBCT quality was poorer (28% ± 9%). The differences when estimating the mean and maximum dose delivered to organs-at-risk were up to 2.0%pD and 2.8%pD, respectively. Conclusions: The authors evaluated several DIR algorithms for CT-to-CBCT registrations. In spite of all methods resulting in comparable geometrical matching, the choice of DIR implementation leads to uncertainties in dose warped, particularly in regions of high gradient and/or poor imaging quality.« less

  5. Quantitative Image Quality and Histogram-Based Evaluations of an Iterative Reconstruction Algorithm at Low-to-Ultralow Radiation Dose Levels: A Phantom Study in Chest CT

    PubMed Central

    Lee, Ki Baek

    2018-01-01

    Objective To describe the quantitative image quality and histogram-based evaluation of an iterative reconstruction (IR) algorithm in chest computed tomography (CT) scans at low-to-ultralow CT radiation dose levels. Materials and Methods In an adult anthropomorphic phantom, chest CT scans were performed with 128-section dual-source CT at 70, 80, 100, 120, and 140 kVp, and the reference (3.4 mGy in volume CT Dose Index [CTDIvol]), 30%-, 60%-, and 90%-reduced radiation dose levels (2.4, 1.4, and 0.3 mGy). The CT images were reconstructed by using filtered back projection (FBP) algorithms and IR algorithm with strengths 1, 3, and 5. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were statistically compared between different dose levels, tube voltages, and reconstruction algorithms. Moreover, histograms of subtraction images before and after standardization in x- and y-axes were visually compared. Results Compared with FBP images, IR images with strengths 1, 3, and 5 demonstrated image noise reduction up to 49.1%, SNR increase up to 100.7%, and CNR increase up to 67.3%. Noteworthy image quality degradations on IR images including a 184.9% increase in image noise, 63.0% decrease in SNR, and 51.3% decrease in CNR, and were shown between 60% and 90% reduced levels of radiation dose (p < 0.0001). Subtraction histograms between FBP and IR images showed progressively increased dispersion with increased IR strength and increased dose reduction. After standardization, the histograms appeared deviated and ragged between FBP images and IR images with strength 3 or 5, but almost normally-distributed between FBP images and IR images with strength 1. Conclusion The IR algorithm may be used to save radiation doses without substantial image quality degradation in chest CT scanning of the adult anthropomorphic phantom, down to approximately 1.4 mGy in CTDIvol (60% reduced dose). PMID:29354008

  6. Evaluation of an analytic linear Boltzmann transport equation solver for high-density inhomogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S. A. M.; Ansbacher, W.; Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6

    2013-01-15

    Purpose: Acuros external beam (Acuros XB) is a novel dose calculation algorithm implemented through the ECLIPSE treatment planning system. The algorithm finds a deterministic solution to the linear Boltzmann transport equation, the same equation commonly solved stochastically by Monte Carlo methods. This work is an evaluation of Acuros XB, by comparison with Monte Carlo, for dose calculation applications involving high-density materials. Existing non-Monte Carlo clinical dose calculation algorithms, such as the analytic anisotropic algorithm (AAA), do not accurately model dose perturbations due to increased electron scatter within high-density volumes. Methods: Acuros XB, AAA, and EGSnrc based Monte Carlo are usedmore » to calculate dose distributions from 18 MV and 6 MV photon beams delivered to a cubic water phantom containing a rectangular high density (4.0-8.0 g/cm{sup 3}) volume at its center. The algorithms are also used to recalculate a clinical prostate treatment plan involving a unilateral hip prosthesis, originally evaluated using AAA. These results are compared graphically and numerically using gamma-index analysis. Radio-chromic film measurements are presented to augment Monte Carlo and Acuros XB dose perturbation data. Results: Using a 2% and 1 mm gamma-analysis, between 91.3% and 96.8% of Acuros XB dose voxels containing greater than 50% the normalized dose were in agreement with Monte Carlo data for virtual phantoms involving 18 MV and 6 MV photons, stainless steel and titanium alloy implants and for on-axis and oblique field delivery. A similar gamma-analysis of AAA against Monte Carlo data showed between 80.8% and 87.3% agreement. Comparing Acuros XB and AAA evaluations of a clinical prostate patient plan involving a unilateral hip prosthesis, Acuros XB showed good overall agreement with Monte Carlo while AAA underestimated dose on the upstream medial surface of the prosthesis due to electron scatter from the high-density material. Film measurements support the dose perturbations demonstrated by Monte Carlo and Acuros XB data. Conclusions: Acuros XB is shown to perform as well as Monte Carlo methods and better than existing clinical algorithms for dose calculations involving high-density volumes.« less

  7. Quantifying the effect of air gap, depth, and range shifter thickness on TPS dosimetric accuracy in superficial PBS proton therapy.

    PubMed

    Shirey, Robert J; Wu, Hsinshun Terry

    2018-01-01

    This study quantifies the dosimetric accuracy of a commercial treatment planning system as functions of treatment depth, air gap, and range shifter thickness for superficial pencil beam scanning proton therapy treatments. The RayStation 6 pencil beam and Monte Carlo dose engines were each used to calculate the dose distributions for a single treatment plan with varying range shifter air gaps. Central axis dose values extracted from each of the calculated plans were compared to dose values measured with a calibrated PTW Markus chamber at various depths in RW3 solid water. Dose was measured at 12 depths, ranging from the surface to 5 cm, for each of the 18 different air gaps, which ranged from 0.5 to 28 cm. TPS dosimetric accuracy, defined as the ratio of calculated dose relative to the measured dose, was plotted as functions of depth and air gap for the pencil beam and Monte Carlo dose algorithms. The accuracy of the TPS pencil beam dose algorithm was found to be clinically unacceptable at depths shallower than 3 cm with air gaps wider than 10 cm, and increased range shifter thickness only added to the dosimetric inaccuracy of the pencil beam algorithm. Each configuration calculated with Monte Carlo was determined to be clinically acceptable. Further comparisons of the Monte Carlo dose algorithm to the measured spread-out Bragg Peaks of multiple fields used during machine commissioning verified the dosimetric accuracy of Monte Carlo in a variety of beam energies and field sizes. Discrepancies between measured and TPS calculated dose values can mainly be attributed to the ability (or lack thereof) of the TPS pencil beam dose algorithm to properly model secondary proton scatter generated in the range shifter. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  8. Assessment of dedicated low-dose cardiac micro-CT reconstruction algorithms using the left ventricular volume of small rodents as a performance measure.

    PubMed

    Maier, Joscha; Sawall, Stefan; Kachelrieß, Marc

    2014-05-01

    Phase-correlated microcomputed tomography (micro-CT) imaging plays an important role in the assessment of mouse models of cardiovascular diseases and the determination of functional parameters as the left ventricular volume. As the current gold standard, the phase-correlated Feldkamp reconstruction (PCF), shows poor performance in case of low dose scans, more sophisticated reconstruction algorithms have been proposed to enable low-dose imaging. In this study, the authors focus on the McKinnon-Bates (MKB) algorithm, the low dose phase-correlated (LDPC) reconstruction, and the high-dimensional total variation minimization reconstruction (HDTV) and investigate their potential to accurately determine the left ventricular volume at different dose levels from 50 to 500 mGy. The results were verified in phantom studies of a five-dimensional (5D) mathematical mouse phantom. Micro-CT data of eight mice, each administered with an x-ray dose of 500 mGy, were acquired, retrospectively gated for cardiac and respiratory motion and reconstructed using PCF, MKB, LDPC, and HDTV. Dose levels down to 50 mGy were simulated by using only a fraction of the projections. Contrast-to-noise ratio (CNR) was evaluated as a measure of image quality. Left ventricular volume was determined using different segmentation algorithms (Otsu, level sets, region growing). Forward projections of the 5D mouse phantom were performed to simulate a micro-CT scan. The simulated data were processed the same way as the real mouse data sets. Compared to the conventional PCF reconstruction, the MKB, LDPC, and HDTV algorithm yield images of increased quality in terms of CNR. While the MKB reconstruction only provides small improvements, a significant increase of the CNR is observed in LDPC and HDTV reconstructions. The phantom studies demonstrate that left ventricular volumes can be determined accurately at 500 mGy. For lower dose levels which were simulated for real mouse data sets, the HDTV algorithm shows the best performance. At 50 mGy, the deviation from the reference obtained at 500 mGy were less than 4%. Also the LDPC algorithm provides reasonable results with deviation less than 10% at 50 mGy while PCF and MKB reconstruction show larger deviations even at higher dose levels. LDPC and HDTV increase CNR and allow for quantitative evaluations even at dose levels as low as 50 mGy. The left ventricular volumes exemplarily illustrate that cardiac parameters can be accurately estimated at lowest dose levels if sophisticated algorithms are used. This allows to reduce dose by a factor of 10 compared to today's gold standard and opens new options for longitudinal studies of the heart.

  9. Assessment of dedicated low-dose cardiac micro-CT reconstruction algorithms using the left ventricular volume of small rodents as a performance measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Joscha, E-mail: joscha.maier@dkfz.de; Sawall, Stefan; Kachelrieß, Marc

    2014-05-15

    Purpose: Phase-correlated microcomputed tomography (micro-CT) imaging plays an important role in the assessment of mouse models of cardiovascular diseases and the determination of functional parameters as the left ventricular volume. As the current gold standard, the phase-correlated Feldkamp reconstruction (PCF), shows poor performance in case of low dose scans, more sophisticated reconstruction algorithms have been proposed to enable low-dose imaging. In this study, the authors focus on the McKinnon-Bates (MKB) algorithm, the low dose phase-correlated (LDPC) reconstruction, and the high-dimensional total variation minimization reconstruction (HDTV) and investigate their potential to accurately determine the left ventricular volume at different dose levelsmore » from 50 to 500 mGy. The results were verified in phantom studies of a five-dimensional (5D) mathematical mouse phantom. Methods: Micro-CT data of eight mice, each administered with an x-ray dose of 500 mGy, were acquired, retrospectively gated for cardiac and respiratory motion and reconstructed using PCF, MKB, LDPC, and HDTV. Dose levels down to 50 mGy were simulated by using only a fraction of the projections. Contrast-to-noise ratio (CNR) was evaluated as a measure of image quality. Left ventricular volume was determined using different segmentation algorithms (Otsu, level sets, region growing). Forward projections of the 5D mouse phantom were performed to simulate a micro-CT scan. The simulated data were processed the same way as the real mouse data sets. Results: Compared to the conventional PCF reconstruction, the MKB, LDPC, and HDTV algorithm yield images of increased quality in terms of CNR. While the MKB reconstruction only provides small improvements, a significant increase of the CNR is observed in LDPC and HDTV reconstructions. The phantom studies demonstrate that left ventricular volumes can be determined accurately at 500 mGy. For lower dose levels which were simulated for real mouse data sets, the HDTV algorithm shows the best performance. At 50 mGy, the deviation from the reference obtained at 500 mGy were less than 4%. Also the LDPC algorithm provides reasonable results with deviation less than 10% at 50 mGy while PCF and MKB reconstruction show larger deviations even at higher dose levels. Conclusions: LDPC and HDTV increase CNR and allow for quantitative evaluations even at dose levels as low as 50 mGy. The left ventricular volumes exemplarily illustrate that cardiac parameters can be accurately estimated at lowest dose levels if sophisticated algorithms are used. This allows to reduce dose by a factor of 10 compared to today's gold standard and opens new options for longitudinal studies of the heart.« less

  10. Monte Carlo evaluation of Acuros XB dose calculation Algorithm for intensity modulated radiation therapy of nasopharyngeal carcinoma

    NASA Astrophysics Data System (ADS)

    Yeh, Peter C. Y.; Lee, C. C.; Chao, T. C.; Tung, C. J.

    2017-11-01

    Intensity-modulated radiation therapy is an effective treatment modality for the nasopharyngeal carcinoma. One important aspect of this cancer treatment is the need to have an accurate dose algorithm dealing with the complex air/bone/tissue interface in the head-neck region to achieve the cure without radiation-induced toxicities. The Acuros XB algorithm explicitly solves the linear Boltzmann transport equation in voxelized volumes to account for the tissue heterogeneities such as lungs, bone, air, and soft tissues in the treatment field receiving radiotherapy. With the single beam setup in phantoms, this algorithm has already been demonstrated to achieve the comparable accuracy with Monte Carlo simulations. In the present study, five nasopharyngeal carcinoma patients treated with the intensity-modulated radiation therapy were examined for their dose distributions calculated using the Acuros XB in the planning target volume and the organ-at-risk. Corresponding results of Monte Carlo simulations were computed from the electronic portal image data and the BEAMnrc/DOSXYZnrc code. Analysis of dose distributions in terms of the clinical indices indicated that the Acuros XB was in comparable accuracy with Monte Carlo simulations and better than the anisotropic analytical algorithm for dose calculations in real patients.

  11. Comparison of Nine Statistical Model Based Warfarin Pharmacogenetic Dosing Algorithms Using the Racially Diverse International Warfarin Pharmacogenetic Consortium Cohort Database

    PubMed Central

    Liu, Rong; Li, Xi; Zhang, Wei; Zhou, Hong-Hao

    2015-01-01

    Objective Multiple linear regression (MLR) and machine learning techniques in pharmacogenetic algorithm-based warfarin dosing have been reported. However, performances of these algorithms in racially diverse group have never been objectively evaluated and compared. In this literature-based study, we compared the performances of eight machine learning techniques with those of MLR in a large, racially-diverse cohort. Methods MLR, artificial neural network (ANN), regression tree (RT), multivariate adaptive regression splines (MARS), boosted regression tree (BRT), support vector regression (SVR), random forest regression (RFR), lasso regression (LAR) and Bayesian additive regression trees (BART) were applied in warfarin dose algorithms in a cohort from the International Warfarin Pharmacogenetics Consortium database. Covariates obtained by stepwise regression from 80% of randomly selected patients were used to develop algorithms. To compare the performances of these algorithms, the mean percentage of patients whose predicted dose fell within 20% of the actual dose (mean percentage within 20%) and the mean absolute error (MAE) were calculated in the remaining 20% of patients. The performances of these techniques in different races, as well as the dose ranges of therapeutic warfarin were compared. Robust results were obtained after 100 rounds of resampling. Results BART, MARS and SVR were statistically indistinguishable and significantly out performed all the other approaches in the whole cohort (MAE: 8.84–8.96 mg/week, mean percentage within 20%: 45.88%–46.35%). In the White population, MARS and BART showed higher mean percentage within 20% and lower mean MAE than those of MLR (all p values < 0.05). In the Asian population, SVR, BART, MARS and LAR performed the same as MLR. MLR and LAR optimally performed among the Black population. When patients were grouped in terms of warfarin dose range, all machine learning techniques except ANN and LAR showed significantly higher mean percentage within 20%, and lower MAE (all p values < 0.05) than MLR in the low- and high- dose ranges. Conclusion Overall, machine learning-based techniques, BART, MARS and SVR performed superior than MLR in warfarin pharmacogenetic dosing. Differences of algorithms’ performances exist among the races. Moreover, machine learning-based algorithms tended to perform better in the low- and high- dose ranges than MLR. PMID:26305568

  12. The impact of low-Z and high-Z metal implants in IMRT: A Monte Carlo study of dose inaccuracies in commercial dose algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spadea, Maria Francesca, E-mail: mfspadea@unicz.it; Verburg, Joost Mathias; Seco, Joao

    2014-01-15

    Purpose: The aim of the study was to evaluate the dosimetric impact of low-Z and high-Z metallic implants on IMRT plans. Methods: Computed tomography (CT) scans of three patients were analyzed to study effects due to the presence of Titanium (low-Z), Platinum and Gold (high-Z) inserts. To eliminate artifacts in CT images, a sinogram-based metal artifact reduction algorithm was applied. IMRT dose calculations were performed on both the uncorrected and corrected images using a commercial planning system (convolution/superposition algorithm) and an in-house Monte Carlo platform. Dose differences between uncorrected and corrected datasets were computed and analyzed using gamma index (Pγ{submore » <1}) and setting 2 mm and 2% as distance to agreement and dose difference criteria, respectively. Beam specific depth dose profiles across the metal were also examined. Results: Dose discrepancies between corrected and uncorrected datasets were not significant for low-Z material. High-Z materials caused under-dosage of 20%–25% in the region surrounding the metal and over dosage of 10%–15% downstream of the hardware. Gamma index test yielded Pγ{sub <1}>99% for all low-Z cases; while for high-Z cases it returned 91% < Pγ{sub <1}< 99%. Analysis of the depth dose curve of a single beam for low-Z cases revealed that, although the dose attenuation is altered inside the metal, it does not differ downstream of the insert. However, for high-Z metal implants the dose is increased up to 10%–12% around the insert. In addition, Monte Carlo method was more sensitive to the presence of metal inserts than superposition/convolution algorithm. Conclusions: The reduction in terms of dose of metal artifacts in CT images is relevant for high-Z implants. In this case, dose distribution should be calculated using Monte Carlo algorithms, given their superior accuracy in dose modeling in and around the metal. In addition, the knowledge of the composition of metal inserts improves the accuracy of the Monte Carlo dose calculation significantly.« less

  13. TU-G-204-09: The Effects of Reduced- Dose Lung Cancer Screening CT On Lung Nodule Detection Using a CAD Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S; Lo, P; Kim, G

    2015-06-15

    Purpose: While Lung Cancer Screening CT is being performed at low doses, the purpose of this study was to investigate the effects of further reducing dose on the performance of a CAD nodule-detection algorithm. Methods: We selected 50 cases from our local database of National Lung Screening Trial (NLST) patients for which we had both the image series and the raw CT data from the original scans. All scans were acquired with fixed mAs (25 for standard-sized patients, 40 for large patients) on a 64-slice scanner (Sensation 64, Siemens Healthcare). All images were reconstructed with 1-mm slice thickness, B50 kernel.more » 10 of the cases had at least one nodule reported on the NLST reader forms. Based on a previously-published technique, we added noise to the raw data to simulate reduced-dose versions of each case at 50% and 25% of the original NLST dose (i.e. approximately 1.0 and 0.5 mGy CTDIvol). For each case at each dose level, the CAD detection algorithm was run and nodules greater than 4 mm in diameter were reported. These CAD results were compared to “truth”, defined as the approximate nodule centroids from the NLST reports. Subject-level mean sensitivities and false-positive rates were calculated for each dose level. Results: The mean sensitivities of the CAD algorithm were 35% at the original dose, 20% at 50% dose, and 42.5% at 25% dose. The false-positive rates, in decreasing-dose order, were 3.7, 2.9, and 10 per case. In certain cases, particularly in larger patients, there were severe photon-starvation artifacts, especially in the apical region due to the high-attenuating shoulders. Conclusion: The detection task was challenging for the CAD algorithm at all dose levels, including the original NLST dose. However, the false-positive rate at 25% dose approximately tripled, suggesting a loss of CAD robustness somewhere between 0.5 and 1.0 mGy. NCI grant U01 CA181156 (Quantitative Imaging Network); Tobacco Related Disease Research Project grant 22RT-0131.« less

  14. SU-F-T-428: An Optimization-Based Commissioning Tool for Finite Size Pencil Beam Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Tian, Z; Song, T

    Purpose: Finite size pencil beam (FSPB) algorithms are commonly used to pre-calculate the beamlet dose distribution for IMRT treatment planning. FSPB commissioning, which usually requires fine tuning of the FSPB kernel parameters, is crucial to the dose calculation accuracy and hence the plan quality. Yet due to the large number of beamlets, FSPB commissioning could be very tedious. This abstract reports an optimization-based FSPB commissioning tool we have developed in MatLab to facilitate the commissioning. Methods: A FSPB dose kernel generally contains two types of parameters: the profile parameters determining the dose kernel shape, and a 2D scaling factors accountingmore » for the longitudinal and off-axis corrections. The former were fitted using the penumbra of a reference broad beam’s dose profile with Levenberg-Marquardt algorithm. Since the dose distribution of a broad beam is simply a linear superposition of the dose kernel of each beamlet calculated with the fitted profile parameters and scaled using the scaling factors, these factors could be determined by solving an optimization problem which minimizes the discrepancies between the calculated dose of broad beams and the reference dose. Results: We have commissioned a FSPB algorithm for three linac photon beams (6MV, 15MV and 6MVFFF). Dose of four field sizes (6*6cm2, 10*10cm2, 15*15cm2 and 20*20cm2) were calculated and compared with the reference dose exported from Eclipse TPS system. For depth dose curves, the differences are less than 1% of maximum dose after maximum dose depth for most cases. For lateral dose profiles, the differences are less than 2% of central dose at inner-beam regions. The differences of the output factors are within 1% for all the three beams. Conclusion: We have developed an optimization-based commissioning tool for FSPB algorithms to facilitate the commissioning, providing sufficient accuracy of beamlet dose calculation for IMRT optimization.« less

  15. National dosimetric audit network finds discrepancies in AAA lung inhomogeneity corrections.

    PubMed

    Dunn, Leon; Lehmann, Joerg; Lye, Jessica; Kenny, John; Kron, Tomas; Alves, Andrew; Cole, Andrew; Zifodya, Jackson; Williams, Ivan

    2015-07-01

    This work presents the Australian Clinical Dosimetry Service's (ACDS) findings of an investigation of systematic discrepancies between treatment planning system (TPS) calculated and measured audit doses. Specifically, a comparison between the Anisotropic Analytic Algorithm (AAA) and other common dose-calculation algorithms in regions downstream (≥2cm) from low-density material in anthropomorphic and slab phantom geometries is presented. Two measurement setups involving rectilinear slab-phantoms (ACDS Level II audit) and anthropomorphic geometries (ACDS Level III audit) were used in conjunction with ion chamber (planar 2D array and Farmer-type) measurements. Measured doses were compared to calculated doses for a variety of cases, with and without the presence of inhomogeneities and beam-modifiers in 71 audits. Results demonstrate a systematic AAA underdose with an average discrepancy of 2.9 ± 1.2% when the AAA algorithm is implemented in regions distal from lung-tissue interfaces, when lateral beams are used with anthropomorphic phantoms. This systemic discrepancy was found for all Level III audits of facilities using the AAA algorithm. This discrepancy is not seen when identical measurements are compared for other common dose-calculation algorithms (average discrepancy -0.4 ± 1.7%), including the Acuros XB algorithm also available with the Eclipse TPS. For slab phantom geometries (Level II audits), with similar measurement points downstream from inhomogeneities this discrepancy is also not seen. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  16. Single-dose volume regulation algorithm for a gas-compensated intrathecal infusion pump.

    PubMed

    Nam, Kyoung Won; Kim, Kwang Gi; Sung, Mun Hyun; Choi, Seong Wook; Kim, Dae Hyun; Jo, Yung Ho

    2011-01-01

    The internal pressures of medication reservoirs of gas-compensated intrathecal medication infusion pumps decrease when medication is discharged, and these discharge-induced pressure drops can decrease the volume of medication discharged. To prevent these reductions, the volumes discharged must be adjusted to maintain the required dosage levels. In this study, the authors developed an automatic control algorithm for an intrathecal infusion pump developed by the Korean National Cancer Center that regulates single-dose volumes. The proposed algorithm estimates the amount of medication remaining and adjusts control parameters automatically to maintain single-dose volumes at predetermined levels. Experimental results demonstrated that the proposed algorithm can regulate mean single-dose volumes with a variation of <3% and estimate the remaining medication volume with an accuracy of >98%. © 2010, Copyright the Authors. Artificial Organs © 2010, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  17. A simplified analytical random walk model for proton dose calculation

    NASA Astrophysics Data System (ADS)

    Yao, Weiguang; Merchant, Thomas E.; Farr, Jonathan B.

    2016-10-01

    We propose an analytical random walk model for proton dose calculation in a laterally homogeneous medium. A formula for the spatial fluence distribution of primary protons is derived. The variance of the spatial distribution is in the form of a distance-squared law of the angular distribution. To improve the accuracy of dose calculation in the Bragg peak region, the energy spectrum of the protons is used. The accuracy is validated against Monte Carlo simulation in water phantoms with either air gaps or a slab of bone inserted. The algorithm accurately reflects the dose dependence on the depth of the bone and can deal with small-field dosimetry. We further applied the algorithm to patients’ cases in the highly heterogeneous head and pelvis sites and used a gamma test to show the reasonable accuracy of the algorithm in these sites. Our algorithm is fast for clinical use.

  18. Comparison of optimization algorithms in intensity-modulated radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Kendrick, Rachel

    Intensity-modulated radiation therapy is used to better conform the radiation dose to the target, which includes avoiding healthy tissue. Planning programs employ optimization methods to search for the best fluence of each photon beam, and therefore to create the best treatment plan. The Computational Environment for Radiotherapy Research (CERR), a program written in MATLAB, was used to examine some commonly-used algorithms for one 5-beam plan. Algorithms include the genetic algorithm, quadratic programming, pattern search, constrained nonlinear optimization, simulated annealing, the optimization method used in Varian EclipseTM, and some hybrids of these. Quadratic programing, simulated annealing, and a quadratic/simulated annealing hybrid were also separately compared using different prescription doses. The results of each dose-volume histogram as well as the visual dose color wash were used to compare the plans. CERR's built-in quadratic programming provided the best overall plan, but avoidance of the organ-at-risk was rivaled by other programs. Hybrids of quadratic programming with some of these algorithms seems to suggest the possibility of better planning programs, as shown by the improved quadratic/simulated annealing plan when compared to the simulated annealing algorithm alone. Further experimentation will be done to improve cost functions and computational time.

  19. Efficient implementation of the 3D-DDA ray traversal algorithm on GPU and its application in radiation dose calculation.

    PubMed

    Xiao, Kai; Chen, Danny Z; Hu, X Sharon; Zhou, Bo

    2012-12-01

    The three-dimensional digital differential analyzer (3D-DDA) algorithm is a widely used ray traversal method, which is also at the core of many convolution∕superposition (C∕S) dose calculation approaches. However, porting existing C∕S dose calculation methods onto graphics processing unit (GPU) has brought challenges to retaining the efficiency of this algorithm. In particular, straightforward implementation of the original 3D-DDA algorithm inflicts a lot of branch divergence which conflicts with the GPU programming model and leads to suboptimal performance. In this paper, an efficient GPU implementation of the 3D-DDA algorithm is proposed, which effectively reduces such branch divergence and improves performance of the C∕S dose calculation programs running on GPU. The main idea of the proposed method is to convert a number of conditional statements in the original 3D-DDA algorithm into a set of simple operations (e.g., arithmetic, comparison, and logic) which are better supported by the GPU architecture. To verify and demonstrate the performance improvement, this ray traversal method was integrated into a GPU-based collapsed cone convolution∕superposition (CCCS) dose calculation program. The proposed method has been tested using a water phantom and various clinical cases on an NVIDIA GTX570 GPU. The CCCS dose calculation program based on the efficient 3D-DDA ray traversal implementation runs 1.42 ∼ 2.67× faster than the one based on the original 3D-DDA implementation, without losing any accuracy. The results show that the proposed method can effectively reduce branch divergence in the original 3D-DDA ray traversal algorithm and improve the performance of the CCCS program running on GPU. Considering the wide utilization of the 3D-DDA algorithm, various applications can benefit from this implementation method.

  20. Radiation dose reduction using a neck detection algorithm for single spiral brain and cervical spine CT acquisition in the trauma setting.

    PubMed

    Ardley, Nicholas D; Lau, Ken K; Buchan, Kevin

    2013-12-01

    Cervical spine injuries occur in 4-8 % of adults with head trauma. Dual acquisition technique has been traditionally used for the CT scanning of brain and cervical spine. The purpose of this study was to determine the efficacy of radiation dose reduction by using a single acquisition technique that incorporated both anatomical regions with a dedicated neck detection algorithm. Thirty trauma patients for brain and cervical spine CT were included and were scanned with the single acquisition technique. The radiation doses from the single CT acquisition technique with the neck detection algorithm, which allowed appropriate independent dose administration relevant to brain and cervical spine regions, were recorded. Comparison was made both to the doses calculated from the simulation of the traditional dual acquisitions with matching parameters, and to the doses of retrospective dual acquisition legacy technique with the same sample size. The mean simulated dose for the traditional dual acquisition technique was 3.99 mSv, comparable to the average dose of 4.2 mSv from 30 previous patients who had CT of brain and cervical spine as dual acquisitions. The mean dose from the single acquisition technique was 3.35 mSv, resulting in a 16 % overall dose reduction. The images from the single acquisition technique were of excellent diagnostic quality. The new single acquisition CT technique incorporating the neck detection algorithm for brain and cervical spine significantly reduces the overall radiation dose by eliminating the unavoidable overlapping range between 2 anatomical regions which occurs with the traditional dual acquisition technique.

  1. SU-E-T-33: A Feasibility-Seeking Algorithm Applied to Planning of Intensity Modulated Proton Therapy: A Proof of Principle Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Penfold, S; Casiraghi, M; Dou, T

    2015-06-15

    Purpose: To investigate the applicability of feasibility-seeking cyclic orthogonal projections to the field of intensity modulated proton therapy (IMPT) inverse planning. Feasibility of constraints only, as opposed to optimization of a merit function, is less demanding algorithmically and holds a promise of parallel computations capability with non-cyclic orthogonal projections algorithms such as string-averaging or block-iterative strategies. Methods: A virtual 2D geometry was designed containing a C-shaped planning target volume (PTV) surrounding an organ at risk (OAR). The geometry was pixelized into 1 mm pixels. Four beams containing a subset of proton pencil beams were simulated in Geant4 to provide themore » system matrix A whose elements a-ij correspond to the dose delivered to pixel i by a unit intensity pencil beam j. A cyclic orthogonal projections algorithm was applied with the goal of finding a pencil beam intensity distribution that would meet the following dose requirements: D-OAR < 54 Gy and 57 Gy < D-PTV < 64.2 Gy. The cyclic algorithm was based on the concept of orthogonal projections onto half-spaces according to the Agmon-Motzkin-Schoenberg algorithm, also known as ‘ART for inequalities’. Results: The cyclic orthogonal projections algorithm resulted in less than 5% of the PTV pixels and less than 1% of OAR pixels violating their dose constraints, respectively. Because of the abutting OAR-PTV geometry and the realistic modelling of the pencil beam penumbra, complete satisfaction of the dose objectives was not achieved, although this would be a clinically acceptable plan for a meningioma abutting the brainstem, for example. Conclusion: The cyclic orthogonal projections algorithm was demonstrated to be an effective tool for inverse IMPT planning in the 2D test geometry described. We plan to further develop this linear algorithm to be capable of incorporating dose-volume constraints into the feasibility-seeking algorithm.« less

  2. Validation of a track repeating algorithm for intensity modulated proton therapy: clinical cases study

    NASA Astrophysics Data System (ADS)

    Yepes, Pablo P.; Eley, John G.; Liu, Amy; Mirkovic, Dragan; Randeniya, Sharmalee; Titt, Uwe; Mohan, Radhe

    2016-04-01

    Monte Carlo (MC) methods are acknowledged as the most accurate technique to calculate dose distributions. However, due its lengthy calculation times, they are difficult to utilize in the clinic or for large retrospective studies. Track-repeating algorithms, based on MC-generated particle track data in water, accelerate dose calculations substantially, while essentially preserving the accuracy of MC. In this study, we present the validation of an efficient dose calculation algorithm for intensity modulated proton therapy, the fast dose calculator (FDC), based on a track-repeating technique. We validated the FDC algorithm for 23 patients, which included 7 brain, 6 head-and-neck, 5 lung, 1 spine, 1 pelvis and 3 prostate cases. For validation, we compared FDC-generated dose distributions with those from a full-fledged Monte Carlo based on GEANT4 (G4). We compared dose-volume-histograms, 3D-gamma-indices and analyzed a series of dosimetric indices. More than 99% of the voxels in the voxelized phantoms describing the patients have a gamma-index smaller than unity for the 2%/2 mm criteria. In addition the difference relative to the prescribed dose between the dosimetric indices calculated with FDC and G4 is less than 1%. FDC reduces the calculation times from 5 ms per proton to around 5 μs.

  3. SU-E-T-465: Dose Calculation Method for Dynamic Tumor Tracking Using a Gimbal-Mounted Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugimoto, S; Inoue, T; Kurokawa, C

    Purpose: Dynamic tumor tracking using the gimbal-mounted linac (Vero4DRT, Mitsubishi Heavy Industries, Ltd., Japan) has been available when respiratory motion is significant. The irradiation accuracy of the dynamic tumor tracking has been reported to be excellent. In addition to the irradiation accuracy, a fast and accurate dose calculation algorithm is needed to validate the dose distribution in the presence of respiratory motion because the multiple phases of it have to be considered. A modification of dose calculation algorithm is necessary for the gimbal-mounted linac due to the degrees of freedom of gimbal swing. The dose calculation algorithm for the gimbalmore » motion was implemented using the linear transformation between coordinate systems. Methods: The linear transformation matrices between the coordinate systems with and without gimbal swings were constructed using the combination of translation and rotation matrices. The coordinate system where the radiation source is at the origin and the beam axis along the z axis was adopted. The transformation can be divided into the translation from the radiation source to the gimbal rotation center, the two rotations around the center relating to the gimbal swings, and the translation from the gimbal center to the radiation source. After operating the transformation matrix to the phantom or patient image, the dose calculation can be performed as the no gimbal swing. The algorithm was implemented in the treatment planning system, PlanUNC (University of North Carolina, NC). The convolution/superposition algorithm was used. The dose calculations with and without gimbal swings were performed for the 3 × 3 cm{sup 2} field with the grid size of 5 mm. Results: The calculation time was about 3 minutes per beam. No significant additional time due to the gimbal swing was observed. Conclusions: The dose calculation algorithm for the finite gimbal swing was implemented. The calculation time was moderate.« less

  4. Dose calculation algorithm of fast fine-heterogeneity correction for heavy charged particle radiotherapy.

    PubMed

    Kanematsu, Nobuyuki

    2011-04-01

    This work addresses computing techniques for dose calculations in treatment planning with proton and ion beams, based on an efficient kernel-convolution method referred to as grid-dose spreading (GDS) and accurate heterogeneity-correction method referred to as Gaussian beam splitting. The original GDS algorithm suffered from distortion of dose distribution for beams tilted with respect to the dose-grid axes. Use of intermediate grids normal to the beam field has solved the beam-tilting distortion. Interplay of arrangement between beams and grids was found as another intrinsic source of artifact. Inclusion of rectangular-kernel convolution in beam transport, to share the beam contribution among the nearest grids in a regulatory manner, has solved the interplay problem. This algorithmic framework was applied to a tilted proton pencil beam and a broad carbon-ion beam. In these cases, while the elementary pencil beams individually split into several tens, the calculation time increased only by several times with the GDS algorithm. The GDS and beam-splitting methods will complementarily enable accurate and efficient dose calculations for radiotherapy with protons and ions. Copyright © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Priori mask guided image reconstruction (p-MGIR) for ultra-low dose cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Park, Justin C.; Zhang, Hao; Chen, Yunmei; Fan, Qiyong; Kahler, Darren L.; Liu, Chihray; Lu, Bo

    2015-11-01

    Recently, the compressed sensing (CS) based iterative reconstruction method has received attention because of its ability to reconstruct cone beam computed tomography (CBCT) images with good quality using sparsely sampled or noisy projections, thus enabling dose reduction. However, some challenges remain. In particular, there is always a tradeoff between image resolution and noise/streak artifact reduction based on the amount of regularization weighting that is applied uniformly across the CBCT volume. The purpose of this study is to develop a novel low-dose CBCT reconstruction algorithm framework called priori mask guided image reconstruction (p-MGIR) that allows reconstruction of high-quality low-dose CBCT images while preserving the image resolution. In p-MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions: (1) where anatomical structures are complex, and (2) where intensities are relatively uniform. The priori mask, which is the key concept of the p-MGIR algorithm, was defined as the matrix that distinguishes between the two separate CBCT regions where the resolution needs to be preserved and where streak or noise needs to be suppressed. We then alternately updated each part of image by solving two sub-minimization problems iteratively, where one minimization was focused on preserving the edge information of the first part while the other concentrated on the removal of noise/artifacts from the latter part. To evaluate the performance of the p-MGIR algorithm, a numerical head-and-neck phantom, a Catphan 600 physical phantom, and a clinical head-and-neck cancer case were used for analysis. The results were compared with the standard Feldkamp-Davis-Kress as well as conventional CS-based algorithms. Examination of the p-MGIR algorithm showed that high-quality low-dose CBCT images can be reconstructed without compromising the image resolution. For both phantom and the patient cases, the p-MGIR is able to achieve a clinically-reasonable image with 60 projections. Therefore, a clinically-viable, high-resolution head-and-neck CBCT image can be obtained while cutting the dose by 83%. Moreover, the image quality obtained using p-MGIR is better than the quality obtained using other algorithms. In this work, we propose a novel low-dose CBCT reconstruction algorithm called p-MGIR. It can be potentially used as a CBCT reconstruction algorithm with low dose scan requests

  6. Predicting warfarin dosage in European–Americans and African–Americans using DNA samples linked to an electronic health record

    PubMed Central

    Ramirez, Andrea H; Shi, Yaping; Schildcrout, Jonathan S; Delaney, Jessica T; Xu, Hua; Oetjens, Matthew T; Zuvich, Rebecca L; Basford, Melissa A; Bowton, Erica; Jiang, Min; Speltz, Peter; Zink, Raquel; Cowan, James; Pulley, Jill M; Ritchie, Marylyn D; Masys, Daniel R; Roden, Dan M; Crawford, Dana C; Denny, Joshua C

    2012-01-01

    Aim Warfarin pharmacogenomic algorithms reduce dosing error, but perform poorly in non-European–Americans. Electronic health record (EHR) systems linked to biobanks may allow for pharmacogenomic analysis, but they have not yet been used for this purpose. Patients & methods We used BioVU, the Vanderbilt EHR-linked DNA repository, to identify European–Americans (n = 1022) and African–Americans (n = 145) on stable warfarin therapy and evaluated the effect of 15 pharmacogenetic variants on stable warfarin dose. Results Associations between variants in VKORC1, CYP2C9 and CYP4F2 with weekly dose were observed in European–Americans as well as additional variants in CYP2C9 and CALU in African–Americans. Compared with traditional 5 mg/day dosing, implementing the US FDA recommendations or the International Warfarin Pharmacogenomics Consortium (IWPC) algorithm reduced error in weekly dose in European–Americans (13.5–12.4 and 9.5 mg/week, respectively) but less so in African–Americans (15.2–15.0 and 13.8 mg/week, respectively). By further incorporating associated variants specific for European–Americans and African–Americans in an expanded algorithm, dose-prediction error reduced to 9.1 mg/week (95% CI: 8.4–9.6) in European–Americans and 12.4 mg/week (95% CI: 10.0–13.2) in African–Americans. The expanded algorithm explained 41 and 53% of dose variation in African–Americans and European–Americans, respectively, compared with 29 and 50%, respectively, for the IWPC algorithm. Implementing these predictions via dispensable pill regimens similarly reduced dosing error. Conclusion These results validate EHR-linked DNA biorepositories as real-world resources for pharmacogenomic validation and discovery. PMID:22329724

  7. SU-E-T-626: Accuracy of Dose Calculation Algorithms in MultiPlan Treatment Planning System in Presence of Heterogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moignier, C; Huet, C; Barraux, V

    Purpose: Advanced stereotactic radiotherapy (SRT) treatments require accurate dose calculation for treatment planning especially for treatment sites involving heterogeneous patient anatomy. The purpose of this study was to evaluate the accuracy of dose calculation algorithms, Raytracing and Monte Carlo (MC), implemented in the MultiPlan treatment planning system (TPS) in presence of heterogeneities. Methods: First, the LINAC of a CyberKnife radiotherapy facility was modeled with the PENELOPE MC code. A protocol for the measurement of dose distributions with EBT3 films was established and validated thanks to comparison between experimental dose distributions and calculated dose distributions obtained with MultiPlan Raytracing and MCmore » algorithms as well as with the PENELOPE MC model for treatments planned with the homogenous Easycube phantom. Finally, bones and lungs inserts were used to set up a heterogeneous Easycube phantom. Treatment plans with the 10, 7.5 or the 5 mm field sizes were generated in Multiplan TPS with different tumor localizations (in the lung and at the lung/bone/soft tissue interface). Experimental dose distributions were compared to the PENELOPE MC and Multiplan calculations using the gamma index method. Results: Regarding the experiment in the homogenous phantom, 100% of the points passed for the 3%/3mm tolerance criteria. These criteria include the global error of the method (CT-scan resolution, EBT3 dosimetry, LINAC positionning …), and were used afterwards to estimate the accuracy of the MultiPlan algorithms in heterogeneous media. Comparison of the dose distributions obtained in the heterogeneous phantom is in progress. Conclusion: This work has led to the development of numerical and experimental dosimetric tools for small beam dosimetry. Raytracing and MC algorithms implemented in MultiPlan TPS were evaluated in heterogeneous media.« less

  8. SU-F-SPS-11: The Dosimetric Comparison of Truebeam 2.0 and Cyberknife M6 Treatment Plans for Brain SRS Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mabhouti, H; Sanli, E; Cebe, M

    Purpose: Brain stereotactic radiosurgery involves the use of precisely directed, single session radiation to create a desired radiobiologic response within the brain target with acceptable minimal effects on surrounding structures or tissues. In this study, the dosimetric comparison of Truebeam 2.0 and Cyberknife M6 treatment plans were made. Methods: For Truebeam 2.0 machine, treatment planning were done using 2 full arc VMAT technique with 6 FFF beam on the CT scan of Randophantom simulating the treatment of sterotactic treatments for one brain metastasis. The dose distribution were calculated using Eclipse treatment planning system with Acuros XB algorithm. The treatment planningmore » of the same target were also done for Cyberknife M6 machine with Multiplan treatment planning system using Monte Carlo algorithm. Using the same film batch, the net OD to dose calibration curve was obtained using both machine by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. Dose distribution were measured using EBT3 film dosimeter. The measured and calculated doses were compared. Results: The dose distribution in the target and 2 cm beyond the target edge were calculated on TPSs and measured using EBT3 film. For cyberknife plans, the gamma analysis passing rates between measured and calculated dose distributions were 99.2% and 96.7% for target and peripheral region of target respectively. For Truebeam plans, the gamma analysis passing rates were 99.1% and 95.5% for target and peripheral region of target respectively. Conclusion: Although, target dose distribution calculated accurately by Acuros XB and Monte Carlo algorithms, Monte carlo calculation algorithm predicts dose distribution around the peripheral region of target more accurately than Acuros algorithm.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kieselmann, J; Bartzsch, S; Oelfke, U

    Purpose: Microbeam Radiation Therapy is a preclinical method in radiation oncology that modulates radiation fields on a micrometre scale. Dose calculation is challenging due to arising dose gradients and therapeutically important dose ranges. Monte Carlo (MC) simulations, often used as gold standard, are computationally expensive and hence too slow for the optimisation of treatment parameters in future clinical applications. On the other hand, conventional kernel based dose calculation leads to inaccurate results close to material interfaces. The purpose of this work is to overcome these inaccuracies while keeping computation times low. Methods: A point kernel superposition algorithm is modified tomore » account for tissue inhomogeneities. Instead of conventional ray tracing approaches, methods from differential geometry are applied and the space around the primary photon interaction is locally warped. The performance of this approach is compared to MC simulations and a simple convolution algorithm (CA) for two different phantoms and photon spectra. Results: While peak doses of all dose calculation methods agreed within less than 4% deviations, the proposed approach surpassed a simple convolution algorithm in accuracy by a factor of up to 3 in the scatter dose. In a treatment geometry similar to possible future clinical situations differences between Monte Carlo and the differential geometry algorithm were less than 3%. At the same time the calculation time did not exceed 15 minutes. Conclusion: With the developed method it was possible to improve the dose calculation based on the CA method with respect to accuracy especially at sharp tissue boundaries. While the calculation is more extensive than for the CA method and depends on field size, the typical calculation time for a 20×20 mm{sup 2} field on a 3.4 GHz and 8 GByte RAM processor remained below 15 minutes. Parallelisation and optimisation of the algorithm could lead to further significant calculation time reductions.« less

  10. High-dose-rate prostate brachytherapy inverse planning on dose-volume criteria by simulated annealing.

    PubMed

    Deist, T M; Gorissen, B L

    2016-02-07

    High-dose-rate brachytherapy is a tumor treatment method where a highly radioactive source is brought in close proximity to the tumor. In this paper we develop a simulated annealing algorithm to optimize the dwell times at preselected dwell positions to maximize tumor coverage under dose-volume constraints on the organs at risk. Compared to existing algorithms, our algorithm has advantages in terms of speed and objective value and does not require an expensive general purpose solver. Its success mainly depends on exploiting the efficiency of matrix multiplication and a careful selection of the neighboring states. In this paper we outline its details and make an in-depth comparison with existing methods using real patient data.

  11. An algorithm for intelligent sorting of CT-related dose parameters.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan L; Steingall, Scott R; Boonn, William W; Kim, Woojin

    2012-02-01

    Imaging centers nationwide are seeking innovative means to record and monitor computed tomography (CT)-related radiation dose in light of multiple instances of patient overexposure to medical radiation. As a solution, we have developed RADIANCE, an automated pipeline for extraction, archival, and reporting of CT-related dose parameters. Estimation of whole-body effective dose from CT dose length product (DLP)--an indirect estimate of radiation dose--requires anatomy-specific conversion factors that cannot be applied to total DLP, but instead necessitate individual anatomy-based DLPs. A challenge exists because the total DLP reported on a dose sheet often includes multiple separate examinations (e.g., chest CT followed by abdominopelvic CT). Furthermore, the individual reported series DLPs may not be clearly or consistently labeled. For example, "arterial" could refer to the arterial phase of the triple liver CT or the arterial phase of a CT angiogram. To address this problem, we have designed an intelligent algorithm to parse dose sheets for multi-series CT examinations and correctly separate the total DLP into its anatomic components. The algorithm uses information from the departmental PACS to determine how many distinct CT examinations were concurrently performed. Then, it matches the number of distinct accession numbers to the series that were acquired and anatomically matches individual series DLPs to their appropriate CT examinations. This algorithm allows for more accurate dose analytics, but there remain instances where automatic sorting is not feasible. To ultimately improve radiology patient care, we must standardize series names and exam names to unequivocally sort exams by anatomy and correctly estimate whole-body effective dose.

  12. An algorithm for intelligent sorting of CT-related dose parameters

    NASA Astrophysics Data System (ADS)

    Cook, Tessa S.; Zimmerman, Stefan L.; Steingal, Scott; Boonn, William W.; Kim, Woojin

    2011-03-01

    Imaging centers nationwide are seeking innovative means to record and monitor CT-related radiation dose in light of multiple instances of patient over-exposure to medical radiation. As a solution, we have developed RADIANCE, an automated pipeline for extraction, archival and reporting of CT-related dose parameters. Estimation of whole-body effective dose from CT dose-length product (DLP)-an indirect estimate of radiation dose-requires anatomy-specific conversion factors that cannot be applied to total DLP, but instead necessitate individual anatomy-based DLPs. A challenge exists because the total DLP reported on a dose sheet often includes multiple separate examinations (e.g., chest CT followed by abdominopelvic CT). Furthermore, the individual reported series DLPs may not be clearly or consistently labeled. For example, Arterial could refer to the arterial phase of the triple liver CT or the arterial phase of a CT angiogram. To address this problem, we have designed an intelligent algorithm to parse dose sheets for multi-series CT examinations and correctly separate the total DLP into its anatomic components. The algorithm uses information from the departmental PACS to determine how many distinct CT examinations were concurrently performed. Then, it matches the number of distinct accession numbers to the series that were acquired, and anatomically matches individual series DLPs to their appropriate CT examinations. This algorithm allows for more accurate dose analytics, but there remain instances where automatic sorting is not feasible. To ultimately improve radiology patient care, we must standardize series names and exam names to unequivocally sort exams by anatomy and correctly estimate whole-body effective dose.

  13. Dose reduction potential of iterative reconstruction algorithms in neck CTA-a simulation study.

    PubMed

    Ellmann, Stephan; Kammerer, Ferdinand; Allmendinger, Thomas; Brand, Michael; Janka, Rolf; Hammon, Matthias; Lell, Michael M; Uder, Michael; Kramer, Manuel

    2016-10-01

    This study aimed to determine the degree of radiation dose reduction in neck CT angiography (CTA) achievable with Sinogram-affirmed iterative reconstruction (SAFIRE) algorithms. 10 consecutive patients scheduled for neck CTA were included in this study. CTA images of the external carotid arteries either were reconstructed with filtered back projection (FBP) at full radiation dose level or underwent simulated dose reduction by proprietary reconstruction software. The dose-reduced images were reconstructed using either SAFIRE 3 or SAFIRE 5 and compared with full-dose FBP images in terms of vessel definition. 5 observers performed a total of 3000 pairwise comparisons. SAFIRE allowed substantial radiation dose reductions in neck CTA while maintaining vessel definition. The possible levels of radiation dose reduction ranged from approximately 34 to approximately 90% and depended on the SAFIRE algorithm strength and the size of the vessel of interest. In general, larger vessels permitted higher degrees of radiation dose reduction, especially with higher SAFIRE strength levels. With small vessels, the superiority of SAFIRE 5 over SAFIRE 3 was lost. Neck CTA can be performed with substantially less radiation dose when SAFIRE is applied. The exact degree of radiation dose reduction should be adapted to the clinical question, in particular to the smallest vessel needing excellent definition.

  14. SU-E-T-373: Evaluation and Reduction of Contralateral Skin /subcutaneous Dose for Tangential Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butson, M; Carroll, S; Whitaker, M

    2015-06-15

    Purpose: Tangential breast irradiation is a standard treatment technique for breast cancer therapy. One aspect of dose delivery includes dose delivered to the skin caused by electron contamination. This effect is especially important for highly oblique beams used on the medical tangent where the electron contamination deposits dose on the contralateral breast side. This work aims to investigate and predict as well as define a method to reduce this dose during tangential breast radiotherapy. Methods: Analysis and calculation of breast skin and subcutaneous dose is performed using a Varian Eclipse planning system, AAA algorithm for 6MV x-ray treatments. Measurements weremore » made using EBT3 Gafchromic film to verify the accuracy of planning data. Various materials were tested to assess their ability to remove electron contamination on the contralateral breast. Results: Results showed that the Varian Eclipse AAA algorithm could accurately estimate contralateral breast dose in the build-up region at depths of 2mm or deeper. Surface dose was underestimated by the AAA algorithm. Doses up to 12% of applied dose were seen on the contralateral breast surface and up to 9 % at 2mm depth. Due to the nature of this radiation, being mainly low energy electron contamination, a bolus material could be used to reduce this dose to less than 3%. This is accomplished by 10 mm of superflab bolus or by 1 mm of lead. Conclusion: Contralateral breast skin and subcutaneous dose is present for tangential breast treatment and has been measured to be up to 12% of applied dose from the medial tangent beam. This dose is deposited at shallow depths and is accurately calculated by the Eclipse AAA algorithm at depths of 2mm or greater. Bolus material placed over the contralateral can be used to effectively reduce this skin dose.« less

  15. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms.

    PubMed

    Tang, Jie; Nett, Brian E; Chen, Guang-Hong

    2009-10-07

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  16. Percentage depth dose calculation accuracy of model based algorithms in high energy photon small fields through heterogeneous media and comparison with plastic scintillator dosimetry.

    PubMed

    Alagar, Ananda Giri Babu; Mani, Ganesh Kadirampatti; Karunakaran, Kaviarasu

    2016-01-08

    Small fields smaller than 4 × 4 cm2 are used in stereotactic and conformal treatments where heterogeneity is normally present. Since dose calculation accuracy in both small fields and heterogeneity often involves more discrepancy, algorithms used by treatment planning systems (TPS) should be evaluated for achieving better treatment results. This report aims at evaluating accuracy of four model-based algorithms, X-ray Voxel Monte Carlo (XVMC) from Monaco, Superposition (SP) from CMS-Xio, AcurosXB (AXB) and analytical anisotropic algorithm (AAA) from Eclipse are tested against the measurement. Measurements are done using Exradin W1 plastic scintillator in Solid Water phantom with heterogeneities like air, lung, bone, and aluminum, irradiated with 6 and 15 MV photons of square field size ranging from 1 to 4 cm2. Each heterogeneity is introduced individually at two different depths from depth-of-dose maximum (Dmax), one setup being nearer and another farther from the Dmax. The central axis percentage depth-dose (CADD) curve for each setup is measured separately and compared with the TPS algorithm calculated for the same setup. The percentage normalized root mean squared deviation (%NRMSD) is calculated, which represents the whole CADD curve's deviation against the measured. It is found that for air and lung heterogeneity, for both 6 and 15 MV, all algorithms show maximum deviation for field size 1 × 1 cm2 and gradually reduce when field size increases, except for AAA. For aluminum and bone, all algorithms' deviations are less for 15 MV irrespective of setup. In all heterogeneity setups, 1 × 1 cm2 field showed maximum deviation, except in 6MV bone setup. All algorithms in the study, irrespective of energy and field size, when any heterogeneity is nearer to Dmax, the dose deviation is higher compared to the same heterogeneity far from the Dmax. Also, all algorithms show maximum deviation in lower-density materials compared to high-density materials.

  17. Can image enhancement allow radiation dose to be reduced whilst maintaining the perceived diagnostic image quality required for coronary angiography?

    PubMed Central

    Joshi, Anuja; Gislason-Lee, Amber J; Keeble, Claire; Sivananthan, Uduvil M

    2017-01-01

    Objective: The aim of this research was to quantify the reduction in radiation dose facilitated by image processing alone for percutaneous coronary intervention (PCI) patient angiograms, without reducing the perceived image quality required to confidently make a diagnosis. Methods: Incremental amounts of image noise were added to five PCI angiograms, simulating the angiogram as having been acquired at corresponding lower dose levels (10–89% dose reduction). 16 observers with relevant experience scored the image quality of these angiograms in 3 states—with no image processing and with 2 different modern image processing algorithms applied. These algorithms are used on state-of-the-art and previous generation cardiac interventional X-ray systems. Ordinal regression allowing for random effects and the delta method were used to quantify the dose reduction possible by the processing algorithms, for equivalent image quality scores. Results: Observers rated the quality of the images processed with the state-of-the-art and previous generation image processing with a 24.9% and 15.6% dose reduction, respectively, as equivalent in quality to the unenhanced images. The dose reduction facilitated by the state-of-the-art image processing relative to previous generation processing was 10.3%. Conclusion: Results demonstrate that statistically significant dose reduction can be facilitated with no loss in perceived image quality using modern image enhancement; the most recent processing algorithm was more effective in preserving image quality at lower doses. Advances in knowledge: Image enhancement was shown to maintain perceived image quality in coronary angiography at a reduced level of radiation dose using computer software to produce synthetic images from real angiograms simulating a reduction in dose. PMID:28124572

  18. Monte Carlo uncertainty analysis of dose estimates in radiochromic film dosimetry with single-channel and multichannel algorithms.

    PubMed

    Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio

    2018-03-01

    To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Development of a pharmacogenetic-guided warfarin dosing algorithm for Puerto Rican patients.

    PubMed

    Ramos, Alga S; Seip, Richard L; Rivera-Miranda, Giselle; Felici-Giovanini, Marcos E; Garcia-Berdecia, Rafael; Alejandro-Cowan, Yirelia; Kocherla, Mohan; Cruz, Iadelisse; Feliu, Juan F; Cadilla, Carmen L; Renta, Jessica Y; Gorowski, Krystyna; Vergara, Cunegundo; Ruaño, Gualberto; Duconge, Jorge

    2012-12-01

    This study was aimed at developing a pharmacogenetic-driven warfarin-dosing algorithm in 163 admixed Puerto Rican patients on stable warfarin therapy. A multiple linear-regression analysis was performed using log-transformed effective warfarin dose as the dependent variable, and combining CYP2C9 and VKORC1 genotyping with other relevant nongenetic clinical and demographic factors as independent predictors. The model explained more than two-thirds of the observed variance in the warfarin dose among Puerto Ricans, and also produced significantly better 'ideal dose' estimates than two pharmacogenetic models and clinical algorithms published previously, with the greatest benefit seen in patients ultimately requiring <7 mg/day. We also assessed the clinical validity of the model using an independent validation cohort of 55 Puerto Rican patients from Hartford, CT, USA (R(2) = 51%). Our findings provide the basis for planning prospective pharmacogenetic studies to demonstrate the clinical utility of genotyping warfarin-treated Puerto Rican patients.

  20. From prompt gamma distribution to dose: a novel approach combining an evolutionary algorithm and filtering based on Gaussian-powerlaw convolutions.

    PubMed

    Schumann, A; Priegnitz, M; Schoene, S; Enghardt, W; Rohling, H; Fiedler, F

    2016-10-07

    Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.

  1. Testing of the analytical anisotropic algorithm for photon dose calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esch, Ann van; Tillikainen, Laura; Pyykkonen, Jukka

    2006-11-15

    The analytical anisotropic algorithm (AAA) was implemented in the Eclipse (Varian Medical Systems) treatment planning system to replace the single pencil beam (SPB) algorithm for the calculation of dose distributions for photon beams. AAA was developed to improve the dose calculation accuracy, especially in heterogeneous media. The total dose deposition is calculated as the superposition of the dose deposited by two photon sources (primary and secondary) and by an electron contamination source. The photon dose is calculated as a three-dimensional convolution of Monte-Carlo precalculated scatter kernels, scaled according to the electron density matrix. For the configuration of AAA, an optimizationmore » algorithm determines the parameters characterizing the multiple source model by optimizing the agreement between the calculated and measured depth dose curves and profiles for the basic beam data. We have combined the acceptance tests obtained in three different departments for 6, 15, and 18 MV photon beams. The accuracy of AAA was tested for different field sizes (symmetric and asymmetric) for open fields, wedged fields, and static and dynamic multileaf collimation fields. Depth dose behavior at different source-to-phantom distances was investigated. Measurements were performed on homogeneous, water equivalent phantoms, on simple phantoms containing cork inhomogeneities, and on the thorax of an anthropomorphic phantom. Comparisons were made among measurements, AAA, and SPB calculations. The optimization procedure for the configuration of the algorithm was successful in reproducing the basic beam data with an overall accuracy of 3%, 1 mm in the build-up region, and 1%, 1 mm elsewhere. Testing of the algorithm in more clinical setups showed comparable results for depth dose curves, profiles, and monitor units of symmetric open and wedged beams below d{sub max}. The electron contamination model was found to be suboptimal to model the dose around d{sub max}, especially for physical wedges at smaller source to phantom distances. For the asymmetric field verification, absolute dose difference of up to 4% were observed for the most extreme asymmetries. Compared to the SPB, the penumbra modeling is considerably improved (1%, 1 mm). At the interface between solid water and cork, profiles show a better agreement with AAA. Depth dose curves in the cork are substantially better with AAA than with SPB. Improvements are more pronounced for 18 MV than for 6 MV. Point dose measurements in the thoracic phantom are mostly within 5%. In general, we can conclude that, compared to SPB, AAA improves the accuracy of dose calculations. Particular progress was made with respect to the penumbra and low dose regions. In heterogeneous materials, improvements are substantial and more pronounced for high (18 MV) than for low (6 MV) energies.« less

  2. Influence of radiation dose and reconstruction algorithm in MDCT assessment of airway wall thickness: A phantom study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gomez-Cardona, Daniel; Nagle, Scott K.; Department of Radiology, University of Wisconsin-Madison School of Medicine and Public Health, 600 Highland Avenue, Madison, Wisconsin 53792

    Purpose: Wall thickness (WT) is an airway feature of great interest for the assessment of morphological changes in the lung parenchyma. Multidetector computed tomography (MDCT) has recently been used to evaluate airway WT, but the potential risk of radiation-induced carcinogenesis—particularly in younger patients—might limit a wider use of this imaging method in clinical practice. The recent commercial implementation of the statistical model-based iterative reconstruction (MBIR) algorithm, instead of the conventional filtered back projection (FBP) algorithm, has enabled considerable radiation dose reduction in many other clinical applications of MDCT. The purpose of this work was to study the impact of radiationmore » dose and MBIR in the MDCT assessment of airway WT. Methods: An airway phantom was scanned using a clinical MDCT system (Discovery CT750 HD, GE Healthcare) at 4 kV levels and 5 mAs levels. Both FBP and a commercial implementation of MBIR (Veo{sup TM}, GE Healthcare) were used to reconstruct CT images of the airways. For each kV–mAs combination and each reconstruction algorithm, the contrast-to-noise ratio (CNR) of the airways was measured, and the WT of each airway was measured and compared with the nominal value; the relative bias and the angular standard deviation in the measured WT were calculated. For each airway and reconstruction algorithm, the overall performance of WT quantification across all of the 20 kV–mAs combinations was quantified by the sum of squares (SSQs) of the difference between the measured and nominal WT values. Finally, the particular kV–mAs combination and reconstruction algorithm that minimized radiation dose while still achieving a reference WT quantification accuracy level was chosen as the optimal acquisition and reconstruction settings. Results: The wall thicknesses of seven airways of different sizes were analyzed in the study. Compared with FBP, MBIR improved the CNR of the airways, particularly at low radiation dose levels. For FBP, the relative bias and the angular standard deviation of the measured WT increased steeply with decreasing radiation dose. Except for the smallest airway, MBIR enabled significant reduction in both the relative bias and angular standard deviation of the WT, particularly at low radiation dose levels; the SSQ was reduced by 50%–96% by using MBIR. The optimal reconstruction algorithm was found to be MBIR for the seven airways being assessed, and the combined use of MBIR and optimal kV–mAs selection resulted in a radiation dose reduction of 37%–83% compared with a reference scan protocol with a dose level of 1 mGy. Conclusions: The quantification accuracy of airway WT is strongly influenced by radiation dose and reconstruction algorithm. The MBIR algorithm potentially allows the desired WT quantification accuracy to be achieved with reduced radiation dose, which may enable a wider clinical use of MDCT for the assessment of airway WT, particularly for younger patients who may be more sensitive to exposures with ionizing radiation.« less

  3. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J; Grassberger, C; Paganetti, H

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50)more » were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend treatment plan verification using Monte Carlo simulations for patients with complex geometries.« less

  4. A correction scheme for a simplified analytical random walk model algorithm of proton dose calculation in distal Bragg peak regions

    NASA Astrophysics Data System (ADS)

    Yao, Weiguang; Merchant, Thomas E.; Farr, Jonathan B.

    2016-10-01

    The lateral homogeneity assumption is used in most analytical algorithms for proton dose, such as the pencil-beam algorithms and our simplified analytical random walk model. To improve the dose calculation in the distal fall-off region in heterogeneous media, we analyzed primary proton fluence near heterogeneous media and propose to calculate the lateral fluence with voxel-specific Gaussian distributions. The lateral fluence from a beamlet is no longer expressed by a single Gaussian for all the lateral voxels, but by a specific Gaussian for each lateral voxel. The voxel-specific Gaussian for the beamlet of interest is calculated by re-initializing the fluence deviation on an effective surface where the proton energies of the beamlet of interest and the beamlet passing the voxel are the same. The dose improvement from the correction scheme was demonstrated by the dose distributions in two sets of heterogeneous phantoms consisting of cortical bone, lung, and water and by evaluating distributions in example patients with a head-and-neck tumor and metal spinal implants. The dose distributions from Monte Carlo simulations were used as the reference. The correction scheme effectively improved the dose calculation accuracy in the distal fall-off region and increased the gamma test pass rate. The extra computation for the correction was about 20% of that for the original algorithm but is dependent upon patient geometry.

  5. Measuring radiation dose in computed tomography using elliptic phantom and free-in-air, and evaluating iterative metal artifact reduction algorithm

    NASA Astrophysics Data System (ADS)

    Morgan, Ashraf

    The need for an accurate and reliable way for measuring patient dose in multi-row detector computed tomography (MDCT) has increased significantly. This research was focusing on the possibility of measuring CT dose in air to estimate Computed Tomography Dose Index (CTDI) for routine quality control purposes. New elliptic CTDI phantom that better represent human geometry was manufactured for investigating the effect of the subject shape on measured CTDI. Monte Carlo simulation was utilized in order to determine the dose distribution in comparison to the traditional cylindrical CTDI phantom. This research also investigated the effect of Siemens health care newly developed iMAR (iterative metal artifact reduction) algorithm, arthroplasty phantom was designed and manufactured that purpose. The design of new phantoms was part of the research as they mimic the human geometry more than the existing CTDI phantom. The standard CTDI phantom is a right cylinder that does not adequately represent the geometry of the majority of the patient population. Any dose reduction algorithm that is used during patient scan will not be utilized when scanning the CTDI phantom, so a better-designed phantom will allow the use of dose reduction algorithms when measuring dose, which leads to better dose estimation and/or better understanding of dose delivery. Doses from a standard CTDI phantom and the newly-designed phantoms were compared to doses measured in air. Iterative reconstruction is a promising technique in MDCT dose reduction and artifacts correction. Iterative reconstruction algorithms have been developed to address specific imaging tasks as is the case with Iterative Metal Artifact Reduction or iMAR which was developed by Siemens and is to be in use with the companys future computed tomography platform. The goal of iMAR is to reduce metal artifact when imaging patients with metal implants and recover CT number of tissues adjacent to the implant. This research evaluated iMAR capability of recovering CT numbers and reducing noise. Also, the use of iMAR should allow using lower tube voltage instead of 140 KVp which is used frequently to image patients with shoulder implants. The evaluations of image quality and dose reduction were carried out using an arthroplasty phantom.

  6. Commissioning and initial acceptance tests for a commercial convolution dose calculation algorithm for radiotherapy treatment planning in comparison with Monte Carlo simulation and measurement

    PubMed Central

    Moradi, Farhad; Mahdavi, Seyed Rabi; Mostaar, Ahmad; Motamedi, Mohsen

    2012-01-01

    In this study the commissioning of a dose calculation algorithm in a currently used treatment planning system was performed and the calculation accuracy of two available methods in the treatment planning system i.e., collapsed cone convolution (CCC) and equivalent tissue air ratio (ETAR) was verified in tissue heterogeneities. For this purpose an inhomogeneous phantom (IMRT thorax phantom) was used and dose curves obtained by the TPS (treatment planning system) were compared with experimental measurements and Monte Carlo (MCNP code) simulation. Dose measurements were performed by using EDR2 radiographic films within the phantom. Dose difference (DD) between experimental results and two calculation methods was obtained. Results indicate maximum difference of 12% in the lung and 3% in the bone tissue of the phantom between two methods and the CCC algorithm shows more accurate depth dose curves in tissue heterogeneities. Simulation results show the accurate dose estimation by MCNP4C in soft tissue region of the phantom and also better results than ETAR method in bone and lung tissues. PMID:22973081

  7. Warfarin pharmacogenetics: a single VKORC1 polymorphism is predictive of dose across 3 racial groups.

    PubMed

    Limdi, Nita A; Wadelius, Mia; Cavallari, Larisa; Eriksson, Niclas; Crawford, Dana C; Lee, Ming-Ta M; Chen, Chien-Hsiun; Motsinger-Reif, Alison; Sagreiya, Hersh; Liu, Nianjun; Wu, Alan H B; Gage, Brian F; Jorgensen, Andrea; Pirmohamed, Munir; Shin, Jae-Gook; Suarez-Kurtz, Guilherme; Kimmel, Stephen E; Johnson, Julie A; Klein, Teri E; Wagner, Michael J

    2010-05-06

    Warfarin-dosing algorithms incorporating CYP2C9 and VKORC1 -1639G>A improve dose prediction compared with algorithms based solely on clinical and demographic factors. However, these algorithms better capture dose variability among whites than Asians or blacks. Herein, we evaluate whether other VKORC1 polymorphisms and haplotypes explain additional variation in warfarin dose beyond that explained by VKORC1 -1639G>A among Asians (n = 1103), blacks (n = 670), and whites (n = 3113). Participants were recruited from 11 countries as part of the International Warfarin Pharmacogenetics Consortium effort. Evaluation of the effects of individual VKORC1 single nucleotide polymorphisms (SNPs) and haplotypes on warfarin dose used both univariate and multi variable linear regression. VKORC1 -1639G>A and 1173C>T individually explained the greatest variance in dose in all 3 racial groups. Incorporation of additional VKORC1 SNPs or haplotypes did not further improve dose prediction. VKORC1 explained greater variability in dose among whites than blacks and Asians. Differences in the percentage of variance in dose explained by VKORC1 across race were largely accounted for by the frequency of the -1639A (or 1173T) allele. Thus, clinicians should recognize that, although at a population level, the contribution of VKORC1 toward dose requirements is higher in whites than in nonwhites; genotype predicts similar dose requirements across racial groups.

  8. SU-E-T-344: Validation and Clinical Experience of Eclipse Electron Monte Carlo Algorithm (EMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokharel, S; Rana, S

    2014-06-01

    Purpose: The purpose of this study is to validate Eclipse Electron Monte Carlo (Algorithm for routine clinical uses. Methods: The PTW inhomogeneity phantom (T40037) with different combination of heterogeneous slabs has been CT-scanned with Philips Brilliance 16 slice scanner. The phantom contains blocks of Rando Alderson materials mimicking lung, Polystyrene (Tissue), PTFE (Bone) and PMAA. The phantom has 30×30×2.5 cm base plate with 2cm recesses to insert inhomogeneity. The detector systems used in this study are diode, tlds and Gafchromic EBT2 films. The diode and tlds were included in CT scans. The CT sets are transferred to Eclipse treatment planningmore » system. Several plans have been created with Eclipse Monte Carlo (EMC) algorithm 11.0.21. Measurements have been carried out in Varian TrueBeam machine for energy from 6–22mev. Results: The measured and calculated doses agreed very well for tissue like media. The agreement was reasonably okay for the presence of lung inhomogeneity. The point dose agreement was within 3.5% and Gamma passing rate at 3%/3mm was greater than 93% except for 6Mev(85%). The disagreement can reach as high as 10% in the presence of bone inhomogeneity. This is due to eclipse reporting dose to the medium as opposed to the dose to the water as in conventional calculation engines. Conclusion: Care must be taken when using Varian Eclipse EMC algorithm for dose calculation for routine clinical uses. The algorithm dose not report dose to water in which most of the clinical experiences are based on rather it just reports dose to medium directly. In the presence of inhomogeneity such as bone, the dose discrepancy can be as high as 10% or even more depending on the location of normalization point or volume. As Radiation oncology as an empirical science, care must be taken before using EMC reported monitor units for clinical uses.« less

  9. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    PubMed

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  10. A comparison between anisotropic analytical and multigrid superposition dose calculation algorithms in radiotherapy treatment planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Vincent W.C., E-mail: htvinwu@polyu.edu.hk; Tse, Teddy K.H.; Ho, Cola L.M.

    2013-07-01

    Monte Carlo (MC) simulation is currently the most accurate dose calculation algorithm in radiotherapy planning but requires relatively long processing time. Faster model-based algorithms such as the anisotropic analytical algorithm (AAA) by the Eclipse treatment planning system and multigrid superposition (MGS) by the XiO treatment planning system are 2 commonly used algorithms. This study compared AAA and MGS against MC, as the gold standard, on brain, nasopharynx, lung, and prostate cancer patients. Computed tomography of 6 patients of each cancer type was used. The same hypothetical treatment plan using the same machine and treatment prescription was computed for each casemore » by each planning system using their respective dose calculation algorithm. The doses at reference points including (1) soft tissues only, (2) bones only, (3) air cavities only, (4) soft tissue-bone boundary (Soft/Bone), (5) soft tissue-air boundary (Soft/Air), and (6) bone-air boundary (Bone/Air), were measured and compared using the mean absolute percentage error (MAPE), which was a function of the percentage dose deviations from MC. Besides, the computation time of each treatment plan was recorded and compared. The MAPEs of MGS were significantly lower than AAA in all types of cancers (p<0.001). With regards to body density combinations, the MAPE of AAA ranged from 1.8% (soft tissue) to 4.9% (Bone/Air), whereas that of MGS from 1.6% (air cavities) to 2.9% (Soft/Bone). The MAPEs of MGS (2.6%±2.1) were significantly lower than that of AAA (3.7%±2.5) in all tissue density combinations (p<0.001). The mean computation time of AAA for all treatment plans was significantly lower than that of the MGS (p<0.001). Both AAA and MGS algorithms demonstrated dose deviations of less than 4.0% in most clinical cases and their performance was better in homogeneous tissues than at tissue boundaries. In general, MGS demonstrated relatively smaller dose deviations than AAA but required longer computation time.« less

  11. SU-E-T-188: Film Dosimetry Verification of Monte Carlo Generated Electron Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enright, S; Asprinio, A; Lu, L

    2014-06-01

    Purpose: The purpose of this study was to compare dose distributions from film measurements to Monte Carlo generated electron treatment plans. Irradiation with electrons offers the advantages of dose uniformity in the target volume and of minimizing the dose to deeper healthy tissue. Using the Monte Carlo algorithm will improve dose accuracy in regions with heterogeneities and irregular surfaces. Methods: Dose distributions from GafChromic{sup ™} EBT3 films were compared to dose distributions from the Electron Monte Carlo algorithm in the Eclipse{sup ™} radiotherapy treatment planning system. These measurements were obtained for 6MeV, 9MeV and 12MeV electrons at two depths. Allmore » phantoms studied were imported into Eclipse by CT scan. A 1 cm thick solid water template with holes for bonelike and lung-like plugs was used. Different configurations were used with the different plugs inserted into the holes. Configurations with solid-water plugs stacked on top of one another were also used to create an irregular surface. Results: The dose distributions measured from the film agreed with those from the Electron Monte Carlo treatment plan. Accuracy of Electron Monte Carlo algorithm was also compared to that of Pencil Beam. Dose distributions from Monte Carlo had much higher pass rates than distributions from Pencil Beam when compared to the film. The pass rate for Monte Carlo was in the 80%–99% range, where the pass rate for Pencil Beam was as low as 10.76%. Conclusion: The dose distribution from Monte Carlo agreed with the measured dose from the film. When compared to the Pencil Beam algorithm, pass rates for Monte Carlo were much higher. Monte Carlo should be used over Pencil Beam for regions with heterogeneities and irregular surfaces.« less

  12. A point kernel algorithm for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Debus, Charlotte; Oelfke, Uwe; Bartzsch, Stefan

    2017-11-01

    Microbeam radiation therapy (MRT) is a treatment approach in radiation therapy where the treatment field is spatially fractionated into arrays of a few tens of micrometre wide planar beams of unusually high peak doses separated by low dose regions of several hundred micrometre width. In preclinical studies, this treatment approach has proven to spare normal tissue more effectively than conventional radiation therapy, while being equally efficient in tumour control. So far dose calculations in MRT, a prerequisite for future clinical applications are based on Monte Carlo simulations. However, they are computationally expensive, since scoring volumes have to be small. In this article a kernel based dose calculation algorithm is presented that splits the calculation into photon and electron mediated energy transport, and performs the calculation of peak and valley doses in typical MRT treatment fields within a few minutes. Kernels are analytically calculated depending on the energy spectrum and material composition. In various homogeneous materials peak, valley doses and microbeam profiles are calculated and compared to Monte Carlo simulations. For a microbeam exposure of an anthropomorphic head phantom calculated dose values are compared to measurements and Monte Carlo calculations. Except for regions close to material interfaces calculated peak dose values match Monte Carlo results within 4% and valley dose values within 8% deviation. No significant differences are observed between profiles calculated by the kernel algorithm and Monte Carlo simulations. Measurements in the head phantom agree within 4% in the peak and within 10% in the valley region. The presented algorithm is attached to the treatment planning platform VIRTUOS. It was and is used for dose calculations in preclinical and pet-clinical trials at the biomedical beamline ID17 of the European synchrotron radiation facility in Grenoble, France.

  13. A Pharmacogenetics-Based Warfarin Maintenance Dosing Algorithm from Northern Chinese Patients

    PubMed Central

    Luo, Fang; Wang, Jin'e; Shi, Yi; Tan, Yu; Chen, Qianlong; Zhang, Yu; Hui, Rutai; Wang, Yibo

    2014-01-01

    Inconsistent associations with warfarin dose were observed in genetic variants except VKORC1 haplotype and CYP2C9*3 in Chinese people, and few studies on warfarin dose algorithm was performed in a large Chinese Han population lived in Northern China. Of 787 consenting patients with heart-valve replacements who were receiving long-term warfarin maintenance therapy, 20 related Single nucleotide polymorphisms were genotyped. Only VKORC1 and CYP2C9 SNPs were observed to be significantly associated with warfarin dose. In the derivation cohort (n = 551), warfarin dose variability was influenced, in decreasing order, by VKORC1 rs7294 (27.3%), CYP2C9*3(7.0%), body surface area(4.2%), age(2.7%), target INR(1.4%), CYP4F2 rs2108622 (0.7%), amiodarone use(0.6%), diabetes mellitus(0.6%), and digoxin use(0.5%), which account for 45.1% of the warfarin dose variability. In the validation cohort (n = 236), the actual maintenance dose was significantly correlated with predicted dose (r = 0.609, P<0.001). Our algorithm could improve the personalized management of warfarin use in Northern Chinese patients. PMID:25126975

  14. SU-E-T-397: Evaluation of Planned Dose Distributions by Monte Carlo (0.5%) and Ray Tracing Algorithm for the Spinal Tumors with CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, H; Brindle, J; Hepel, J

    2015-06-15

    Purpose: To analyze and evaluate dose distribution between Ray Tracing (RT) and Monte Carlo (MC) algorithms of 0.5% uncertainty on a critical structure of spinal cord and gross target volume and planning target volume. Methods: Twenty four spinal tumor patients were treated with stereotactic body radiotherapy (SBRT) by CyberKnife in 2013 and 2014. The MC algorithm with 0.5% of uncertainty is used to recalculate the dose distribution for the treatment plan of the patients using the same beams, beam directions, and monitor units (MUs). Results: The prescription doses are uniformly larger for MC plans than RT except one case. Upmore » to a factor of 1.19 for 0.25cc threshold volume and 1.14 for 1.2cc threshold volume of dose differences are observed for the spinal cord. Conclusion: The MC recalculated dose distributions are larger than the original MC calculations for the spinal tumor cases. Based on the accuracy of the MC calculations, more radiation dose might be delivered to the tumor targets and spinal cords with the increase prescription dose.« less

  15. SU-E-T-268: Differences in Treatment Plan Quality and Delivery Between Two Commercial Treatment Planning Systems for Volumetric Arc-Based Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S; Zhang, H; Zhang, B

    2015-06-15

    Purpose: To clinically evaluate the differences in volumetric modulated arc therapy (VMAT) treatment plan and delivery between two commercial treatment planning systems. Methods: Two commercial VMAT treatment planning systems with different VMAT optimization algorithms and delivery approaches were evaluated. This study included 16 clinical VMAT plans performed with the first system: 2 spine, 4 head and neck (HN), 2 brain, 4 pancreas, and 4 pelvis plans. These 16 plans were then re-optimized with the same number of arcs using the second treatment planning system. Planning goals were invariant between the two systems. Gantry speed, dose rate modulation, MLC modulation, planmore » quality, number of monitor units (MUs), VMAT quality assurance (QA) results, and treatment delivery time were compared between the 2 systems. VMAT QA results were performed using Mapcheck2 and analyzed with gamma analysis (3mm/3% and 2mm/2%). Results: Similar plan quality was achieved with each VMAT optimization algorithm, and the difference in delivery time was minimal. Algorithm 1 achieved planning goals by highly modulating the MLC (total distance traveled by leaves (TL) = 193 cm average over control points per plan), while maintaining a relatively constant dose rate (dose-rate change <100 MU/min). Algorithm 2 involved less MLC modulation (TL = 143 cm per plan), but greater dose-rate modulation (range = 0-600 MU/min). The average number of MUs was 20% less for algorithm 2 (ratio of MUs for algorithms 2 and 1 ranged from 0.5-1). VMAT QA results were similar for all disease sites except HN plans. For HN plans, the average gamma passing rates were 88.5% (2mm/2%) and 96.9% (3mm/3%) for algorithm 1 and 97.9% (2mm/2%) and 99.6% (3mm/3%) for algorithm 2. Conclusion: Both VMAT optimization algorithms achieved comparable plan quality; however, fewer MUs were needed and QA results were more robust for Algorithm 2, which more highly modulated dose rate.« less

  16. Improving the estimation of mealtime insulin dose in adults with type 1 diabetes: the Normal Insulin Demand for Dose Adjustment (NIDDA) study.

    PubMed

    Bao, Jiansong; Gilbertson, Heather R; Gray, Robyn; Munns, Diane; Howard, Gabrielle; Petocz, Peter; Colagiuri, Stephen; Brand-Miller, Jennie C

    2011-10-01

    Although carbohydrate counting is routine practice in type 1 diabetes, hyperglycemic episodes are common. A food insulin index (FII) has been developed and validated for predicting the normal insulin demand generated by mixed meals in healthy adults. We sought to compare a novel algorithm on the basis of the FII for estimating mealtime insulin dose with carbohydrate counting in adults with type 1 diabetes. A total of 28 patients using insulin pump therapy consumed two different breakfast meals of equal energy, glycemic index, fiber, and calculated insulin demand (both FII = 60) but approximately twofold difference in carbohydrate content, in random order on three consecutive mornings. On one occasion, a carbohydrate-counting algorithm was applied to meal A (75 g carbohydrate) for determining bolus insulin dose. On the other two occasions, carbohydrate counting (about half the insulin dose as meal A) and the FII algorithm (same dose as meal A) were applied to meal B (41 g carbohydrate). A real-time continuous glucose monitor was used to assess 3-h postprandial glycemia. Compared with carbohydrate counting, the FII algorithm significantly decreased glucose incremental area under the curve over 3 h (-52%, P = 0.013) and peak glucose excursion (-41%, P = 0.01) and improved the percentage of time within the normal blood glucose range (4-10 mmol/L) (31%, P = 0.001). There was no significant difference in the occurrence of hypoglycemia. An insulin algorithm based on physiological insulin demand evoked by foods in healthy subjects may be a useful tool for estimating mealtime insulin dose in patients with type 1 diabetes.

  17. Improvements in pencil beam scanning proton therapy dose calculation accuracy in brain tumor cases with a commercial Monte Carlo algorithm.

    PubMed

    Widesott, Lamberto; Lorentini, Stefano; Fracchiolla, Francesco; Farace, Paolo; Schwarz, Marco

    2018-05-04

    validation of a commercial Monte Carlo (MC) algorithm (RayStation ver6.0.024) for the treatment of brain tumours with pencil beam scanning (PBS) proton therapy, comparing it via measurements and analytical calculations in clinically realistic scenarios. Methods: For the measurements a 2D ion chamber array detector (MatriXX PT)) was placed underneath the following targets: 1) anthropomorphic head phantom (with two different thickness) and 2) a biological sample (i.e. half lamb's head). In addition, we compared the MC dose engine vs. the RayStation pencil beam (PB) algorithm clinically implemented so far, in critical conditions such as superficial targets (i.e. in need of range shifter), different air gaps and gantry angles to simulate both orthogonal and tangential beam arrangements. For every plan the PB and MC dose calculation were compared to measurements using a gamma analysis metrics (3%, 3mm). Results: regarding the head phantom the gamma passing rate (GPR) was always >96% and on average > 99% for the MC algorithm; PB algorithm had a GPR ≤90% for all the delivery configurations with single slab (apart 95 % GPR from gantry 0° and small air gap) and in case of two slabs of the head phantom the GPR was >95% only in case of small air gaps for all the three (0°, 45°,and 70°) simulated beam gantry angles. Overall the PB algorithm tends to overestimate the dose to the target (up to 25%) and underestimate the dose to the organ at risk (up to 30%). We found similar results (but a bit worse for PB algorithm) for the two targets of the lamb's head where only two beam gantry angles were simulated. Conclusions: our results suggest that in PBS proton therapy range shifter (RS) need to be used with extreme caution when planning the treatment with an analytical algorithm due to potentially great discrepancies between the planned dose and the dose delivered to the patients, also in case of brain tumours where this issue could be underestimated. Our results also suggest that a MC evaluation of the dose has to be performed every time the RS is used and, mostly, when it is used with large air gaps and beam directions tangential to the patient surface. . © 2018 Institute of Physics and Engineering in Medicine.

  18. Concept development of X-ray mass thickness detection for irradiated items upon electron beam irradiation processing

    NASA Astrophysics Data System (ADS)

    Qin, Huaili; Yang, Guang; Kuang, Shan; Wang, Qiang; Liu, Jingjing; Zhang, Xiaomin; Li, Cancan; Han, Zhiwei; Li, Yuanjing

    2018-02-01

    The present project will adopt the principle and technology of X-ray imaging to quickly measure the mass thickness (wherein the mass thickness of the item =density of the item × thickness of the item) of the irradiated items and thus to determine whether the packaging size and inside location of the item will meet the requirements for treating thickness upon electron beam irradiation processing. The development of algorithm of X-ray mass thickness detector as well as the prediction of dose distribution have been completed. The development of the algorithm was based on the X-ray attenuation. 4 standard modules, Al sheet, Al ladders, PMMA sheet and PMMA ladders, were selected for the algorithm development. The algorithm was optimized until the error between tested mass thickness and standard mass thickness was less than 5%. Dose distribution of all energy (1-10 MeV) for each mass thickness was obtained using Monte-carlo method and used for the analysis of dose distribution, which provides the information of whether the item will be penetrated or not, as well as the Max. dose, Min. dose and DUR of the whole item.

  19. Changes in prescribed doses for the Seattle neutron therapy system

    NASA Astrophysics Data System (ADS)

    Popescu, A.

    2008-06-01

    From the beginning of the neutron therapy program at the University of Washington Medical Center, the neutron dose distribution in tissue has been calculated using an in-house treatment planning system called PRISM. In order to increase the accuracy of the absorbed dose calculations, two main improvements were made to the PRISM treatment planning system: (a) the algorithm was changed by the addition of an analytical expression of the central axis wedge factor dependence with field size and depth developed at UWMC. Older versions of the treatment-planning algorithm used a constant central axis wedge factor; (b) a complete newly commissioned set of measured data was introduced in the latest version of PRISM. The new version of the PRISM algorithm allowed for the use of the wedge profiles measured at different depths instead of one wedge profile measured at one depth. The comparison of the absorbed dose calculations using the old and the improved algorithm showed discrepancies mainly due to the missing central axis wedge factor dependence with field size and depth and due to the absence of the wedge profiles at depths different from 10 cm. This study concludes that the previously reported prescribed doses for neutron therapy should be changed.

  20. The dosimetric effects of tissue heterogeneities in intensity-modulated radiation therapy (IMRT) of the head and neck

    NASA Astrophysics Data System (ADS)

    Al-Hallaq, H. A.; Reft, C. S.; Roeske, J. C.

    2006-03-01

    The dosimetric effects of bone and air heterogeneities in head and neck IMRT treatments were quantified. An anthropomorphic RANDO phantom was CT-scanned with 16 thermoluminescent dosimeter (TLD) chips placed in and around the target volume. A standard IMRT plan generated with CORVUS was used to irradiate the phantom five times. On average, measured dose was 5.1% higher than calculated dose. Measurements were higher by 7.1% near the heterogeneities and by 2.6% in tissue. The dose difference between measurement and calculation was outside the 95% measurement confidence interval for six TLDs. Using CORVUS' heterogeneity correction algorithm, the average difference between measured and calculated doses decreased by 1.8% near the heterogeneities and by 0.7% in tissue. Furthermore, dose differences lying outside the 95% confidence interval were eliminated for five of the six TLDs. TLD doses recalculated by Pinnacle3's convolution/superposition algorithm were consistently higher than CORVUS doses, a trend that matched our measured results. These results indicate that the dosimetric effects of air cavities are larger than those of bone heterogeneities, thereby leading to a higher delivered dose compared to CORVUS calculations. More sophisticated algorithms such as convolution/superposition or Monte Carlo should be used for accurate tailoring of IMRT dose in head and neck tumours.

  1. Inter-patient image registration algorithms to disentangle regional dose bioeffects.

    PubMed

    Monti, Serena; Pacelli, Roberto; Cella, Laura; Palma, Giuseppe

    2018-03-20

    Radiation therapy (RT) technological advances call for a comprehensive reconsideration of the definition of dose features leading to radiation induced morbidity (RIM). In this context, the voxel-based approach (VBA) to dose distribution analysis in RT offers a radically new philosophy to evaluate local dose response patterns, as an alternative to dose-volume-histograms for identifying dose sensitive regions of normal tissue. The VBA relies on mapping patient dose distributions into a single reference case anatomy which serves as anchor for local dosimetric evaluations. The inter-patient elastic image registrations (EIRs) of the planning CTs provide the deformation fields necessary for the actual warp of dose distributions. In this study we assessed the impact of EIR on the VBA results in thoracic patients by identifying two state-of-the-art EIR algorithms (Demons and B-Spline). Our analysis demonstrated that both the EIR algorithms may be successfully used to highlight subregions with dose differences associated with RIM that substantially overlap. Furthermore, the inclusion for the first time of covariates within a dosimetric statistical model that faces the multiple comparison problem expands the potential of VBA, thus paving the way to a reliable voxel-based analysis of RIM in datasets with strong correlation of the outcome with non-dosimetric variables.

  2. A dose error evaluation study for 4D dose calculations

    NASA Astrophysics Data System (ADS)

    Milz, Stefan; Wilkens, Jan J.; Ullrich, Wolfgang

    2014-10-01

    Previous studies have shown that respiration induced motion is not negligible for Stereotactic Body Radiation Therapy. The intrafractional breathing induced motion influences the delivered dose distribution on the underlying patient geometry such as the lung or the abdomen. If a static geometry is used, a planning process for these indications does not represent the entire dynamic process. The quality of a full 4D dose calculation approach depends on the dose coordinate transformation process between deformable geometries. This article provides an evaluation study that introduces an advanced method to verify the quality of numerical dose transformation generated by four different algorithms. The used transformation metric value is based on the deviation of the dose mass histogram (DMH) and the mean dose throughout dose transformation. The study compares the results of four algorithms. In general, two elementary approaches are used: dose mapping and energy transformation. Dose interpolation (DIM) and an advanced concept, so called divergent dose mapping model (dDMM), are used for dose mapping. The algorithms are compared to the basic energy transformation model (bETM) and the energy mass congruent mapping (EMCM). For evaluation 900 small sample regions of interest (ROI) are generated inside an exemplary lung geometry (4DCT). A homogeneous fluence distribution is assumed for dose calculation inside the ROIs. The dose transformations are performed with the four different algorithms. The study investigates the DMH-metric and the mean dose metric for different scenarios (voxel sizes: 8 mm, 4 mm, 2 mm, 1 mm 9 different breathing phases). dDMM achieves the best transformation accuracy in all measured test cases with 3-5% lower errors than the other models. The results of dDMM are reasonable and most efficient in this study, although the model is simple and easy to implement. The EMCM model also achieved suitable results, but the approach requires a more complex programming structure. The study discloses disadvantages for the bETM and for the DIM. DIM yielded insufficient results for large voxel sizes, while bETM is prone to errors for small voxel sizes.

  3. A dose error evaluation study for 4D dose calculations.

    PubMed

    Milz, Stefan; Wilkens, Jan J; Ullrich, Wolfgang

    2014-11-07

    Previous studies have shown that respiration induced motion is not negligible for Stereotactic Body Radiation Therapy. The intrafractional breathing induced motion influences the delivered dose distribution on the underlying patient geometry such as the lung or the abdomen. If a static geometry is used, a planning process for these indications does not represent the entire dynamic process. The quality of a full 4D dose calculation approach depends on the dose coordinate transformation process between deformable geometries. This article provides an evaluation study that introduces an advanced method to verify the quality of numerical dose transformation generated by four different algorithms.The used transformation metric value is based on the deviation of the dose mass histogram (DMH) and the mean dose throughout dose transformation. The study compares the results of four algorithms. In general, two elementary approaches are used: dose mapping and energy transformation. Dose interpolation (DIM) and an advanced concept, so called divergent dose mapping model (dDMM), are used for dose mapping. The algorithms are compared to the basic energy transformation model (bETM) and the energy mass congruent mapping (EMCM). For evaluation 900 small sample regions of interest (ROI) are generated inside an exemplary lung geometry (4DCT). A homogeneous fluence distribution is assumed for dose calculation inside the ROIs. The dose transformations are performed with the four different algorithms.The study investigates the DMH-metric and the mean dose metric for different scenarios (voxel sizes: 8 mm, 4 mm, 2 mm, 1 mm; 9 different breathing phases). dDMM achieves the best transformation accuracy in all measured test cases with 3-5% lower errors than the other models. The results of dDMM are reasonable and most efficient in this study, although the model is simple and easy to implement. The EMCM model also achieved suitable results, but the approach requires a more complex programming structure. The study discloses disadvantages for the bETM and for the DIM. DIM yielded insufficient results for large voxel sizes, while bETM is prone to errors for small voxel sizes.

  4. Clinical implementation and evaluation of the Acuros dose calculation algorithm.

    PubMed

    Yan, Chenyu; Combine, Anthony G; Bednarz, Greg; Lalonde, Ronald J; Hu, Bin; Dickens, Kathy; Wynn, Raymond; Pavord, Daniel C; Saiful Huq, M

    2017-09-01

    The main aim of this study is to validate the Acuros XB dose calculation algorithm for a Varian Clinac iX linac in our clinics, and subsequently compare it with the wildely used AAA algorithm. The source models for both Acuros XB and AAA were configured by importing the same measured beam data into Eclipse treatment planning system. Both algorithms were validated by comparing calculated dose with measured dose on a homogeneous water phantom for field sizes ranging from 6 cm × 6 cm to 40 cm × 40 cm. Central axis and off-axis points with different depths were chosen for the comparison. In addition, the accuracy of Acuros was evaluated for wedge fields with wedge angles from 15 to 60°. Similarly, variable field sizes for an inhomogeneous phantom were chosen to validate the Acuros algorithm. In addition, doses calculated by Acuros and AAA at the center of lung equivalent tissue from three different VMAT plans were compared to the ion chamber measured doses in QUASAR phantom, and the calculated dose distributions by the two algorithms and their differences on patients were compared. Computation time on VMAT plans was also evaluated for Acuros and AAA. Differences between dose-to-water (calculated by AAA and Acuros XB) and dose-to-medium (calculated by Acuros XB) on patient plans were compared and evaluated. For open 6 MV photon beams on the homogeneous water phantom, both Acuros XB and AAA calculations were within 1% of measurements. For 23 MV photon beams, the calculated doses were within 1.5% of measured doses for Acuros XB and 2% for AAA. Testing on the inhomogeneous phantom demonstrated that AAA overestimated doses by up to 8.96% at a point close to lung/solid water interface, while Acuros XB reduced that to 1.64%. The test on QUASAR phantom showed that Acuros achieved better agreement in lung equivalent tissue while AAA underestimated dose for all VMAT plans by up to 2.7%. Acuros XB computation time was about three times faster than AAA for VMAT plans, and computation time for other plans will be discussed at the end. Maximum difference between dose calculated by AAA and dose-to-medium by Acuros XB (Acuros_D m,m ) was 4.3% on patient plans at the isocenter, and maximum difference between D 100 calculated by AAA and by Acuros_D m,m was 11.3%. When calculating the maximum dose to spinal cord on patient plans, differences between dose calculated by AAA and Acuros_D m,m were more than 3%. Compared with AAA, Acuros XB improves accuracy in the presence of inhomogeneity, and also significantly reduces computation time for VMAT plans. Dose differences between AAA and Acuros_D w,m were generally less than the dose differences between AAA and Acuros_D m,m . Clinical practitioners should consider making Acuros XB available in clinics, however, further investigation and clarification is needed about which dose reporting mode (dose-to-water or dose-to-medium) should be used in clinics. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, R; Popple, R; Benhabib, S

    Purpose: To evaluate the accuracy of electron dose distribution calculated by the Varian Eclipse electron Monte Carlo (eMC) algorithm for use with recent commercially available bolus electron conformal therapy (ECT). Methods: eMC-calculated electron dose distributions for bolus ECT have been compared to those previously measured for cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV CT anatomy for each site. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The bolus ECT treatment plans were imported into the Eclipse treatment planning system and calculated using the maximum allowable histories (2×10{sup 9}),more » resulting in a statistical error of <0.2%. Smoothing was not used for these calculations. Differences between eMC-calculated and measured dose distributions were evaluated in terms of absolute dose difference as well as distance to agreement (DTA). Results: Results from the eMC for the retromolar trigone phantom showed 89% (41/46) of dose points within 3% dose difference or 3 mm DTA. There was an average dose difference of −0.12% with a standard deviation of 2.56%. Results for the nose phantom showed 95% (54/57) of dose points within 3% dose difference or 3 mm DTA. There was an average dose difference of 1.12% with a standard deviation of 3.03%. Dose calculation times for the retromolar trigone and nose treatment plans were 15 min and 22 min, respectively, using 16 processors (Intel Xeon E5-2690, 2.9 GHz) on a Varian Eclipse framework agent server (FAS). Results of this study were consistent with those previously reported for accuracy of the eMC electron dose algorithm and for the .decimal, Inc. pencil beam redefinition algorithm used to plan the bolus. Conclusion: These results show that the accuracy of the Eclipse eMC algorithm is suitable for clinical implementation of bolus ECT.« less

  6. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    NASA Astrophysics Data System (ADS)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC-based SBRT treatment planning in the routine clinical setting.

  7. A deep convolutional neural network using directional wavelets for low-dose X-ray CT reconstruction.

    PubMed

    Kang, Eunhee; Min, Junhong; Ye, Jong Chul

    2017-10-01

    Due to the potential risk of inducing cancer, radiation exposure by X-ray CT devices should be reduced for routine patient scanning. However, in low-dose X-ray CT, severe artifacts typically occur due to photon starvation, beam hardening, and other causes, all of which decrease the reliability of the diagnosis. Thus, a high-quality reconstruction method from low-dose X-ray CT data has become a major research topic in the CT community. Conventional model-based de-noising approaches are, however, computationally very expensive, and image-domain de-noising approaches cannot readily remove CT-specific noise patterns. To tackle these problems, we want to develop a new low-dose X-ray CT algorithm based on a deep-learning approach. We propose an algorithm which uses a deep convolutional neural network (CNN) which is applied to the wavelet transform coefficients of low-dose CT images. More specifically, using a directional wavelet transform to extract the directional component of artifacts and exploit the intra- and inter- band correlations, our deep network can effectively suppress CT-specific noise. In addition, our CNN is designed with a residual learning architecture for faster network training and better performance. Experimental results confirm that the proposed algorithm effectively removes complex noise patterns from CT images derived from a reduced X-ray dose. In addition, we show that the wavelet-domain CNN is efficient when used to remove noise from low-dose CT compared to existing approaches. Our results were rigorously evaluated by several radiologists at the Mayo Clinic and won second place at the 2016 "Low-Dose CT Grand Challenge." To the best of our knowledge, this work is the first deep-learning architecture for low-dose CT reconstruction which has been rigorously evaluated and proven to be effective. In addition, the proposed algorithm, in contrast to existing model-based iterative reconstruction (MBIR) methods, has considerable potential to benefit from large data sets. Therefore, we believe that the proposed algorithm opens a new direction in the area of low-dose CT research. © 2017 American Association of Physicists in Medicine.

  8. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography.

    PubMed

    Treiber, O; Wanninger, F; Führ, H; Panzer, W; Regulla, D; Winkler, G

    2003-02-21

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing. a dose reduction by 25% has no serious influence on the detection results. whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.

  9. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography

    NASA Astrophysics Data System (ADS)

    Treiber, O.; Wanninger, F.; Führ, H.; Panzer, W.; Regulla, D.; Winkler, G.

    2003-02-01

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing, a dose reduction by 25% has no serious influence on the detection results, whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.

  10. SU-F-J-23: Field-Of-View Expansion in Cone-Beam CT Reconstruction by Use of Prior Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haga, A; Magome, T; Nakano, M

    Purpose: Cone-beam CT (CBCT) has become an integral part of online patient setup in an image-guided radiation therapy (IGRT). In addition, the utility of CBCT for dose calculation has actively been investigated. However, the limited size of field-of-view (FOV) and resulted CBCT image with a lack of peripheral area of patient body prevents the reliability of dose calculation. In this study, we aim to develop an FOV expanded CBCT in IGRT system to allow the dose calculation. Methods: Three lung cancer patients were selected in this study. We collected the cone-beam projection images in the CBCT-based IGRT system (X-ray volumemore » imaging unit, ELEKTA), where FOV size of the provided CBCT with these projections was 410 × 410 mm{sup 2} (normal FOV). Using these projections, CBCT with a size of 728 × 728 mm{sup 2} was reconstructed by a posteriori estimation algorithm including a prior image constrained compressed sensing (PICCS). The treatment planning CT was used as a prior image. To assess the effectiveness of FOV expansion, a dose calculation was performed on the expanded CBCT image with region-of-interest (ROI) density mapping method, and it was compared with that of treatment planning CT as well as that of CBCT reconstructed by filtered back projection (FBP) algorithm. Results: A posteriori estimation algorithm with PICCS clearly visualized an area outside normal FOV, whereas the FBP algorithm yielded severe streak artifacts outside normal FOV due to under-sampling. The dose calculation result using the expanded CBCT agreed with that using treatment planning CT very well; a maximum dose difference was 1.3% for gross tumor volumes. Conclusion: With a posteriori estimation algorithm, FOV in CBCT can be expanded. Dose comparison results suggested that the use of expanded CBCTs is acceptable for dose calculation in adaptive radiation therapy. This study has been supported by KAKENHI (15K08691).« less

  11. Photon beam dosimetry with EBT3 film in heterogeneous regions: Application to the evaluation of dose-calculation algorithms

    NASA Astrophysics Data System (ADS)

    Jung, Hyunuk; Kum, Oyeon; Han, Youngyih; Park, Byungdo; Cheong, Kwang-Ho

    2014-12-01

    For a better understanding of the accuracy of state-of-the-art-radiation therapies, 2-dimensional dosimetry in a patient-like environment will be helpful. Therefore, the dosimetry of EBT3 films in non-water-equivalent tissues was investigated, and the accuracy of commercially-used dose-calculation algorithms was evaluated with EBT3 measurement. Dose distributions were measured with EBT3 films for an in-house-designed phantom that contained a lung or a bone substitute, i.e., an air cavity (3 × 3 × 3 cm3) or teflon (2 × 2 × 2 cm3 or 3 × 3 × 3 cm3), respectively. The phantom was irradiated with 6-MV X-rays with field sizes of 2 × 2, 3 × 3, and 5 × 5 cm2. The accuracy of EBT3 dosimetry was evaluated by comparing the measured dose with the dose obtained from Monte Carlo (MC) simulations. A dose-to-bone-equivalent material was obtained by multiplying the EBT3 measurements by the stopping power ratio (SPR). The EBT3 measurements were then compared with the predictions from four algorithms: Monte Carlo (MC) in iPlan, acuros XB (AXB), analytical anisotropic algorithm (AAA) in Eclipse, and superposition-convolution (SC) in Pinnacle. For the air cavity, the EBT3 measurements agreed with the MC calculation to within 2% on average. For teflon, the EBT3 measurements differed by 9.297% (±0.9229%) on average from the Monte Carlo calculation before dose conversion, and by 0.717% (±0.6546%) after applying the SPR. The doses calculated by using the MC, AXB, AAA, and SC algorithms for the air cavity differed from the EBT3 measurements on average by 2.174, 2.863, 18.01, and 8.391%, respectively; for teflon, the average differences were 3.447, 4.113, 7.589, and 5.102%. The EBT3 measurements corrected with the SPR agreed with 2% on average both within and beyond the heterogeneities with MC results, thereby indicating that EBT3 dosimetry can be used in heterogeneous media. The MC and the AXB dose calculation algorithms exhibited clinically-acceptable accuracy (<5%) in heterogeneities.

  12. Research on Ratio of Dosage of Drugs in Traditional Chinese Prescriptions by Data Mining.

    PubMed

    Yu, Xing-Wen; Gong, Qing-Yue; Hu, Kong-Fa; Mao, Wen-Jing; Zhang, Wei-Ming

    2017-01-01

    Maximizing the effectiveness of prescriptions and minimizing adverse effects of drugs is a key component of the health care of patients. In the practice of traditional Chinese medicine (TCM), it is important to provide clinicians a reference for dosing of prescribed drugs. The traditional Cheng-Church biclustering algorithm (CC) is optimized and the data of TCM prescription dose is analyzed by using the optimization algorithm. Based on an analysis of 212 prescriptions related to TCM treatment of kidney diseases, the study generated 87 prescription dose quantum matrices and each sub-matrix represents the referential value of the doses of drugs in different recipes. The optimized CC algorithm can effectively eliminate the interference of zero in the original dose matrix of TCM prescriptions and avoid zero appearing in output sub-matrix. This results in the ability to effectively analyze the reference value of drugs in different prescriptions related to kidney diseases, so as to provide valuable reference for clinicians to use drugs rationally.

  13. CT brush and CancerZap!: two video games for computed tomography dose minimization.

    PubMed

    Alvare, Graham; Gordon, Richard

    2015-05-12

    X-ray dose from computed tomography (CT) scanners has become a significant public health concern. All CT scanners spray x-ray photons across a patient, including those using compressive sensing algorithms. New technologies make it possible to aim x-ray beams where they are most needed to form a diagnostic or screening image. We have designed a computer game, CT Brush, that takes advantage of this new flexibility. It uses a standard MART algorithm (Multiplicative Algebraic Reconstruction Technique), but with a user defined dynamically selected subset of the rays. The image appears as the player moves the CT brush over an initially blank scene, with dose accumulating with every "mouse down" move. The goal is to find the "tumor" with as few moves (least dose) as possible. We have successfully implemented CT Brush in Java and made it available publicly, requesting crowdsourced feedback on improving the open source code. With this experience, we also outline a "shoot 'em up game" CancerZap! for photon limited CT. We anticipate that human computing games like these, analyzed by methods similar to those used to understand eye tracking, will lead to new object dependent CT algorithms that will require significantly less dose than object independent nonlinear and compressive sensing algorithms that depend on sprayed photons. Preliminary results suggest substantial dose reduction is achievable.

  14. Percentage depth dose calculation accuracy of model based algorithms in high energy photon small fields through heterogeneous media and comparison with plastic scintillator dosimetry

    PubMed Central

    Mani, Ganesh Kadirampatti; Karunakaran, Kaviarasu

    2016-01-01

    Small fields smaller than 4×4 cm2 are used in stereotactic and conformal treatments where heterogeneity is normally present. Since dose calculation accuracy in both small fields and heterogeneity often involves more discrepancy, algorithms used by treatment planning systems (TPS) should be evaluated for achieving better treatment results. This report aims at evaluating accuracy of four model‐based algorithms, X‐ray Voxel Monte Carlo (XVMC) from Monaco, Superposition (SP) from CMS‐Xio, AcurosXB (AXB) and analytical anisotropic algorithm (AAA) from Eclipse are tested against the measurement. Measurements are done using Exradin W1 plastic scintillator in Solid Water phantom with heterogeneities like air, lung, bone, and aluminum, irradiated with 6 and 15 MV photons of square field size ranging from 1 to 4 cm2. Each heterogeneity is introduced individually at two different depths from depth‐of‐dose maximum (Dmax), one setup being nearer and another farther from the Dmax. The central axis percentage depth‐dose (CADD) curve for each setup is measured separately and compared with the TPS algorithm calculated for the same setup. The percentage normalized root mean squared deviation (%NRMSD) is calculated, which represents the whole CADD curve's deviation against the measured. It is found that for air and lung heterogeneity, for both 6 and 15 MV, all algorithms show maximum deviation for field size 1×1 cm2 and gradually reduce when field size increases, except for AAA. For aluminum and bone, all algorithms' deviations are less for 15 MV irrespective of setup. In all heterogeneity setups, 1×1 cm2 field showed maximum deviation, except in 6 MV bone setup. All algorithms in the study, irrespective of energy and field size, when any heterogeneity is nearer to Dmax, the dose deviation is higher compared to the same heterogeneity far from the Dmax. Also, all algorithms show maximum deviation in lower‐density materials compared to high‐density materials. PACS numbers: 87.53.Bn, 87.53.kn, 87.56.bd, 87.55.Kd, 87.56.jf PMID:26894345

  15. Image reconstruction algorithm for optically stimulated luminescence 2D dosimetry using laser-scanned Al2O3:C and Al2O3:C,Mg films

    NASA Astrophysics Data System (ADS)

    Ahmed, M. F.; Schnell, E.; Ahmad, S.; Yukihara, E. G.

    2016-10-01

    The objective of this work was to develop an image reconstruction algorithm for 2D dosimetry using Al2O3:C and Al2O3:C,Mg optically stimulated luminescence (OSL) films imaged using a laser scanning system. The algorithm takes into account parameters associated with detector properties and the readout system. Pieces of Al2O3:C films (~8 mm  ×  8 mm  ×  125 µm) were irradiated and used to simulate dose distributions with extreme dose gradients (zero and non-zero dose regions). The OSLD film pieces were scanned using a custom-built laser-scanning OSL reader and the data obtained were used to develop and demonstrate a dose reconstruction algorithm. The algorithm includes corrections for: (a) galvo hysteresis, (b) photomultiplier tube (PMT) linearity, (c) phosphorescence, (d) ‘pixel bleeding’ caused by the 35 ms luminescence lifetime of F-centers in Al2O3, (e) geometrical distortion inherent to Galvo scanning system, and (f) position dependence of the light collection efficiency. The algorithm was also applied to 6.0 cm  ×  6.0 cm  ×  125 μm or 10.0 cm  ×  10.0 cm  ×  125 µm Al2O3:C and Al2O3:C,Mg films exposed to megavoltage x-rays (6 MV) and 12C beams (430 MeV u-1). The results obtained using pieces of irradiated films show the ability of the image reconstruction algorithm to correct for pixel bleeding even in the presence of extremely sharp dose gradients. Corrections for geometric distortion and position dependence of light collection efficiency were shown to minimize characteristic limitations of this system design. We also exemplify the application of the algorithm to more clinically relevant 6 MV x-ray beam and a 12C pencil beam, demonstrating the potential for small field dosimetry. The image reconstruction algorithm described here provides the foundation for laser-scanned OSL applied to 2D dosimetry.

  16. Low Dose CT Reconstruction via Edge-preserving Total Variation Regularization

    PubMed Central

    Tian, Zhen; Jia, Xun; Yuan, Kehong; Pan, Tinsu; Jiang, Steve B.

    2014-01-01

    High radiation dose in CT scans increases a lifetime risk of cancer and has become a major clinical concern. Recently, iterative reconstruction algorithms with Total Variation (TV) regularization have been developed to reconstruct CT images from highly undersampled data acquired at low mAs levels in order to reduce the imaging dose. Nonetheless, the low contrast structures tend to be smoothed out by the TV regularization, posing a great challenge for the TV method. To solve this problem, in this work we develop an iterative CT reconstruction algorithm with edge-preserving TV regularization to reconstruct CT images from highly undersampled data obtained at low mAs levels. The CT image is reconstructed by minimizing an energy consisting of an edge-preserving TV norm and a data fidelity term posed by the x-ray projections. The edge-preserving TV term is proposed to preferentially perform smoothing only on non-edge part of the image in order to better preserve the edges, which is realized by introducing a penalty weight to the original total variation norm. During the reconstruction process, the pixels at edges would be gradually identified and given small penalty weight. Our iterative algorithm is implemented on GPU to improve its speed. We test our reconstruction algorithm on a digital NCAT phantom, a physical chest phantom, and a Catphan phantom. Reconstruction results from a conventional FBP algorithm and a TV regularization method without edge preserving penalty are also presented for comparison purpose. The experimental results illustrate that both TV-based algorithm and our edge-preserving TV algorithm outperform the conventional FBP algorithm in suppressing the streaking artifacts and image noise under the low dose context. Our edge-preserving algorithm is superior to the TV-based algorithm in that it can preserve more information of low contrast structures and therefore maintain acceptable spatial resolution. PMID:21860076

  17. SU-F-T-431: Dosimetric Validation of Acuros XB Algorithm for Photon Dose Calculation in Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, L; Yadav, G; Kishore, V

    2016-06-15

    Purpose: To validate the Acuros XB algorithm implemented in Eclipse Treatment planning system version 11 (Varian Medical System, Inc., Palo Alto, CA, USA) for photon dose calculation. Methods: Acuros XB is a Linear Boltzmann transport equation (LBTE) solver that solves LBTE equation explicitly and gives result equivalent to Monte Carlo. 6MV photon beam from Varian Clinac-iX (2300CD) was used for dosimetric validation of Acuros XB. Percentage depth dose (PDD) and profiles (at dmax, 5, 10, 20 and 30 cm) measurements were performed in water for field size ranging from 2×2,4×4, 6×6, 10×10, 20×20, 30×30 and 40×40 cm{sup 2}. Acuros XBmore » results were compared against measurements and anisotropic analytical algorithm (AAA) algorithm. Results: Acuros XB result shows good agreement with measurements, and were comparable to AAA algorithm. Result for PDD and profiles shows less than one percent difference from measurements, and from calculated PDD and profiles by AAA algorithm for all field size. TPS calculated Gamma error histogram values, average gamma errors in PDD curves before dmax and after dmax were 0.28, 0.15 for Acuros XB and 0.24, 0.17 for AAA respectively, average gamma error in profile curves in central region, penumbra region and outside field region were 0.17, 0.21, 0.42 for Acuros XB and 0.10, 0.22, 0.35 for AAA respectively. Conclusion: The dosimetric validation of Acuros XB algorithms in water medium was satisfactory. Acuros XB algorithm has potential to perform photon dose calculation with high accuracy, which is more desirable for modern radiotherapy environment.« less

  18. Dose Titration Algorithm Tuning (DTAT) should supersede 'the' Maximum Tolerated Dose (MTD) in oncology dose-finding trials.

    PubMed

    Norris, David C

    2017-01-01

    Background . Absent adaptive, individualized dose-finding in early-phase oncology trials, subsequent 'confirmatory' Phase III trials risk suboptimal dosing, with resulting loss of statistical power and reduced probability of technical success for the investigational therapy. While progress has been made toward explicitly adaptive dose-finding and quantitative modeling of dose-response relationships, most such work continues to be organized around a concept of 'the' maximum tolerated dose (MTD). The purpose of this paper is to demonstrate concretely how the aim of early-phase trials might be conceived, not as 'dose-finding', but as dose titration algorithm (DTA) -finding. Methods. A Phase I dosing study is simulated, for a notional cytotoxic chemotherapy drug, with neutropenia constituting the critical dose-limiting toxicity. The drug's population pharmacokinetics and myelosuppression dynamics are simulated using published parameter estimates for docetaxel. The amenability of this model to linearization is explored empirically. The properties of a simple DTA targeting neutrophil nadir of 500 cells/mm 3 using a Newton-Raphson heuristic are explored through simulation in 25 simulated study subjects. Results. Individual-level myelosuppression dynamics in the simulation model approximately linearize under simple transformations of neutrophil concentration and drug dose. The simulated dose titration exhibits largely satisfactory convergence, with great variance in individualized optimal dosing. Some titration courses exhibit overshooting. Conclusions. The large inter-individual variability in simulated optimal dosing underscores the need to replace 'the' MTD with an individualized concept of MTD i . To illustrate this principle, the simplest possible DTA capable of realizing such a concept is demonstrated. Qualitative phenomena observed in this demonstration support discussion of the notion of tuning such algorithms. Although here illustrated specifically in relation to cytotoxic chemotherapy, the DTAT principle appears similarly applicable to Phase I studies of cancer immunotherapy and molecularly targeted agents.

  19. Phase I study of continuous MKC-1 in patients with advanced or metastatic solid malignancies using the modified Time-to-Event Continual Reassessment Method (TITE-CRM) dose escalation design.

    PubMed

    Tevaarwerk, Amye; Wilding, George; Eickhoff, Jens; Chappell, Rick; Sidor, Carolyn; Arnott, Jamie; Bailey, Howard; Schelman, William; Liu, Glenn

    2012-06-01

    MKC-1 is an oral cell-cycle inhibitor with broad antitumor activity in preclinical models. Clinical studies demonstrated modest antitumor activity using intermittent dosing schedule, however additional preclinical data suggested continuous dosing could be efficacious with additional effects against the mTor/AKT pathway. The primary objectives were to determine the maximum tolerated dose (MTD) and response of continuous MKC-1. Secondary objectives included characterizing the dose limiting toxicities (DLTs) and pharmacokinetics (PK). Patients with solid malignancies were eligible, if they had measurable disease, ECOG PS ≤1, and adequate organ function. Exclusions included brain metastases and inability to receive oral drug. MKC-1 was dosed twice daily, continuously in 28-day cycles. Other medications were eliminated if there were possible drug interactions. Doses were assigned using a TITE-CRM algorithm following enrollment of the first 3 pts. Disease response was assessed every 8 weeks. Between 5/08-9/09, 24 patients enrolled (15 M/9 F, median 58 years, range 44-77). Patients 1-3 received 120 mg/d of MKC-1; patients 4-24 were dosed per the TITE-CRM algorithm: 150 mg [n = 1], 180 [2], 200 [1], 230 [1], 260 [5], 290 [6], 320 [5]. The median time on drug was 8 weeks (range 4-28). The only DLT occurred at 320 mg (grade 3 fatigue). Stable disease occurred at 150 mg/d (28 weeks; RCC) and 320 mg/d (16 weeks; breast, parotid). Escalation halted at 320 mg/d. Day 28 pharmacokinetics indicated absorption and active metabolites. Continuous MKC-1 was well-tolerated; there were no RECIST responses, although clinical benefit occurred in 3/24 pts. Dose escalation stopped at 320 mg/d, and this is the MTD as defined by the CRM dose escalation algorithm; this cumulative dose/cycle exceeds that determined from intermittent dosing studies. A TITE-CRM allowed for rapid dose escalation and was able to account for late toxicities with continuous dosing via a modified algorithm.

  20. TU-F-BRF-03: Effect of Radiation Therapy Planning Scan Registration On the Dose in Lung Cancer Patient CT Scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cunliffe, A; Contee, C; White, B

    Purpose: To characterize the effect of deformable registration of serial computed tomography (CT) scans on the radiation dose calculated from a treatment planning scan. Methods: Eighteen patients who received curative doses (≥60Gy, 2Gy/fraction) of photon radiation therapy for lung cancer treatment were retrospectively identified. For each patient, a diagnostic-quality pre-therapy (4–75 days) CT scan and a treatment planning scan with an associated dose map calculated in Pinnacle were collected. To establish baseline correspondence between scan pairs, a researcher manually identified anatomically corresponding landmark point pairs between the two scans. Pre-therapy scans were co-registered with planning scans (and associated dose maps)more » using the Plastimatch demons and Fraunhofer MEVIS deformable registration algorithms. Landmark points in each pretherapy scan were automatically mapped to the planning scan using the displacement vector field output from both registration algorithms. The absolute difference in planned dose (|ΔD|) between manually and automatically mapped landmark points was calculated. Using regression modeling, |ΔD| was modeled as a function of the distance between manually and automatically matched points (registration error, E), the dose standard deviation (SD-dose) in the eight-pixel neighborhood, and the registration algorithm used. Results: 52–92 landmark point pairs (median: 82) were identified in each patient's scans. Average |ΔD| across patients was 3.66Gy (range: 1.2–7.2Gy). |ΔD| was significantly reduced by 0.53Gy using Plastimatch demons compared with Fraunhofer MEVIS. |ΔD| increased significantly as a function of E (0.39Gy/mm) and SD-dose (2.23Gy/Gy). Conclusion: An average error of <4Gy in radiation dose was introduced when points were mapped between CT scan pairs using deformable registration. Dose differences following registration were significantly increased when the Fraunhofer MEVIS registration algorithm was used, spatial registration errors were larger, and dose gradient was higher (i.e., higher SD-dose). To our knowledge, this is the first study to directly compute dose errors following deformable registration of lung CT scans.« less

  1. Estimation of internal organ motion-induced variance in radiation dose in non-gated radiotherapy

    NASA Astrophysics Data System (ADS)

    Zhou, Sumin; Zhu, Xiaofeng; Zhang, Mutian; Zheng, Dandan; Lei, Yu; Li, Sicong; Bennion, Nathan; Verma, Vivek; Zhen, Weining; Enke, Charles

    2016-12-01

    In the delivery of non-gated radiotherapy (RT), owing to intra-fraction organ motion, a certain degree of RT dose uncertainty is present. Herein, we propose a novel mathematical algorithm to estimate the mean and variance of RT dose that is delivered without gating. These parameters are specific to individual internal organ motion, dependent on individual treatment plans, and relevant to the RT delivery process. This algorithm uses images from a patient’s 4D simulation study to model the actual patient internal organ motion during RT delivery. All necessary dose rate calculations are performed in fixed patient internal organ motion states. The analytical and deterministic formulae of mean and variance in dose from non-gated RT were derived directly via statistical averaging of the calculated dose rate over possible random internal organ motion initial phases, and did not require constructing relevant histograms. All results are expressed in dose rate Fourier transform coefficients for computational efficiency. Exact solutions are provided to simplified, yet still clinically relevant, cases. Results from a volumetric-modulated arc therapy (VMAT) patient case are also presented. The results obtained from our mathematical algorithm can aid clinical decisions by providing information regarding both mean and variance of radiation dose to non-gated patients prior to RT delivery.

  2. DRESS syndrome with thrombotic microangiopathy revealing a Noonan syndrome: Case report.

    PubMed

    Bobot, Mickaël; Coen, Matteo; Simon, Clémentine; Daniel, Laurent; Habib, Gilbert; Serratrice, Jacques

    2018-04-01

    The life-threatening drug rash with eosinophilia and systemic symptoms (DRESS) syndrome occurs most commonly after exposure to drugs, clinical features mimic those found with other serious systemic disorders. It is rarely associated with thrombotic microangiopathy. We describe the unique case of a 44-year-old man who simultaneously experienced DRESS syndrome with thrombotic microangiopathy (TMA) after a 5 days treatment with fluindione. Clinical evaluation leads to the discovery of an underlying lymphangiomatosis, due to a Noonan syndrome. The anticoagulant was withdrawn, and corticosteroids (1 mg/kg/day) and acenocoumarol were started. Clinical improvement ensued. At follow-up the patient is well. The association of DRESS with TMA is a rare condition; we believe that the presence of the underlying Noonan syndrome could have been the trigger. Moreover, we speculate about the potential interrelations between these entities.

  3. The characteristics of dose at mass interface on lung cancer Stereotactic Body Radiotherapy (SBRT) simulation

    NASA Astrophysics Data System (ADS)

    Wulansari, I. H.; Wibowo, W. E.; Pawiro, S. A.

    2017-05-01

    In lung cancer cases, there exists a difficulty for the Treatment Planning System (TPS) to predict the dose at or near the mass interface. This error prediction might influence the minimum or maximum dose received by lung cancer. In addition to target motion, the target dose prediction error also contributes in the combined error during the course of treatment. The objective of this work was to verify dose plan calculated by adaptive convolution algorithm in Pinnacle3 at the mass interface against a set of measurement. The measurement was performed using Gafchromic EBT 3 film in static and dynamic CIRS phantom with amplitudes of 5 mm, 10 mm, and 20 mm in superior-inferior motion direction. Static and dynamic phantom were scanned with fast CT and slow CT before planned. The results showed that adaptive convolution algorithm mostly predicted mass interface dose lower than the measured dose in a range of -0,63% to 8,37% for static phantom in fast CT scanning and -0,27% to 15,9% for static phantom in slow CT scanning. In dynamic phantom, this algorithm was predicted mass interface dose higher than measured dose up to -89% for fast CT and varied from -17% until 37% for slow CT. This interface of dose differences caused the dose mass decreased in fast CT, except for 10 mm motion amplitude, and increased in slow CT for the greater amplitude of motion.

  4. TH-E-BRE-07: Development of Dose Calculation Error Predictors for a Widely Implemented Clinical Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egan, A; Laub, W

    2014-06-15

    Purpose: Several shortcomings of the current implementation of the analytic anisotropic algorithm (AAA) may lead to dose calculation errors in highly modulated treatments delivered to highly heterogeneous geometries. Here we introduce a set of dosimetric error predictors that can be applied to a clinical treatment plan and patient geometry in order to identify high risk plans. Once a problematic plan is identified, the treatment can be recalculated with more accurate algorithm in order to better assess its viability. Methods: Here we focus on three distinct sources dosimetric error in the AAA algorithm. First, due to a combination of discrepancies inmore » smallfield beam modeling as well as volume averaging effects, dose calculated through small MLC apertures can be underestimated, while that behind small MLC blocks can overestimated. Second, due the rectilinear scaling of the Monte Carlo generated pencil beam kernel, energy is not properly transported through heterogeneities near, but not impeding, the central axis of the beamlet. And third, AAA overestimates dose in regions very low density (< 0.2 g/cm{sup 3}). We have developed an algorithm to detect the location and magnitude of each scenario within the patient geometry, namely the field-size index (FSI), the heterogeneous scatter index (HSI), and the lowdensity index (LDI) respectively. Results: Error indices successfully identify deviations between AAA and Monte Carlo dose distributions in simple phantom geometries. Algorithms are currently implemented in the MATLAB computing environment and are able to run on a typical RapidArc head and neck geometry in less than an hour. Conclusion: Because these error indices successfully identify each type of error in contrived cases, with sufficient benchmarking, this method can be developed into a clinical tool that may be able to help estimate AAA dose calculation errors and when it might be advisable to use Monte Carlo calculations.« less

  5. Technical Note: A novel leaf sequencing optimization algorithm which considers previous underdose and overdose events for MLC tracking radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wisotzky, Eric, E-mail: eric.wisotzky@charite.de, E-mail: eric.wisotzky@ipk.fraunhofer.de; O’Brien, Ricky; Keall, Paul J., E-mail: paul.keall@sydney.edu.au

    2016-01-15

    Purpose: Multileaf collimator (MLC) tracking radiotherapy is complex as the beam pattern needs to be modified due to the planned intensity modulation as well as the real-time target motion. The target motion cannot be planned; therefore, the modified beam pattern differs from the original plan and the MLC sequence needs to be recomputed online. Current MLC tracking algorithms use a greedy heuristic in that they optimize for a given time, but ignore past errors. To overcome this problem, the authors have developed and improved an algorithm that minimizes large underdose and overdose regions. Additionally, previous underdose and overdose events aremore » taken into account to avoid regions with high quantity of dose events. Methods: The authors improved the existing MLC motion control algorithm by introducing a cumulative underdose/overdose map. This map represents the actual projection of the planned tumor shape and logs occurring dose events at each specific regions. These events have an impact on the dose cost calculation and reduce recurrence of dose events at each region. The authors studied the improvement of the new temporal optimization algorithm in terms of the L1-norm minimization of the sum of overdose and underdose compared to not accounting for previous dose events. For evaluation, the authors simulated the delivery of 5 conformal and 14 intensity-modulated radiotherapy (IMRT)-plans with 7 3D patient measured tumor motion traces. Results: Simulations with conformal shapes showed an improvement of L1-norm up to 8.5% after 100 MLC modification steps. Experiments showed comparable improvements with the same type of treatment plans. Conclusions: A novel leaf sequencing optimization algorithm which considers previous dose events for MLC tracking radiotherapy has been developed and investigated. Reductions in underdose/overdose are observed for conformal and IMRT delivery.« less

  6. Dosimetric comparison of peripheral NSCLC SBRT using Acuros XB and AAA calculation algorithms.

    PubMed

    Ong, Chloe C H; Ang, Khong Wei; Soh, Roger C X; Tin, Kah Ming; Yap, Jerome H H; Lee, James C L; Bragg, Christopher M

    2017-01-01

    There is a concern for dose calculation in highly heterogenous environments such as the thorax region. This study compares the quality of treatment plans of peripheral non-small cell lung cancer (NSCLC) stereotactic body radiation therapy (SBRT) using 2 calculation algorithms, namely, Eclipse Anisotropic Analytical Algorithm (AAA) and Acuros External Beam (AXB), for 3-dimensional conformal radiation therapy (3DCRT) and volumetric-modulated arc therapy (VMAT). Four-dimensional computed tomography (4DCT) data from 20 anonymized patients were studied using Varian Eclipse planning system, AXB, and AAA version 10.0.28. A 3DCRT plan and a VMAT plan were generated using AAA and AXB with constant plan parameters for each patient. The prescription and dose constraints were benchmarked against Radiation Therapy Oncology Group (RTOG) 0915 protocol. Planning parameters of the plan were compared statistically using Mann-Whitney U tests. Results showed that 3DCRT and VMAT plans have a lower target coverage up to 8% when calculated using AXB as compared with AAA. The conformity index (CI) for AXB plans was 4.7% lower than AAA plans, but was closer to unity, which indicated better target conformity. AXB produced plans with global maximum doses which were, on average, 2% hotter than AAA plans. Both 3DCRT and VMAT plans were able to achieve D95%. VMAT plans were shown to be more conformal (CI = 1.01) and were at least 3.2% and 1.5% lower in terms of PTV maximum and mean dose, respectively. There was no statistically significant difference for doses received by organs at risk (OARs) regardless of calculation algorithms and treatment techniques. In general, the difference in tissue modeling for AXB and AAA algorithm is responsible for the dose distribution between the AXB and the AAA algorithms. The AXB VMAT plans could be used to benefit patients receiving peripheral NSCLC SBRT. Copyright © 2017 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  7. Quantitative Features of Liver Lesions, Lung Nodules, and Renal Stones at Multi-Detector Row CT Examinations: Dependency on Radiation Dose and Reconstruction Algorithm.

    PubMed

    Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan

    2016-04-01

    To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.

  8. Improving the Estimation of Mealtime Insulin Dose in Adults With Type 1 Diabetes

    PubMed Central

    Bao, Jiansong; Gilbertson, Heather R.; Gray, Robyn; Munns, Diane; Howard, Gabrielle; Petocz, Peter; Colagiuri, Stephen; Brand-Miller, Jennie C.

    2011-01-01

    OBJECTIVE Although carbohydrate counting is routine practice in type 1 diabetes, hyperglycemic episodes are common. A food insulin index (FII) has been developed and validated for predicting the normal insulin demand generated by mixed meals in healthy adults. We sought to compare a novel algorithm on the basis of the FII for estimating mealtime insulin dose with carbohydrate counting in adults with type 1 diabetes. RESEARCH DESIGN AND METHODS A total of 28 patients using insulin pump therapy consumed two different breakfast meals of equal energy, glycemic index, fiber, and calculated insulin demand (both FII = 60) but approximately twofold difference in carbohydrate content, in random order on three consecutive mornings. On one occasion, a carbohydrate-counting algorithm was applied to meal A (75 g carbohydrate) for determining bolus insulin dose. On the other two occasions, carbohydrate counting (about half the insulin dose as meal A) and the FII algorithm (same dose as meal A) were applied to meal B (41 g carbohydrate). A real-time continuous glucose monitor was used to assess 3-h postprandial glycemia. RESULTS Compared with carbohydrate counting, the FII algorithm significantly decreased glucose incremental area under the curve over 3 h (–52%, P = 0.013) and peak glucose excursion (–41%, P = 0.01) and improved the percentage of time within the normal blood glucose range (4–10 mmol/L) (31%, P = 0.001). There was no significant difference in the occurrence of hypoglycemia. CONCLUSIONS An insulin algorithm based on physiological insulin demand evoked by foods in healthy subjects may be a useful tool for estimating mealtime insulin dose in patients with type 1 diabetes. PMID:21949219

  9. SU-F-T-619: Dose Evaluation of Specific Patient Plans Based On Monte Carlo Algorithm for a CyberKnife Stereotactic Radiosurgery System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piao, J; PLA 302 Hospital, Beijing; Xu, S

    2016-06-15

    Purpose: This study will use Monte Carlo to simulate the Cyberknife system, and intend to develop the third-party tool to evaluate the dose verification of specific patient plans in TPS. Methods: By simulating the treatment head using the BEAMnrc and DOSXYZnrc software, the comparison between the calculated and measured data will be done to determine the beam parameters. The dose distribution calculated in the Raytracing, Monte Carlo algorithms of TPS (Multiplan Ver4.0.2) and in-house Monte Carlo simulation method for 30 patient plans, which included 10 head, lung and liver cases in each, were analyzed. The γ analysis with the combinedmore » 3mm/3% criteria would be introduced to quantitatively evaluate the difference of the accuracy between three algorithms. Results: More than 90% of the global error points were less than 2% for the comparison of the PDD and OAR curves after determining the mean energy and FWHM.The relative ideal Monte Carlo beam model had been established. Based on the quantitative evaluation of dose accuracy for three algorithms, the results of γ analysis shows that the passing rates (84.88±9.67% for head,98.83±1.05% for liver,98.26±1.87% for lung) of PTV in 30 plans between Monte Carlo simulation and TPS Monte Carlo algorithms were good. And the passing rates (95.93±3.12%,99.84±0.33% in each) of PTV in head and liver plans between Monte Carlo simulation and TPS Ray-tracing algorithms were also good. But the difference of DVHs in lung plans between Monte Carlo simulation and Ray-tracing algorithms was obvious, and the passing rate (51.263±38.964%) of γ criteria was not good. It is feasible that Monte Carlo simulation was used for verifying the dose distribution of patient plans. Conclusion: Monte Carlo simulation algorithm developed in the CyberKnife system of this study can be used as a reference tool for the third-party tool, which plays an important role in dose verification of patient plans. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11275105). Thanks for the support from Accuray Corp.« less

  10. SU-E-T-202: Impact of Monte Carlo Dose Calculation Algorithm On Prostate SBRT Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venencia, C; Garrigo, E; Cardenas, J

    2014-06-01

    Purpose: The purpose of this work was to quantify the dosimetric impact of using Monte Carlo algorithm on pre calculated SBRT prostate treatment with pencil beam dose calculation algorithm. Methods: A 6MV photon beam produced by a Novalis TX (BrainLAB-Varian) linear accelerator equipped with HDMLC was used. Treatment plans were done using 9 fields with Iplanv4.5 (BrainLAB) and dynamic IMRT modality. Institutional SBRT protocol uses a total dose to the prostate of 40Gy in 5 fractions, every other day. Dose calculation is done by pencil beam (2mm dose resolution), heterogeneity correction and dose volume constraint (UCLA) for PTV D95%=40Gy andmore » D98%>39.2Gy, Rectum V20Gy<50%, V32Gy<20%, V36Gy<10% and V40Gy<5%, Bladder V20Gy<40% and V40Gy<10%, femoral heads V16Gy<5%, penile bulb V25Gy<3cc, urethra and overlap region between PTV and PRV Rectum Dmax<42Gy. 10 SBRT treatments plans were selected and recalculated using Monte Carlo with 2mm spatial resolution and mean variance of 2%. DVH comparisons between plans were done. Results: The average difference between PTV doses constraints were within 2%. However 3 plans have differences higher than 3% which does not meet the D98% criteria (>39.2Gy) and should have been renormalized. Dose volume constraint differences for rectum, bladder, femoral heads and penile bulb were les than 2% and within tolerances. Urethra region and overlapping between PTV and PRV Rectum shows increment of dose in all plans. The average difference for urethra region was 2.1% with a maximum of 7.8% and for the overlapping region 2.5% with a maximum of 8.7%. Conclusion: Monte Carlo dose calculation on dynamic IMRT treatments could affects on plan normalization. Dose increment in critical region of urethra and PTV overlapping region with PTV could have clinical consequences which need to be studied. The use of Monte Carlo dose calculation algorithm is limited because inverse planning dose optimization use only pencil beam.« less

  11. Validation of a method for in vivo 3D dose reconstruction for IMRT and VMAT treatments using on-treatment EPID images and a model-based forward-calculation algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Uytven, Eric, E-mail: eric.vanuytven@cancercare.mb.ca; Van Beek, Timothy; McCowan, Peter M.

    2015-12-15

    Purpose: Radiation treatments are trending toward delivering higher doses per fraction under stereotactic radiosurgery and hypofractionated treatment regimens. There is a need for accurate 3D in vivo patient dose verification using electronic portal imaging device (EPID) measurements. This work presents a model-based technique to compute full three-dimensional patient dose reconstructed from on-treatment EPID portal images (i.e., transmission images). Methods: EPID dose is converted to incident fluence entering the patient using a series of steps which include converting measured EPID dose to fluence at the detector plane and then back-projecting the primary source component of the EPID fluence upstream of themore » patient. Incident fluence is then recombined with predicted extra-focal fluence and used to calculate 3D patient dose via a collapsed-cone convolution method. This method is implemented in an iterative manner, although in practice it provides accurate results in a single iteration. The robustness of the dose reconstruction technique is demonstrated with several simple slab phantom and nine anthropomorphic phantom cases. Prostate, head and neck, and lung treatments are all included as well as a range of delivery techniques including VMAT and dynamic intensity modulated radiation therapy (IMRT). Results: Results indicate that the patient dose reconstruction algorithm compares well with treatment planning system computed doses for controlled test situations. For simple phantom and square field tests, agreement was excellent with a 2%/2 mm 3D chi pass rate ≥98.9%. On anthropomorphic phantoms, the 2%/2 mm 3D chi pass rates ranged from 79.9% to 99.9% in the planning target volume (PTV) region and 96.5% to 100% in the low dose region (>20% of prescription, excluding PTV and skin build-up region). Conclusions: An algorithm to reconstruct delivered patient 3D doses from EPID exit dosimetry measurements was presented. The method was applied to phantom and patient data sets, as well as for dynamic IMRT and VMAT delivery techniques. Results indicate that the EPID dose reconstruction algorithm presented in this work is suitable for clinical implementation.« less

  12. A class solution for volumetric-modulated arc therapy planning in postprostatectomy radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forde, Elizabeth, E-mail: eforde@tcd.ie; Bromley, Regina; Institute of Medical Physics, School of Physics, University of Sydney, New South Wales

    This study is aimed to test a postprostatectomy volumetric-modulated arc therapy (VMAT) planning class solution. The solution applies to both the progressive resolution optimizer algorithm version 2 (PRO 2) and the algorithm version 3 (PRO 3), addressing the effect of an upgraded algorithm. A total of 10 radical postprostatectomy patients received 68 Gy to 95% of the planning target volume (PTV), which was planned using VMAT. Each case followed a set of planning instructions; including contouring, field setup, and predetermined optimization parameters. Each case was run through both algorithms only once, with no user interaction. Results were averaged and comparedmore » against Radiation Therapy Oncology Group (RTOG) 0534 end points. In addition, the clinical target volume (CTV) D{sub 100}, PTV D{sub 99}, and PTV mean doses were recorded, along with conformity indices (CIs) (95% and 98%) and the homogeneity index. All cases satisfied PTV D{sub 95} of 68 Gy and a maximum dose < 74.8 Gy. The average result for the PTV D{sub 99} was 64.1 Gy for PRO 2 and 62.1 Gy for PRO 3. The average PTV mean dose for PRO 2 was 71.4 Gy and 71.5 Gy for PRO 3. The CTV D{sub 100} average dose was 67.7 and 68.0 Gy for PRO 2 and PRO 3, respectively. The mean homogeneity index for both algorithms was 0.08. The average 95% CI was 1.17 for PRO 2 and 1.19 for PRO 3. For 98%, the average results were 1.08 and 1.12 for PRO 2 and PRO 3, respectively. All cases for each algorithm met the RTOG organs at risk dose constraints. A successful class solution has been established for prostate bed VMAT radiotherapy regardless of the algorithm used.« less

  13. TU-D-209-03: Alignment of the Patient Graphic Model Using Fluoroscopic Images for Skin Dose Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oines, A; Oines, A; Kilian-Meneghin, J

    2016-06-15

    Purpose: The Dose Tracking System (DTS) was developed to provide realtime feedback of skin dose and dose rate during interventional fluoroscopic procedures. A color map on a 3D graphic of the patient represents the cumulative dose distribution on the skin. Automated image correlation algorithms are described which use the fluoroscopic procedure images to align and scale the patient graphic for more accurate dose mapping. Methods: Currently, the DTS employs manual patient graphic selection and alignment. To improve the accuracy of dose mapping and automate the software, various methods are explored to extract information about the beam location and patient morphologymore » from the procedure images. To match patient anatomy with a reference projection image, preprocessing is first used, including edge enhancement, edge detection, and contour detection. Template matching algorithms from OpenCV are then employed to find the location of the beam. Once a match is found, the reference graphic is scaled and rotated to fit the patient, using image registration correlation functions in Matlab. The algorithm runs correlation functions for all points and maps all correlation confidences to a surface map. The highest point of correlation is used for alignment and scaling. The transformation data is saved for later model scaling. Results: Anatomic recognition is used to find matching features between model and image and image registration correlation provides for alignment and scaling at any rotation angle with less than onesecond runtime, and at noise levels in excess of 150% of those found in normal procedures. Conclusion: The algorithm provides the necessary scaling and alignment tools to improve the accuracy of dose distribution mapping on the patient graphic with the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  14. Implementation of Monte Carlo Dose calculation for CyberKnife treatment planning

    NASA Astrophysics Data System (ADS)

    Ma, C.-M.; Li, J. S.; Deng, J.; Fan, J.

    2008-02-01

    Accurate dose calculation is essential to advanced stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT) especially for treatment planning involving heterogeneous patient anatomy. This paper describes the implementation of a fast Monte Carlo dose calculation algorithm in SRS/SRT treatment planning for the CyberKnife® SRS/SRT system. A superposition Monte Carlo algorithm is developed for this application. Photon mean free paths and interaction types for different materials and energies as well as the tracks of secondary electrons are pre-simulated using the MCSIM system. Photon interaction forcing and splitting are applied to the source photons in the patient calculation and the pre-simulated electron tracks are repeated with proper corrections based on the tissue density and electron stopping powers. Electron energy is deposited along the tracks and accumulated in the simulation geometry. Scattered and bremsstrahlung photons are transported, after applying the Russian roulette technique, in the same way as the primary photons. Dose calculations are compared with full Monte Carlo simulations performed using EGS4/MCSIM and the CyberKnife treatment planning system (TPS) for lung, head & neck and liver treatments. Comparisons with full Monte Carlo simulations show excellent agreement (within 0.5%). More than 10% differences in the target dose are found between Monte Carlo simulations and the CyberKnife TPS for SRS/SRT lung treatment while negligible differences are shown in head and neck and liver for the cases investigated. The calculation time using our superposition Monte Carlo algorithm is reduced up to 62 times (46 times on average for 10 typical clinical cases) compared to full Monte Carlo simulations. SRS/SRT dose distributions calculated by simple dose algorithms may be significantly overestimated for small lung target volumes, which can be improved by accurate Monte Carlo dose calculations.

  15. A sparsity-based iterative algorithm for reconstruction of micro-CT images from highly undersampled projection datasets obtained with a synchrotron X-ray source

    NASA Astrophysics Data System (ADS)

    Melli, S. Ali; Wahid, Khan A.; Babyn, Paul; Cooper, David M. L.; Gopi, Varun P.

    2016-12-01

    Synchrotron X-ray Micro Computed Tomography (Micro-CT) is an imaging technique which is increasingly used for non-invasive in vivo preclinical imaging. However, it often requires a large number of projections from many different angles to reconstruct high-quality images leading to significantly high radiation doses and long scan times. To utilize this imaging technique further for in vivo imaging, we need to design reconstruction algorithms that reduce the radiation dose and scan time without reduction of reconstructed image quality. This research is focused on using a combination of gradient-based Douglas-Rachford splitting and discrete wavelet packet shrinkage image denoising methods to design an algorithm for reconstruction of large-scale reduced-view synchrotron Micro-CT images with acceptable quality metrics. These quality metrics are computed by comparing the reconstructed images with a high-dose reference image reconstructed from 1800 equally spaced projections spanning 180°. Visual and quantitative-based performance assessment of a synthetic head phantom and a femoral cortical bone sample imaged in the biomedical imaging and therapy bending magnet beamline at the Canadian Light Source demonstrates that the proposed algorithm is superior to the existing reconstruction algorithms. Using the proposed reconstruction algorithm to reduce the number of projections in synchrotron Micro-CT is an effective way to reduce the overall radiation dose and scan time which improves in vivo imaging protocols.

  16. Low dose CT reconstruction via L1 norm dictionary learning using alternating minimization algorithm and balancing principle.

    PubMed

    Wu, Junfeng; Dai, Fang; Hu, Gang; Mou, Xuanqin

    2018-04-18

    Excessive radiation exposure in computed tomography (CT) scans increases the chance of developing cancer and has become a major clinical concern. Recently, statistical iterative reconstruction (SIR) with l0-norm dictionary learning regularization has been developed to reconstruct CT images from the low dose and few-view dataset in order to reduce radiation dose. Nonetheless, the sparse regularization term adopted in this approach is l0-norm, which cannot guarantee the global convergence of the proposed algorithm. To address this problem, in this study we introduced the l1-norm dictionary learning penalty into SIR framework for low dose CT image reconstruction, and developed an alternating minimization algorithm to minimize the associated objective function, which transforms CT image reconstruction problem into a sparse coding subproblem and an image updating subproblem. During the image updating process, an efficient model function approach based on balancing principle is applied to choose the regularization parameters. The proposed alternating minimization algorithm was evaluated first using real projection data of a sheep lung CT perfusion and then using numerical simulation based on sheep lung CT image and chest image. Both visual assessment and quantitative comparison using terms of root mean square error (RMSE) and structural similarity (SSIM) index demonstrated that the new image reconstruction algorithm yielded similar performance with l0-norm dictionary learning penalty and outperformed the conventional filtered backprojection (FBP) and total variation (TV) minimization algorithms.

  17. A fast method to emulate an iterative POCS image reconstruction algorithm.

    PubMed

    Zeng, Gengsheng L

    2017-10-01

    Iterative image reconstruction algorithms are commonly used to optimize an objective function, especially when the objective function is nonquadratic. Generally speaking, the iterative algorithms are computationally inefficient. This paper presents a fast algorithm that has one backprojection and no forward projection. This paper derives a new method to solve an optimization problem. The nonquadratic constraint, for example, an edge-preserving denoising constraint is implemented as a nonlinear filter. The algorithm is derived based on the POCS (projections onto projections onto convex sets) approach. A windowed FBP (filtered backprojection) algorithm enforces the data fidelity. An iterative procedure, divided into segments, enforces edge-enhancement denoising. Each segment performs nonlinear filtering. The derived iterative algorithm is computationally efficient. It contains only one backprojection and no forward projection. Low-dose CT data are used for algorithm feasibility studies. The nonlinearity is implemented as an edge-enhancing noise-smoothing filter. The patient studies results demonstrate its effectiveness in processing low-dose x ray CT data. This fast algorithm can be used to replace many iterative algorithms. © 2017 American Association of Physicists in Medicine.

  18. Statistic and dosimetric criteria to assess the shift of the prescribed dose for lung radiotherapy plans when integrating point kernel models in medical physics: are we ready?

    PubMed

    Chaikh, Abdulhamid; Balosso, Jacques

    2016-12-01

    To apply the statistical bootstrap analysis and dosimetric criteria's to assess the change of prescribed dose (PD) for lung cancer to maintain the same clinical results when using new generations of dose calculation algorithms. Nine lung cancer cases were studied. For each patient, three treatment plans were generated using exactly the same beams arrangements. In plan 1, the dose was calculated using pencil beam convolution (PBC) algorithm turning on heterogeneity correction with modified batho (PBC-MB). In plan 2, the dose was calculated using anisotropic analytical algorithm (AAA) and the same PD, as plan 1. In plan 3, the dose was calculated using AAA with monitor units (MUs) obtained from PBC-MB, as input. The dosimetric criteria's include MUs, delivered dose at isocentre (Diso) and calculated dose to 95% of the target volume (D95). The bootstrap method was used to assess the significance of the dose differences and to accurately estimate the 95% confidence interval (95% CI). Wilcoxon and Spearman's rank tests were used to calculate P values and the correlation coefficient (ρ). Statistically significant for dose difference was found using point kernel model. A good correlation was observed between both algorithms types, with ρ>0.9. Using AAA instead of PBC-MB, an adjustment of the PD in the isocentre is suggested. For a given set of patients, we assessed the need to readjust the PD for lung cancer using dosimetric indices and bootstrap statistical method. Thus, if the goal is to keep on with the same clinical results, the PD for lung tumors has to be adjusted with AAA. According to our simulation we suggest to readjust the PD by 5% and an optimization for beam arrangements to better protect the organs at risks (OARs).

  19. Fiber-Coupled, Time-Gated { {Al}}_{2}{ {O}}_{3} : { {C}} Radioluminescence Dosimetry Technique and Algorithm for Radiation Therapy With LINACs

    NASA Astrophysics Data System (ADS)

    Magne, Sylvain; Deloule, Sybelle; Ostrowsky, Aimé; Ferdinand, Pierre

    2013-08-01

    An original algorithm for real-time In Vivo Dosimetry (IVD) based on Radioluminescence (RL) of dosimetric-grade Al2O3:C crystals is described and demonstrated in reference conditions with 12-MV photon beams from a Saturne 43 linear accelerator (LINAC), simulating External Beam Radiation Therapy (EBRT) treatments. During the course of irradiation, a portion of electrons is trapped within the Al2O3:C crystal while another portion recombines and generates RL, recorded on-line using an optical fiber. The RL sensitivity is dose-dependent and increases in accordance with the concentration of trapped electrons. Once irradiation is completed, the Al2O3:C crystal is reset by laser light (reusable) and the resultant OSL (Optically Stimulated Luminescence) is also collected back by the remote RL-OSL reader and finally integrated to yield the absorbed dose. During irradiation, scintillation and Cerenkov lights generated within the optical fiber (“stem effect”) are removed by a time-discrimination method involving a discriminating unit and a fiber-coupled BGO scintillator placed in the irradiation room, next to the LINAC. The RL signals were then calibrated with respect to reference dose and dose rate data using an ionization chamber (IC). The algorithm relies upon the integral of the RL and provides the accumulated dose (useful to the medical physicist) at any time during irradiation, the dose rate being derived afterwards. It is tested with both step and arbitrary dose rate profiles, manually operated from the LINAC control desk. The doses measured by RL and OSL are both compared to reference doses and deviations are about ±2% and ±1% respectively, thus demonstrating the reliability of the algorithm for arbitrary profiles and wide range of dose rates. Although the calculation was done off-line, it is amenable to real-time processing during irradiation.

  20. Are pharmacological properties of anticoagulants reflected in pharmaceutical pricing and reimbursement policy? Out-patient treatment of venous thromboembolism and utilization of anticoagulants in Poland.

    PubMed

    Bochenek, T; Czarnogorski, M; Nizankowski, R; Pilc, A

    2014-06-01

    Pharmacotherapy with vitamin K antagonists (VKA) and low-molecular-weight heparins (LMWH) is a major cost driver in the treatment of venous thromboembolism (VTE). Major representatives of anticoagulants in Europe include: acenocoumarol and warfarin (VKA), enoxaparin, dalteparin, nadroparin, reviparin, parnaparin and bemiparin (LMWH). Aim of this report is to measure and critically assess the utilization of anticoagulants and other resources used in the out-patient treatment of VTE in Poland. To confront the findings with available scientific evidence on pharmacological and clinical properties of anticoagulants. The perspectives of the National Health Fund (NHF) and the patients were adopted, descriptive statistics methods were used. The data were gathered at the NHF and the clinic specialized in treatment of coagulation disorders. Non-pharmacological costs of treatment were for the NHF 1.6 times higher with VKA than with LMWH. Daily cost of pharmacotherapy with LMWH turned out higher than with VKA (234 times for the NHF, 42 times per patient). Within both LMWH and VKA the reimbursement due for the daily doses of a particular medication altered in the manner inversely proportional to the level of patient co-payment. Utilization of long-marketed and cheap VKA was dominated by LMWH, when assessed both through the monetary measures and by the actual volume of sales. Pharmaceutical reimbursement policy favored the more expensive equivalents among VKA and LMWH, whereas in the financial terms the patients were far better off when remaining on a more expensive alternative. The pharmaceutical pricing and reimbursement policy of the state should be more closely related to the pharmacological properties of anticoagulants.

  1. A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support

    PubMed Central

    Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.

    2014-01-01

    Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768

  2. SU-F-SPS-06: Implementation of a Back-Projection Algorithm for 2D in Vivo Dosimetry with An EPID System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez Reyes, B; Rodriguez Perez, E; Sosa Aquino, M

    Purpose: To implement a back-projection algorithm for 2D dose reconstructions for in vivo dosimetry in radiation therapy using an Electronic Portal Imaging Device (EPID) based on amorphous silicon. Methods: An EPID system was used to calculate dose-response function, pixel sensitivity map, exponential scatter kernels and beam hardenig correction for the back-projection algorithm. All measurements were done with a 6 MV beam. A 2D dose reconstruction for an irradiated water phantom (30×30×30 cm{sup 3}) was done to verify the algorithm implementation. Gamma index evaluation between the 2D reconstructed dose and the calculated with a treatment planning system (TPS) was done. Results:more » A linear fit was found for the dose-response function. The pixel sensitivity map has a radial symmetry and was calculated with a profile of the pixel sensitivity variation. The parameters for the scatter kernels were determined only for a 6 MV beam. The primary dose was estimated applying the scatter kernel within EPID and scatter kernel within the patient. The beam hardening coefficient is σBH= 3.788×10{sup −4} cm{sup 2} and the effective linear attenuation coefficient is µAC= 0.06084 cm{sup −1}. The 95% of points evaluated had γ values not longer than the unity, with gamma criteria of ΔD = 3% and Δd = 3 mm, and within the 50% isodose surface. Conclusion: The use of EPID systems proved to be a fast tool for in vivo dosimetry, but the implementation is more complex that the elaborated for pre-treatment dose verification, therefore, a simplest method must be investigated. The accuracy of this method should be improved modifying the algorithm in order to compare lower isodose curves.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thiyagarajan, Rajesh; Vikraman, S; Karrthick, KP

    Purpose: To evaluate the impact of dose calculation algorithm on the dose distribution of biologically optimized Volumatric Modulated Arc Therapy (VMAT) plans for Esophgeal cancer. Methods: Eighteen retrospectively treated patients with carcinoma esophagus were studied. VMAT plans were optimized using biological objectives in Monaco (5.0) TPS for 6MV photon beam (Elekta Infinity). These plans were calculated for final dose using Monte Carlo (MC), Collapsed Cone Convolution (CCC) & Pencil Beam Convolution (PBC) algorithms from Monaco and Oncentra Masterplan TPS. A dose grid of 2mm was used for all algorithms and 1% per plan uncertainty maintained for MC calculation. MC basedmore » calculations were considered as the reference for CCC & PBC. Dose volume histogram (DVH) indices (D95, D98, D50 etc) of Target (PTV) and critical structures were compared to study the impact of all three algorithms. Results: Beam models were consistent with measured data. The mean difference observed in reference with MC calculation for D98, D95, D50 & D2 of PTV were 0.37%, −0.21%, 1.51% & 1.18% respectively for CCC and 3.28%, 2.75%, 3.61% & 3.08% for PBC. Heart D25 mean difference was 4.94% & 11.21% for CCC and PBC respectively. Lung Dmean mean difference was 1.5% (CCC) and 4.1% (PBC). Spinal cord D2 mean difference was 2.35% (CCC) and 3.98% (PBC). Similar differences were observed for liver and kidneys. The overall mean difference found for target and critical structures was 0.71±1.52%, 2.71±3.10% for CCC and 3.18±1.55%, 6.61±5.1% for PBC respectively. Conclusion: We observed a significant overestimate of dose distribution by CCC and PBC as compared to MC. The dose prediction of CCC is closer (<3%) to MC than that of PBC. This can be attributed to poor performance of CCC and PBC in inhomogeneous regions around esophagus. CCC can be considered as an alternate in the absence of MC algorithm.« less

  4. Comparison of three algorithms for initiation and titration of insulin glargine in insulin-naive patients with type 2 diabetes mellitus.

    PubMed

    Dailey, George; Aurand, Lisa; Stewart, John; Ameer, Barbara; Zhou, Rong

    2014-03-01

    Several titration algorithms can be used to adjust insulin dose and attain blood glucose targets. We compared clinical outcomes using three initiation and titration algorithms for insulin glargine in insulin-naive patients with type 2 diabetes mellitus (T2DM); focusing on those receiving both metformin and sulfonylurea (SU) at baseline. This was a pooled analysis of patient-level data from prospective, randomized, controlled 24-week trials. Patients received algorithm 1 (1 IU increase once daily, if fasting plasma glucose [FPG] > target), algorithm 2 (2 IU increase every 3 days, if FPG > target), or algorithm 3 (treat-to-target, generally 2-8 IU increase weekly based on 2-day mean FPG levels). Glycemic control, insulin dose, and hypoglycemic events were compared between algorithms. Overall, 1380 patients were included. In patients receiving metformin and SU at baseline, there were no significant differences in glycemic control between algorithms. Weight-adjusted dose was higher for algorithm 2 vs algorithms 1 and 3 (P = 0.0037 and P < 0.0001, respectively), though results were not significantly different when adjusted for reductions in HbA1c (0.36 IU/kg, 0.43 IU/kg, and 0.31 IU/kg for algorithms 1, 2, and 3, respectively). Yearly hypoglycemic event rates (confirmed blood glucose <56 mg/dL) were higher for algorithm 3 than algorithms 1 (P = 0.0003) and 2 (P < 0.0001). Three algorithms for initiation and titration of insulin glargine in patients with T2DM resulted in similar levels of glycemic control, with lower rates of hypoglycemia for patients treated using simpler algorithms 1 and 2. © 2013 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  5. Fluence map optimization (FMO) with dose-volume constraints in IMRT using the geometric distance sorting method.

    PubMed

    Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang

    2012-10-21

    A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.

  6. [Clinical applications of dosing algorithm in the predication of warfarin maintenance dose].

    PubMed

    Huang, Sheng-wen; Xiang, Dao-kang; An, Bang-quan; Li, Gui-fang; Huang, Ling; Wu, Hai-li

    2011-12-27

    To evaluate the feasibility of clinical application for genetic based dosing algorithm in the predication of warfarin maintenance dose in Chinese population. The clinical data were collected and blood samples harvested from a total of 126 patients undergoing heart valve replacement. The genotypes of VKORC1 and CYP2C9 were determined by melting curve analysis after PCR. They were divided randomly into the study and control groups. In the study group, the first three doses of warfarin were prescribed according to the predicted warfarin maintenance dose while warfarin was initiated at 2.5 mg/d in the control group. The warfarin doses were adjusted according to the measured international normalized ratio (INR) values. And all subjects were followed for 50 days after an initiation of warfarin therapy. At the end of a 50-day follow-up period, the proportions of the patients on a stable dose were 82.4% (42/51) and 62.5% (30/48) for the study and control groups respectively. The mean durations of reaching a stable dose of warfarin were (27.5 ± 1.8) and (34.7 ± 1.8) days and the median durations were (24.0 ± 1.7) and (33.0 ± 4.5) days in the study and control groups respectively. Significant differences existed in the durations of reaching a stable dose between the two groups (P = 0.012). Compared with the control group, the hazard ratio (HR) for the duration of reaching a stable dose was 1.786 in the study group (95%CI 1.088 - 2.875, P = 0.026). The predicted dosing algorithm incorporating genetic and non-genetic factors may shorten the duration of achieving efficiently a stable dose of warfarin. And the present study validates the feasibility of its clinical application.

  7. Local ROI Reconstruction via Generalized FBP and BPF Algorithms along More Flexible Curves.

    PubMed

    Yu, Hengyong; Ye, Yangbo; Zhao, Shiying; Wang, Ge

    2006-01-01

    We study the local region-of-interest (ROI) reconstruction problem, also referred to as the local CT problem. Our scheme includes two steps: (a) the local truncated normal-dose projections are extended to global dataset by combining a few global low-dose projections; (b) the ROI are reconstructed by either the generalized filtered backprojection (FBP) or backprojection-filtration (BPF) algorithms. The simulation results show that both the FBP and BPF algorithms can reconstruct satisfactory results with image quality in the ROI comparable to that of the corresponding global CT reconstruction.

  8. Fully Convolutional Architecture for Low-Dose CT Image Noise Reduction

    NASA Astrophysics Data System (ADS)

    Badretale, S.; Shaker, F.; Babyn, P.; Alirezaie, J.

    2017-10-01

    One of the critical topics in medical low-dose Computed Tomography (CT) imaging is how best to maintain image quality. As the quality of images decreases with lowering the X-ray radiation dose, improving image quality is extremely important and challenging. We have proposed a novel approach to denoise low-dose CT images. Our algorithm learns directly from an end-to-end mapping from the low-dose Computed Tomography images for denoising the normal-dose CT images. Our method is based on a deep convolutional neural network with rectified linear units. By learning various low-level to high-level features from a low-dose image the proposed algorithm is capable of creating a high-quality denoised image. We demonstrate the superiority of our technique by comparing the results with two other state-of-the-art methods in terms of the peak signal to noise ratio, root mean square error, and a structural similarity index.

  9. Warfarin Pharmacogenetics

    PubMed Central

    Johnson, Julie A.; Cavallari, Larisa H.

    2014-01-01

    The cytochrome P450 (CYP) 2C9 and vitamin K epoxide reductase complex 1 (VKORC1) genotypes have been strongly and consistently associated with warfarin dose requirements, and dosing algorithms incorporating genetic and clinical information have been shown to be predictive of stable warfarin dose. However, clinical trials evaluating genotype-guided warfarin dosing produced mixed results, calling into question the utility of this approach. Recent trials used surrogate markers as endpoints rather than clinical endpoints, further complicating translation of the data to clinical practice. The present data do not support genetic testing to guide warfarin dosing, but in the setting where genotype data are available, use of such data in those of European ancestry is reasonable. Outcomes data are expected from an on-going trial, observational studies continue, and more work is needed to define dosing algorithms that incorporate appropriate variants in minority populations; all these will further shape guidelines and recommendations on the clinical utility of genotype-guided warfarin dosing. PMID:25282448

  10. Small field depth dose profile of 6 MV photon beam in a simple air-water heterogeneity combination: A comparison between anisotropic analytical algorithm dose estimation with thermoluminescent dosimeter dose measurement.

    PubMed

    Mandal, Abhijit; Ram, Chhape; Mourya, Ankur; Singh, Navin

    2017-01-01

    To establish trends of estimation error of dose calculation by anisotropic analytical algorithm (AAA) with respect to dose measured by thermoluminescent dosimeters (TLDs) in air-water heterogeneity for small field size photon. TLDs were irradiated along the central axis of the photon beam in four different solid water phantom geometries using three small field size single beams. The depth dose profiles were estimated using AAA calculation model for each field sizes. The estimated and measured depth dose profiles were compared. The over estimation (OE) within air cavity were dependent on field size (f) and distance (x) from solid water-air interface and formulated as OE = - (0.63 f + 9.40) x2+ (-2.73 f + 58.11) x + (0.06 f2 - 1.42 f + 15.67). In postcavity adjacent point and distal points from the interface have dependence on field size (f) and equations are OE = 0.42 f2 - 8.17 f + 71.63, OE = 0.84 f2 - 1.56 f + 17.57, respectively. The trend of estimation error of AAA dose calculation algorithm with respect to measured value have been formulated throughout the radiation path length along the central axis of 6 MV photon beam in air-water heterogeneity combination for small field size photon beam generated from a 6 MV linear accelerator.

  11. Deformable structure registration of bladder through surface mapping.

    PubMed

    Xiong, Li; Viswanathan, Akila; Stewart, Alexandra J; Haker, Steven; Tempany, Clare M; Chin, Lee M; Cormack, Robert A

    2006-06-01

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractions of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Xiong; Viswanathan, Akila; Stewart, Alexandra J.

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractionsmore » of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.« less

  13. SU-F-J-148: A Collapsed Cone Algorithm Can Be Used for Quality Assurance for Monaco Treatment Plans for the MR-Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hackett, S; Asselen, B van; Wolthaus, J

    2016-06-15

    Purpose: Treatment plans for the MR-linac, calculated in Monaco v5.19, include direct simulation of the effects of the 1.5T B{sub 0}-field. We tested the feasibility of using a collapsed-cone (CC) algorithm in Oncentra, which does not account for effects of the B{sub 0}-field, as a fast online, independent 3D check of dose calculations. Methods: Treatment plans for six patients were generated in Monaco with a 6 MV FFF beam and the B{sub 0}-field. All plans were recalculated with a CC model of the same beam. Plans for the same patients were also generated in Monaco without the B{sub 0}-field. Themore » mean dose (Dmean) and doses to 10% (D10%) and 90% (D90%) of the volume were determined, as percentages of the prescribed dose, for target volumes and OARs in each calculated dose distribution. Student’s t-tests between paired parameters from Monaco plans and corresponding CC calculations were performed. Results: Figure 1 shows an example of the difference between dose distributions calculated in Monaco, with the B{sub 0}-field, and the CC algorithm. Figure 2 shows distributions of (absolute) difference between parameters for Monaco plans, with the B{sub 0}-field, and CC calculations. The Dmean and D90% values for the CTVs and PTVs were significantly different, but differences in dose distributions arose predominantly at the edges of the target volumes. Inclusion of the B{sub 0}-field had little effect on agreement of the Dmean values, as illustrated by Figure 3, nor on agreement of the D10% and D90% values. Conclusion: Dose distributions recalculated with a CC algorithm show good agreement with those calculated with Monaco, for plans both with and without the B{sub 0}-field, indicating that the CC algorithm could be used to check online treatment planning for the MRlinac. Agreement for a wider range of treatment sites, and the feasibility of using the γ-test as a simple pass/fail criterion, will be investigated.« less

  14. Algorithm of pulmonary emphysema extraction using low dose thoracic 3D CT images

    NASA Astrophysics Data System (ADS)

    Saita, S.; Kubo, M.; Kawata, Y.; Niki, N.; Nakano, Y.; Omatsu, H.; Tominaga, K.; Eguchi, K.; Moriyama, N.

    2006-03-01

    Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to 100 thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.

  15. SU-F-T-273: Using a Diode Array to Explore the Weakness of TPS DoseCalculation Algorithm for VMAT and Sliding Window Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, J; Lu, B; Yan, G

    Purpose: To identify the weakness of dose calculation algorithm in a treatment planning system for volumetric modulated arc therapy (VMAT) and sliding window (SW) techniques using a two-dimensional diode array. Methods: The VMAT quality assurance(QA) was implemented with a diode array using multiple partial arcs that divided from a VMAT plan; each partial arc has the same segments and the original monitor units. Arc angles were less than ± 30°. Multiple arcs delivered through consecutive and repetitive gantry operating clockwise and counterclockwise. The source-toaxis distance setup with the effective depths of 10 and 20 cm were used for a diodemore » array. To figure out dose errors caused in delivery of VMAT fields, the numerous fields having the same segments with the VMAT field irradiated using different delivery techniques of static and step-and-shoot. The dose distributions of the SW technique were evaluated by creating split fields having fine moving steps of multi-leaf collimator leaves. Calculated doses using the adaptive convolution algorithm were analyzed with measured ones with distance-to-agreement and dose difference of 3 mm and 3%.. Results: While the beam delivery through static and step-and-shoot techniques showed the passing rate of 97 ± 2%, partial arc delivery of the VMAT fields brought out passing rate of 85%. However, when leaf motion was restricted less than 4.6 mm/°, passing rate was improved up to 95 ± 2%. Similar passing rate were obtained for both 10 and 20 cm effective depth setup. The calculated doses using the SW technique showed the dose difference over 7% at the final arrival point of moving leaves. Conclusion: Error components in dynamic delivery of modulated beams were distinguished by using the suggested QA method. This partial arc method can be used for routine VMAT QA. Improved SW calculation algorithm is required to provide accurate estimated doses.« less

  16. SU-F-T-74: Experimental Validation of Monaco Electron Monte Carlo Dose Calculation for Small Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varadhan; Way, S; Arentsen, L

    2016-06-15

    Purpose: To verify experimentally the accuracy of Monaco (Elekta) electron Monte Carlo (eMC) algorithm to calculate small field size depth doses, monitor units and isodose distributions. Methods: Beam modeling of eMC algorithm was performed for electron energies of 6, 9, 12 15 and 18 Mev for a Elekta Infinity Linac and all available ( 6, 10, 14 20 and 25 cone) applicator sizes. Electron cutouts of incrementally smaller field sizes (20, 40, 60 and 80% blocked from open cone) were fabricated. Dose calculation was performed using a grid size smaller than one-tenth of the R{sub 80–20} electron distal falloff distancemore » and number of particle histories was set at 500,000 per cm{sup 2}. Percent depth dose scans and beam profiles at dmax, d{sub 90} and d{sub 80} depths were measured for each cutout and energy with Wellhoffer (IBA) Blue Phantom{sup 2} scanning system and compared against eMC calculated doses. Results: The measured dose and output factors of incrementally reduced cutout sizes (to 3cm diameter) agreed with eMC calculated doses within ± 2.5%. The profile comparisons at dmax, d{sub 90} and d{sub 80} depths and percent depth doses at reduced field sizes agreed within 2.5% or 2mm. Conclusion: Our results indicate that the Monaco eMC algorithm can accurately predict depth doses, isodose distributions, and monitor units in homogeneous water phantom for field sizes as small as 3.0 cm diameter for energies in the 6 to 18 MeV range at 100 cm SSD. Consequently, the old rule of thumb to approximate limiting cutout size for an electron field determined by the lateral scatter equilibrium (E (MeV)/2.5 in centimeters of water) does not apply to Monaco eMC algorithm.« less

  17. TH-AB-202-09: Direct-Aperture Optimization for Combined MV+kV Dose Planning in Fluoroscopic Real-Time Tumor-Tracking Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, X; Belcher, AH; Grelewicz, Z

    Purpose: Real-time kV fluoroscopic tumor tracking has the benefit of direct tumor position monitoring. However, there is clinical concern over the excess kV imaging dose cost to the patient when imaging in continuous fluoroscopic mode. This work addresses this specific issue by proposing a combined MV+kV direct-aperture optimization (DAO) approach to integrate the kV imaging beam into a treatment planning such that the kV radiation is considered as a contributor to the overall dose delivery. Methods: The combined MV+kV DAO approach includes three algorithms. First, a projected Quasi-Newton algorithm (L-BFGS) is used to find optimized fluence with MV+kV dose formore » the best possible dose distribution. Then, Engel’s algorithm is applied to optimize the total number of monitor units and heuristically optimize the number of apertures. Finally, an aperture shape optimization (ASO) algorithm is applied to locally optimize the leaf positions of MLC. Results: Compared to conventional DAO MV plans with continuous kV fluoroscopic tracking, combined MV+kV DAO plan leads to a reduction in the total number of MV monitor units due to inclusion of kV dose as part of the PTV, and was also found to reduce the mean and maximum doses on the organs at risk (OAR). Compared to conventional DAO MV plan without kV tracking, the OAR dose in the combined MV+kV DAO plan was only slightly higher. DVH curves show that combined MV+kV DAO plan provided about the same PTV coverage as that in the conventional DAO plans without kV imaging. Conclusion: We report a combined MV+kV DAO approach that allows real time kV imager tumor tracking with only a trivial increasing on the OAR doses while providing the same coverage to PTV. The approach is suitable for clinic implementation.« less

  18. A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures.

    PubMed

    Neylon, J; Sheng, K; Yu, V; Chen, Q; Low, D A; Kupelian, P; Santhanam, A

    2014-10-01

    Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy into a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.

  19. A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neylon, J., E-mail: jneylon@mednet.ucla.edu; Sheng, K.; Yu, V.

    Purpose: Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. Methods: The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy intomore » a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. Results: The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. Conclusions: The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.« less

  20. SU-F-T-148: Are the Approximations in Analytic Semi-Empirical Dose Calculation Algorithms for Intensity Modulated Proton Therapy for Complex Heterogeneities of Head and Neck Clinically Significant?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yepes, P; UT MD Anderson Cancer Center, Houston, TX; Titt, U

    2016-06-15

    Purpose: Evaluate the differences in dose distributions between the proton analytic semi-empirical dose calculation algorithm used in the clinic and Monte Carlo calculations for a sample of 50 head-and-neck (H&N) patients and estimate the potential clinical significance of the differences. Methods: A cohort of 50 H&N patients, treated at the University of Texas Cancer Center with Intensity Modulated Proton Therapy (IMPT), were selected for evaluation of clinical significance of approximations in computed dose distributions. H&N site was selected because of the highly inhomogeneous nature of the anatomy. The Fast Dose Calculator (FDC), a fast track-repeating accelerated Monte Carlo algorithm formore » proton therapy, was utilized for the calculation of dose distributions delivered during treatment plans. Because of its short processing time, FDC allows for the processing of large cohorts of patients. FDC has been validated versus GEANT4, a full Monte Carlo system and measurements in water and for inhomogeneous phantoms. A gamma-index analysis, DVHs, EUDs, and TCP and NTCPs computed using published models were utilized to evaluate the differences between the Treatment Plan System (TPS) and FDC. Results: The Monte Carlo results systematically predict lower dose delivered in the target. The observed differences can be as large as 8 Gy, and should have a clinical impact. Gamma analysis also showed significant differences between both approaches, especially for the target volumes. Conclusion: Monte Carlo calculations with fast algorithms is practical and should be considered for the clinic, at least as a treatment plan verification tool.« less

  1. Optimization-based image reconstruction from sparse-view data in offset-detector CBCT

    NASA Astrophysics Data System (ADS)

    Bian, Junguo; Wang, Jiong; Han, Xiao; Sidky, Emil Y.; Shao, Lingxiong; Pan, Xiaochuan

    2013-01-01

    The field of view (FOV) of a cone-beam computed tomography (CBCT) unit in a single-photon emission computed tomography (SPECT)/CBCT system can be increased by offsetting the CBCT detector. Analytic-based algorithms have been developed for image reconstruction from data collected at a large number of densely sampled views in offset-detector CBCT. However, the radiation dose involved in a large number of projections can be of a health concern to the imaged subject. CBCT-imaging dose can be reduced by lowering the number of projections. As analytic-based algorithms are unlikely to reconstruct accurate images from sparse-view data, we investigate and characterize in the work optimization-based algorithms, including an adaptive steepest descent-weighted projection onto convex sets (ASD-WPOCS) algorithms, for image reconstruction from sparse-view data collected in offset-detector CBCT. Using simulated data and real data collected from a physical pelvis phantom and patient, we verify and characterize properties of the algorithms under study. Results of our study suggest that optimization-based algorithms such as ASD-WPOCS may be developed for yielding images of potential utility from a number of projections substantially smaller than those used currently in clinical SPECT/CBCT imaging, thus leading to a dose reduction in CBCT imaging.

  2. Objective performance assessment of five computed tomography iterative reconstruction algorithms.

    PubMed

    Omotayo, Azeez; Elbakri, Idris

    2016-11-22

    Iterative algorithms are gaining clinical acceptance in CT. We performed objective phantom-based image quality evaluation of five commercial iterative reconstruction algorithms available on four different multi-detector CT (MDCT) scanners at different dose levels as well as the conventional filtered back-projection (FBP) reconstruction. Using the Catphan500 phantom, we evaluated image noise, contrast-to-noise ratio (CNR), modulation transfer function (MTF) and noise-power spectrum (NPS). The algorithms were evaluated over a CTDIvol range of 0.75-18.7 mGy on four major MDCT scanners: GE DiscoveryCT750HD (algorithms: ASIR™ and VEO™); Siemens Somatom Definition AS+ (algorithm: SAFIRE™); Toshiba Aquilion64 (algorithm: AIDR3D™); and Philips Ingenuity iCT256 (algorithm: iDose4™). Images were reconstructed using FBP and the respective iterative algorithms on the four scanners. Use of iterative algorithms decreased image noise and increased CNR, relative to FBP. In the dose range of 1.3-1.5 mGy, noise reduction using iterative algorithms was in the range of 11%-51% on GE DiscoveryCT750HD, 10%-52% on Siemens Somatom Definition AS+, 49%-62% on Toshiba Aquilion64, and 13%-44% on Philips Ingenuity iCT256. The corresponding CNR increase was in the range 11%-105% on GE, 11%-106% on Siemens, 85%-145% on Toshiba and 13%-77% on Philips respectively. Most algorithms did not affect the MTF, except for VEO™ which produced an increase in the limiting resolution of up to 30%. A shift in the peak of the NPS curve towards lower frequencies and a decrease in NPS amplitude were obtained with all iterative algorithms. VEO™ required long reconstruction times, while all other algorithms produced reconstructions in real time. Compared to FBP, iterative algorithms reduced image noise and increased CNR. The iterative algorithms available on different scanners achieved different levels of noise reduction and CNR increase while spatial resolution improvements were obtained only with VEO™. This study is useful in that it provides performance assessment of the iterative algorithms available from several mainstream CT manufacturers.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokhrel, D; Badkul, R; Jiang, H

    Purpose: To compare dose distributions calculated using the iPlan XVMC algorithm and heterogeneities corrected/uncorrected Pencil Beam (PB-hete/PB-homo) algorithms for SBRT treatments of lung tumors. Methods: Ten patients with centrally located solitary lung tumors were treated using MC-based SBRT to 60Gy in 5 fractions for PTVV100%=95%. ITV was delineated on MIP-images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1–106.5cc(mean=48.6cc). MC-SBRT plans were generated with a combination of non-coplanar conformal arcs/beams using iPlan-XVMC-algorithm (BrainLABiPlan ver.4.1.2) for Novalis-TX consisting of HD-MLCs and 6MV-SRS(1000MU/min) mode, following RTOG 0813 dosimetric criteria. For comparison, PB-hete/PB-homo algorithms were used to re-calculate dose distributions using same beammore » configurations, MLCs/monitor units. Plans were evaluated with isocenter/maximal/mean doses to PTV. Normal lung doses were evaluated with V5/V10/V20 and mean-lung-dose(MLD), excluding PTV. Other OAR doses such as maximal spinal cord/2cc-esophagus/max bronchial tree (BT/maximal heart doses were tabulated. Results: Maximal/mean/isocenter doses to PTV calculated by PB-hete were uniformly larger than MC plans by a factors of 1.09/1.13/1.07, on average, whereas they were consistently lower by PB-homo by a factors of 0.9/0.84/0.9, respectively. The volume covered by 5Gy/10Gy/20Gy isodose-lines of the lung were comparable (average within±3%) when calculated by PB-hete compared to XVMC, but, consistently lower by PB-homo by a factors of 0.90/0.88/0.85, respectively. MLD was higher with PB-hete by 1.05, but, lower by PB-homo by 0.9, on average, compared to XVMC. XVMC max-cord/max-BT/max-heart and 2cc of esophagus doses were comparable to PB-hete; however, PB-homo underestimates by a factors of 0.82/0.89/0.88/0.86, on average, respectively. Conclusion: PB-hete significantly overestimates dose to PTV relative to XVMC -hence underdosing the target. MC is more complex and accurate with tissue-heterogeneities.The magnitude of variation significantly varies with ‘small-island-tumor’ surrounded by low-density lung tissues -PB algorithms lacks later electron scattering. Dose calculation with XVMC for lung SBRT is routinely performed in our clinic, its performance for head'neck/sinus cases will also be investigated.« less

  4. Dose Titration Algorithm Tuning (DTAT) should supersede ‘the’ Maximum Tolerated Dose (MTD) in oncology dose-finding trials

    PubMed Central

    Norris, David C.

    2017-01-01

    Background. Absent adaptive, individualized dose-finding in early-phase oncology trials, subsequent ‘confirmatory’ Phase III trials risk suboptimal dosing, with resulting loss of statistical power and reduced probability of technical success for the investigational therapy. While progress has been made toward explicitly adaptive dose-finding and quantitative modeling of dose-response relationships, most such work continues to be organized around a concept of ‘the’ maximum tolerated dose (MTD). The purpose of this paper is to demonstrate concretely how the aim of early-phase trials might be conceived, not as ‘dose-finding’, but as dose titration algorithm (DTA)-finding. Methods. A Phase I dosing study is simulated, for a notional cytotoxic chemotherapy drug, with neutropenia constituting the critical dose-limiting toxicity. The drug’s population pharmacokinetics and myelosuppression dynamics are simulated using published parameter estimates for docetaxel. The amenability of this model to linearization is explored empirically. The properties of a simple DTA targeting neutrophil nadir of 500 cells/mm 3 using a Newton-Raphson heuristic are explored through simulation in 25 simulated study subjects. Results. Individual-level myelosuppression dynamics in the simulation model approximately linearize under simple transformations of neutrophil concentration and drug dose. The simulated dose titration exhibits largely satisfactory convergence, with great variance in individualized optimal dosing. Some titration courses exhibit overshooting. Conclusions. The large inter-individual variability in simulated optimal dosing underscores the need to replace ‘the’ MTD with an individualized concept of MTD i . To illustrate this principle, the simplest possible DTA capable of realizing such a concept is demonstrated. Qualitative phenomena observed in this demonstration support discussion of the notion of tuning such algorithms. Although here illustrated specifically in relation to cytotoxic chemotherapy, the DTAT principle appears similarly applicable to Phase I studies of cancer immunotherapy and molecularly targeted agents. PMID:28663782

  5. Optimal field-splitting algorithm in intensity-modulated radiotherapy: Evaluations using head-and-neck and female pelvic IMRT cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dou, Xin; Kim, Yusung, E-mail: yusung-kim@uiowa.edu; Bayouth, John E.

    2013-04-01

    To develop an optimal field-splitting algorithm of minimal complexity and verify the algorithm using head-and-neck (H and N) and female pelvic intensity-modulated radiotherapy (IMRT) cases. An optimal field-splitting algorithm was developed in which a large intensity map (IM) was split into multiple sub-IMs (≥2). The algorithm reduced the total complexity by minimizing the monitor units (MU) delivered and segment number of each sub-IM. The algorithm was verified through comparison studies with the algorithm as used in a commercial treatment planning system. Seven IMRT, H and N, and female pelvic cancer cases (54 IMs) were analyzed by MU, segment numbers, andmore » dose distributions. The optimal field-splitting algorithm was found to reduce both total MU and the total number of segments. We found on average a 7.9 ± 11.8% and 9.6 ± 18.2% reduction in MU and segment numbers for H and N IMRT cases with an 11.9 ± 17.4% and 11.1 ± 13.7% reduction for female pelvic cases. The overall percent (absolute) reduction in the numbers of MU and segments were found to be on average −9.7 ± 14.6% (−15 ± 25 MU) and −10.3 ± 16.3% (−3 ± 5), respectively. In addition, all dose distributions from the optimal field-splitting method showed improved dose distributions. The optimal field-splitting algorithm shows considerable improvements in both total MU and total segment number. The algorithm is expected to be beneficial for the radiotherapy treatment of large-field IMRT.« less

  6. DRESS syndrome with thrombotic microangiopathy revealing a Noonan syndrome

    PubMed Central

    Bobot, Mickaël; Coen, Matteo; Simon, Clémentine; Daniel, Laurent; Habib, Gilbert; Serratrice, Jacques

    2018-01-01

    Abstract Rationale: The life-threatening drug rash with eosinophilia and systemic symptoms (DRESS) syndrome occurs most commonly after exposure to drugs, clinical features mimic those found with other serious systemic disorders. It is rarely associated with thrombotic microangiopathy. Patient concerns: We describe the unique case of a 44-year-old man who simultaneously experienced DRESS syndrome with thrombotic microangiopathy (TMA) after a 5 days treatment with fluindione. Diagnoses: Clinical evaluation leads to the discovery of an underlying lymphangiomatosis, due to a Noonan syndrome. Intervetions: The anticoagulant was withdrawn, and corticosteroids (1 mg/kg/day) and acenocoumarol were started. Outcomes: Clinical improvement ensued. At follow-up the patient is well. Lessons: The association of DRESS with TMA is a rare condition; we believe that the presence of the underlying Noonan syndrome could have been the trigger. Moreover, we speculate about the potential interrelations between these entities. PMID:29642153

  7. Development of a pharmacogenetic-guided warfarin dosing algorithm for Puerto Rican patients

    PubMed Central

    Ramos, Alga S; Seip, Richard L; Rivera-Miranda, Giselle; Felici-Giovanini, Marcos E; Garcia-Berdecia, Rafael; Alejandro-Cowan, Yirelia; Kocherla, Mohan; Cruz, Iadelisse; Feliu, Juan F; Cadilla, Carmen L; Renta, Jessica Y; Gorowski, Krystyna; Vergara, Cunegundo; Ruaño, Gualberto; Duconge, Jorge

    2012-01-01

    Aim This study was aimed at developing a pharmacogenetic-driven warfarin-dosing algorithm in 163 admixed Puerto Rican patients on stable warfarin therapy. Patients & methods A multiple linear-regression analysis was performed using log-transformed effective warfarin dose as the dependent variable, and combining CYP2C9 and VKORC1 genotyping with other relevant nongenetic clinical and demographic factors as independent predictors. Results The model explained more than two-thirds of the observed variance in the warfarin dose among Puerto Ricans, and also produced significantly better ‘ideal dose’ estimates than two pharmacogenetic models and clinical algorithms published previously, with the greatest benefit seen in patients ultimately requiring <7 mg/day. We also assessed the clinical validity of the model using an independent validation cohort of 55 Puerto Rican patients from Hartford, CT, USA (R2 = 51%). Conclusion Our findings provide the basis for planning prospective pharmacogenetic studies to demonstrate the clinical utility of genotyping warfarin-treated Puerto Rican patients. PMID:23215886

  8. Applications of nonlocal means algorithm in low-dose X-ray CT image processing and reconstruction: a review

    PubMed Central

    Zhang, Hao; Zeng, Dong; Zhang, Hua; Wang, Jing; Liang, Zhengrong

    2017-01-01

    Low-dose X-ray computed tomography (LDCT) imaging is highly recommended for use in the clinic because of growing concerns over excessive radiation exposure. However, the CT images reconstructed by the conventional filtered back-projection (FBP) method from low-dose acquisitions may be severely degraded with noise and streak artifacts due to excessive X-ray quantum noise, or with view-aliasing artifacts due to insufficient angular sampling. In 2005, the nonlocal means (NLM) algorithm was introduced as a non-iterative edge-preserving filter to denoise natural images corrupted by additive Gaussian noise, and showed superior performance. It has since been adapted and applied to many other image types and various inverse problems. This paper specifically reviews the applications of the NLM algorithm in LDCT image processing and reconstruction, and explicitly demonstrates its improving effects on the reconstructed CT image quality from low-dose acquisitions. The effectiveness of these applications on LDCT and their relative performance are described in detail. PMID:28303644

  9. SU-E-J-109: Evaluation of Deformable Accumulated Parotid Doses Using Different Registration Algorithms in Adaptive Head and Neck Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, S; Chinese PLA General Hospital, Beijing, 100853 China; Liu, B

    2015-06-15

    Purpose: Three deformable image registration (DIR) algorithms are utilized to perform deformable dose accumulation for head and neck tomotherapy treatment, and the differences of the accumulated doses are evaluated. Methods: Daily MVCT data for 10 patients with pathologically proven nasopharyngeal cancers were analyzed. The data were acquired using tomotherapy (TomoTherapy, Accuray) at the PLA General Hospital. The prescription dose to the primary target was 70Gy in 33 fractions.Three DIR methods (B-spline, Diffeomorphic Demons and MIMvista) were used to propagate parotid structures from planning CTs to the daily CTs and accumulate fractionated dose on the planning CTs. The mean accumulated dosesmore » of parotids were quantitatively compared and the uncertainties of the propagated parotid contours were evaluated using Dice similarity index (DSI). Results: The planned mean dose of the ipsilateral parotids (32.42±3.13Gy) was slightly higher than those of the contralateral parotids (31.38±3.19Gy)in 10 patients. The difference between the accumulated mean doses of the ipsilateral parotids in the B-spline, Demons and MIMvista deformation algorithms (36.40±5.78Gy, 34.08±6.72Gy and 33.72±2.63Gy ) were statistically significant (B-spline vs Demons, P<0.0001, B-spline vs MIMvista, p =0.002). And The difference between those of the contralateral parotids in the B-spline, Demons and MIMvista deformation algorithms (34.08±4.82Gy, 32.42±4.80Gy and 33.92±4.65Gy ) were also significant (B-spline vs Demons, p =0.009, B-spline vs MIMvista, p =0.074). For the DSI analysis, the scores of B-spline, Demons and MIMvista DIRs were 0.90, 0.89 and 0.76. Conclusion: Shrinkage of parotid volumes results in the dose increase to the parotid glands in adaptive head and neck radiotherapy. The accumulated doses of parotids show significant difference using the different DIR algorithms between kVCT and MVCT. Therefore, the volume-based criterion (i.e. DSI) as a quantitative evaluation of registration accuracy is essential besides the visual assessment by the treating physician. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11105225)« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moura, Eduardo S., E-mail: emoura@wisc.edu; Micka, John A.; Hammer, Cliff G.

    Purpose: This work presents the development of a phantom to verify the treatment planning system (TPS) algorithms used for high-dose-rate (HDR) brachytherapy. It is designed to measure the relative dose in a heterogeneous media. The experimental details used, simulation methods, and comparisons with a commercial TPS are also provided. Methods: To simulate heterogeneous conditions, four materials were used: Virtual Water™ (VM), BR50/50™, cork, and aluminum. The materials were arranged in 11 heterogeneity configurations. Three dosimeters were used to measure the relative response from a HDR {sup 192}Ir source: TLD-100™, Gafchromic{sup ®} EBT3 film, and an Exradin™ A1SL ionization chamber. Tomore » compare the results from the experimental measurements, the various configurations were modeled in the PENELOPE/penEasy Monte Carlo code. Images of each setup geometry were acquired from a CT scanner and imported into BrachyVision™ TPS software, which includes a grid-based Boltzmann solver Acuros™. The results of the measurements performed in the heterogeneous setups were normalized to the dose values measured in the homogeneous Virtual Water™ setup and the respective differences due to the heterogeneities were considered. Additionally, dose values calculated based on the American Association of Physicists in Medicine-Task Group 43 formalism were compared to dose values calculated with the Acuros™ algorithm in the phantom. Calculated doses were compared at the same points, where measurements have been performed. Results: Differences in the relative response as high as 11.5% were found from the homogeneous setup when the heterogeneous materials were inserted into the experimental phantom. The aluminum and cork materials produced larger differences than the plastic materials, with the BR50/50™ material producing results similar to the Virtual Water™ results. Our experimental methods agree with the PENELOPE/penEasy simulations for most setups and dosimeters. The TPS relative differences with the Acuros™ algorithm were similar in both experimental and simulated setups. The discrepancy between the BrachyVision™, Acuros™, and TG-43 dose responses in the phantom described by this work exceeded 12% for certain setups. Conclusions: The results derived from the phantom measurements show good agreement with the simulations and TPS calculations, using Acuros™ algorithm. Differences in the dose responses were evident in the experimental results when heterogeneous materials were introduced. These measurements prove the usefulness of the heterogeneous phantom for verification of HDR treatment planning systems based on model-based dose calculation algorithms.« less

  11. SU-D-202-04: Validation of Deformable Image Registration Algorithms for Head and Neck Adaptive Radiotherapy in Routine Clinical Setting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L; Pi, Y; Chen, Z

    2016-06-15

    Purpose: To evaluate the ROI contours and accumulated dose difference using different deformable image registration (DIR) algorithms for head and neck (H&N) adaptive radiotherapy. Methods: Eight H&N cancer patients were randomly selected from the affiliated hospital. During the treatment, patients were rescanned every week with ROIs well delineated by radiation oncologist on each weekly CT. New weekly treatment plans were also re-designed with consistent dose prescription on the rescanned CT and executed for one week on Siemens CT-on-rails accelerator. At the end, we got six weekly CT scans from CT1 to CT6 including six weekly treatment plans for each patient.more » The primary CT1 was set as the reference CT for DIR proceeding with the left five weekly CTs using ANACONDA and MORFEUS algorithms separately in RayStation and the external skin ROI was set to be the controlling ROI both. The entire calculated weekly dose were deformed and accumulated on corresponding reference CT1 according to the deformation vector field (DVFs) generated by the two different DIR algorithms respectively. Thus we got both the ANACONDA-based and MORFEUS-based accumulated total dose on CT1 for each patient. At the same time, we mapped the ROIs on CT1 to generate the corresponding ROIs on CT6 using ANACONDA and MORFEUS DIR algorithms. DICE coefficients between the DIR deformed and radiation oncologist delineated ROIs on CT6 were calculated. Results: For DIR accumulated dose, PTV D95 and Left-Eyeball Dmax show significant differences with 67.13 cGy and 109.29 cGy respectively (Table1). For DIR mapped ROIs, PTV, Spinal cord and Left-Optic nerve show difference with −0.025, −0.127 and −0.124 (Table2). Conclusion: Even two excellent DIR algorithms can give divergent results for ROI deformation and dose accumulation. As more and more TPS get DIR module integrated, there is an urgent need to realize the potential risk using DIR in clinical.« less

  12. Report of the AAPM Task Group No. 105: Issues associated with clinical implementation of Monte Carlo-based photon and electron external beam treatment planning.

    PubMed

    Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V

    2007-12-01

    The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

  13. Investigation of effective decision criteria for multiobjective optimization in IMRT.

    PubMed

    Holdsworth, Clay; Stewart, Robert D; Kim, Minsun; Liao, Jay; Phillips, Mark H

    2011-06-01

    To investigate how using different sets of decision criteria impacts the quality of intensity modulated radiation therapy (IMRT) plans obtained by multiobjective optimization. A multiobjective optimization evolutionary algorithm (MOEA) was used to produce sets of IMRT plans. The MOEA consisted of two interacting algorithms: (i) a deterministic inverse planning optimization of beamlet intensities that minimizes a weighted sum of quadratic penalty objectives to generate IMRT plans and (ii) an evolutionary algorithm that selects the superior IMRT plans using decision criteria and uses those plans to determine the new weights and penalty objectives of each new plan. Plans resulting from the deterministic algorithm were evaluated by the evolutionary algorithm using a set of decision criteria for both targets and organs at risk (OARs). Decision criteria used included variation in the target dose distribution, mean dose, maximum dose, generalized equivalent uniform dose (gEUD), an equivalent uniform dose (EUD(alpha,beta) formula derived from the linear-quadratic survival model, and points on dose volume histograms (DVHs). In order to quantatively compare results from trials using different decision criteria, a neutral set of comparison metrics was used. For each set of decision criteria investigated, IMRT plans were calculated for four different cases: two simple prostate cases, one complex prostate Case, and one complex head and neck Case. When smaller numbers of decision criteria, more descriptive decision criteria, or less anti-correlated decision criteria were used to characterize plan quality during multiobjective optimization, dose to OARs and target dose variation were reduced in the final population of plans. Mean OAR dose and gEUD (a = 4) decision criteria were comparable. Using maximum dose decision criteria for OARs near targets resulted in inferior populations that focused solely on low target variance at the expense of high OAR dose. Target dose range, (D(max) - D(min)), decision criteria were found to be most effective for keeping targets uniform. Using target gEUD decision criteria resulted in much lower OAR doses but much higher target dose variation. EUD(alpha,beta) based decision criteria focused on a region of plan space that was a compromise between target and OAR objectives. None of these target decision criteria dominated plans using other criteria, but only focused on approaching a different area of the Pareto front. The choice of decision criteria implemented in the MOEA had a significant impact on the region explored and the rate of convergence toward the Pareto front. When more decision criteria, anticorrelated decision criteria, or decision criteria with insufficient information were implemented, inferior populations are resulted. When more informative decision criteria were used, such as gEUD, EUD(alpha,beta), target dose range, and mean dose, MOEA optimizations focused on approaching different regions of the Pareto front, but did not dominate each other. Using simple OAR decision criteria and target EUD(alpha,beta) decision criteria demonstrated the potential to generate IMRT plans that significantly reduce dose to OARs while achieving the same or better tumor control when clinical requirements on target dose variance can be met or relaxed.

  14. A spatially encoded dose difference maximal intensity projection map for patient dose evaluation: A new first line patient quality assurance tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu Weigang; Graff, Pierre; Boettger, Thomas

    2011-04-15

    Purpose: To develop a spatially encoded dose difference maximal intensity projection (DD-MIP) as an online patient dose evaluation tool for visualizing the dose differences between the planning dose and dose on the treatment day. Methods: Megavoltage cone-beam CT (MVCBCT) images acquired on the treatment day are used for generating the dose difference index. Each index is represented by different colors for underdose, acceptable, and overdose regions. A maximal intensity projection (MIP) algorithm is developed to compress all the information of an arbitrary 3D dose difference index into a 2D DD-MIP image. In such an algorithm, a distance transformation is generatedmore » based on the planning CT. Then, two new volumes representing the overdose and underdose regions of the dose difference index are encoded with the distance transformation map. The distance-encoded indices of each volume are normalized using the skin distance obtained on the planning CT. After that, two MIPs are generated based on the underdose and overdose volumes with green-to-blue and green-to-red lookup tables, respectively. Finally, the two MIPs are merged with an appropriate transparency level and rendered in planning CT images. Results: The spatially encoded DD-MIP was implemented in a dose-guided radiotherapy prototype and tested on 33 MVCBCT images from six patients. The user can easily establish the threshold for the overdose and underdose. A 3% difference between the treatment and planning dose was used as the threshold in the study; hence, the DD-MIP shows red or blue color for the dose difference >3% or {<=}3%, respectively. With such a method, the overdose and underdose regions can be visualized and distinguished without being overshadowed by superficial dose differences. Conclusions: A DD-MIP algorithm was developed that compresses information from 3D into a single or two orthogonal projections while hinting the user whether the dose difference is on the skin surface or deeper.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Saenz, D

    Purpose: Stereotactic radiosurgery (SRS) outcomes are related to the delivered dose to the target and to surrounding tissue. We have commissioned a Monte Carlo based dose calculation algorithm to recalculated the delivered dose planned using pencil beam calculation dose engine. Methods: Twenty consecutive previously treated patients have been selected for this study. All plans were generated using the iPlan treatment planning system (TPS) and calculated using the pencil beam algorithm. Each patient plan consisted of 1 to 3 targets and treated using dynamically conformal arcs or intensity modulated beams. Multi-target treatments were delivered using multiple isocenters, one for each target.more » These plans were recalculated for the purpose of this study using a single isocenter. The CT image sets along with the plan, doses and structures were DICOM exported to Monaco TPS and the dose was recalculated using the same voxel resolution and monitor units. Benchmark data was also generated prior to patient calculations to assess the accuracy of the two TPS against measurements using a micro ionization chamber in solid water. Results: Good agreement, within −0.4% for Monaco and +2.2% for iPlan were observed for measurements in water phantom. Doses in patient geometry revealed up to 9.6% differences for single target plans and 9.3% for multiple-target-multiple-isocenter plans. The average dose differences for multi-target-single-isocenter plans were approximately 1.4%. Similar differences were observed for the OARs and integral dose. Conclusion: Accuracy of the beam is crucial for the dose calculation especially in the case of small fields such as those used in SRS treatments. A superior dose calculation algorithm such as Monte Carlo, with properly commissioned beam models, which is unaffected by the lack of electronic equilibrium should be preferred for the calculation of small fields to improve accuracy.« less

  16. Incorporating partial shining effects in proton pencil-beam dose calculation

    NASA Astrophysics Data System (ADS)

    Li, Yupeng; Zhang, Xiaodong; Fwu Lii, Ming; Sahoo, Narayan; Zhu, Ron X.; Gillin, Michael; Mohan, Radhe

    2008-02-01

    A range modulator wheel (RMW) is an essential component in passively scattered proton therapy. We have observed that a proton beam spot may shine on multiple steps of the RMW. Proton dose calculation algorithms normally do not consider the partial shining effect, and thus overestimate the dose at the proximal shoulder of spread-out Bragg peak (SOBP) compared with the measurement. If the SOBP is adjusted to better fit the plateau region, the entrance dose is likely to be underestimated. In this work, we developed an algorithm that can be used to model this effect and to allow for dose calculations that better fit the measured SOBP. First, a set of apparent modulator weights was calculated without considering partial shining. Next, protons spilled from the accelerator reaching the modulator wheel were simplified as a circular spot of uniform intensity. A weight-splitting process was then performed to generate a set of effective modulator weights with the partial shining effect incorporated. The SOBPs of eight options, which are used to label different combinations of proton-beam energy and scattering devices, were calculated with the generated effective weights. Our algorithm fitted the measured SOBP at the proximal and entrance regions much better than the ones without considering partial shining effect for all SOBPs of the eight options. In a prostate patient, we found that dose calculation without considering partial shining effect underestimated the femoral head and skin dose.

  17. Local ROI Reconstruction via Generalized FBP and BPF Algorithms along More Flexible Curves

    PubMed Central

    Ye, Yangbo; Zhao, Shiying; Wang, Ge

    2006-01-01

    We study the local region-of-interest (ROI) reconstruction problem, also referred to as the local CT problem. Our scheme includes two steps: (a) the local truncated normal-dose projections are extended to global dataset by combining a few global low-dose projections; (b) the ROI are reconstructed by either the generalized filtered backprojection (FBP) or backprojection-filtration (BPF) algorithms. The simulation results show that both the FBP and BPF algorithms can reconstruct satisfactory results with image quality in the ROI comparable to that of the corresponding global CT reconstruction. PMID:23165018

  18. MO-PIS-Exhibit Hall-01: Imaging: CT Dose Optimization Technologies I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denison, K; Smith, S

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The imaging topic this year is CT scanner dose optimization capabilities. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Dose Optimization Capabilities of GE Computed Tomography Scanners Presentation Time: 11:15 – 11:45 AM GE Healthcare is dedicated to the delivery of high quality clinical images through the development of technologies, whichmore » optimize the application of ionizing radiation. In computed tomography, dose management solutions fall into four categories: employs projection data and statistical modeling to decrease noise in the reconstructed image - creating an opportunity for mA reduction in the acquisition of diagnostic images. Veo represents true Model Based Iterative Reconstruction (MBiR). Using high-level algorithms in tandem with advanced computing power, Veo enables lower pixel noise standard deviation and improved spatial resolution within a single image. Advanced Adaptive Image Filters allow for maintenance of spatial resolution while reducing image noise. Examples of adaptive image space filters include Neuro 3-D filters and Cardiac Noise Reduction Filters. AutomA adjusts mA along the z-axis and is the CT equivalent of auto exposure control in conventional x-ray systems. Dynamic Z-axis Tracking offers an additional opportunity for dose reduction in helical acquisitions while SmartTrack Z-axis Tracking serves to ensure beam, collimator and detector alignment during tube rotation. SmartmA provides angular mA modulation. ECG Helical Modulation reduces mA during the systolic phase of the heart cycle. SmartBeam optimization uses bowtie beam-shaping hardware and software to filter off-axis x-rays - minimizing dose and reducing x-ray scatter. The DICOM Radiation Dose Structured Report (RDSR) generates a dose report at the conclusion of every examination. Dose Check preemptively notifies CT operators when scan parameters exceed user-defined dose thresholds. DoseWatch is an information technology application providing vendor-agnostic dose tracking and analysis for CT (and all other diagnostic x-ray modalities) SnapShot Pulse improves coronary CTA dose management. VolumeShuttle uses two acquisitions to increase coverage, decrease dose, and conserve on contrast administration. Color-Coding for Kids applies the Broselow-Luten Pediatric System to facilitate pediatric emergency care and reduce medical errors. FeatherLight achieves dose optimization through pediatric procedure-based protocols. Adventure Series scanners provide a child-friendly imaging environment promoting patient cooperation with resultant reduction in retakes and patient motion. Philips CT Dose Optimization Tools and Advanced Reconstruction Presentation Time: 11:45 ‘ 12:15 PM The first part of the talk will cover “Dose Reduction and Dose Optimization Technologies” present in Philips CT Scanners. The main Technologies to be presented include: DoseRight and tube current modulation (DoseRight, Z-DOM, 3D-DOM, DoseRight Cardiac) Special acquisition modes Beam filtration and beam shapers Eclipse collimator and ClearRay collimator NanoPanel detector DoseRight will cover automatic tube current selection that automatically adjusts the dose for the individual patient. The presentation will explore the modulation techniques currently employed in Philips CT scanners and will include the algorithmic concepts as well as illustrative examples. Modulation and current selection technologies to be covered include the Automatic Current Selection component of DoseRight, ZDOM longitudinal dose modulation, 3D-DOM (combination of longitudinal and rotational dose modulation), Cardiac Dose right (an ECG based dose modulation scheme), and the DoseRight Index (DRI) IQ index. The special acquisition modes covers acquisition techniques such as prospective gating that is designed to reduce exposure to the patient through the Cardiac Step and Shoot scan mode. This mode can substitute the much higher dose retrospective scan modes for certain types of cardiac imaging. The beam filtration and beam shaper portion will discuss the variety of filtration and beam shaping configurations available on Philips scanners. This topic includes the x-ray beam characteristics, tube filtration as well as dose compensator characteristics. The Eclipse collimator, ClearRay collimator and the NanoPanel detector portion will discuss additional technologies specific to wide coverage CT that address some of the unique challenges encountered and techniques employed to optimize image quality and optimize dose utilization. The Eclipse collimator reduces extraneous exposure by actively blocking the radiation tails at either end of helical scans that do not contribute to the image generation. The ClearRay collimator and the NanoPanel detector optimize the quality of the signal that reaches the detectors by addressing the increased scattered radiation present in wide coverage and the NanoPanel detector adds superior electronic noise characteristics valuable when imaging at a low dose level. The second part of the talk will present “Advanced Reconstruction Technologies” currently available on Philips CT Scanners. The talk will cover filtered back projection (FBP), iDose4 and Iterative Model Reconstruction (IMR). Each reconstruction method will include a discussion of the algorithm as well as similarities and differences between the algorithms. Examples illustrating the merits of each algorithm will be presented, and techniques and metrics to characterize the performance of each type of algorithm will be presented. The Filtered Back projection portion will discuss and provide a brief summary of relevant standard image reconstruction techniques in common use, and discuss the common tradeoffs when using the FBP algorithm. The iDose4 portion will present the algorithms used for iDose4 as well the different levels. The meaning of different levels of iDose4 available will be presented and quantified. Guidelines for selection iDose4 parameters based on the imaging need will be explained. The different image quality goals available with iDose4 and specifically how iDose4 enables noise reduction, spatial resolution improvement or both will be explained. The approaches to leveraging the benefits of iDose4 such as improved spatial resolution, decreased noise, and artifact prevention will be described and quantified; and measurements and metrics behind the improvements will be presented. The image quality benefits in specific imaging situations as well as how to best combine the technology with other dose reduction strategies to ensure the best image quality at a given dose level will be presented. Insight into the IMR algorithm as well as contrast to the iDose4 techniques and performance characteristics will be discussed. Metrics and techniques for characterizing this class of algorithm and IQ performance will be presented. The image quality benefits and the dose reduction capabilities of IMR will be explored. Illustrative examples of the noise reduction, spatial resolution improvement, and low contrast detectability improvements of the reconstruction method will be presented: clinical cases and phantom measurements demonstrating the benefits of IMR in the areas of low dose imaging, spatial resolution and low contrast resolution are discussed and the technical details behind the measurements will be presented compared to both iDose4 and traditional filtered back projection (FBP)« less

  19. SU-C-207-05: A Comparative Study of Noise-Reduction Algorithms for Low-Dose Cone-Beam Computed Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mukherjee, S; Yao, W

    2015-06-15

    Purpose: To study different noise-reduction algorithms and to improve the image quality of low dose cone beam CT for patient positioning in radiation therapy. Methods: In low-dose cone-beam CT, the reconstructed image is contaminated with excessive quantum noise. In this study, three well-developed noise reduction algorithms namely, a) penalized weighted least square (PWLS) method, b) split-Bregman total variation (TV) method, and c) compressed sensing (CS) method were studied and applied to the images of a computer–simulated “Shepp-Logan” phantom and a physical CATPHAN phantom. Up to 20% additive Gaussian noise was added to the Shepp-Logan phantom. The CATPHAN phantom was scannedmore » by a Varian OBI system with 100 kVp, 4 ms and 20 mA. For comparing the performance of these algorithms, peak signal-to-noise ratio (PSNR) of the denoised images was computed. Results: The algorithms were shown to have the potential in reducing the noise level for low-dose CBCT images. For Shepp-Logan phantom, an improvement of PSNR of 2 dB, 3.1 dB and 4 dB was observed using PWLS, TV and CS respectively, while for CATPHAN, the improvement was 1.2 dB, 1.8 dB and 2.1 dB, respectively. Conclusion: Penalized weighted least square, total variation and compressed sensing methods were studied and compared for reducing the noise on a simulated phantom and a physical phantom scanned by low-dose CBCT. The techniques have shown promising results for noise reduction in terms of PSNR improvement. However, reducing the noise without compromising the smoothness and resolution of the image needs more extensive research.« less

  20. Impact of dose engine algorithm in pencil beam scanning proton therapy for breast cancer.

    PubMed

    Tommasino, Francesco; Fellin, Francesco; Lorentini, Stefano; Farace, Paolo

    2018-06-01

    Proton therapy for the treatment of breast cancer is acquiring increasing interest, due to the potential reduction of radiation-induced side effects such as cardiac and pulmonary toxicity. While several in silico studies demonstrated the gain in plan quality offered by pencil beam scanning (PBS) compared to passive scattering techniques, the related dosimetric uncertainties have been poorly investigated so far. Five breast cancer patients were planned with Raystation 6 analytical pencil beam (APB) and Monte Carlo (MC) dose calculation algorithms. Plans were optimized with APB and then MC was used to recalculate dose distribution. Movable snout and beam splitting techniques (i.e. using two sub-fields for the same beam entrance, one with and the other without the use of a range shifter) were considered. PTV dose statistics were recorded. The same planning configurations were adopted for the experimental benchmark. Dose distributions were measured with a 2D array of ionization chambers and compared to APB and MC calculated ones by means of a γ analysis (agreement criteria 3%, 3 mm). Our results indicate that, when using proton PBS for breast cancer treatment, the Raystation 6 APB algorithm does not allow obtaining sufficient accuracy, especially with large air gaps. On the contrary, the MC algorithm resulted into much higher accuracy in all beam configurations tested and has to be recommended. Centers where a MC algorithm is not yet available should consider a careful use of APB, possibly combined with a movable snout system or in any case with strategies aimed at minimizing air gaps. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. MO-C-17A-11: A Segmentation and Point Matching Enhanced Deformable Image Registration Method for Dose Accumulation Between HDR CT Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhen, X; Chen, H; Zhou, L

    2014-06-15

    Purpose: To propose and validate a novel and accurate deformable image registration (DIR) scheme to facilitate dose accumulation among treatment fractions of high-dose-rate (HDR) gynecological brachytherapy. Method: We have developed a method to adapt DIR algorithms to gynecologic anatomies with HDR applicators by incorporating a segmentation step and a point-matching step into an existing DIR framework. In the segmentation step, random walks algorithm is used to accurately segment and remove the applicator region (AR) in the HDR CT image. A semi-automatic seed point generation approach is developed to obtain the incremented foreground and background point sets to feed the randommore » walks algorithm. In the subsequent point-matching step, a feature-based thin-plate spline-robust point matching (TPS-RPM) algorithm is employed for AR surface point matching. With the resulting mapping, a DVF characteristic of the deformation between the two AR surfaces is generated by B-spline approximation, which serves as the initial DVF for the following Demons DIR between the two AR-free HDR CT images. Finally, the calculated DVF via Demons combined with the initial one serve as the final DVF to map doses between HDR fractions. Results: The segmentation and registration accuracy are quantitatively assessed by nine clinical HDR cases from three gynecological cancer patients. The quantitative results as well as the visual inspection of the DIR indicate that our proposed method can suppress the interference of the applicator with the DIR algorithm, and accurately register HDR CT images as well as deform and add interfractional HDR doses. Conclusions: We have developed a novel and robust DIR scheme that can perform registration between HDR gynecological CT images and yield accurate registration results. This new DIR scheme has potential for accurate interfractional HDR dose accumulation. This work is supported in part by the National Natural ScienceFoundation of China (no 30970866 and no 81301940)« less

  2. TH-E-BRE-05: Analysis of Dosimetric Characteristics in Two Leaf Motion Calculator Algorithms for Sliding Window IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, L; Huang, B; Rowedder, B

    Purpose: The Smart leaf motion calculator (SLMC) in Eclipse treatment planning system is an advanced fluence delivery modeling algorithm as it takes into account fine MLC features including inter-leaf leakage, rounded leaf tips, non-uniform leaf thickness, and the spindle cavity etc. In this study, SLMC and traditional Varian LMC (VLMC) algorithms were investigated, for the first time, in dosimetric characteristics and delivery accuracy of sliding window (SW) IMRT. Methods: The SW IMRT plans of 51 cancer cases were included to evaluate dosimetric characteristics and dose delivery accuracy from leaf motion calculated by SLMC and VLMC, respectively. All plans were deliveredmore » using a Varian TrueBeam Linac. The DVH and MUs of the plans were analyzed. Three patient specific QA tools - independent dose calculation software IMSure, Delta4 phantom, and EPID portal dosimetry were also used to measure the delivered dose distribution. Results: Significant differences in the MUs were observed between the two LMCs (p≤0.001).Gamma analysis shows an excellent agreement between the planned dose distribution calculated by both LMC algorithms and delivered dose distribution measured by three QA tools in all plans at 3%/3 mm, leading to a mean pass rate exceeding 97%. The mean fraction of pixels with gamma < 1 of SLMC is slightly lower than that of VLMC in the IMSure and Delta4 results, but higher in portal dosimetry (the highest spatial resolution), especially in complex cases such as nasopharynx. Conclusion: The study suggests that the two LMCs generates the similar target coverage and sparing patterns of critical structures. However, SLMC is modestly more accurate than VLMC in modeling advanced MLC features, which may lead to a more accurate dose delivery in SW IMRT. Current clinical QA tools might not be specific enough to differentiate the dosimetric discrepancies at the millimeter level calculated by these two LMC algorithms. NIH/NIGMS grant U54 GM104944, Lincy Endowed Assistant Professorship.« less

  3. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  4. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreyra, M; Salinas Aranda, F; Dodat, D

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical andmore » dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.« less

  5. Clinical implementation of AXB from AAA for breast: Plan quality and subvolume analysis.

    PubMed

    Guebert, Alexandra; Conroy, Leigh; Weppler, Sarah; Alghamdi, Majed; Conway, Jessica; Harper, Lindsay; Phan, Tien; Olivotto, Ivo A; Smith, Wendy L; Quirk, Sarah

    2018-05-01

    Two dose calculation algorithms are available in Varian Eclipse software: Anisotropic Analytical Algorithm (AAA) and Acuros External Beam (AXB). Many Varian Eclipse-based centers have access to AXB; however, a thorough understanding of how it will affect plan characteristics and, subsequently, clinical practice is necessary prior to implementation. We characterized the difference in breast plan quality between AXB and AAA for dissemination to clinicians during implementation. Locoregional irradiation plans were created with AAA for 30 breast cancer patients with a prescription dose of 50 Gy to the breast and 45 Gy to the regional node, in 25 fractions. The internal mammary chain (IMC CTV ) nodes were covered by 80% of the breast dose. AXB, both dose-to-water and dose-to-medium reporting, was used to recalculate plans while maintaining constant monitor units. Target coverage and organ-at-risk doses were compared between the two algorithms using dose-volume parameters. An analysis to assess location-specific changes was performed by dividing the breast into nine subvolumes in the superior-inferior and left-right directions. There were minimal differences found between the AXB and AAA calculated plans. The median difference between AXB and AAA for breast CTV V 95% , was <2.5%. For IMC CTV , the median differences V 95% , and V 80% were <5% and 0%, respectively; indicating IMC CTV coverage only decreased when marginally covered. Mean superficial dose increased by a median of 3.2 Gy. In the subvolume analysis, the medial subvolumes were "hotter" when recalculated with AXB and the lateral subvolumes "cooler" with AXB; however, all differences were within 2 Gy. We observed minimal difference in magnitude and spatial distribution of dose when comparing the two algorithms. The largest observable differences occurred in superficial dose regions. Therefore, clinical implementation of AXB from AAA for breast radiotherapy is not expected to result in changes in clinical practice for prescribing or planning breast radiotherapy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  6. Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.

    PubMed

    Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin

    2013-08-01

    The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.

  7. SU-E-J-92: Validating Dose Uncertainty Estimates Produced by AUTODIRECT, An Automated Program to Evaluate Deformable Image Registration Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Pouliot, J

    2015-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool with the potential to deformably map dose from one computed-tomography (CT) image to another. Errors in the DIR, however, will produce errors in the transferred dose distribution. We have proposed a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), which predicts voxel-specific dose mapping errors on a patient-by-patient basis. This work validates the effectiveness of AUTODIRECT to predict dose mapping errors with virtual and physical phantom datasets. Methods: AUTODIRECT requires 4 inputs: moving and fixed CT images and two noise scans of a water phantom (for noise characterization). Then,more » AUTODIRECT uses algorithms to generate test deformations and applies them to the moving and fixed images (along with processing) to digitally create sets of test images, with known ground-truth deformations that are similar to the actual one. The clinical DIR algorithm is then applied to these test image sets (currently 4) . From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. This work compares these uncertainty estimates to the actual errors made by the Velocity Deformable Multi Pass algorithm on 11 virtual and 1 physical phantom datasets. Results: For 11 of the 12 tests, the predicted dose error distributions from AUTODIRECT are well matched to the actual error distributions within 1–6% for 10 virtual phantoms, and 9% for the physical phantom. For one of the cases though, the predictions underestimated the errors in the tail of the distribution. Conclusion: Overall, the AUTODIRECT algorithm performed well on the 12 phantom cases for Velocity and was shown to generate accurate estimates of dose warping uncertainty. AUTODIRECT is able to automatically generate patient-, organ- , and voxel-specific DIR uncertainty estimates. This ability would be useful for patient-specific DIR quality assurance.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, C; Zhang, H; Chen, Y

    Purpose: Recently, compressed sensing (CS) based iterative reconstruction (IR) method is receiving attentions to reconstruct high quality cone beam computed tomography (CBCT) images using sparsely sampled or noisy projections. The aim of this study is to develop a novel baseline algorithm called Mask Guided Image Reconstruction (MGIR), which can provide superior image quality for both low-dose 3DCBCT and 4DCBCT under single mathematical framework. Methods: In MGIR, the unknown CBCT volume was mathematically modeled as a combination of two regions where anatomical structures are 1) within the priori-defined mask and 2) outside the mask. Then we update each part of imagesmore » alternatively thorough solving minimization problems based on CS type IR. For low-dose 3DCBCT, the former region is defined as the anatomically complex region where it is focused to preserve edge information while latter region is defined as contrast uniform, and hence aggressively updated to remove noise/artifact. In 4DCBCT, the regions are separated as the common static part and moving part. Then, static volume and moving volumes were updated with global and phase sorted projection respectively, to optimize the image quality of both moving and static part simultaneously. Results: Examination of MGIR algorithm showed that high quality of both low-dose 3DCBCT and 4DCBCT images can be reconstructed without compromising the image resolution and imaging dose or scanning time respectively. For low-dose 3DCBCT, a clinical viable and high resolution head-and-neck image can be obtained while cutting the dose by 83%. In 4DCBCT, excellent quality 4DCBCT images could be reconstructed while requiring no more projection data and imaging dose than a typical clinical 3DCBCT scan. Conclusion: The results shown that the image quality of MGIR was superior compared to other published CS based IR algorithms for both 4DCBCT and low-dose 3DCBCT. This makes our MGIR algorithm potentially useful in various on-line clinical applications. Provisional Patent: UF#15476; WGS Ref. No. U1198.70067US00.« less

  9. Dosimetric Impact of Using the Acuros XB Algorithm for Intensity Modulated Radiation Therapy and RapidArc Planning in Nasopharyngeal Carcinomas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kan, Monica W.K., E-mail: kanwkm@ha.org.hk; Department of Physics and Materials Science, City University of Hong Kong, Hong Kong; Leung, Lucullus H.T.

    2013-01-01

    Purpose: To assess the dosimetric implications for the intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy with RapidArc (RA) of nasopharyngeal carcinomas (NPC) due to the use of the Acuros XB (AXB) algorithm versus the anisotropic analytical algorithm (AAA). Methods and Materials: Nine-field sliding window IMRT and triple-arc RA plans produced for 12 patients with NPC using AAA were recalculated using AXB. The dose distributions to multiple planning target volumes (PTVs) with different prescribed doses and critical organs were compared. The PTVs were separated into components in bone, air, and tissue. The change of doses by AXB duemore » to air and bone, and the variation of the amount of dose changes with number of fields was also studied using simple geometric phantoms. Results: Using AXB instead of AAA, the averaged mean dose to PTV{sub 70} (70 Gy was prescribed to PTV{sub 70}) was found to be 0.9% and 1.2% lower for IMRT and RA, respectively. It was approximately 1% lower in tissue, 2% lower in bone, and 1% higher in air. The averaged minimum dose to PTV{sub 70} in bone was approximately 4% lower for both IMRT and RA, whereas it was approximately 1.5% lower for PTV{sub 70} in tissue. The decrease in target doses estimated by AXB was mostly contributed from the presence of bone, less from tissue, and none from air. A similar trend was observed for PTV{sub 60} (60 Gy was prescribed to PTV{sub 60}). The doses to most serial organs were found to be 1% to 3% lower and to other organs 4% to 10% lower for both techniques. Conclusions: The use of the AXB algorithm is highly recommended for IMRT and RapidArc planning for NPC cases.« less

  10. Optimization of the double dosimetry algorithm for interventional cardiologists

    NASA Astrophysics Data System (ADS)

    Chumak, Vadim; Morgun, Artem; Bakhanova, Elena; Voloskiy, Vitalii; Borodynchik, Elena

    2014-11-01

    A double dosimetry method is recommended in interventional cardiology (IC) to assess occupational exposure; yet currently there is no common and universal algorithm for effective dose estimation. In this work, flexible and adaptive algorithm building methodology was developed and some specific algorithm applicable for typical irradiation conditions of IC procedures was obtained. It was shown that the obtained algorithm agrees well with experimental measurements and is less conservative compared to other known algorithms.

  11. SU-E-I-01: Iterative CBCT Reconstruction with a Feature-Preserving Penalty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyu, Q; Li, B; Southern Medical University, Guangzhou

    2015-06-15

    Purpose: Low-dose CBCT is desired in various clinical applications. Iterative image reconstruction algorithms have shown advantages in suppressing noise in low-dose CBCT. However, due to the smoothness constraint enforced during the reconstruction process, edges may be blurred and image features may lose in the reconstructed image. In this work, we proposed a new penalty design to preserve image features in the image reconstructed by iterative algorithms. Methods: Low-dose CBCT is reconstructed by minimizing the penalized weighted least-squares (PWLS) objective function. Binary Robust Independent Elementary Features (BRIEF) of the image were integrated into the penalty of PWLS. BRIEF is a generalmore » purpose point descriptor that can be used to identify important features of an image. In this work, BRIEF distance of two neighboring pixels was used to weigh the smoothing parameter in PWLS. For pixels of large BRIEF distance, weaker smooth constraint will be enforced. Image features will be better preserved through such a design. The performance of the PWLS algorithm with BRIEF penalty was evaluated by a CatPhan 600 phantom. Results: The image quality reconstructed by the proposed PWLS-BRIEF algorithm is superior to that by the conventional PWLS method and the standard FDK method. At matched noise level, edges in PWLS-BRIEF reconstructed image are better preserved. Conclusion: This study demonstrated that the proposed PWLS-BRIEF algorithm has great potential on preserving image features in low-dose CBCT.« less

  12. Extending Three-Dimensional Weighted Cone Beam Filtered Backprojection (CB-FBP) Algorithm for Image Reconstruction in Volumetric CT at Low Helical Pitches

    PubMed Central

    Hsieh, Jiang; Nilsen, Roy A.; McOlash, Scott M.

    2006-01-01

    A three-dimensional (3D) weighted helical cone beam filtered backprojection (CB-FBP) algorithm (namely, original 3D weighted helical CB-FBP algorithm) has already been proposed to reconstruct images from the projection data acquired along a helical trajectory in angular ranges up to [0, 2 π]. However, an overscan is usually employed in the clinic to reconstruct tomographic images with superior noise characteristics at the most challenging anatomic structures, such as head and spine, extremity imaging, and CT angiography as well. To obtain the most achievable noise characteristics or dose efficiency in a helical overscan, we extended the 3D weighted helical CB-FBP algorithm to handle helical pitches that are smaller than 1: 1 (namely extended 3D weighted helical CB-FBP algorithm). By decomposing a helical over scan with an angular range of [0, 2π + Δβ] into a union of full scans corresponding to an angular range of [0, 2π], the extended 3D weighted function is a summation of all 3D weighting functions corresponding to each full scan. An experimental evaluation shows that the extended 3D weighted helical CB-FBP algorithm can improve noise characteristics or dose efficiency of the 3D weighted helical CB-FBP algorithm at a helical pitch smaller than 1: 1, while its reconstruction accuracy and computational efficiency are maintained. It is believed that, such an efficient CB reconstruction algorithm that can provide superior noise characteristics or dose efficiency at low helical pitches may find its extensive applications in CT medical imaging. PMID:23165031

  13. Study of 201 non-small cell lung cancer patients given stereotactic ablative radiation therapy shows local control dependence on dose calculation algorithm.

    PubMed

    Latifi, Kujtim; Oliver, Jasmine; Baker, Ryan; Dilling, Thomas J; Stevens, Craig W; Kim, Jongphil; Yue, Binglin; Demarco, Marylou; Zhang, Geoffrey G; Moros, Eduardo G; Feygelman, Vladimir

    2014-04-01

    Pencil beam (PB) and collapsed cone convolution (CCC) dose calculation algorithms differ significantly when used in the thorax. However, such differences have seldom been previously directly correlated with outcomes of lung stereotactic ablative body radiation (SABR). Data for 201 non-small cell lung cancer patients treated with SABR were analyzed retrospectively. All patients were treated with 50 Gy in 5 fractions of 10 Gy each. The radiation prescription mandated that 95% of the planning target volume (PTV) receive the prescribed dose. One hundred sixteen patients were planned with BrainLab treatment planning software (TPS) with the PB algorithm and treated on a Novalis unit. The other 85 were planned on the Pinnacle TPS with the CCC algorithm and treated on a Varian linac. Treatment planning objectives were numerically identical for both groups. The median follow-up times were 24 and 17 months for the PB and CCC groups, respectively. The primary endpoint was local/marginal control of the irradiated lesion. Gray's competing risk method was used to determine the statistical differences in local/marginal control rates between the PB and CCC groups. Twenty-five patients planned with PB and 4 patients planned with the CCC algorithms to the same nominal doses experienced local recurrence. There was a statistically significant difference in recurrence rates between the PB and CCC groups (hazard ratio 3.4 [95% confidence interval: 1.18-9.83], Gray's test P=.019). The differences (Δ) between the 2 algorithms for target coverage were as follows: ΔD99GITV = 7.4 Gy, ΔD99PTV = 10.4 Gy, ΔV90GITV = 13.7%, ΔV90PTV = 37.6%, ΔD95PTV = 9.8 Gy, and ΔDISO = 3.4 Gy. GITV = gross internal tumor volume. Local control in patients receiving who were planned to the same nominal dose with PB and CCC algorithms were statistically significantly different. Possible alternative explanations are described in the report, although they are not thought likely to explain the difference. We conclude that the difference is due to relative dosimetric underdosing of tumors with the PB algorithm. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devpura, S; Li, H; Liu, C

    Purpose: To correlate dose distributions computed using six algorithms for recurrent early stage non-small cell lung cancer (NSCLC) patients treated with stereotactic body radiotherapy (SBRT), with outcome (local failure). Methods: Of 270 NSCLC patients treated with 12Gyx4, 20 were found to have local recurrence prior to the 2-year time point. These patients were originally planned with 1-D pencil beam (1-D PB) algorithm. 4D imaging was performed to manage tumor motion. Regions of local failures were determined from follow-up PET-CT scans. Follow-up CT images were rigidly fused to the planning CT (pCT), and recurrent tumor volumes (Vrecur) were mapped to themore » pCT. Dose was recomputed, retrospectively, using five algorithms: 3-D PB, collapsed cone convolution (CCC), anisotropic analytical algorithm (AAA), AcurosXB, and Monte Carlo (MC). Tumor control probability (TCP) was computed using the Marsden model (1,2). Patterns of failure were classified as central, in-field, marginal, and distant for Vrecur ≥95% of prescribed dose, 95–80%, 80–20%, and ≤20%, respectively (3). Results: Average PTV D95 (dose covering 95% of the PTV) for 3-D PB, CCC, AAA, AcurosXB, and MC relative to 1-D PB were 95.3±2.1%, 84.1±7.5%, 84.9±5.7%, 86.3±6.0%, and 85.1±7.0%, respectively. TCP values for 1-D PB, 3-D PB, CCC, AAA, AcurosXB, and MC were 98.5±1.2%, 95.7±3.0, 79.6±16.1%, 79.7±16.5%, 81.1±17.5%, and 78.1±20%, respectively. Patterns of local failures were similar for 1-D and 3D PB plans, which predicted that the majority of failures occur in centraldistal regions, with only ∼15% occurring distantly. However, with convolution/superposition and MC type algorithms, the majority of failures (65%) were predicted to be distant, consistent with the literature. Conclusion: Based on MC and convolution/superposition type algorithms, average PTV D95 and TCP were ∼15% lower than the planned 1-D PB dose calculation. Patterns of failure results suggest that MC and convolution/superposition type algorithms predict different outcomes for patterns of failure relative to PB algorithms. Work supported in part by Varian Medical Systems, Palo Alto, CA.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S; Suh, T; Chung, J

    Purpose: The purpose of this study is to evaluate the dosimetric and radiobiological impact of Acuros XB (AXB) and Anisotropic Analytic Algorithm (AAA) dose calculation algorithms on prostate stereotactic body radiation therapy plans with both conventional flattened (FF) and flattening-filter free (FFF) modes. Methods: For thirteen patients with prostate cancer, SBRT planning was performed using 10-MV photon beam with FF and FFF modes. The total dose prescribed to the PTV was 42.7 Gy in 7 fractions. All plans were initially calculated using AAA algorithm in Eclipse treatment planning system (11.0.34), and then were re-calculated using AXB with the same MUsmore » and MLC files. The four types of plans for different algorithms and beam energies were compared in terms of homogeneity and conformity. To evaluate the radiobiological impact, the tumor control probability (TCP) and normal tissue complication probability (NTCP) calculations were performed. Results: For PTV, both calculation algorithms and beam modes lead to comparable homogeneity and conformity. However, the averaged TCP values in AXB plans were always lower than in AAA plans with an average difference of 5.3% and 6.1% for 10-MV FFF and FF beam, respectively. In addition, the averaged NTCP values for organs at risk (OARs) were comparable. Conclusion: This study showed that prostate SBRT plan were comparable dosimetric results with different dose calculation algorithms as well as delivery beam modes. For biological results, even though NTCP values for both calculation algorithms and beam modes were similar, AXB plans produced slightly lower TCP compared to the AAA plans.« less

  16. Dosimetric verification of radiotherapy treatment planning systems in Serbia: national audit

    PubMed Central

    2012-01-01

    Background Independent external audits play an important role in quality assurance programme in radiation oncology. The audit supported by the IAEA in Serbia was designed to review the whole chain of activities in 3D conformal radiotherapy (3D-CRT) workflow, from patient data acquisition to treatment planning and dose delivery. The audit was based on the IAEA recommendations and focused on dosimetry part of the treatment planning and delivery processes. Methods The audit was conducted in three radiotherapy departments of Serbia. An anthropomorphic phantom was scanned with a computed tomography unit (CT) and treatment plans for eight different test cases involving various beam configurations suggested by the IAEA were prepared on local treatment planning systems (TPSs). The phantom was irradiated following the treatment plans for these test cases and doses in specific points were measured with an ionization chamber. The differences between the measured and calculated doses were reported. Results The measurements were conducted for different photon beam energies and TPS calculation algorithms. The deviation between the measured and calculated values for all test cases made with advanced algorithms were within the agreement criteria, while the larger deviations were observed for simpler algorithms. The number of measurements with results outside the agreement criteria increased with the increase of the beam energy and decreased with TPS calculation algorithm sophistication. Also, a few errors in the basic dosimetry data in TPS were detected and corrected. Conclusions The audit helped the users to better understand the operational features and limitations of their TPSs and resulted in increased confidence in dose calculation accuracy using TPSs. The audit results indicated the shortcomings of simpler algorithms for the test cases performed and, therefore the transition to more advanced algorithms is highly desirable. PMID:22971539

  17. Dosimetric verification of radiotherapy treatment planning systems in Serbia: national audit.

    PubMed

    Rutonjski, Laza; Petrović, Borislava; Baucal, Milutin; Teodorović, Milan; Cudić, Ozren; Gershkevitsh, Eduard; Izewska, Joanna

    2012-09-12

    Independent external audits play an important role in quality assurance programme in radiation oncology. The audit supported by the IAEA in Serbia was designed to review the whole chain of activities in 3D conformal radiotherapy (3D-CRT) workflow, from patient data acquisition to treatment planning and dose delivery. The audit was based on the IAEA recommendations and focused on dosimetry part of the treatment planning and delivery processes. The audit was conducted in three radiotherapy departments of Serbia. An anthropomorphic phantom was scanned with a computed tomography unit (CT) and treatment plans for eight different test cases involving various beam configurations suggested by the IAEA were prepared on local treatment planning systems (TPSs). The phantom was irradiated following the treatment plans for these test cases and doses in specific points were measured with an ionization chamber. The differences between the measured and calculated doses were reported. The measurements were conducted for different photon beam energies and TPS calculation algorithms. The deviation between the measured and calculated values for all test cases made with advanced algorithms were within the agreement criteria, while the larger deviations were observed for simpler algorithms. The number of measurements with results outside the agreement criteria increased with the increase of the beam energy and decreased with TPS calculation algorithm sophistication. Also, a few errors in the basic dosimetry data in TPS were detected and corrected. The audit helped the users to better understand the operational features and limitations of their TPSs and resulted in increased confidence in dose calculation accuracy using TPSs. The audit results indicated the shortcomings of simpler algorithms for the test cases performed and, therefore the transition to more advanced algorithms is highly desirable.

  18. GPU-based fast cone beam CT reconstruction from undersampled and noisy projection data via total variation.

    PubMed

    Jia, Xun; Lou, Yifei; Li, Ruijiang; Song, William Y; Jiang, Steve B

    2010-04-01

    Cone-beam CT (CBCT) plays an important role in image guided radiation therapy (IGRT). However, the large radiation dose from serial CBCT scans in most IGRT procedures raises a clinical concern, especially for pediatric patients who are essentially excluded from receiving IGRT for this reason. The goal of this work is to develop a fast GPU-based algorithm to reconstruct CBCT from undersampled and noisy projection data so as to lower the imaging dose. The CBCT is reconstructed by minimizing an energy functional consisting of a data fidelity term and a total variation regularization term. The authors developed a GPU-friendly version of the forward-backward splitting algorithm to solve this model. A multigrid technique is also employed. It is found that 20-40 x-ray projections are sufficient to reconstruct images with satisfactory quality for IGRT. The reconstruction time ranges from 77 to 130 s on an NVIDIA Tesla C1060 (NVIDIA, Santa Clara, CA) GPU card, depending on the number of projections used, which is estimated about 100 times faster than similar iterative reconstruction approaches. Moreover, phantom studies indicate that the algorithm enables the CBCT to be reconstructed under a scanning protocol with as low as 0.1 mA s/projection. Comparing with currently widely used full-fan head and neck scanning protocol of approximately 360 projections with 0.4 mA s/projection, it is estimated that an overall 36-72 times dose reduction has been achieved in our fast CBCT reconstruction algorithm. This work indicates that the developed GPU-based CBCT reconstruction algorithm is capable of lowering imaging dose considerably. The high computation efficiency in this algorithm makes the iterative CBCT reconstruction approach applicable in real clinical environments.

  19. SU-E-T-381: Evaluation of Calculated Dose Accuracy for Organs-At-Risk Located at Out-Of-Field in a Commercial Treatment Planning System for High Energy Photon Beams Produced From TrueBeam Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, L; Ding, G

    Purpose: Dose calculation accuracy for the out-of-field dose is important for predicting the dose to the organs-at-risk when they are located outside primary beams. The investigations on evaluating the calculation accuracy of treatment planning systems (TPS) on out-of-field dose in existing publications have focused on low energy (6MV) photon. This study evaluates out-of-field dose calculation accuracy of AAA algorithm for 15MV high energy photon beams. Methods: We used the EGSnrc Monte Carlo (MC) codes to evaluate the AAA algorithm in Varian Eclipse TPS (v.11). The incident beams start with validated Varian phase-space sources for a TrueBeam linac equipped with Millenniummore » 120 MLC. Dose comparisons between using AAA and MC for CT based realistic patient treatment plans using VMAT techniques for prostate and lung were performed and uncertainties of organ dose predicted by AAA at out-of-field location were evaluated. Results: The results show that AAA calculations under-estimate doses at the dose level of 1% (or less) of prescribed dose for CT based patient treatment plans using VMAT techniques. In regions where dose is only 1% of prescribed dose, although AAA under-estimates the out-of-field dose by 30% relative to the local dose, it is only about 0.3% of prescribed dose. For example, the uncertainties of calculated organ dose to liver or kidney that is located out-of-field is <0.3% of prescribed dose. Conclusion: For 15MV high energy photon beams, very good agreements (<1%) in calculating dose distributions were obtained between AAA and MC. The uncertainty of out-of-field dose calculations predicted by the AAA algorithm for realistic patient VMAT plans is <0.3% of prescribed dose in regions where the dose relative to the prescribed dose is <1%, although the uncertainties can be much larger relative to local doses. For organs-at-risk located at out-of-field, the error of dose predicted by Eclipse using AAA is negligible. This work was conducted in part using the resources of Varian research grant VUMC40590-R.« less

  20. Implementation of a dose gradient method into optimization of dose distribution in prostate cancer 3D-CRT plans

    PubMed Central

    Giżyńska, Marta K.; Kukołowicz, Paweł F.; Kordowski, Paweł

    2014-01-01

    Aim The aim of this work is to present a method of beam weight and wedge angle optimization for patients with prostate cancer. Background 3D-CRT is usually realized with forward planning based on a trial and error method. Several authors have published a few methods of beam weight optimization applicable to the 3D-CRT. Still, none on these methods is in common use. Materials and methods Optimization is based on the assumption that the best plan is achieved if dose gradient at ICRU point is equal to zero. Our optimization algorithm requires beam quality index, depth of maximum dose, profiles of wedged fields and maximum dose to femoral heads. The method was tested for 10 patients with prostate cancer, treated with the 3-field technique. Optimized plans were compared with plans prepared by 12 experienced planners. Dose standard deviation in target volume, and minimum and maximum doses were analyzed. Results The quality of plans obtained with the proposed optimization algorithms was comparable to that prepared by experienced planners. Mean difference in target dose standard deviation was 0.1% in favor of the plans prepared by planners for optimization of beam weights and wedge angles. Introducing a correction factor for patient body outline for dose gradient at ICRU point improved dose distribution homogeneity. On average, a 0.1% lower standard deviation was achieved with the optimization algorithm. No significant difference in mean dose–volume histogram for the rectum was observed. Conclusions Optimization shortens very much time planning. The average planning time was 5 min and less than a minute for forward and computer optimization, respectively. PMID:25337411

  1. Optimization for high-dose-rate brachytherapy of cervical cancer with adaptive simulated annealing and gradient descent.

    PubMed

    Yao, Rui; Templeton, Alistair K; Liao, Yixiang; Turian, Julius V; Kiel, Krystyna D; Chu, James C H

    2014-01-01

    To validate an in-house optimization program that uses adaptive simulated annealing (ASA) and gradient descent (GD) algorithms and investigate features of physical dose and generalized equivalent uniform dose (gEUD)-based objective functions in high-dose-rate (HDR) brachytherapy for cervical cancer. Eight Syed/Neblett template-based cervical cancer HDR interstitial brachytherapy cases were used for this study. Brachytherapy treatment plans were first generated using inverse planning simulated annealing (IPSA). Using the same dwell positions designated in IPSA, plans were then optimized with both physical dose and gEUD-based objective functions, using both ASA and GD algorithms. Comparisons were made between plans both qualitatively and based on dose-volume parameters, evaluating each optimization method and objective function. A hybrid objective function was also designed and implemented in the in-house program. The ASA plans are higher on bladder V75% and D2cc (p=0.034) and lower on rectum V75% and D2cc (p=0.034) than the IPSA plans. The ASA and GD plans are not significantly different. The gEUD-based plans have higher homogeneity index (p=0.034), lower overdose index (p=0.005), and lower rectum gEUD and normal tissue complication probability (p=0.005) than the physical dose-based plans. The hybrid function can produce a plan with dosimetric parameters between the physical dose-based and gEUD-based plans. The optimized plans with the same objective value and dose-volume histogram could have different dose distributions. Our optimization program based on ASA and GD algorithms is flexible on objective functions, optimization parameters, and can generate optimized plans comparable with IPSA. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  2. Comparison of low-contrast detectability between two CT reconstruction algorithms using voxel-based 3D printed textured phantoms.

    PubMed

    Solomon, Justin; Ba, Alexandre; Bochud, François; Samei, Ehsan

    2016-12-01

    To use novel voxel-based 3D printed textured phantoms in order to compare low-contrast detectability between two reconstruction algorithms, FBP (filtered-backprojection) and SAFIRE (sinogram affirmed iterative reconstruction) and determine what impact background texture (i.e., anatomical noise) has on estimating the dose reduction potential of SAFIRE. Liver volumes were segmented from 23 abdominal CT cases. The volumes were characterized in terms of texture features from gray-level co-occurrence and run-length matrices. Using a 3D clustered lumpy background (CLB) model, a fitting technique based on a genetic optimization algorithm was used to find CLB textures that were reflective of the liver textures, accounting for CT system factors of spatial blurring and noise. With the modeled background texture as a guide, four cylindrical phantoms (Textures A-C and uniform, 165 mm in diameter, and 30 mm height) were designed, each containing 20 low-contrast spherical signals (6 mm diameter at nominal contrast levels of ∼3.2, 5.2, 7.2, 10, and 14 HU with four repeats per signal). The phantoms were voxelized and input into a commercial multimaterial 3D printer (Object Connex 350), with custom software for voxel-based printing (using principles of digital dithering). Images of the textured phantoms and a corresponding uniform phantom were acquired at six radiation dose levels (SOMATOM Flash, Siemens Healthcare) and observer model detection performance (detectability index of a multislice channelized Hotelling observer) was estimated for each condition (5 contrasts × 6 doses × 2 reconstructions × 4 backgrounds = 240 total conditions). A multivariate generalized regression analysis was performed (linear terms, no interactions, random error term, log link function) to assess whether dose, reconstruction algorithm, signal contrast, and background type have statistically significant effects on detectability. Also, fitted curves of detectability (averaged across contrast levels) as a function of dose were constructed for each reconstruction algorithm and background texture. FBP and SAFIRE were compared for each background type to determine the improvement in detectability at a given dose, and the reduced dose at which SAFIRE had equivalent performance compared to FBP at 100% dose. Detectability increased with increasing radiation dose (P = 2.7 × 10 -59 ) and contrast level (P = 2.2 × 10 -86 ) and was higher in the uniform phantom compared to the textured phantoms (P = 6.9 × 10 -51 ). Overall, SAFIRE had higher d' compared to FBP (P = 0.02). The estimated dose reduction potential of SAFIRE was found to be 8%, 10%, 27%, and 8% for Texture-A, Texture-B, Texture-C and uniform phantoms. In all background types, detectability was higher with SAFIRE compared to FBP. However, the relative improvement observed from SAFIRE was highly dependent on the complexity of the background texture. Iterative algorithms such as SAFIRE should be assessed in the most realistic context possible.

  3. Linear feasibility algorithms for treatment planning in interstitial photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Rendon, A.; Beck, J. C.; Lilge, Lothar

    2008-02-01

    Interstitial Photodynamic therapy (IPDT) has been under intense investigation in recent years, with multiple clinical trials underway. This effort has demanded the development of optimization strategies that determine the best locations and output powers for light sources (cylindrical or point diffusers) to achieve an optimal light delivery. Furthermore, we have recently introduced cylindrical diffusers with customizable emission profiles, placing additional requirements on the optimization algorithms, particularly in terms of the stability of the inverse problem. Here, we present a general class of linear feasibility algorithms and their properties. Moreover, we compare two particular instances of these algorithms, which are been used in the context of IPDT: the Cimmino algorithm and a weighted gradient descent (WGD) algorithm. The algorithms were compared in terms of their convergence properties, the cost function they minimize in the infeasible case, their ability to regularize the inverse problem, and the resulting optimal light dose distributions. Our results show that the WGD algorithm overall performs slightly better than the Cimmino algorithm and that it converges to a minimizer of a clinically relevant cost function in the infeasible case. Interestingly however, treatment plans resulting from either algorithms were very similar in terms of the resulting fluence maps and dose volume histograms, once the diffuser powers adjusted to achieve equal prostate coverage.

  4. SU-E-T-577: Commissioning of a Deterministic Algorithm for External Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, T; Finlay, J; Mesina, C

    Purpose: We report commissioning results for a deterministic algorithm for external photon beam treatment planning. A deterministic algorithm solves the radiation transport equations directly using a finite difference method, thus improve the accuracy of dose calculation, particularly under heterogeneous conditions with results similar to that of Monte Carlo (MC) simulation. Methods: Commissioning data for photon energies 6 – 15 MV includes the percentage depth dose (PDD) measured at SSD = 90 cm and output ratio in water (Spc), both normalized to 10 cm depth, for field sizes between 2 and 40 cm and depths between 0 and 40 cm. Off-axismore » ratio (OAR) for the same set of field sizes was used at 5 depths (dmax, 5, 10, 20, 30 cm). The final model was compared with the commissioning data as well as additional benchmark data. The benchmark data includes dose per MU determined for 17 points for SSD between 80 and 110 cm, depth between 5 and 20 cm, and lateral offset of up to 16.5 cm. Relative comparisons were made in a heterogeneous phantom made of cork and solid water. Results: Compared to the commissioning beam data, the agreement are generally better than 2% with large errors (up to 13%) observed in the buildup regions of the FDD and penumbra regions of the OAR profiles. The overall mean standard deviation is 0.04% when all data are taken into account. Compared to the benchmark data, the agreements are generally better than 2%. Relative comparison in heterogeneous phantom is in general better than 4%. Conclusion: A commercial deterministic algorithm was commissioned for megavoltage photon beams. In a homogeneous medium, the agreement between the algorithm and measurement at the benchmark points is generally better than 2%. The dose accuracy for a deterministic algorithm is better than a convolution algorithm in heterogeneous medium.« less

  5. Denoising of polychromatic CT images based on their own noise properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ji Hye; Chang, Yongjin; Ra, Jong Beom, E-mail: jbra@kaist.ac.kr

    Purpose: Because of high diagnostic accuracy and fast scan time, computed tomography (CT) has been widely used in various clinical applications. Since the CT scan introduces radiation exposure to patients, however, dose reduction has recently been recognized as an important issue in CT imaging. However, low-dose CT causes an increase of noise in the image and thereby deteriorates the accuracy of diagnosis. In this paper, the authors develop an efficient denoising algorithm for low-dose CT images obtained using a polychromatic x-ray source. The algorithm is based on two steps: (i) estimation of space variant noise statistics, which are uniquely determinedmore » according to the system geometry and scanned object, and (ii) subsequent novel conversion of the estimated noise to Gaussian noise so that an existing high performance Gaussian noise filtering algorithm can be directly applied to CT images with non-Gaussian noise. Methods: For efficient polychromatic CT image denoising, the authors first reconstruct an image with the iterative maximum-likelihood polychromatic algorithm for CT to alleviate the beam-hardening problem. We then estimate the space-variant noise variance distribution on the image domain. Since there are many high performance denoising algorithms available for the Gaussian noise, image denoising can become much more efficient if they can be used. Hence, the authors propose a novel conversion scheme to transform the estimated space-variant noise to near Gaussian noise. In the suggested scheme, the authors first convert the image so that its mean and variance can have a linear relationship, and then produce a Gaussian image via variance stabilizing transform. The authors then apply a block matching 4D algorithm that is optimized for noise reduction of the Gaussian image, and reconvert the result to obtain a final denoised image. To examine the performance of the proposed method, an XCAT phantom simulation and a physical phantom experiment were conducted. Results: Both simulation and experimental results show that, unlike the existing denoising algorithms, the proposed algorithm can effectively reduce the noise over the whole region of CT images while preventing degradation of image resolution. Conclusions: To effectively denoise polychromatic low-dose CT images, a novel denoising algorithm is proposed. Because this algorithm is based on the noise statistics of a reconstructed polychromatic CT image, the spatially varying noise on the image is effectively reduced so that the denoised image will have homogeneous quality over the image domain. Through a simulation and a real experiment, it is verified that the proposed algorithm can deliver considerably better performance compared to the existing denoising algorithms.« less

  6. Preliminary assessment of the impact of incorporating a detailed algorithm for the effects of nuclear irradiation on combat crew performance into the Janus combat simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warshawsky, A.S.; Uzelac, M.J.; Pimper, J.E.

    The Crew III algorithm for assessing time and dose dependent combat crew performance subsequent to nuclear irradiation was incorporated into the Janus combat simulation system. Battle outcomes using this algorithm were compared to outcomes based on the currently used time-independent cookie-cutter'' assessment methodology. The results illustrate quantifiable differences in battle outcome between the two assessment techniques. Results suggest that tactical nuclear weapons are more effective than currently assumed if performance degradation attributed to radiation doses between 150 to 3000 rad are taken into account. 6 refs., 9 figs.

  7. Technical Note: A direct ray-tracing method to compute integral depth dose in pencil beam proton radiography with a multilayer ionization chamber.

    PubMed

    Farace, Paolo; Righetto, Roberto; Deffet, Sylvain; Meijers, Arturs; Vander Stappen, Francois

    2016-12-01

    To introduce a fast ray-tracing algorithm in pencil proton radiography (PR) with a multilayer ionization chamber (MLIC) for in vivo range error mapping. Pencil beam PR was obtained by delivering spots uniformly positioned in a square (45 × 45 mm 2 field-of-view) of 9 × 9 spots capable of crossing the phantoms (210 MeV). The exit beam was collected by a MLIC to sample the integral depth dose (IDD MLIC ). PRs of an electron-density and of a head phantom were acquired by moving the couch to obtain multiple 45 × 45 mm 2 frames. To map the corresponding range errors, the two-dimensional set of IDD MLIC was compared with (i) the integral depth dose computed by the treatment planning system (TPS) by both analytic (IDD TPS ) and Monte Carlo (IDD MC ) algorithms in a volume of water simulating the MLIC at the CT, and (ii) the integral depth dose directly computed by a simple ray-tracing algorithm (IDD direct ) through the same CT data. The exact spatial position of the spot pattern was numerically adjusted testing different in-plane positions and selecting the one that minimized the range differences between IDD direct and IDD MLIC . Range error mapping was feasible by both the TPS and the ray-tracing methods, but very sensitive to even small misalignments. In homogeneous regions, the range errors computed by the direct ray-tracing algorithm matched the results obtained by both the analytic and the Monte Carlo algorithms. In both phantoms, lateral heterogeneities were better modeled by the ray-tracing and the Monte Carlo algorithms than by the analytic TPS computation. Accordingly, when the pencil beam crossed lateral heterogeneities, the range errors mapped by the direct algorithm matched better the Monte Carlo maps than those obtained by the analytic algorithm. Finally, the simplicity of the ray-tracing algorithm allowed to implement a prototype procedure for automated spatial alignment. The ray-tracing algorithm can reliably replace the TPS method in MLIC PR for in vivo range verification and it can be a key component to develop software tools for spatial alignment and correction of CT calibration.

  8. Site-specific range uncertainties caused by dose calculation algorithms for proton therapy

    NASA Astrophysics Data System (ADS)

    Schuemann, J.; Dowdell, S.; Grassberger, C.; Min, C. H.; Paganetti, H.

    2014-08-01

    The purpose of this study was to assess the possibility of introducing site-specific range margins to replace current generic margins in proton therapy. Further, the goal was to study the potential of reducing margins with current analytical dose calculations methods. For this purpose we investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict the range of proton fields. Dose distributions predicted by an analytical pencil-beam algorithm were compared with those obtained using Monte Carlo (MC) simulations (TOPAS). A total of 508 passively scattered treatment fields were analyzed for seven disease sites (liver, prostate, breast, medulloblastoma-spine, medulloblastoma-whole brain, lung and head and neck). Voxel-by-voxel comparisons were performed on two-dimensional distal dose surfaces calculated by pencil-beam and MC algorithms to obtain the average range differences and root mean square deviation for each field for the distal position of the 90% dose level (R90) and the 50% dose level (R50). The average dose degradation of the distal falloff region, defined as the distance between the distal position of the 80% and 20% dose levels (R80-R20), was also analyzed. All ranges were calculated in water-equivalent distances. Considering total range uncertainties and uncertainties from dose calculation alone, we were able to deduce site-specific estimations. For liver, prostate and whole brain fields our results demonstrate that a reduction of currently used uncertainty margins is feasible even without introducing MC dose calculations. We recommend range margins of 2.8% + 1.2 mm for liver and prostate treatments and 3.1% + 1.2 mm for whole brain treatments, respectively. On the other hand, current margins seem to be insufficient for some breast, lung and head and neck patients, at least if used generically. If no case specific adjustments are applied, a generic margin of 6.3% + 1.2 mm would be needed for breast, lung and head and neck treatments. We conclude that the currently used generic range uncertainty margins in proton therapy should be redefined site specific and that complex geometries may require a field specific adjustment. Routine verifications of treatment plans using MC simulations are recommended for patients with heterogeneous geometries.

  9. In vivo measurements for high dose rate brachytherapy with optically stimulated luminescent dosimeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Renu; Jursinic, Paul A.

    2013-07-15

    Purpose: To show the feasibility of clinical implementation of OSLDs for high dose-rate (HDR) in vivo dosimetry for gynecological and breast patients. To discuss how the OSLDs were characterized for an Ir-192 source, taking into account low gamma energy and high dose gradients. To describe differences caused by the dose calculation formalism of treatment planning systems.Methods: OSLD irradiations were made using the GammaMedplus iX Ir-192 HDR, Varian Medical Systems, Milpitas, CA. BrachyVision versions 8.9 and 10.0, Varian Medical Systems, Milpitas, CA, were used for calculations. Version 8.9 used the TG-43 algorithm and version 10.0 used the Acuros algorithm. The OSLDsmore » (InLight Nanodots) were characterized for Ir-192. Various phantoms were created to assess calculated and measured doses and the angular dependence and self-absorption of the Nanodots. Following successful phantom measurements, patient measurements for gynecological patients and breast cancer patients were made and compared to calculated doses.Results: The OSLD sensitivity to Ir-192 compared to 6 MV is between 1.10 and 1.25, is unique to each detector, and changes with accumulated dose. The measured doses were compared to those predicted by the treatment planning system and found to be in agreement for the gynecological patients to within measurement uncertainty. The range of differences between the measured and Acuros calculated doses was -10%-14%. For the breast patients, there was a discrepancy of -4.4% to +6.5% between the measured and calculated doses at the skin surface when the Acuros algorithm was used. These differences were within experimental uncertainty due to (random) error in the location of the detector with respect to the treatment catheter.Conclusions: OSLDs can be successfully used for HDR in vivo dosimetry. However, for the measurements to be meaningful one must account for the angular dependence, volume-averaging, and the greater sensitivity to Ir-192 gamma rays than to 6 MV x-rays if 6 MV x-rays were used for OSLD calibration. The limitations of the treatment planning algorithm must be understood, especially for surface dose measurements. Use of in vivo dosimetry for HDR brachytherapy treatments is feasible and has the potential to detect and prevent gross errors. In vivo HDR brachytherapy should be included as part of the QA for a HDR brachytherapy program.« less

  10. Quantitative assessment of the accuracy of dose calculation using pencil beam and Monte Carlo algorithms and requirements for clinical quality assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Imad, E-mail: iali@ouhsc.edu; Ahmad, Salahuddin

    2013-10-01

    To compare the doses calculated using the BrainLAB pencil beam (PB) and Monte Carlo (MC) algorithms for tumors located in various sites including the lung and evaluate quality assurance procedures required for the verification of the accuracy of dose calculation. The dose-calculation accuracy of PB and MC was also assessed quantitatively with measurement using ionization chamber and Gafchromic films placed in solid water and heterogeneous phantoms. The dose was calculated using PB convolution and MC algorithms in the iPlan treatment planning system from BrainLAB. The dose calculation was performed on the patient's computed tomography images with lesions in various treatmentmore » sites including 5 lungs, 5 prostates, 4 brains, 2 head and necks, and 2 paraspinal tissues. A combination of conventional, conformal, and intensity-modulated radiation therapy plans was used in dose calculation. The leaf sequence from intensity-modulated radiation therapy plans or beam shapes from conformal plans and monitor units and other planning parameters calculated by the PB were identical for calculating dose with MC. Heterogeneity correction was considered in both PB and MC dose calculations. Dose-volume parameters such as V95 (volume covered by 95% of prescription dose), dose distributions, and gamma analysis were used to evaluate the calculated dose by PB and MC. The measured doses by ionization chamber and EBT GAFCHROMIC film in solid water and heterogeneous phantoms were used to quantitatively asses the accuracy of dose calculated by PB and MC. The dose-volume histograms and dose distributions calculated by PB and MC in the brain, prostate, paraspinal, and head and neck were in good agreement with one another (within 5%) and provided acceptable planning target volume coverage. However, dose distributions of the patients with lung cancer had large discrepancies. For a plan optimized with PB, the dose coverage was shown as clinically acceptable, whereas in reality, the MC showed a systematic lack of dose coverage. The dose calculated by PB for lung tumors was overestimated by up to 40%. An interesting feature that was observed is that despite large discrepancies in dose-volume histogram coverage of the planning target volume between PB and MC, the point doses at the isocenter (center of the lesions) calculated by both algorithms were within 7% even for lung cases. The dose distributions measured with EBT GAFCHROMIC films in heterogeneous phantoms showed large discrepancies of nearly 15% lower than PB at interfaces between heterogeneous media, where these lower doses measured by the film were in agreement with those by MC. The doses (V95) calculated by MC and PB agreed within 5% for treatment sites with small tissue heterogeneities such as the prostate, brain, head and neck, and paraspinal tumors. Considerable discrepancies, up to 40%, were observed in the dose-volume coverage between MC and PB in lung tumors, which may affect clinical outcomes. The discrepancies between MC and PB increased for 15 MV compared with 6 MV indicating the importance of implementation of accurate clinical treatment planning such as MC. The comparison of point doses is not representative of the discrepancies in dose coverage and might be misleading in evaluating the accuracy of dose calculation between PB and MC. Thus, the clinical quality assurance procedures required to verify the accuracy of dose calculation using PB and MC need to consider measurements of 2- and 3-dimensional dose distributions rather than a single point measurement using heterogeneous phantoms instead of homogenous water-equivalent phantoms.« less

  11. Beyond Gaussians: a study of single spot modeling for scanning proton dose calculation

    PubMed Central

    Li, Yupeng; Zhu, Ronald X.; Sahoo, Narayan; Anand, Aman; Zhang, Xiaodong

    2013-01-01

    Active spot scanning proton therapy is becoming increasingly adopted by proton therapy centers worldwide. Unlike passive-scattering proton therapy, active spot scanning proton therapy, especially intensity-modulated proton therapy, requires proper modeling of each scanning spot to ensure accurate computation of the total dose distribution contributed from a large number of spots. During commissioning of the spot scanning gantry at the Proton Therapy Center in Houston, it was observed that the long-range scattering protons in a medium may have been inadequately modeled for high-energy beams by a commercial treatment planning system, which could lead to incorrect prediction of field-size effects on dose output. In the present study, we developed a pencil-beam algorithm for scanning-proton dose calculation by focusing on properly modeling individual scanning spots. All modeling parameters required by the pencil-beam algorithm can be generated based solely on a few sets of measured data. We demonstrated that low-dose halos in single-spot profiles in the medium could be adequately modeled with the addition of a modified Cauchy-Lorentz distribution function to a double-Gaussian function. The field-size effects were accurately computed at all depths and field sizes for all energies, and good dose accuracy was also achieved for patient dose verification. The implementation of the proposed pencil beam algorithm also enabled us to study the importance of different modeling components and parameters at various beam energies. The results of this study may be helpful in improving dose calculation accuracy and simplifying beam commissioning and treatment planning processes for spot scanning proton therapy. PMID:22297324

  12. An algorithm for treatment of patients with hypersensitivity reactions after vaccines.

    PubMed

    Wood, Robert A; Berger, Melvin; Dreskin, Stephen C; Setse, Rosanna; Engler, Renata J M; Dekker, Cornelia L; Halsey, Neal A

    2008-09-01

    Concerns about possible allergic reactions to immunizations are raised frequently by both patients/parents and primary care providers. Estimates of true allergic, or immediate hypersensitivity, reactions to routine vaccines range from 1 per 50000 doses for diphtheria-tetanus-pertussis to approximately 1 per 500000 to 1000000 doses for most other vaccines. In a large study from New Zealand, data were collected during a 5-year period on 15 marketed vaccines and revealed an estimated rate of 1 immediate hypersensitivity reaction per 450000 doses of vaccine administered. Another large study, conducted within the Vaccine Safety Datalink, described a range of reaction rates to >7.5 million doses. Depending on the study design and the time after the immunization event, reaction rates varied from 0.65 cases per million doses to 1.53 cases per million doses when additional allergy codes were included. For some vaccines, particularly when allergens such as gelatin are part of the formulation (eg, Japanese encephalitis), higher rates of serious allergic reactions may occur. Although these per-dose estimates suggest that true hypersensitivity reactions are quite rare, the large number of doses that are administered, especially for the commonly used vaccines, makes this a relatively common clinical problem. In this review, we present background information on vaccine hypersensitivity, followed by a detailed algorithm that provides a rational and organized approach for the evaluation and treatment of patients with suspected hypersensitivity. We then include 3 cases of suspected allergic reactions to vaccines that have been referred to the Clinical Immunization Safety Assessment network to demonstrate the practical application of the algorithm.

  13. SU-E-T-800: Verification of Acurose XB Dose Calculation Algorithm at Air Cavity-Tissue Interface Using Film Measurement for Small Fields of 6-MV Flattening Filter-Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S; Suh, T; Chung, J

    2015-06-15

    Purpose: To verify the dose accuracy of Acuros XB (AXB) dose calculation algorithm at air-tissue interface using inhomogeneous phantom for 6-MV flattening filter-free (FFF) beams. Methods: An inhomogeneous phantom included air cavity was manufactured for verifying dose accuracy at the air-tissue interface. The phantom was composed with 1 and 3 cm thickness of air cavity. To evaluate the central axis doses (CAD) and dose profiles of the interface, the dose calculations were performed for 3 × 3 and 4 × 4 cm{sup 2} fields of 6 MV FFF beams with AAA and AXB in Eclipse treatment plainning system. Measurements inmore » this region were performed with Gafchromic film. The root mean square errors (RMSE) were analyzed with calculated and measured dose profile. Dose profiles were divided into inner-dose profile (>80%) and penumbra (20% to 80%) region for evaluating RMSE. To quantify the distribution difference, gamma evaluation was used and determined the agreement with 3%/3mm criteria. Results: The percentage differences (%Diffs) between measured and calculated CAD in the interface, AXB shows more agreement than AAA. The %Diffs were increased with increasing the thickness of air cavity size and it is similar for both algorithms. In RMSEs of inner-profile, AXB was more accurate than AAA. The difference was up to 6 times due to overestimation by AAA. RMSEs of penumbra appeared to high difference for increasing the measurement depth. Gamma agreement also presented that the passing rates decreased in penumbra. Conclusion: This study demonstrated that the dose calculation with AXB shows more accurate than with AAA for the air-tissue interface. The 2D dose distributions with AXB for both inner-profile and penumbra showed better agreement than with AAA relative to variation of the measurement depths and air cavity sizes.« less

  14. Characterisation of a MOSFET-based detector for dose measurement under megavoltage electron beam radiotherapy

    NASA Astrophysics Data System (ADS)

    Jong, W. L.; Ung, N. M.; Tiong, A. H. L.; Rosenfeld, A. B.; Wong, J. H. D.

    2018-03-01

    The aim of this study is to investigate the fundamental dosimetric characteristics of the MOSkin detector for megavoltage electron beam dosimetry. The reproducibility, linearity, energy dependence, dose rate dependence, depth dose measurement, output factor measurement, and surface dose measurement under megavoltage electron beam were tested. The MOSkin detector showed excellent reproducibility (>98%) and linearity (R2= 1.00) up to 2000 cGy for 4-20 MeV electron beams. The MOSkin detector also showed minimal dose rate dependence (within ±3%) and energy dependence (within ±2%) over the clinical range of electron beams, except for an energy dependence at 4 MeV electron beam. An energy dependence correction factor of 1.075 is needed when the MOSkin detector is used for 4 MeV electron beam. The output factors measured by the MOSkin detector were within ±2% compared to those measured with the EBT3 film and CC13 chamber. The measured depth doses using the MOSkin detector agreed with those measured using the CC13 chamber, except at the build-up region due to the dose volume averaging effect of the CC13 chamber. For surface dose measurements, MOSkin measurements were in agreement within ±3% to those measured using EBT3 film. Measurements using the MOSkin detector were also compared to electron dose calculation algorithms namely the GGPB and eMC algorithms. Both algorithms were in agreement with measurements to within ±2% and ±4% for output factor (except for the 4 × 4 cm2 field size) and surface dose, respectively. With the uncertainties taken into account, the MOSkin detector was found to be a suitable detector for dose measurement under megavoltage electron beam. This has been demonstrated in the in vivo skin dose measurement on patients during electron boost to the breast tumour bed.

  15. Intensity-modulated radiotherapy for locally advanced non-small-cell lung cancer: a dose-escalation planning study.

    PubMed

    Lievens, Yolande; Nulens, An; Gaber, Mousa Amr; Defraene, Gilles; De Wever, Walter; Stroobants, Sigrid; Van den Heuvel, Frank

    2011-05-01

    To evaluate the potential for dose escalation with intensity-modulated radiotherapy (IMRT) in positron emission tomography-based radiotherapy planning for locally advanced non-small-cell lung cancer (LA-NSCLC). For 35 LA-NSCLC patients, three-dimensional conformal radiotherapy and IMRT plans were made to a prescription dose (PD) of 66 Gy in 2-Gy fractions. Dose escalation was performed toward the maximal PD using secondary endpoint constraints for the lung, spinal cord, and heart, with de-escalation according to defined esophageal tolerance. Dose calculation was performed using the Eclipse pencil beam algorithm, and all plans were recalculated using a collapsed cone algorithm. The normal tissue complication probabilities were calculated for the lung (Grade 2 pneumonitis) and esophagus (acute toxicity, grade 2 or greater, and late toxicity). IMRT resulted in statistically significant decreases in the mean lung (p <.0001) and maximal spinal cord (p = .002 and 0005) doses, allowing an average increase in the PD of 8.6-14.2 Gy (p ≤.0001). This advantage was lost after de-escalation within the defined esophageal dose limits. The lung normal tissue complication probabilities were significantly lower for IMRT (p <.0001), even after dose escalation. For esophageal toxicity, IMRT significantly decreased the acute NTCP values at the low dose levels (p = .0009 and p <.0001). After maximal dose escalation, late esophageal tolerance became critical (p <.0001), especially when using IMRT, owing to the parallel increases in the esophageal dose and PD. In LA-NSCLC, IMRT offers the potential to significantly escalate the PD, dependent on the lung and spinal cord tolerance. However, parallel increases in the esophageal dose abolished the advantage, even when using collapsed cone algorithms. This is important to consider in the context of concomitant chemoradiotherapy schedules using IMRT. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. [New calculation algorithms in brachytherapy for iridium 192 treatments].

    PubMed

    Robert, C; Dumas, I; Martinetti, F; Chargari, C; Haie-Meder, C; Lefkopoulos, D

    2018-05-18

    Since 1995, the brachytherapy dosimetry protocols follow the methodology recommended by the Task Group 43. This methodology, which has the advantage of being fast, is based on several approximations that are not always valid in clinical conditions. Model-based dose calculation algorithms have recently emerged in treatment planning stations and are considered as a major evolution by allowing for consideration of the patient's finite dimensions, tissue heterogeneities and the presence of high atomic number materials in applicators. In 2012, a report from the American Association of Physicists in Medicine Radiation Therapy Task Group 186 reviews these models and makes recommendations for their clinical implementation. This review focuses on the use of model-based dose calculation algorithms in the context of iridium 192 treatments. After a description of these algorithms and their clinical implementation, a summary of the main questions raised by these new methods is performed. Considerations regarding the choice of the medium used for the dose specification and the recommended methodology for assigning materials characteristics are especially described. In the last part, recent concrete examples from the literature illustrate the capabilities of these new algorithms on clinical cases. Copyright © 2018 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  17. The Texas Medication Algorithm Project (TMAP) schizophrenia algorithms.

    PubMed

    Miller, A L; Chiles, J A; Chiles, J K; Crismon, M L; Rush, A J; Shon, S P

    1999-10-01

    In the Texas Medication Algorithm Project (TMAP), detailed guidelines for medication management of schizophrenia and related disorders, bipolar disorders, and major depressive disorders have been developed and implemented. This article describes the algorithms developed for medication treatment of schizophrenia and related disorders. The guidelines recommend a sequence of medications and discuss dosing, duration, and switch-over tactics. They also specify response criteria at each stage of the algorithm for both positive and negative symptoms. The rationale and evidence for each aspect of the algorithms are presented.

  18. Low-dose CT reconstruction via L1 dictionary learning regularization using iteratively reweighted least-squares.

    PubMed

    Zhang, Cheng; Zhang, Tao; Li, Ming; Peng, Chengtao; Liu, Zhaobang; Zheng, Jian

    2016-06-18

    In order to reduce the radiation dose of CT (computed tomography), compressed sensing theory has been a hot topic since it provides the possibility of a high quality recovery from the sparse sampling data. Recently, the algorithm based on DL (dictionary learning) was developed to deal with the sparse CT reconstruction problem. However, the existing DL algorithm focuses on the minimization problem with the L2-norm regularization term, which leads to reconstruction quality deteriorating while the sampling rate declines further. Therefore, it is essential to improve the DL method to meet the demand of more dose reduction. In this paper, we replaced the L2-norm regularization term with the L1-norm one. It is expected that the proposed L1-DL method could alleviate the over-smoothing effect of the L2-minimization and reserve more image details. The proposed algorithm solves the L1-minimization problem by a weighting strategy, solving the new weighted L2-minimization problem based on IRLS (iteratively reweighted least squares). Through the numerical simulation, the proposed algorithm is compared with the existing DL method (adaptive dictionary based statistical iterative reconstruction, ADSIR) and other two typical compressed sensing algorithms. It is revealed that the proposed algorithm is more accurate than the other algorithms especially when further reducing the sampling rate or increasing the noise. The proposed L1-DL algorithm can utilize more prior information of image sparsity than ADSIR. By transforming the L2-norm regularization term of ADSIR with the L1-norm one and solving the L1-minimization problem by IRLS strategy, L1-DL could reconstruct the image more exactly.

  19. Inverse determination of the penalty parameter in penalized weighted least-squares algorithm for noise reduction of low-dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Guan, Huaiqun; Solberg, Timothy

    2011-07-15

    Purpose: A statistical projection restoration algorithm based on the penalized weighted least-squares (PWLS) criterion can substantially improve the image quality of low-dose CBCT images. The performance of PWLS is largely dependent on the choice of the penalty parameter. Previously, the penalty parameter was chosen empirically by trial and error. In this work, the authors developed an inverse technique to calculate the penalty parameter in PWLS for noise suppression of low-dose CBCT in image guided radiotherapy (IGRT). Methods: In IGRT, a daily CBCT is acquired for the same patient during a treatment course. In this work, the authors acquired the CBCTmore » with a high-mAs protocol for the first session and then a lower mAs protocol for the subsequent sessions. The high-mAs projections served as the goal (ideal) toward, which the low-mAs projections were to be smoothed by minimizing the PWLS objective function. The penalty parameter was determined through an inverse calculation of the derivative of the objective function incorporating both the high and low-mAs projections. Then the parameter obtained can be used for PWLS to smooth the noise in low-dose projections. CBCT projections for a CatPhan 600 and an anthropomorphic head phantom, as well as for a brain patient, were used to evaluate the performance of the proposed technique. Results: The penalty parameter in PWLS was obtained for each CBCT projection using the proposed strategy. The noise in the low-dose CBCT images reconstructed from the smoothed projections was greatly suppressed. Image quality in PWLS-processed low-dose CBCT was comparable to its corresponding high-dose CBCT. Conclusions: A technique was proposed to estimate the penalty parameter for PWLS algorithm. It provides an objective and efficient way to obtain the penalty parameter for image restoration algorithms that require predefined smoothing parameters.« less

  20. Caffeine dosing strategies to optimize alertness during sleep loss.

    PubMed

    Vital-Lopez, Francisco G; Ramakrishnan, Sridhar; Doty, Tracy J; Balkin, Thomas J; Reifman, Jaques

    2018-05-28

    Sleep loss, which affects about one-third of the US population, can severely impair physical and neurobehavioural performance. Although caffeine, the most widely used stimulant in the world, can mitigate these effects, currently there are no tools to guide the timing and amount of caffeine consumption to optimize its benefits. In this work, we provide an optimization algorithm, suited for mobile computing platforms, to determine when and how much caffeine to consume, so as to safely maximize neurobehavioural performance at the desired time of the day, under any sleep-loss condition. The algorithm is based on our previously validated Unified Model of Performance, which predicts the effect of caffeine consumption on a psychomotor vigilance task. We assessed the algorithm by comparing the caffeine-dosing strategies (timing and amount) it identified with the dosing strategies used in four experimental studies, involving total and partial sleep loss. Through computer simulations, we showed that the algorithm yielded caffeine-dosing strategies that enhanced performance of the predicted psychomotor vigilance task by up to 64% while using the same total amount of caffeine as in the original studies. In addition, the algorithm identified strategies that resulted in equivalent performance to that in the experimental studies while reducing caffeine consumption by up to 65%. Our work provides the first quantitative caffeine optimization tool for designing effective strategies to maximize neurobehavioural performance and to avoid excessive caffeine consumption during any arbitrary sleep-loss condition. © 2018 The Authors. Journal of Sleep Research published by John Wiley & Sons Ltd on behalf of European Sleep Research Society.

  1. Continuous intensity map optimization (CIMO): A novel approach to leaf sequencing in step and shoot IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao Daliang; Earl, Matthew A.; Luan, Shuang

    2006-04-15

    A new leaf-sequencing approach has been developed that is designed to reduce the number of required beam segments for step-and-shoot intensity modulated radiation therapy (IMRT). This approach to leaf sequencing is called continuous-intensity-map-optimization (CIMO). Using a simulated annealing algorithm, CIMO seeks to minimize differences between the optimized and sequenced intensity maps. Two distinguishing features of the CIMO algorithm are (1) CIMO does not require that each optimized intensity map be clustered into discrete levels and (2) CIMO is not rule-based but rather simultaneously optimizes both the aperture shapes and weights. To test the CIMO algorithm, ten IMRT patient cases weremore » selected (four head-and-neck, two pancreas, two prostate, one brain, and one pelvis). For each case, the optimized intensity maps were extracted from the Pinnacle{sup 3} treatment planning system. The CIMO algorithm was applied, and the optimized aperture shapes and weights were loaded back into Pinnacle. A final dose calculation was performed using Pinnacle's convolution/superposition based dose calculation. On average, the CIMO algorithm provided a 54% reduction in the number of beam segments as compared with Pinnacle's leaf sequencer. The plans sequenced using the CIMO algorithm also provided improved target dose uniformity and a reduced discrepancy between the optimized and sequenced intensity maps. For ten clinical intensity maps, comparisons were performed between the CIMO algorithm and the power-of-two reduction algorithm of Xia and Verhey [Med. Phys. 25(8), 1424-1434 (1998)]. When the constraints of a Varian Millennium multileaf collimator were applied, the CIMO algorithm resulted in a 26% reduction in the number of segments. For an Elekta multileaf collimator, the CIMO algorithm resulted in a 67% reduction in the number of segments. An average leaf sequencing time of less than one minute per beam was observed.« less

  2. SU-F-BRCD-09: Total Variation (TV) Based Fast Convergent Iterative CBCT Reconstruction with GPU Acceleration.

    PubMed

    Xu, Q; Yang, D; Tan, J; Anastasio, M

    2012-06-01

    To improve image quality and reduce imaging dose in CBCT for radiation therapy applications and to realize near real-time image reconstruction based on use of a fast convergence iterative algorithm and acceleration by multi-GPUs. An iterative image reconstruction that sought to minimize a weighted least squares cost function that employed total variation (TV) regularization was employed to mitigate projection data incompleteness and noise. To achieve rapid 3D image reconstruction (< 1 min), a highly optimized multiple-GPU implementation of the algorithm was developed. The convergence rate and reconstruction accuracy were evaluated using a modified 3D Shepp-Logan digital phantom and a Catphan-600 physical phantom. The reconstructed images were compared with the clinical FDK reconstruction results. Digital phantom studies showed that only 15 iterations and 60 iterations are needed to achieve algorithm convergence for 360-view and 60-view cases, respectively. The RMSE was reduced to 10-4 and 10-2, respectively, by using 15 iterations for each case. Our algorithm required 5.4s to complete one iteration for the 60-view case using one Tesla C2075 GPU. The few-view study indicated that our iterative algorithm has great potential to reduce the imaging dose and preserve good image quality. For the physical Catphan studies, the images obtained from the iterative algorithm possessed better spatial resolution and higher SNRs than those obtained from by use of a clinical FDK reconstruction algorithm. We have developed a fast convergence iterative algorithm for CBCT image reconstruction. The developed algorithm yielded images with better spatial resolution and higher SNR than those produced by a commercial FDK tool. In addition, from the few-view study, the iterative algorithm has shown great potential for significantly reducing imaging dose. We expect that the developed reconstruction approach will facilitate applications including IGART and patient daily CBCT-based treatment localization. © 2012 American Association of Physicists in Medicine.

  3. Dosimetric validation of the Acuros XB Advanced Dose Calculation algorithm: fundamental characterization in water

    NASA Astrophysics Data System (ADS)

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Mancosu, Pietro; Cozzi, Luca

    2011-05-01

    This corrigendum intends to clarify some important points that were not clearly or properly addressed in the original paper, and for which the authors apologize. The original description of the first Acuros algorithm is from the developers, published in Physics in Medicine and Biology by Vassiliev et al (2010) in the paper entitled 'Validation of a new grid-based Boltzmann equation solver for dose calculation in radiotherapy with photon beams'. The main equations describing the algorithm reported in our paper, implemented as the 'Acuros XB Advanced Dose Calculation Algorithm' in the Varian Eclipse treatment planning system, were originally described (for the original Acuros algorithm) in the above mentioned paper by Vassiliev et al. The intention of our description in our paper was to give readers an overview of the algorithm, not pretending to have authorship of the algorithm itself (used as implemented in the planning system). Unfortunately our paper was not clear, particularly in not allocating full credit to the work published by Vassiliev et al on the original Acuros algorithm. Moreover, it is important to clarify that we have not adapted any existing algorithm, but have used the Acuros XB implementation in the Eclipse planning system from Varian. In particular, the original text of our paper should have been as follows: On page 1880 the sentence 'A prototype LBTE solver, called Attila (Wareing et al 2001), was also applied to external photon beam dose calculations (Gifford et al 2006, Vassiliev et al 2008, 2010). Acuros XB builds upon many of the methods in Attila, but represents a ground-up rewrite of the solver where the methods were adapted especially for external photon beam dose calculations' should be corrected to 'A prototype LBTE solver, called Attila (Wareing et al 2001), was also applied to external photon beam dose calculations (Gifford et al 2006, Vassiliev et al 2008). A new algorithm called Acuros, developed by the Transpire Inc. group, was built upon many of the methods in Attila, but represents a ground-up rewrite of the solver where the methods were especially adapted for external photon beam dose calculations, and described in Vassiliev et al (2010). Acuros XB is the Varian implementation of the original Acuros algorithm in the Eclipse planning system'. On page 1881, the sentence 'Monte Carlo and explicit LBTE solution, with sufficient refinement, will converge on the same solution. However, both methods produce errors (inaccuracies). In explicit LBTE solution methods, errors are primarily systematic, and result from discretization of the solution variables in space, angle, and energy. In both Monte Carlo and explicit LBTE solvers, a trade-off exists between speed and accuracy: reduced computational time may be achieved when less stringent accuracy criteria are specified, and vice versa' should cite the reference Vassiliev et al (2010). On page 1882, the beginning of the sub-paragraph The radiation transport model should start with 'The following description of the Acuros XB algorithm is as outlined by Vassiliev et al (2010) and reports the main steps of the radiation transport model as implemented in Eclipse'. The authors apologize for this lack of clarity in our published paper, and trust that this corrigendum gives full credit to Vassiliev et al in their earlier paper, with respect to previous work on the Acuros algorithm. However we wish to note that the entire contents of the data and results published in our paper are original and the work of the listed authors. References Gifford K A, Horton J L Jr, Wareing T A, Failla G and Mourtada F 2006 Comparison of a finite-element multigroup discrete-ordinates code with Monte Carlo for radiotherapy calculations Phys. Med. Biol. 51 2253-65 Vassiliev O N, Wareing T A, Davis I M, McGhee J, Barnett D, Horton J L, Gifford K, Failla G, Titt U and Mourtada F 2008 Feasibility of a multigroup deterministic solution method for three-dimensional radiotherapy dose calculations Int. J. Radiat. Oncol. Biol. Phys. 72 220-7 Vassiliev O N, Wareing T A, McGhee J, Failla G, Salehpour M R and Mourtada F 2010 Validation of a new grid based Boltzmann equation solver for dose calculation in radiotherapy with photon beams Phys. Med. Biol. 55 581-98 Wareing T A, McGhee J M, Morel J E and Pautz S D 2001 Discontinuous finite element Sn methods on three-dimensional unstructured grids Nucl. Sci. Eng. 138 256-68

  4. Performance Characteristics of an Independent Dose Verification Program for Helical Tomotherapy

    PubMed Central

    Chang, Isaac C. F.; Chen, Jeff; Yartsev, Slav

    2017-01-01

    Helical tomotherapy with its advanced method of intensity-modulated radiation therapy delivery has been used clinically for over 20 years. The standard delivery quality assurance procedure to measure the accuracy of delivered radiation dose from each treatment plan to a phantom is time-consuming. RadCalc®, a radiotherapy dose verification software, has released specifically for beta testing a module for tomotherapy plan dose calculations. RadCalc®'s accuracy for tomotherapy dose calculations was evaluated through examination of point doses in ten lung and ten prostate clinical plans. Doses calculated by the TomoHDA™ tomotherapy treatment planning system were used as the baseline. For lung cases, RadCalc® overestimated point doses in the lung by an average of 13%. Doses within the spinal cord and esophagus were overestimated by 10%. Prostate plans showed better agreement, with overestimations of 6% in the prostate, bladder, and rectum. The systematic overestimation likely resulted from limitations of the pencil beam dose calculation algorithm implemented by RadCalc®. Limitations were more severe in areas of greater inhomogeneity and less prominent in regions of homogeneity with densities closer to 1 g/cm3. Recommendations for RadCalc® dose calculation algorithms and anatomical representation were provided based on the results of the study. PMID:28974862

  5. Use of a channelized Hotelling observer to assess CT image quality and optimize dose reduction for iteratively reconstructed images.

    PubMed

    Favazza, Christopher P; Ferrero, Andrea; Yu, Lifeng; Leng, Shuai; McMillan, Kyle L; McCollough, Cynthia H

    2017-07-01

    The use of iterative reconstruction (IR) algorithms in CT generally decreases image noise and enables dose reduction. However, the amount of dose reduction possible using IR without sacrificing diagnostic performance is difficult to assess with conventional image quality metrics. Through this investigation, achievable dose reduction using a commercially available IR algorithm without loss of low contrast spatial resolution was determined with a channelized Hotelling observer (CHO) model and used to optimize a clinical abdomen/pelvis exam protocol. A phantom containing 21 low contrast disks-three different contrast levels and seven different diameters-was imaged at different dose levels. Images were created with filtered backprojection (FBP) and IR. The CHO was tasked with detecting the low contrast disks. CHO performance indicated dose could be reduced by 22% to 25% without compromising low contrast detectability (as compared to full-dose FBP images) whereas 50% or more dose reduction significantly reduced detection performance. Importantly, default settings for the scanner and protocol investigated reduced dose by upward of 75%. Subsequently, CHO-based protocol changes to the default protocol yielded images of higher quality and doses more consistent with values from a larger, dose-optimized scanner fleet. CHO assessment provided objective data to successfully optimize a clinical CT acquisition protocol.

  6. Outcomes using exhaled nitric oxide measurements as an adjunct to primary care asthma management.

    PubMed

    Hewitt, Richard S; Modrich, Catherine M; Cowan, Jan O; Herbison, G Peter; Taylor, D Robin

    2009-12-01

    Exhaled nitric oxide (FENO) measurements may help to highlight when inhaled corticosteroid (ICS) therapy should or should not be adjusted in asthma. This is often difficult to judge. Our aim was to evaluate a decision-support algorithm incorporating FENO measurements in a nurse-led asthma clinic. Asthma management was guided by an algorithm based on high (>45ppb), intermediate (30-45ppb), or low (<30ppb) FENO levels and asthma control status. This provided for one of eight possible treatment options, including diagnosis review and ICS dose adjustment. Well controlled asthma increased from 41% at visit 1 to 68% at visit 5 (p=0.001). The mean fluticasone dose decreased from 312 mcg/day at visit 2 to 211mcg/day at visit 5 (p=0.022). There was a high level of protocol deviations (25%), often related to concerns about reducing the ICS dose. The % fall in FENO associated with a change in asthma status from poor control to good control was 35%. An FENO-based algorithm provided for a reduction in ICS doses without compromising asthma control. However, the results may have been influenced by the education and support which patients received. Reluctance to reduce ICS dose was an issue which may have influenced the overall results. Australian Clinical Trials Registry # 012605000354684.

  7. Optimization of Treatment Geometry to Reduce Normal Brain Dose in Radiosurgery of Multiple Brain Metastases with Single-Isocenter Volumetric Modulated Arc Therapy.

    PubMed

    Wu, Qixue; Snyder, Karen Chin; Liu, Chang; Huang, Yimei; Zhao, Bo; Chetty, Indrin J; Wen, Ning

    2016-09-30

    Treatment of patients with multiple brain metastases using a single-isocenter volumetric modulated arc therapy (VMAT) has been shown to decrease treatment time with the tradeoff of larger low dose to the normal brain tissue. We have developed an efficient Projection Summing Optimization Algorithm to optimize the treatment geometry in order to reduce dose to normal brain tissue for radiosurgery of multiple metastases with single-isocenter VMAT. The algorithm: (a) measures coordinates of outer boundary points of each lesion to be treated using the Eclipse Scripting Application Programming Interface, (b) determines the rotations of couch, collimator, and gantry using three matrices about the cardinal axes, (c) projects the outer boundary points of the lesion on to Beam Eye View projection plane, (d) optimizes couch and collimator angles by selecting the least total unblocked area for each specific treatment arc, and (e) generates a treatment plan with the optimized angles. The results showed significant reduction in the mean dose and low dose volume to normal brain, while maintaining the similar treatment plan qualities on the thirteen patients treated previously. The algorithm has the flexibility with regard to the beam arrangements and can be integrated in the treatment planning system for clinical application directly.

  8. Automated segmentation of cardiac visceral fat in low-dose non-contrast chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Liang, Mingzhu; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2015-03-01

    Cardiac visceral fat was segmented from low-dose non-contrast chest CT images using a fully automated method. Cardiac visceral fat is defined as the fatty tissues surrounding the heart region, enclosed by the lungs and posterior to the sternum. It is measured by constraining the heart region with an Anatomy Label Map that contains robust segmentations of the lungs and other major organs and estimating the fatty tissue within this region. The algorithm was evaluated on 124 low-dose and 223 standard-dose non-contrast chest CT scans from two public datasets. Based on visual inspection, 343 cases had good cardiac visceral fat segmentation. For quantitative evaluation, manual markings of cardiac visceral fat regions were made in 3 image slices for 45 low-dose scans and the Dice similarity coefficient (DSC) was computed. The automated algorithm achieved an average DSC of 0.93. Cardiac visceral fat volume (CVFV), heart region volume (HRV) and their ratio were computed for each case. The correlation between cardiac visceral fat measurement and coronary artery and aortic calcification was also evaluated. Results indicated the automated algorithm for measuring cardiac visceral fat volume may be an alternative method to the traditional manual assessment of thoracic region fat content in the assessment of cardiovascular disease risk.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J; Zhang, W; Lu, J

    Purpose: To investigate the accuracy and feasibility of dose calculations using kilovoltage cone beam computed tomography in cervical cancer radiotherapy using a correction algorithm. Methods: The Hounsfield units (HU) and electron density (HU-density) curve was obtained for both planning CT (pCT) and kilovoltage cone beam CT (CBCT) using a CIRS-062 calibration phantom. The pCT and kV-CBCT images have different HU values, and if the HU-density curve of CBCT was directly used to calculate dose in CBCT images may have a deviation on dose distribution. It is necessary to normalize the different HU values between pCT and CBCT. A HU correctionmore » algorithm was used for CBCT images (cCBCT). Fifteen intensity-modulated radiation therapy (IMRT) plans of cervical cancer were chosen, and the plans were transferred to the pCT and cCBCT data sets without any changes for dose calculations. Phantom and patient studies were carried out. The dose differences and dose distributions were compared between cCBCT plan and pCT plan. Results: The HU number of CBCT was measured by several times, and the maximum change was less than 2%. To compare with pCT, the CBCT and cCBCT has a discrepancy, the dose differences in CBCT and cCBCT images were 2.48%±0.65% (range: 1.3%∼3.8%) and 0.48%±0.21% (range: 0.1%∼0.82%) for phantom study, respectively. For dose calculation in patient images, the dose differences were 2.25%±0.43% (range: 1.4%∼3.4%) and 0.63%±0.35% (range: 0.13%∼0.97%), respectively. And for the dose distributions, the passing rate of cCBCT was higher than the CBCTs. Conclusion: The CBCT image for dose calculation is feasible in cervical cancer radiotherapy, and the correction algorithm offers acceptable accuracy. It will become a useful tool for adaptive radiation therapy.« less

  10. TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, J; Park, J; Kim, L

    2016-06-15

    Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommendedmore » by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.« less

  11. Examination of the suitability of an implementation of the Jette localized heterogeneities fluence term L(1)(x,y,z) in an electron beam treatment planning algorithm

    NASA Astrophysics Data System (ADS)

    Rodebaugh, Raymond Francis, Jr.

    2000-11-01

    In this project we applied modifications of the Fermi- Eyges multiple scattering theory to attempt to achieve the goals of a fast, accurate electron dose calculation algorithm. The dose was first calculated for an ``average configuration'' based on the patient's anatomy using a modification of the Hogstrom algorithm. It was split into a measured central axis depth dose component based on the material between the source and the dose calculation point, and an off-axis component based on the physics of multiple coulomb scattering for the average configuration. The former provided the general depth dose characteristics along the beam fan lines, while the latter provided the effects of collimation. The Gaussian localized heterogeneities theory of Jette provided the lateral redistribution of the electron fluence by heterogeneities. Here we terminated Jette's infinite series of fluence redistribution terms after the second term. Experimental comparison data were collected for 1 cm thick x 1 cm diameter air and aluminum pillboxes using the Varian 2100C linear accelerator at Rush-Presbyterian- St. Luke's Medical Center. For an air pillbox, the algorithm results were in reasonable agreement with measured data at both 9 and 20 MeV. For the Aluminum pill box, there were significant discrepancies between the results of this algorithm and experiment. This was particularly apparent for the 9 MeV beam. Of course a one cm thick Aluminum heterogeneity is unlikely to be encountered in a clinical situation; the thickness, linear stopping power, and linear scattering power of Aluminum are all well above what would normally be encountered. We found that the algorithm is highly sensitive to the choice of the average configuration. This is an indication that the series of fluence redistribution terms does not converge fast enough to terminate after the second term. It also makes it difficult to apply the algorithm to cases where there are no a priori means of choosing the best average configuration or where there is a complex geometry containing both lowly and highly scattering heterogeneities. There is some hope of decreasing the sensitivity to the average configuration by including portions of the next term of the localized heterogeneities series.

  12. A Monte Carlo investigation of contaminant electrons due to a novel in vivo transmission detector.

    PubMed

    Asuni, G; Jensen, J M; McCurdy, B M C

    2011-02-21

    A novel transmission detector (IBA Dosimetry, Germany) developed as an IMRT quality assurance tool, intended for in vivo patient dose measurements, is studied here. The goal of this investigation is to use Monte Carlo techniques to characterize treatment beam parameters in the presence of the detector and to compare to those of a plastic block tray (a frequently used clinical device). Particular attention is paid to the impact of the detector on electron contamination model parameters of two commercial dose calculation algorithms. The linac head together with the COMPASS transmission detector (TRD) was modeled using BEAMnrc code. To understand the effect of the TRD on treatment beams, the contaminant electron fluence, energy spectra, and angular distributions at different SSDs were analyzed for open and non-open (i.e. TRD and block tray) fields. Contaminant electrons in the BEAMnrc simulations were separated according to where they were created. Calculation of surface dose and the evaluation of contributions from contaminant electrons were performed using the DOSXYZnrc user code. The effect of the TRD on contaminant electrons model parameters in Eclipse AAA and Pinnacle(3) dose calculation algorithms was investigated. Comparisons of the fluence of contaminant electrons produced in the non-open fields versus open field show that electrons created in the non-open fields increase at shorter SSD, but most of the electrons at shorter SSD are of low energy with large angular spread. These electrons are out-scattered or absorbed in air and contribute less to surface dose at larger SSD. Calculated surface doses with the block tray are higher than those with the TRD. Contribution of contaminant electrons to dose in the buildup region increases with increasing field size. The additional contribution of electrons to surface dose increases with field size for TRD and block tray. The introduction of the TRD results in a 12% and 15% increase in the Gaussian widths used in the contaminant electron source model of the Eclipse AAA dose algorithm. The off-axis coefficient in the Pinnacle(3) dose calculation algorithm decreases in the presence of TRD compared to without the device. The electron model parameters were modified to reflect the increase in electron contamination with the TRD, a necessary step for accurate beam modeling when using the device.

  13. SU-E-T-219: Comprehensive Validation of the Electron Monte Carlo Dose Calculation Algorithm in RayStation Treatment Planning System for An Elekta Linear Accelerator with AgilityTM Treatment Head

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yi; Park, Yang-Kyun; Doppke, Karen P.

    2015-06-15

    Purpose: This study evaluated the performance of the electron Monte Carlo dose calculation algorithm in RayStation v4.0 for an Elekta machine with Agility™ treatment head. Methods: The machine has five electron energies (6–8 MeV) and five applicators (6×6 to 25×25 cm {sup 2}). The dose (cGy/MU at d{sub max}), depth dose and profiles were measured in water using an electron diode at 100 cm SSD for nine square fields ≥2×2 cm{sup 2} and four complex fields at normal incidence, and a 14×14 cm{sup 2} field at 15° and 30° incidence. The dose was also measured for three square fields ≥4×4more » cm{sup 2} at 98, 105 and 110 cm SSD. Using selected energies, the EBT3 radiochromic film was used for dose measurements in slab-shaped inhomogeneous phantoms and a breast phantom with surface curvature. The measured and calculated doses were analyzed using a gamma criterion of 3%/3 mm. Results: The calculated and measured doses varied by <3% for 116 of the 120 points, and <5% for the 4×4 cm{sup 2} field at 110 cm SSD at 9–18 MeV. The gamma analysis comparing the 105 pairs of in-water isodoses passed by >98.1%. The planar doses measured from films placed at 0.5 cm below a lung/tissue layer (12 MeV) and 1.0 cm below a bone/air layer (15 MeV) showed excellent agreement with calculations, with gamma passing by 99.9% and 98.5%, respectively. At the breast-tissue interface, the gamma passing rate is >98.8% at 12–18 MeV. The film results directly validated the accuracy of MU calculation and spatial dose distribution in presence of tissue inhomogeneity and surface curvature - situations challenging for simpler pencil-beam algorithms. Conclusion: The electron Monte Carlo algorithm in RayStation v4.0 is fully validated for clinical use for the Elekta Agility™ machine. The comprehensive validation included small fields, complex fields, oblique beams, extended distance, tissue inhomogeneity and surface curvature.« less

  14. Low dose reconstruction algorithm for differential phase contrast imaging.

    PubMed

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  15. Explicit Filtering Based Low-Dose Differential Phase Reconstruction Algorithm with the Grating Interferometry.

    PubMed

    Jiang, Xiaolei; Zhang, Li; Zhang, Ran; Yin, Hongxia; Wang, Zhenchang

    2015-01-01

    X-ray grating interferometry offers a novel framework for the study of weakly absorbing samples. Three kinds of information, that is, the attenuation, differential phase contrast (DPC), and dark-field images, can be obtained after a single scanning, providing additional and complementary information to the conventional attenuation image. Phase shifts of X-rays are measured by the DPC method; hence, DPC-CT reconstructs refraction indexes rather than attenuation coefficients. In this work, we propose an explicit filtering based low-dose differential phase reconstruction algorithm, which enables reconstruction from reduced scanning without artifacts. The algorithm adopts a differential algebraic reconstruction technique (DART) with the explicit filtering based sparse regularization rather than the commonly used total variation (TV) method. Both the numerical simulation and the biological sample experiment demonstrate the feasibility of the proposed algorithm.

  16. Explicit Filtering Based Low-Dose Differential Phase Reconstruction Algorithm with the Grating Interferometry

    PubMed Central

    Zhang, Li; Zhang, Ran; Yin, Hongxia; Wang, Zhenchang

    2015-01-01

    X-ray grating interferometry offers a novel framework for the study of weakly absorbing samples. Three kinds of information, that is, the attenuation, differential phase contrast (DPC), and dark-field images, can be obtained after a single scanning, providing additional and complementary information to the conventional attenuation image. Phase shifts of X-rays are measured by the DPC method; hence, DPC-CT reconstructs refraction indexes rather than attenuation coefficients. In this work, we propose an explicit filtering based low-dose differential phase reconstruction algorithm, which enables reconstruction from reduced scanning without artifacts. The algorithm adopts a differential algebraic reconstruction technique (DART) with the explicit filtering based sparse regularization rather than the commonly used total variation (TV) method. Both the numerical simulation and the biological sample experiment demonstrate the feasibility of the proposed algorithm. PMID:26089971

  17. SU-E-T-616: Plan Quality Assessment of Both Treatment Planning System Dose and Measurement-Based 3D Reconstructed Dose in the Patient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olch, A

    2015-06-15

    Purpose: Systematic radiotherapy plan quality assessment promotes quality improvement. Software tools can perform this analysis by applying site-specific structure dose metrics. The next step is to similarly evaluate the quality of the dose delivery. This study defines metrics for acceptable doses to targets and normal organs for a particular treatment site and scores each plan accordingly. The input can be the TPS or the measurement-based 3D patient dose. From this analysis, one can determine whether the delivered dose distribution to the patient receives a score which is comparable to the TPS plan score, otherwise replanning may be indicated. Methods: Elevenmore » neuroblastoma patient plans were exported from Eclipse to the Quality Reports program. A scoring algorithm defined a score for each normal and target structure based on dose-volume parameters. Each plan was scored by this algorithm and the percentage of total possible points was obtained. Each plan also underwent IMRT QA measurements with a Mapcheck2 or ArcCheck. These measurements were input into the 3DVH program to compute the patient 3D dose distribution which was analyzed using the same scoring algorithm as the TPS plan. Results: The mean quality score for the TPS plans was 75.37% (std dev=14.15%) compared to 71.95% (std dev=13.45%) for the 3DVH dose distribution. For 3/11 plans, the 3DVH-based quality score was higher than the TPS score, by between 0.5 to 8.4 percentage points. Eight/11 plans scores decreased based on IMRT QA measurements by 1.2 to 18.6 points. Conclusion: Software was used to determine the degree to which the plan quality score differed between the TPS and measurement-based dose. Although the delivery score was generally in good agreement with the planned dose score, there were some that improved while there was one plan whose delivered dose quality was significantly less than planned. This methodology helps evaluate both planned and delivered dose quality. Sun Nuclear Corporation has provded a license for the software described.« less

  18. A comparison between simplified and intensive dose-titration algorithms using AIR inhaled insulin for insulin-naive patients with type 2 diabetes in a randomized noninferiority trial.

    PubMed

    Mathieu, C; Cuddihy, R; Arakaki, R F; Belin, R M; Planquois, J-M; Lyons, J N; Heilmann, C R

    2009-09-01

    Insulin initiation and optimization is a challenge for patients with type 2 diabetes. Our objective was to determine whether safety and efficacy of AIR inhaled insulin (Eli Lilly and Co., Indianapolis, IN) (AIR is a registered trademark of Alkermes, Inc., Cambridge, MA) using a simplified regimen was noninferior to an intensive regimen. This was an open-label, randomized study in insulin-naive adults not optimally controlled by oral antihyperglycemic medications. Simplified titration included a 6 U per meal AIR insulin starting dose. Individual doses were adjusted at mealtime in 2-U increments from the previous day's four-point self-monitored blood glucose (SMBG) (total < or =6 U). Starting Air insulin doses for intensive titration were based on fasting blood glucose, gender, height, and weight. Patients conducted four-point SMBG daily for the study duration. Insulin doses were titrated based on the previous 3 days' mean SMBG (total < or =8 U). End point hemoglobin A1C (A1C) was 7.07 +/- 0.09% and 6.87 +/- 0.09% for simplified (n = 178) and intensive (n = 180) algorithms, respectively. Noninferiority between algorithms was not established. The fasting blood glucose (least squares mean +/- standard error) values for the simplified (137.27 +/- 3.42 mg/dL) and intensive (133.13 +/- 3.42 mg/dL) algorithms were comparable. Safety profiles were comparable. The hypoglycemic rate at 4, 8, 12, and 24 weeks was higher in patients receiving intensive titration (all P < .0001). The nocturnal hypoglycemic rate for patients receiving intensive titration was higher than for those receiving simplified titration at 8 (P < 0.015) and 12 weeks (P < 0.001). Noninferiority between the algorithms, as measured by A1C, was not demonstrated. This finding re-emphasizes the difficulty of identifying optimal, simplified insulin regimens for patients.

  19. SU-E-T-295: Simultaneous Beam Sampling and Aperture Shape Optimization for Station Parameter Optimized Radiation Therapy (SPORT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarepisheh, M; Li, R; Xing, L

    Purpose: Station Parameter Optimized Radiation Therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital LINACs, in which the station parameters of a delivery system, (such as aperture shape and weight, couch position/angle, gantry/collimator angle) are optimized altogether. SPORT promises to deliver unprecedented radiation dose distributions efficiently, yet there does not exist any optimization algorithm to implement it. The purpose of this work is to propose an optimization algorithm to simultaneously optimize the beam sampling and aperture shapes. Methods: We build a mathematical model whose variables are beam angles (including non-coplanar and/or even nonisocentric beams) andmore » aperture shapes. To solve the resulting large scale optimization problem, we devise an exact, convergent and fast optimization algorithm by integrating three advanced optimization techniques named column generation, gradient method, and pattern search. Column generation is used to find a good set of aperture shapes as an initial solution by adding apertures sequentially. Then we apply the gradient method to iteratively improve the current solution by reshaping the aperture shapes and updating the beam angles toward the gradient. Algorithm continues by pattern search method to explore the part of the search space that cannot be reached by the gradient method. Results: The proposed technique is applied to a series of patient cases and significantly improves the plan quality. In a head-and-neck case, for example, the left parotid gland mean-dose, brainstem max-dose, spinal cord max-dose, and mandible mean-dose are reduced by 10%, 7%, 24% and 12% respectively, compared to the conventional VMAT plan while maintaining the same PTV coverage. Conclusion: Combined use of column generation, gradient search and pattern search algorithms provide an effective way to optimize simultaneously the large collection of station parameters and significantly improves quality of resultant treatment plans as compared with conventional VMAT or IMRT treatments.« less

  20. SU-E-T-764: Track Repeating Algorithm for Proton Therapy Applied to Intensity Modulated Proton Therapy for Head-And-Neck Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yepes, P; Mirkovic, D; Mohan, R

    Purpose: To determine the suitability of fast Monte Carlo techniques for dose calculation in particle therapy based on track-repeating algorithm for Intensity Modulated Proton Therapy, IMPT. The application of this technique will make possible detailed retrospective studies of large cohort of patients, which may lead to a better determination of Relative Biological Effects from the analysis of patient data. Methods: A cohort of six head-and-neck patients treated at the University of Texas MD Anderson Cancer Center with IMPT were utilized. The dose distributions were calculated with the standard Treatment Plan System, TPS, MCNPX, GEANT4 and FDC, a fast track-repeating algorithmmore » for proton therapy for the verification and the patient plans. FDC is based on a GEANT4 database of trajectories of protons in a water. The obtained dose distributions were compared to each other utilizing the g-index criteria for 3mm-3% and 2mm-2%, for the maximum spatial and dose differences. The γ-index was calculated for voxels with a dose at least 10% of the maximum delivered dose. Dose Volume Histograms are also calculated for the various dose distributions. Results: Good agreement between GEANT4 and FDC is found with less than 1% of the voxels with a γ-index larger than 1 for 2 mm-2%. The agreement between MCNPX with FDC is within the requirements of clinical standards, even though it is slightly worse than the comparison with GEANT4.The comparison with TPS yielded larger differences, what is also to be expected because pencil beam algorithm do not always performed well in highly inhomogeneous areas like head-and-neck. Conclusion: The good agreement between a track-repeating algorithm and a full Monte Carlo for a large cohort of patients and a challenging, site like head-and-neck, opens the path to systematic and detailed studies of large cohorts, which may yield better understanding of biological effects.« less

  1. SU-C-BRC-04: Efficient Dose Calculation Algorithm for FFF IMRT with a Simplified Bivariate Gaussian Source Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, F; Park, J; Barraclough, B

    2016-06-15

    Purpose: To develop an efficient and accurate independent dose calculation algorithm with a simplified analytical source model for the quality assurance and safe delivery of Flattening Filter Free (FFF)-IMRT on an Elekta Versa HD. Methods: The source model consisted of a point source and a 2D bivariate Gaussian source, respectively modeling the primary photons and the combined effect of head scatter, monitor chamber backscatter and collimator exchange effect. The in-air fluence was firstly calculated by back-projecting the edges of beam defining devices onto the source plane and integrating the visible source distribution. The effect of the rounded MLC leaf end,more » tongue-and-groove and interleaf transmission was taken into account in the back-projection. The in-air fluence was then modified with a fourth degree polynomial modeling the cone-shaped dose distribution of FFF beams. Planar dose distribution was obtained by convolving the in-air fluence with a dose deposition kernel (DDK) consisting of the sum of three 2D Gaussian functions. The parameters of the source model and the DDK were commissioned using measured in-air output factors (Sc) and cross beam profiles, respectively. A novel method was used to eliminate the volume averaging effect of ion chambers in determining the DDK. Planar dose distributions of five head-and-neck FFF-IMRT plans were calculated and compared against measurements performed with a 2D diode array (MapCHECK™) to validate the accuracy of the algorithm. Results: The proposed source model predicted Sc for both 6MV and 10MV with an accuracy better than 0.1%. With a stringent gamma criterion (2%/2mm/local difference), the passing rate of the FFF-IMRT dose calculation was 97.2±2.6%. Conclusion: The removal of the flattening filter represents a simplification of the head structure which allows the use of a simpler source model for very accurate dose calculation. The proposed algorithm offers an effective way to ensure the safe delivery of FFF-IMRT.« less

  2. Inherent smoothness of intensity patterns for intensity modulated radiation therapy generated by simultaneous projection algorithms

    NASA Astrophysics Data System (ADS)

    Xiao, Ying; Michalski, Darek; Censor, Yair; Galvin, James M.

    2004-07-01

    The efficient delivery of intensity modulated radiation therapy (IMRT) depends on finding optimized beam intensity patterns that produce dose distributions, which meet given constraints for the tumour as well as any critical organs to be spared. Many optimization algorithms that are used for beamlet-based inverse planning are susceptible to large variations of neighbouring intensities. Accurately delivering an intensity pattern with a large number of extrema can prove impossible given the mechanical limitations of standard multileaf collimator (MLC) delivery systems. In this study, we apply Cimmino's simultaneous projection algorithm to the beamlet-based inverse planning problem, modelled mathematically as a system of linear inequalities. We show that using this method allows us to arrive at a smoother intensity pattern. Including nonlinear terms in the simultaneous projection algorithm to deal with dose-volume histogram (DVH) constraints does not compromise this property from our experimental observation. The smoothness properties are compared with those from other optimization algorithms which include simulated annealing and the gradient descent method. The simultaneous property of these algorithms is ideally suited to parallel computing technologies.

  3. Stereotactic radiotherapy of intrapulmonary lesions: comparison of different dose calculation algorithms for Oncentra MasterPlan®.

    PubMed

    Troeller, Almut; Garny, Sylvia; Pachmann, Sophia; Kantz, Steffi; Gerum, Sabine; Manapov, Farkhad; Ganswindt, Ute; Belka, Claus; Söhn, Matthias

    2015-02-22

    The use of high accuracy dose calculation algorithms, such as Monte Carlo (MC) and Collapsed Cone (CC) determine dose in inhomogeneous tissue more accurately than pencil beam (PB) algorithms. However, prescription protocols based on clinical experience with PB are often used for treatment plans calculated with CC. This may lead to treatment plans with changes in field size (FS) and changes in dose to organs at risk (OAR), especially for small tumor volumes in lung tissue treated with SABR. We re-evaluated 17 3D-conformal treatment plans for small intrapulmonary lesions with a prescription of 60 Gy in fractions of 7.5 Gy to the 80% isodose. All treatment plans were initially calculated in Oncentra MasterPlan® using a PB algorithm and recalculated with CC (CCre-calc). Furthermore, a CC-based plan with coverage similar to the PB plan (CCcov) and a CC plan with relaxed coverage criteria (CCclin), were created. The plans were analyzed in terms of Dmean, Dmin, Dmax and coverage for GTV, PTV and ITV. Changes in mean lung dose (MLD), V10Gy and V20Gy were evaluated for the lungs. The re-planned CC plans were compared to the original PB plans regarding changes in total monitor units (MU) and average FS. When PB plans were recalculated with CC, the average V60Gy of GTV, ITV and PTV decreased by 13.2%, 19.9% and 41.4%, respectively. Average Dmean decreased by 9% (GTV), 11.6% (ITV) and 14.2% (PTV). Dmin decreased by 18.5% (GTV), 21.3% (ITV) and 17.5% (PTV). Dmax declined by 7.5%. PTV coverage correlated with PTV volume (p < 0.001). MLD, V10Gy, and V20Gy were significantly reduced in the CC plans. Both, CCcov and CCclin had significantly increased MUs and FS compared to PB. Recalculation of PB plans for small lung lesions with CC showed a strong decline in dose and coverage in GTV, ITV and PTV, and declined dose in the lung. Thus, switching from a PB algorithm to CC, while aiming to obtain similar target coverage, can be associated with application of more MU and extension of radiotherapy fields, causing greater OAR exposition.

  4. Compressed sensing with gradient total variation for low-dose CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Seo, Chang-Woo; Cha, Bo Kyung; Jeon, Seongchae; Huh, Young; Park, Justin C.; Lee, Byeonghun; Baek, Junghee; Kim, Eunyoung

    2015-06-01

    This paper describes the improvement of convergence speed with gradient total variation (GTV) in compressed sensing (CS) for low-dose cone-beam computed tomography (CBCT) reconstruction. We derive a fast algorithm for the constrained total variation (TV)-based a minimum number of noisy projections. To achieve this task we combine the GTV with a TV-norm regularization term to promote an accelerated sparsity in the X-ray attenuation characteristics of the human body. The GTV is derived from a TV and enforces more efficient computationally and faster in convergence until a desired solution is achieved. The numerical algorithm is simple and derives relatively fast convergence. We apply a gradient projection algorithm that seeks a solution iteratively in the direction of the projected gradient while enforcing a non-negatively of the found solution. In comparison with the Feldkamp, Davis, and Kress (FDK) and conventional TV algorithms, the proposed GTV algorithm showed convergence in ≤18 iterations, whereas the original TV algorithm needs at least 34 iterations in reducing 50% of the projections compared with the FDK algorithm in order to reconstruct the chest phantom images. Future investigation includes improving imaging quality, particularly regarding X-ray cone-beam scatter, and motion artifacts of CBCT reconstruction.

  5. SU-E-T-48: A Multi-Institutional Study of Independent Dose Verification for Conventional, SRS and SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2015-06-15

    Purpose: To show the results of a multi-institutional study of the independent dose verification for conventional, Stereotactic radiosurgery and body radiotherapy (SRS and SBRT) plans based on the action level of AAPM TG-114. Methods: This study was performed at 12 institutions in Japan. To eliminate the bias of independent dose verification program (Indp), all of the institutions used the same CT-based independent dose verification software (Simple MU Analysis, Triangle Products, JP) with the Clarkson-based algorithm. Eclipse (AAA, PBC), Pinnacle{sup 3} (Adaptive Convolve) and Xio (Superposition) were used as treatment planning system (TPS). The confidence limits (CL, Mean±2SD) for 18 sitesmore » (head, breast, lung, pelvis, etc.) were evaluated in comparison in dose between the TPS and the Indp. Results: A retrospective analysis of 6352 treatment fields was conducted. The CLs for conventional, SRS and SBRT were 1.0±3.7 %, 2.0±2.5 % and 6.2±4.4 %, respectively. In conventional plans, most of the sites showed within 5 % of TG-114 action level. However, there were the systematic difference (4.0±4.0 % and 2.5±5.8 % for breast and lung, respectively). In SRS plans, our results showed good agreement compared to the action level. In SBRT plans, the discrepancy between the Indp was variable depending on dose calculation algorithms of TPS. Conclusion: The impact of dose calculation algorithms for the TPS and the Indp affects the action level. It is effective to set the site-specific tolerances, especially for the site where inhomogeneous correction can affect dose distribution strongly.« less

  6. Performance of two commercial electron beam algorithms over regions close to the lung-mediastinum interface, against Monte Carlo simulation and point dosimetry in virtual and anthropomorphic phantoms.

    PubMed

    Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R

    2014-03-01

    Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Sci-Sat AM: Radiation Dosimetry and Practical Therapy Solutions - 04: On 3D Fabrication of Phantoms and Experimental Verification of Patient Dose Computation Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Rao; Zavan, Rodolfo; McGeachy, Philip

    2016-08-15

    Purpose: Transport based dose calculation algorithm Acuros XB (AXB) has been shown to accurately account for heterogeneities mostly through comparisons with Monte Carlo simulations. This study aims at providing additional experimental verification for AXB for flattened and unflattened clinical energies in low density phantoms of the same material. Materials and Methods: Polystyrene slabs were created using a bench-top 3D printer. Six slabs were printed at varying densities from 0.23 g/cm{sup 3} to 0.68 g/cm{sup 3}, corresponding to different density humanoid tissues. The slabs were used to form different single and multilayer geometries. Dose was calculated with AXB 11.0.31 for 6MV,more » 15MV flattened and 6FFF (flattening filter free) energies for field sizes of 2×2 cm{sup 2} and 5×5 cm{sup 2}. The phantoms containing radiochromic EBT3 films were irradiated. Absolute dose profiles and 2D gamma analyses were performed for 96 dose planes. Results: For all single slab, multislab configurations and energies, absolute dose differences between the AXB calculation and film measurements remained <3% for both fields, with slightly poor disagreement in penumbra. The gamma index at 2% / 2mm averaged 98% in all combinations of fields, phantoms and photon energies. Conclusions: The transport based dose algorithm AXB is in good agreement with the experimental measurements for small field sizes using 6MV, 6FFF and 15MV beams adjacent to low density heterogeneous media. This work provides sufficient experimental ground to support the use of AXB for heterogeneous dose calculation purposes.« less

  8. Population pharmacokinetics of busulfan in pediatric and young adult patients undergoing hematopoietic cell transplant: a model-based dosing algorithm for personalized therapy and implementation into routine clinical use.

    PubMed

    Long-Boyle, Janel R; Savic, Rada; Yan, Shirley; Bartelink, Imke; Musick, Lisa; French, Deborah; Law, Jason; Horn, Biljana; Cowan, Morton J; Dvorak, Christopher C

    2015-04-01

    Population pharmacokinetic (PK) studies of busulfan in children have shown that individualized model-based algorithms provide improved targeted busulfan therapy when compared with conventional dose guidelines. The adoption of population PK models into routine clinical practice has been hampered by the tendency of pharmacologists to develop complex models too impractical for clinicians to use. The authors aimed to develop a population PK model for busulfan in children that can reliably achieve therapeutic exposure (concentration at steady state) and implement a simple model-based tool for the initial dosing of busulfan in children undergoing hematopoietic cell transplantation. Model development was conducted using retrospective data available in 90 pediatric and young adult patients who had undergone hematopoietic cell transplantation with busulfan conditioning. Busulfan drug levels and potential covariates influencing drug exposure were analyzed using the nonlinear mixed effects modeling software, NONMEM. The final population PK model was implemented into a clinician-friendly Microsoft Excel-based tool and used to recommend initial doses of busulfan in a group of 21 pediatric patients prospectively dosed based on the population PK model. Modeling of busulfan time-concentration data indicates that busulfan clearance displays nonlinearity in children, decreasing up to approximately 20% between the concentrations of 250-2000 ng/mL. Important patient-specific covariates found to significantly impact busulfan clearance were actual body weight and age. The percentage of individuals achieving a therapeutic concentration at steady state was significantly higher in subjects receiving initial doses based on the population PK model (81%) than in historical controls dosed on conventional guidelines (52%) (P = 0.02). When compared with the conventional dosing guidelines, the model-based algorithm demonstrates significant improvement for providing targeted busulfan therapy in children and young adults.

  9. On the sensitivity of TG-119 and IROC credentialing to TPS commissioning errors.

    PubMed

    McVicker, Drew; Yin, Fang-Fang; Adamson, Justus D

    2016-01-08

    We investigate the sensitivity of IMRT commissioning using the TG-119 C-shape phantom and credentialing with the IROC head and neck phantom to treatment planning system commissioning errors. We introduced errors into the various aspects of the commissioning process for a 6X photon energy modeled using the analytical anisotropic algorithm within a commercial treatment planning system. Errors were implemented into the various components of the dose calculation algorithm including primary photons, secondary photons, electron contamination, and MLC parameters. For each error we evaluated the probability that it could be committed unknowingly during the dose algorithm commissioning stage, and the probability of it being identified during the verification stage. The clinical impact of each commissioning error was evaluated using representative IMRT plans including low and intermediate risk prostate, head and neck, mesothelioma, and scalp; the sensitivity of the TG-119 and IROC phantoms was evaluated by comparing dosimetric changes to the dose planes where film measurements occur and change in point doses where dosimeter measurements occur. No commissioning errors were found to have both a low probability of detection and high clinical severity. When errors do occur, the IROC credentialing and TG 119 commissioning criteria are generally effective at detecting them; however, for the IROC phantom, OAR point-dose measurements are the most sensitive despite being currently excluded from IROC analysis. Point-dose measurements with an absolute dose constraint were the most effective at detecting errors, while film analysis using a gamma comparison and the IROC film distance to agreement criteria were less effective at detecting the specific commissioning errors implemented here.

  10. Achieving routine submillisievert CT scanning: report from the summit on management of radiation dose in CT.

    PubMed

    McCollough, Cynthia H; Chen, Guang Hong; Kalender, Willi; Leng, Shuai; Samei, Ehsan; Taguchi, Katsuyuki; Wang, Ge; Yu, Lifeng; Pettigrew, Roderic I

    2012-08-01

    This Special Report presents the consensus of the Summit on Management of Radiation Dose in Computed Tomography (CT) (held in February 2011), which brought together participants from academia, clinical practice, industry, and regulatory and funding agencies to identify the steps required to reduce the effective dose from routine CT examinations to less than 1 mSv. The most promising technologies and methods discussed at the summit include innovations and developments in x-ray sources; detectors; and image reconstruction, noise reduction, and postprocessing algorithms. Access to raw projection data and standard data sets for algorithm validation and optimization is a clear need, as is the need for new, clinically relevant metrics of image quality and diagnostic performance. Current commercially available techniques such as automatic exposure control, optimization of tube potential, beam-shaping filters, and dynamic z-axis collimators are important, and education to successfully implement these methods routinely is critically needed. Other methods that are just becoming widely available, such as iterative reconstruction, noise reduction, and postprocessing algorithms, will also have an important role. Together, these existing techniques can reduce dose by a factor of two to four. Technical advances that show considerable promise for additional dose reduction but are several years or more from commercial availability include compressed sensing, volume of interest and interior tomography techniques, and photon-counting detectors. This report offers a strategic roadmap for the CT user and research and manufacturer communities toward routinely achieving effective doses of less than 1 mSv, which is well below the average annual dose from naturally occurring sources of radiation.

  11. A study of optimization techniques in HDR brachytherapy for the prostate

    NASA Astrophysics Data System (ADS)

    Pokharel, Ghana Shyam

    Several studies carried out thus far are in favor of dose escalation to the prostate gland to have better local control of the disease. But optimal way of delivery of higher doses of radiation therapy to the prostate without hurting neighboring critical structures is still debatable. In this study, we proposed that real time high dose rate (HDR) brachytherapy with highly efficient and effective optimization could be an alternative means of precise delivery of such higher doses. This approach of delivery eliminates the critical issues such as treatment setup uncertainties and target localization as in external beam radiation therapy. Likewise, dosimetry in HDR brachytherapy is not influenced by organ edema and potential source migration as in permanent interstitial implants. Moreover, the recent report of radiobiological parameters further strengthen the argument of using hypofractionated HDR brachytherapy for the management of prostate cancer. Firstly, we studied the essential features and requirements of real time HDR brachytherapy treatment planning system. Automating catheter reconstruction with fast editing tools, fast yet accurate dose engine, robust and fast optimization and evaluation engine are some of the essential requirements for such procedures. Moreover, in most of the cases we performed, treatment plan optimization took significant amount of time of overall procedure. So, making treatment plan optimization automatic or semi-automatic with sufficient speed and accuracy was the goal of the remaining part of the project. Secondly, we studied the role of optimization function and constraints in overall quality of optimized plan. We have studied the gradient based deterministic algorithm with dose volume histogram (DVH) and more conventional variance based objective functions for optimization. In this optimization strategy, the relative weight of particular objective in aggregate objective function signifies its importance with respect to other objectives. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was done to keep time window desirable for real time procedures. Therefore, it requires further study with improved conditions to realize the full potential of the algorithm.

  12. Dose algorithm for EXTRAD 4100S extremity dosimeter for use at Sandia National Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, Charles Augustus

    An updated algorithm for the EXTRAD 4100S extremity dosimeter has been derived. This algorithm optimizes the binning of dosimeter element ratios and uses a quadratic function to determine the response factors for low response ratios. This results in lower systematic bias across all test categories and eliminates the need for the 'red strap' algorithm that was used for high energy beta/gamma emitting radionuclides. The Radiation Protection Dosimetry Program (RPDP) at Sandia National Laboratories uses the Thermo Fisher EXTRAD 4100S extremity dosimeter, shown in Fig 1.1 to determine shallow dose to the extremities of potentially exposed individuals. This dosimeter consists ofmore » two LiF TLD elements or 'chipstrates', one of TLD-700 ({sup 7}Li) and one of TLD-100 (natural Li) separated by a tin filter. Following readout and background subtraction, the ratio of the responses of the two elements is determined defining the penetrability of the incident radiation. While this penetrability approximates the incident energy of the radiation, X-rays and beta particles exist in energy distributions that make determination of dose conversion factors less straightforward in their determination.« less

  13. Evaluation of On-Board kV Cone Beam Computed Tomography–Based Dose Calculation With Deformable Image Registration Using Hounsfield Unit Modifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onozato, Yusuke; Kadoya, Noriyuki, E-mail: kadoya.n@rad.med.tohoku.ac.jp; Fujita, Yukio

    2014-06-01

    Purpose: The purpose of this study was to estimate the accuracy of the dose calculation of On-Board Imager (Varian, Palo Alto, CA) cone beam computed tomography (CBCT) with deformable image registration (DIR), using the multilevel-threshold (MLT) algorithm and histogram matching (HM) algorithm in pelvic radiation therapy. Methods and Materials: One pelvis phantom and 10 patients with prostate cancer treated with intensity modulated radiation therapy were studied. To minimize the effect of organ deformation and different Hounsfield unit values between planning CT (PCT) and CBCT, we modified CBCT (mCBCT) with DIR by using the MLT (mCBCT{sub MLT}) and HM (mCBCT{sub HM})more » algorithms. To evaluate the accuracy of the dose calculation, we compared dose differences in dosimetric parameters (mean dose [D{sub mean}], minimum dose [D{sub min}], and maximum dose [D{sub max}]) for planning target volume, rectum, and bladder between PCT (reference) and CBCTs or mCBCTs. Furthermore, we investigated the effect of organ deformation compared with DIR and rigid registration (RR). We determined whether dose differences between PCT and mCBCTs were significantly lower than in CBCT by using Student t test. Results: For patients, the average dose differences in all dosimetric parameters of CBCT with DIR were smaller than those of CBCT with RR (eg, rectum; 0.54% for DIR vs 1.24% for RR). For the mCBCTs with DIR, the average dose differences in all dosimetric parameters were less than 1.0%. Conclusions: We evaluated the accuracy of the dose calculation in CBCT, mCBCT{sub MLT}, and mCBCT{sub HM} with DIR for 10 patients. The results showed that dose differences in D{sub mean}, D{sub min}, and D{sub max} in mCBCTs were within 1%, which were significantly better than those in CBCT, especially for the rectum (P<.05). Our results indicate that the mCBCT{sub MLT} and mCBCT{sub HM} can be useful for improving the dose calculation for adaptive radiation therapy.« less

  14. Iterative reconstruction for CT perfusion with a prior-image induced hybrid nonlocal means regularization: Phantom studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Bin; Lyu, Qingwen; Ma, Jianhua

    2016-04-15

    Purpose: In computed tomography perfusion (CTP) imaging, an initial phase CT acquired with a high-dose protocol can be used to improve the image quality of later phase CT acquired with a low-dose protocol. For dynamic regions, signals in the later low-dose CT may not be completely recovered if the initial CT heavily regularizes the iterative reconstruction process. The authors propose a hybrid nonlocal means (hNLM) regularization model for iterative reconstruction of low-dose CTP to overcome the limitation of the conventional prior-image induced penalty. Methods: The hybrid penalty was constructed by combining the NLM of the initial phase high-dose CT inmore » the stationary region and later phase low-dose CT in the dynamic region. The stationary and dynamic regions were determined by the similarity between the initial high-dose scan and later low-dose scan. The similarity was defined as a Gaussian kernel-based distance between the patch-window of the same pixel in the two scans, and its measurement was then used to weigh the influence of the initial high-dose CT. For regions with high similarity (e.g., stationary region), initial high-dose CT played a dominant role for regularizing the solution. For regions with low similarity (e.g., dynamic region), the regularization relied on a low-dose scan itself. This new hNLM penalty was incorporated into the penalized weighted least-squares (PWLS) for CTP reconstruction. Digital and physical phantom studies were performed to evaluate the PWLS-hNLM algorithm. Results: Both phantom studies showed that the PWLS-hNLM algorithm is superior to the conventional prior-image induced penalty term without considering the signal changes within the dynamic region. In the dynamic region of the Catphan phantom, the reconstruction error measured by root mean square error was reduced by 42.9% in PWLS-hNLM reconstructed image. Conclusions: The PWLS-hNLM algorithm can effectively use the initial high-dose CT to reconstruct low-dose CTP in the stationary region while reducing its influence in the dynamic region.« less

  15. SU-F-T-377: Monte Carlo Re-Evaluation of Volumetric-Modulated Arc Plans of Advanced Stage Nasopharygeal Cancers Optimized with Convolution-Superposition Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K; Leung, R; Law, G

    Background: Commercial treatment planning system Pinnacle3 (Philips, Fitchburg, WI, USA) employs a convolution-superposition algorithm for volumetric-modulated arc radiotherapy (VMAT) optimization and dose calculation. Study of Monte Carlo (MC) dose recalculation of VMAT plans for advanced-stage nasopharyngeal cancers (NPC) is currently limited. Methods: Twenty-nine VMAT prescribed 70Gy, 60Gy, and 54Gy to the planning target volumes (PTVs) were included. These clinical plans achieved with a CS dose engine on Pinnacle3 v9.0 were recalculated by the Monaco TPS v5.0 (Elekta, Maryland Heights, MO, USA) with a XVMC-based MC dose engine. The MC virtual source model was built using the same measurement beam datasetmore » as for the Pinnacle beam model. All MC recalculation were based on absorbed dose to medium in medium (Dm,m). Differences in dose constraint parameters per our institution protocol (Supplementary Table 1) were analyzed. Results: Only differences in maximum dose to left brachial plexus, left temporal lobe and PTV54Gy were found to be statistically insignificant (p> 0.05). Dosimetric differences of other tumor targets and normal organs are found in supplementary Table 1. Generally, doses outside the PTV in the normal organs are lower with MC than with CS. This is also true in the PTV54-70Gy doses but higher dose in the nasal cavity near the bone interfaces is consistently predicted by MC, possibly due to the increased backscattering of short-range scattered photons and the secondary electrons that is not properly modeled by the CS. The straight shoulders of the PTV dose volume histograms (DVH) initially resulted from the CS optimization are merely preserved after MC recalculation. Conclusion: Significant dosimetric differences in VMAT NPC plans were observed between CS and MC calculations. Adjustments of the planning dose constraints to incorporate the physics differences from conventional CS algorithm should be made when VMAT optimization is carried out directly with MC dose engine.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Liu, B; Liang, B

    Purpose: Current CyberKnife treatment planning system (TPS) provided two dose calculation algorithms: Ray-tracing and Monte Carlo. Ray-tracing algorithm is fast, but less accurate, and also can’t handle irregular fields since a multi-leaf collimator system was recently introduced to CyberKnife M6 system. Monte Carlo method has well-known accuracy, but the current version still takes a long time to finish dose calculations. The purpose of this paper is to develop a GPU-based fast C/S dose engine for CyberKnife system to achieve both accuracy and efficiency. Methods: The TERMA distribution from a poly-energetic source was calculated based on beam’s eye view coordinate system,more » which is GPU friendly and has linear complexity. The dose distribution was then computed by inversely collecting the energy depositions from all TERMA points along 192 collapsed-cone directions. EGSnrc user code was used to pre-calculate energy deposition kernels (EDKs) for a series of mono-energy photons The energy spectrum was reconstructed based on measured tissue maximum ratio (TMR) curve, the TERMA averaged cumulative kernels was then calculated. Beam hardening parameters and intensity profiles were optimized based on measurement data from CyberKnife system. Results: The difference between measured and calculated TMR are less than 1% for all collimators except in the build-up regions. The calculated profiles also showed good agreements with the measured doses within 1% except in the penumbra regions. The developed C/S dose engine was also used to evaluate four clinical CyberKnife treatment plans, the results showed a better dose calculation accuracy than Ray-tracing algorithm compared with Monte Carlo method for heterogeneous cases. For the dose calculation time, it takes about several seconds for one beam depends on collimator size and dose calculation grids. Conclusion: A GPU-based C/S dose engine has been developed for CyberKnife system, which was proven to be efficient and accurate for clinical purpose, and can be easily implemented in TPS.« less

  17. WE-DE-BRA-09: Fast Megavoltage CT Imaging with Rapid Scan Time and Low Imaging Dose in Helical Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magome, T; University of Tokyo Hospital, Tokyo; University of Minnesota, Minneapolis, MN

    Purpose: Megavoltage computed tomography (MVCT) imaging has been widely used for daily patient setup with helical tomotherapy (HT). One drawback of MVCT is its very long imaging time, owing to slow couch speed. The purpose of this study was to develop an MVCT imaging method allowing faster couch speeds, and to assess its accuracy for image guidance for HT. Methods: Three cadavers (mimicking closest physiological and physical system of patients) were scanned four times with couch speeds of 1, 2, 3, and 4 mm/s. The resulting MVCT images were reconstructed using an iterative reconstruction (IR) algorithm. The MVCT images weremore » registered with kilovoltage CT images, and the registration errors were compared with the errors with conventional filtered back projection (FBP) algorithm. Moreover, the fast MVCT imaging was tested in three cases of total marrow irradiation as a clinical trial. Results: Three-dimensional registration errors of the MVCT images reconstructed with the IR algorithm were significantly smaller (p < 0.05) than the errors of images reconstructed with the FBP algorithm at fast couch speeds (3, 4 mm/s). The scan time and imaging dose at a speed of 4 mm/s were reduced to 30% of those from a conventional coarse mode scan. For the patient imaging, a limited number of conventional MVCT (1.2 mm/s) and fast MVCT (3 mm/s) reveals acceptable reduced imaging time and dose able to use for anatomical registration. Conclusion: Fast MVCT with IR algorithm maybe clinically feasible alternative for rapid 3D patient localization. This technique may also be useful for calculating daily dose distributions or organ motion analyses in HT treatment over a wide area.« less

  18. X-ray dose reduction in abdominal computed tomography using advanced iterative reconstruction algorithms.

    PubMed

    Ning, Peigang; Zhu, Shaocheng; Shi, Dapeng; Guo, Ying; Sun, Minghua

    2014-01-01

    This work aims to explore the effects of adaptive statistical iterative reconstruction (ASiR) and model-based iterative reconstruction (MBIR) algorithms in reducing computed tomography (CT) radiation dosages in abdominal imaging. CT scans on a standard male phantom were performed at different tube currents. Images at the different tube currents were reconstructed with the filtered back-projection (FBP), 50% ASiR and MBIR algorithms and compared. The CT value, image noise and contrast-to-noise ratios (CNRs) of the reconstructed abdominal images were measured. Volumetric CT dose indexes (CTDIvol) were recorded. At different tube currents, 50% ASiR and MBIR significantly reduced image noise and increased the CNR when compared with FBP. The minimal tube current values required by FBP, 50% ASiR, and MBIR to achieve acceptable image quality using this phantom were 200, 140, and 80 mA, respectively. At the identical image quality, 50% ASiR and MBIR reduced the radiation dose by 35.9% and 59.9% respectively when compared with FBP. Advanced iterative reconstruction techniques are able to reduce image noise and increase image CNRs. Compared with FBP, 50% ASiR and MBIR reduced radiation doses by 35.9% and 59.9%, respectively.

  19. Dental cone-beam CT reconstruction from limited-angle view data based on compressed-sensing (CS) theory for fast, low-dose X-ray imaging

    NASA Astrophysics Data System (ADS)

    Je, Uikyu; Cho, Hyosung; Lee, Minsik; Oh, Jieun; Park, Yeonok; Hong, Daeki; Park, Cheulkyu; Cho, Heemoon; Choi, Sungil; Koo, Yangseo

    2014-06-01

    Recently, reducing radiation doses has become an issue of critical importance in the broader radiological community. As a possible technical approach, especially, in dental cone-beam computed tomography (CBCT), reconstruction from limited-angle view data (< 360°) would enable fast scanning with reduced doses to the patient. In this study, we investigated and implemented an efficient reconstruction algorithm based on compressed-sensing (CS) theory for the scan geometry and performed systematic simulation works to investigate the image characteristics. We also performed experimental works by applying the algorithm to a commercially-available dental CBCT system to demonstrate its effectiveness for image reconstruction in incomplete data problems. We successfully reconstructed CBCT images with incomplete projections acquired at selected scan angles of 120, 150, 180, and 200° with a fixed angle step of 1.2° and evaluated the reconstruction quality quantitatively. Both simulation and experimental demonstrations of the CS-based reconstruction from limited-angle view data show that the algorithm can be applied directly to current dental CBCT systems for reducing the imaging doses and further improving the image quality.

  20. Comparison of exposure assessment methods in a lung cancer case-control study: performance of a lifelong task-based questionnaire for asbestos and PAHs.

    PubMed

    Bourgkard, Eve; Wild, Pascal; Gonzalez, Maria; Févotte, Joëlle; Penven, Emmanuelle; Paris, Christophe

    2013-12-01

    To describe the performance of a lifelong task-based questionnaire (TBQ) in estimating exposures compared with other approaches in the context of a case-control study. A sample of 93 subjects was randomly selected from a lung cancer case-control study corresponding to 497 jobs. For each job, exposure assessments for asbestos and polycyclic aromatic hydrocarbons (PAHs) were obtained by expertise (TBQ expertise) and by algorithm using the TBQ (TBQ algorithm) as well as by expert appraisals based on all available occupational data (REFERENCE expertise) considered to be the gold standard. Additionally, a Job Exposure Matrix (JEM)-based evaluation for asbestos was also obtained. On the 497 jobs, the various evaluations were contrasted using Cohen's κ coefficient of agreement. Additionally, on the total case-control population, the asbestos dose-response relationship based on the TBQ algorithm was compared with the JEM-based assessment. Regarding asbestos, the TBQ-exposure estimates agreed well with the REFERENCE estimate (TBQ expertise: level-weighted κ (lwk)=0.68; TBQ algorithm: lwk=0.61) but less so with the JEM estimate (TBQ expertise: lwk=0.31; TBQ algorithm: lwk=0.26). Regarding PAHs, the agreements between REFERENCE expertise and TBQ were less good (TBQ expertise: lwk=0.43; TBQ algorithm: lwk=0.36). In the case-control study analysis, the dose-response relationship between lung cancer and cumulative asbestos based on the JEM is less steep than with the TBQ-algorithm exposure assessment and statistically non-significant. Asbestos-exposure estimates based on the TBQ were consistent with the REFERENCE expertise and yielded a steeper dose-response relationship than the JEM. For PAHs, results were less clear.

  1. A new correction method serving to eliminate the parabola effect of flatbed scanners used in radiochromic film dosimetry.

    PubMed

    Poppinga, D; Schoenfeld, A A; Doerner, K J; Blanck, O; Harder, D; Poppe, B

    2014-02-01

    The purpose of this study is the correction of the lateral scanner artifact, i.e., the effect that, on a large homogeneously exposed EBT3 film, a flatbed scanner measures different optical densities at different positions along the x axis, the axis parallel to the elongated light source. At constant dose, the measured optical density profiles along this axis have a parabolic shape with significant dose dependent curvature. Therefore, the effect is shortly called the parabola effect. The objective of the algorithm developed in this study is to correct for the parabola effect. Any optical density measured at given position x is transformed into the equivalent optical density c at the apex of the parabola and then converted into the corresponding dose via the calibration of c versus dose. For the present study EBT3 films and an Epson 10000XL scanner including transparency unit were used for the analysis of the parabola effect. The films were irradiated with 6 MV photons from an Elekta Synergy accelerator in a RW3 slab phantom. In order to quantify the effect, ten film pieces with doses graded from 0 to 20.9 Gy were sequentially scanned at eight positions along the x axis and at six positions along the z axis (the movement direction of the light source) both for the portrait and landscape film orientations. In order to test the effectiveness of the new correction algorithm, the dose profiles of an open square field and an IMRT plan were measured by EBT3 films and compared with ionization chamber and ionization chamber array measurement. The parabola effect has been numerically studied over the whole measuring field of the Epson 10000XL scanner for doses up to 20.9 Gy and for both film orientations. The presented algorithm transforms any optical density at position x into the equivalent optical density that would be measured at the same dose at the apex of the parabola. This correction method has been validated up to doses of 5.2 Gy all over the scanner bed with 2D dose distributions of an open square photon field and an IMRT distribution. The algorithm presented in this study quantifies and corrects the parabola effect of EBT3 films scanned in commonly used commercial flatbed scanners at doses up to 5.2 Gy. It is easy to implement, and no additional work steps are necessary in daily routine film dosimetry.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poppinga, D., E-mail: daniela.poppinga@uni-oldenburg.de; Schoenfeld, A. A.; Poppe, B.

    Purpose: The purpose of this study is the correction of the lateral scanner artifact, i.e., the effect that, on a large homogeneously exposed EBT3 film, a flatbed scanner measures different optical densities at different positions along thex axis, the axis parallel to the elongated light source. At constant dose, the measured optical densitiy profiles along this axis have a parabolic shape with significant dose dependent curvature. Therefore, the effect is shortly called the parabola effect. The objective of the algorithm developed in this study is to correct for the parabola effect. Any optical density measured at given position x ismore » transformed into the equivalent optical density c at the apex of the parabola and then converted into the corresponding dose via the calibration of c versus dose. Methods: For the present study EBT3 films and an Epson 10000XL scanner including transparency unit were used for the analysis of the parabola effect. The films were irradiated with 6 MV photons from an Elekta Synergy accelerator in a RW3 slab phantom. In order to quantify the effect, ten film pieces with doses graded from 0 to 20.9 Gy were sequentially scanned at eight positions along thex axis and at six positions along the z axis (the movement direction of the light source) both for the portrait and landscape film orientations. In order to test the effectiveness of the new correction algorithm, the dose profiles of an open square field and an IMRT plan were measured by EBT3 films and compared with ionization chamber and ionization chamber array measurement. Results: The parabola effect has been numerically studied over the whole measuring field of the Epson 10000XL scanner for doses up to 20.9 Gy and for both film orientations. The presented algorithm transforms any optical density at positionx into the equivalent optical density that would be measured at the same dose at the apex of the parabola. This correction method has been validated up to doses of 5.2 Gy all over the scanner bed with 2D dose distributions of an open square photon field and an IMRT distribution. Conclusions: The algorithm presented in this study quantifies and corrects the parabola effect of EBT3 films scanned in commonly used commercial flatbed scanners at doses up to 5.2 Gy. It is easy to implement, and no additional work steps are necessary in daily routine film dosimetry.« less

  3. Satellite change detection of forest damage near the Chernobyl accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClellan, G.E.; Anno, G.H.

    1992-01-01

    A substantial amount of forest within a few kilometers of the Chernobyl nuclear reactor station was badly contaminated with radionuclides by the April 26, 1986, explosion and ensuing fire at reactor No. 4. Radiation doses to conifers in some areas were sufficient to cause discoloration of needles within a few weeks. Other areas, receiving smaller doses, showed foliage changes beginning 6 months to a year later. Multispectral imagery available from Landsat sensors is especially suited for monitoring such changes in vegetation. A series of Landsat Thematic Mapper images was developed that span the 2 yr following the accident. Quantitative dosemore » estimation for the exposed conifers requires an objective change detection algorithm and knowledge of the dose-time response of conifers to ionizing radiation. Pacific-Sierra Research Corporation's Hyperscout{trademark} algorithm is based on an advanced, sensitive technique for change detection particularly suited for multispectral images. The Hyperscout algorithm has been used to assess radiation damage to the forested areas around the Chernobyl nuclear power plant.« less

  4. Evaluation of the new electron-transport algorithm in MCNP6.1 for the simulation of dose point kernel in water

    NASA Astrophysics Data System (ADS)

    Antoni, Rodolphe; Bourgois, Laurent

    2017-12-01

    In this work, the calculation of specific dose distribution in water is evaluated in MCNP6.1 with the regular condensed history algorithm the "detailed electron energy-loss straggling logic" and the new electrons transport algorithm proposed the "single event algorithm". Dose Point Kernel (DPK) is calculated with monoenergetic electrons of 50, 100, 500, 1000 and 3000 keV for different scoring cells dimensions. A comparison between MCNP6 results and well-validated codes for electron-dosimetry, i.e., EGSnrc or Penelope, is performed. When the detailed electron energy-loss straggling logic is used with default setting (down to the cut-off energy 1 keV), we infer that the depth of the dose peak increases with decreasing thickness of the scoring cell, largely due to combined step-size and boundary crossing artifacts. This finding is less prominent for 500 keV, 1 MeV and 3 MeV dose profile. With an appropriate number of sub-steps (ESTEP value in MCNP6), the dose-peak shift is almost complete absent to 50 keV and 100 keV electrons. However, the dose-peak is more prominent compared to EGSnrc and the absorbed dose tends to be underestimated at greater depths, meaning that boundaries crossing artifact are still occurring while step-size artifacts are greatly reduced. When the single-event mode is used for the whole transport, we observe the good agreement of reference and calculated profile for 50 and 100 keV electrons. Remaining artifacts are fully vanished, showing a possible transport treatment for energies less than a hundred of keV and accordance with reference for whatever scoring cell dimension, even if the single event method initially intended to support electron transport at energies below 1 keV. Conversely, results for 500 keV, 1 MeV and 3 MeV undergo a dramatic discrepancy with reference curves. These poor results and so the current unreliability of the method is for a part due to inappropriate elastic cross section treatment from the ENDF/B-VI.8 library in those energy ranges. Accordingly, special care has to be taken in setting choice for calculating electron dose distribution with MCNP6, in particular with regards to dosimetry or nuclear medicine applications.

  5. Patient-specific CT dosimetry calculation: a feasibility study.

    PubMed

    Fearon, Thomas; Xie, Huchen; Cheng, Jason Y; Ning, Holly; Zhuge, Ying; Miller, Robert W

    2011-11-15

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of "standard man". Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient-specific CT dosimetry. A radiation treatment planning system was modified to calculate patient-specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose-volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi-empirical, measured correction-based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point-by-point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%-20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient-specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation.

  6. Poster - Thurs Eve-23: Effect of lung density and geometry variation on inhomogeneity correction algorithms: A Monte Carlo dosimetry evaluation.

    PubMed

    Chow, J; Leung, M; Van Dyk, J

    2008-07-01

    This study provides new information on the evaluation of the lung dose calculation algorithms as a function of the relative electron density of lung, ρ e,lung . Doses calculated using the collapsed cone convolution (CCC) and adaptive convolution (AC) algorithm in lung with the Pinnacle 3 system were compared to those calculated using the Monte Carlo (MC) simulation (EGSnrc-based code). Three groups of lung phantoms, namely, "Slab", "Column" and "Cube" with different ρ e,lung (0.05-0.7), positions, volumes and shapes of lung in water were used. 6 and 18MV photon beams with 4×4 and 10×10cm 2 field sizes produced by a Varian 21EX Linac were used in the MC dose calculations. Results show that the CCC algorithm agrees well with AC to within ±1% for doses calculated in the lung phantoms, indicating that the AC, with 3-4 times less computing time required than CCC, is a good substitute for the CCC method. Comparing the CCC and AC with MC, dose deviations are found when ρ e,lung are ⩽0.1-0.3. The degree of deviation depends on the photon beam energy and field size, and is relatively large when high-energy photon beams with small field are used. For the penumbra widths (20%-80%), the CCC and AC agree well with MC for the "Slab" and "Cube" phantoms with the lung volumes at the central beam axis (CAX). However, deviations >2mm occur in the "Column" phantoms, with two lung volumes separated by a water column along the CAX, using the 18MV (4×4cm 2 ) photon beams with ρ e,lung ⩽0.1. © 2008 American Association of Physicists in Medicine.

  7. Effects of Iterative Reconstruction Algorithms on Computer-assisted Detection (CAD) Software for Lung Nodules in Ultra-low-dose CT for Lung Cancer Screening.

    PubMed

    Nomura, Yukihiro; Higaki, Toru; Fujita, Masayo; Miki, Soichiro; Awaya, Yoshikazu; Nakanishi, Toshio; Yoshikawa, Takeharu; Hayashi, Naoto; Awai, Kazuo

    2017-02-01

    This study aimed to evaluate the effects of iterative reconstruction (IR) algorithms on computer-assisted detection (CAD) software for lung nodules in ultra-low-dose computed tomography (ULD-CT) for lung cancer screening. We selected 85 subjects who underwent both a low-dose CT (LD-CT) scan and an additional ULD-CT scan in our lung cancer screening program for high-risk populations. The LD-CT scans were reconstructed with filtered back projection (FBP; LD-FBP). The ULD-CT scans were reconstructed with FBP (ULD-FBP), adaptive iterative dose reduction 3D (AIDR 3D; ULD-AIDR 3D), and forward projected model-based IR solution (FIRST; ULD-FIRST). CAD software for lung nodules was applied to each image dataset, and the performance of the CAD software was compared among the different IR algorithms. The mean volume CT dose indexes were 3.02 mGy (LD-CT) and 0.30 mGy (ULD-CT). For overall nodules, the sensitivities of CAD software at 3.0 false positives per case were 78.7% (LD-FBP), 9.3% (ULD-FBP), 69.4% (ULD-AIDR 3D), and 77.8% (ULD-FIRST). Statistical analysis showed that the sensitivities of ULD-AIDR 3D and ULD-FIRST were significantly higher than that of ULD-FBP (P < .001). The performance of CAD software in ULD-CT was improved by using IR algorithms. In particular, the performance of CAD in ULD-FIRST was almost equivalent to that in LD-FBP. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  8. Accelerating IMRT optimization by voxel sampling

    NASA Astrophysics Data System (ADS)

    Martin, Benjamin C.; Bortfeld, Thomas R.; Castañon, David A.

    2007-12-01

    This paper presents a new method for accelerating intensity-modulated radiation therapy (IMRT) optimization using voxel sampling. Rather than calculating the dose to the entire patient at each step in the optimization, the dose is only calculated for some randomly selected voxels. Those voxels are then used to calculate estimates of the objective and gradient which are used in a randomized version of a steepest descent algorithm. By selecting different voxels on each step, we are able to find an optimal solution to the full problem. We also present an algorithm to automatically choose the best sampling rate for each structure within the patient during the optimization. Seeking further improvements, we experimented with several other gradient-based optimization algorithms and found that the delta-bar-delta algorithm performs well despite the randomness. Overall, we were able to achieve approximately an order of magnitude speedup on our test case as compared to steepest descent.

  9. SU-F-BRF-09: A Non-Rigid Point Matching Method for Accurate Bladder Dose Summation in Cervical Cancer HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, H; Zhen, X; Zhou, L

    2014-06-15

    Purpose: To propose and validate a deformable point matching scheme for surface deformation to facilitate accurate bladder dose summation for fractionated HDR cervical cancer treatment. Method: A deformable point matching scheme based on the thin plate spline robust point matching (TPSRPM) algorithm is proposed for bladder surface registration. The surface of bladders segmented from fractional CT images is extracted and discretized with triangular surface mesh. Deformation between the two bladder surfaces are obtained by matching the two meshes' vertices via the TPS-RPM algorithm, and the deformation vector fields (DVFs) characteristic of this deformation is estimated by B-spline approximation. Numerically, themore » algorithm is quantitatively compared with the Demons algorithm using five clinical cervical cancer cases by several metrics: vertex-to-vertex distance (VVD), Hausdorff distance (HD), percent error (PE), and conformity index (CI). Experimentally, the algorithm is validated on a balloon phantom with 12 surface fiducial markers. The balloon is inflated with different amount of water, and the displacement of fiducial markers is benchmarked as ground truth to study TPS-RPM calculated DVFs' accuracy. Results: In numerical evaluation, the mean VVD is 3.7(±2.0) mm after Demons, and 1.3(±0.9) mm after TPS-RPM. The mean HD is 14.4 mm after Demons, and 5.3mm after TPS-RPM. The mean PE is 101.7% after Demons and decreases to 18.7% after TPS-RPM. The mean CI is 0.63 after Demons, and increases to 0.90 after TPS-RPM. In the phantom study, the mean Euclidean distance of the fiducials is 7.4±3.0mm and 4.2±1.8mm after Demons and TPS-RPM, respectively. Conclusions: The bladder wall deformation is more accurate using the feature-based TPS-RPM algorithm than the intensity-based Demons algorithm, indicating that TPS-RPM has the potential for accurate bladder dose deformation and dose summation for multi-fractional cervical HDR brachytherapy. This work is supported in part by the National Natural ScienceFoundation of China (no 30970866 and no 81301940)« less

  10. Effects of anti-aggregant, anti-inflammatory and anti-coagulant drug consumption on the preparation and therapeutic potential of plasma rich in growth factors (PRGF).

    PubMed

    Anitua, Eduardo; Troya, María; Zalduendo, Mar; Orive, Gorka

    2015-02-01

    The prevalence and incidence of trauma-related injuries, coronary heart disease and other chronic diseases increase dramatically with age. This population sector is therefore a regular consumer of different types of drugs that may affect platelet aggregation and the coagulation cascade. We have evaluated whether the consumption of acetylsalicylic acid, acenocoumarol, glucosamine sulfate and chondroitin sulfate, and therefore their presence in blood, could interfere with the preparation and biological outcomes of plasma rich in growth factors (PRGF). Clotting time, clot retraction and platelet activation of PRGF was evaluated. PRGF growth factor content and the release of different biomolecules by tendon fibroblasts were also quantified, as well as cell proliferation and cell migration. The preparation and biological potential of PRGF is not affected by the intake of the evaluated drugs, and solely its angiogenic potential and its capacity to induce HA and fibronectin synthesis, is reduced in patients taking anti-coagulants.

  11. Automated aortic calcification detection in low-dose chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.

    2014-03-01

    The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.

  12. Intra-patient comparison of reduced-dose model-based iterative reconstruction with standard-dose adaptive statistical iterative reconstruction in the CT diagnosis and follow-up of urolithiasis.

    PubMed

    Tenant, Sean; Pang, Chun Lap; Dissanayake, Prageeth; Vardhanabhuti, Varut; Stuckey, Colin; Gutteridge, Catherine; Hyde, Christopher; Roobottom, Carl

    2017-10-01

    To evaluate the accuracy of reduced-dose CT scans reconstructed using a new generation of model-based iterative reconstruction (MBIR) in the imaging of urinary tract stone disease, compared with a standard-dose CT using 30% adaptive statistical iterative reconstruction. This single-institution prospective study recruited 125 patients presenting either with acute renal colic or for follow-up of known urinary tract stones. They underwent two immediately consecutive scans, one at standard dose settings and one at the lowest dose (highest noise index) the scanner would allow. The reduced-dose scans were reconstructed using both ASIR 30% and MBIR algorithms and reviewed independently by two radiologists. Objective and subjective image quality measures as well as diagnostic data were obtained. The reduced-dose MBIR scan was 100% concordant with the reference standard for the assessment of ureteric stones. It was extremely accurate at identifying calculi of 3 mm and above. The algorithm allowed a dose reduction of 58% without any loss of scan quality. A reduced-dose CT scan using MBIR is accurate in acute imaging for renal colic symptoms and for urolithiasis follow-up and allows a significant reduction in dose. • MBIR allows reduced CT dose with similar diagnostic accuracy • MBIR outperforms ASIR when used for the reconstruction of reduced-dose scans • MBIR can be used to accurately assess stones 3 mm and above.

  13. Investigating the generalisation of an atlas-based synthetic-CT algorithm to another centre and MR scanner for prostate MR-only radiotherapy

    NASA Astrophysics Data System (ADS)

    Wyatt, Jonathan J.; Dowling, Jason A.; Kelly, Charles G.; McKenna, Jill; Johnstone, Emily; Speight, Richard; Henry, Ann; Greer, Peter B.; McCallum, Hazel M.

    2017-12-01

    There is increasing interest in MR-only radiotherapy planning since it provides superb soft-tissue contrast without the registration uncertainties inherent in a CT-MR registration. However, MR images cannot readily provide the electron density information necessary for radiotherapy dose calculation. An algorithm which generates synthetic CTs for dose calculations from MR images of the prostate using an atlas of 3 T MR images has been previously reported by two of the authors. This paper aimed to evaluate this algorithm using MR data acquired at a different field strength and a different centre to the algorithm atlas. Twenty-one prostate patients received planning 1.5 T MR and CT scans with routine immobilisation devices on a flat-top couch set-up using external lasers. The MR receive coils were supported by a coil bridge. Synthetic CTs were generated from the planning MR images with (sCT1V ) and without (sCT) a one voxel body contour expansion included in the algorithm. This was to test whether this expansion was required for 1.5 T images. Both synthetic CTs were rigidly registered to the planning CT (pCT). A 6 MV volumetric modulated arc therapy plan was created on the pCT and recalculated on the sCT and sCT1V . The synthetic CTs’ dose distributions were compared to the dose distribution calculated on the pCT. The percentage dose difference at isocentre without the body contour expansion (sCT-pCT) was Δ D_sCT=(0.9 +/- 0.8) % and with (sCT1V -pCT) was Δ D_sCT1V=(-0.7 +/- 0.7) % (mean  ±  one standard deviation). The sCT1V result was within one standard deviation of zero and agreed with the result reported previously using 3 T MR data. The sCT dose difference only agreed within two standard deviations. The mean  ±  one standard deviation gamma pass rate was Γ_sCT = 96.1 +/- 2.9 % for the sCT and Γ_sCT1V = 98.8 +/- 0.5 % for the sCT1V (with 2% global dose difference and 2~mm distance to agreement gamma criteria). The one voxel body contour expansion improves the synthetic CT accuracy for MR images acquired at 1.5 T but requires the MR voxel size to be similar to the atlas MR voxel size. This study suggests that the atlas-based algorithm can be generalised to MR data acquired using a different field strength at a different centre.

  14. A back-projection algorithm in the presence of an extra attenuating medium: towards EPID dosimetry for the MR-Linac

    NASA Astrophysics Data System (ADS)

    Torres-Xirau, I.; Olaciregui-Ruiz, I.; Rozendaal, R. A.; González, P.; Mijnheer, B. J.; Sonke, J.-J.; van der Heide, U. A.; Mans, A.

    2017-08-01

    In external beam radiotherapy, electronic portal imaging devices (EPIDs) are frequently used for pre-treatment and for in vivo dose verification. Currently, various MR-guided radiotherapy systems are being developed and clinically implemented. Independent dosimetric verification is highly desirable. For this purpose we adapted our EPID-based dose verification system for use with the MR-Linac combination developed by Elekta in cooperation with UMC Utrecht and Philips. In this study we extended our back-projection method to cope with the presence of an extra attenuating medium between the patient and the EPID. Experiments were performed at a conventional linac, using an aluminum mock-up of the MRI scanner housing between the phantom and the EPID. For a 10 cm square field, the attenuation by the mock-up was 72%, while 16% of the remaining EPID signal resulted from scattered radiation. 58 IMRT fields were delivered to a 20 cm slab phantom with and without the mock-up. EPID reconstructed dose distributions were compared to planned dose distributions using the γ -evaluation method (global, 3%, 3 mm). In our adapted back-projection algorithm the averaged {γmean} was 0.27+/- 0.06 , while in the conventional it was 0.28+/- 0.06 . Dose profiles of several square fields reconstructed with our adapted algorithm showed excellent agreement when compared to TPS.

  15. Validation of clinical testing for warfarin sensitivity: comparison of CYP2C9-VKORC1 genotyping assays and warfarin-dosing algorithms.

    PubMed

    Langley, Michael R; Booker, Jessica K; Evans, James P; McLeod, Howard L; Weck, Karen E

    2009-05-01

    Responses to warfarin (Coumadin) anticoagulation therapy are affected by genetic variability in both the CYP2C9 and VKORC1 genes. Validation of pharmacogenetic testing for warfarin responses includes demonstration of analytical validity of testing platforms and of the clinical validity of testing. We compared four platforms for determining the relevant single nucleotide polymorphisms (SNPs) in both CYP2C9 and VKORC1 that are associated with warfarin sensitivity (Third Wave Invader Plus, ParagonDx/Cepheid Smart Cycler, Idaho Technology LightCycler, and AutoGenomics Infiniti). Each method was examined for accuracy, cost, and turnaround time. All genotyping methods demonstrated greater than 95% accuracy for identifying the relevant SNPs (CYP2C9 *2 and *3; VKORC1 -1639 or 1173). The ParagonDx and Idaho Technology assays had the shortest turnaround and hands-on times. The Third Wave assay was readily scalable to higher test volumes but had the longest hands-on time. The AutoGenomics assay interrogated the largest number of SNPs but had the longest turnaround time. Four published warfarin-dosing algorithms (Washington University, UCSF, Louisville, and Newcastle) were compared for accuracy for predicting warfarin dose in a retrospective analysis of a local patient population on long-term, stable warfarin therapy. The predicted doses from both the Washington University and UCSF algorithms demonstrated the best correlation with actual warfarin doses.

  16. LBP-based penalized weighted least-squares approach to low-dose cone-beam computed tomography reconstruction

    NASA Astrophysics Data System (ADS)

    Ma, Ming; Wang, Huafeng; Liu, Yan; Zhang, Hao; Gu, Xianfeng; Liang, Zhengrong

    2014-03-01

    Cone-beam computed tomography (CBCT) has attracted growing interest of researchers in image reconstruction. The mAs level of the X-ray tube current, in practical application of CBCT, is mitigated in order to reduce the CBCT dose. The lowering of the X-ray tube current, however, results in the degradation of image quality. Thus, low-dose CBCT image reconstruction is in effect a noise problem. To acquire clinically acceptable quality of image, and keep the X-ray tube current as low as achievable in the meanwhile, some penalized weighted least-squares (PWLS)-based image reconstruction algorithms have been developed. One representative strategy in previous work is to model the prior information for solution regularization using an anisotropic penalty term. To enhance the edge preserving and noise suppressing in a finer scale, a novel algorithm combining the local binary pattern (LBP) with penalized weighted leastsquares (PWLS), called LBP-PWLS-based image reconstruction algorithm, is proposed in this work. The proposed LBP-PWLS-based algorithm adaptively encourages strong diffusion on the local spot/flat region around a voxel and less diffusion on edge/corner ones by adjusting the penalty for cost function, after the LBP is utilized to detect the region around the voxel as spot, flat and edge ones. The LBP-PWLS-based reconstruction algorithm was evaluated using the sinogram data acquired by a clinical CT scanner from the CatPhan® 600 phantom. Experimental results on the noiseresolution tradeoff measurement and other quantitative measurements demonstrated its feasibility and effectiveness in edge preserving and noise suppressing in comparison with a previous PWLS reconstruction algorithm.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Codel, G; Serin, E; Pacaci, P

    Purpose: In this study, the comparison of dosimetric accuracy of Acuros XB and AAA algorithms were investigated for small radiation fields incident on homogeneous and heterogeneous geometries Methods: Small open fields of Truebeam 2.0 unit (1×1, 2×2, 3×3, 4×4 fields) were used for this study. The fields were incident on homogeneous phantom and in house phantom containing lung, air, and bone inhomogeneities. Using the same film batch, the net OD to dose calibration curve was obtaine dusing Trubeam 2.0 for 6 MV, 6 FFF, 10 MV, 10 FFF, 15 MV energies by delivering 0- 800 cGy. Films were scanned 48more » hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of Acuros XB and AAA algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement betweent wo algorithms and measurement. For Acuros XB, minimum gamma analysis passin grates between measured and calculated dose distributions were 99.3% and 98.1% for homogeneousand inhomogeneous fields in thecase of lung and bone respectively. For AAA, minimum gamma analysis passingrates were 99.1% and 96.5% for homogeneous and inhomogeneous fields respectively for all used energies and field sizes.In the case of the air heterogeneity, the differences were larger for both calculations algorithms. Over all, when compared to measurement, theAcuros XB had beter agreement than AAA. Conclusion: The Acuros XB calculation algorithm in the TPS is an improvemen tover theexisting AAA algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.« less

  18. The TROPOMI surface UV algorithm

    NASA Astrophysics Data System (ADS)

    Lindfors, Anders V.; Kujanpää, Jukka; Kalakoski, Niilo; Heikkilä, Anu; Lakkala, Kaisa; Mielonen, Tero; Sneep, Maarten; Krotkov, Nickolay A.; Arola, Antti; Tamminen, Johanna

    2018-02-01

    The TROPOspheric Monitoring Instrument (TROPOMI) is the only payload of the Sentinel-5 Precursor (S5P), which is a polar-orbiting satellite mission of the European Space Agency (ESA). TROPOMI is a nadir-viewing spectrometer measuring in the ultraviolet, visible, near-infrared, and the shortwave infrared that provides near-global daily coverage. Among other things, TROPOMI measurements will be used for calculating the UV radiation reaching the Earth's surface. Thus, the TROPOMI surface UV product will contribute to the monitoring of UV radiation by providing daily information on the prevailing UV conditions over the globe. The TROPOMI UV algorithm builds on the heritage of the Ozone Monitoring Instrument (OMI) and the Satellite Application Facility for Atmospheric Composition and UV Radiation (AC SAF) algorithms. This paper provides a description of the algorithm that will be used for estimating surface UV radiation from TROPOMI observations. The TROPOMI surface UV product includes the following UV quantities: the UV irradiance at 305, 310, 324, and 380 nm; the erythemally weighted UV; and the vitamin-D weighted UV. Each of these are available as (i) daily dose or daily accumulated irradiance, (ii) overpass dose rate or irradiance, and (iii) local noon dose rate or irradiance. In addition, all quantities are available corresponding to actual cloud conditions and as clear-sky values, which otherwise correspond to the same conditions but assume a cloud-free atmosphere. This yields 36 UV parameters altogether. The TROPOMI UV algorithm has been tested using input based on OMI and the Global Ozone Monitoring Experiment-2 (GOME-2) satellite measurements. These preliminary results indicate that the algorithm is functioning according to expectations.

  19. Adaptive radiotherapy for NSCLC patients: utilizing the principle of energy conservation to evaluate dose mapping operations

    NASA Astrophysics Data System (ADS)

    Zhong, Hualiang; Chetty, Indrin J.

    2017-06-01

    Tumor regression during the course of fractionated radiotherapy confounds the ability to accurately estimate the total dose delivered to tumor targets. Here we present a new criterion to improve the accuracy of image intensity-based dose mapping operations for adaptive radiotherapy for patients with non-small cell lung cancer (NSCLC). Six NSCLC patients were retrospectively investigated in this study. An image intensity-based B-spline registration algorithm was used for deformable image registration (DIR) of weekly CBCT images to a reference image. The resultant displacement vector fields were employed to map the doses calculated on weekly images to the reference image. The concept of energy conservation was introduced as a criterion to evaluate the accuracy of the dose mapping operations. A finite element method (FEM)-based mechanical model was implemented to improve the performance of the B-Spline-based registration algorithm in regions involving tumor regression. For the six patients, deformed tumor volumes changed by 21.2  ±  15.0% and 4.1  ±  3.7% on average for the B-Spline and the FEM-based registrations performed from fraction 1 to fraction 21, respectively. The energy deposited in the gross tumor volume (GTV) was 0.66 Joules (J) per fraction on average. The energy derived from the fractional dose reconstructed by the B-spline and FEM-based DIR algorithms in the deformed GTV’s was 0.51 J and 0.64 J, respectively. Based on landmark comparisons for the 6 patients, mean error for the FEM-based DIR algorithm was 2.5  ±  1.9 mm. The cross-correlation coefficient between the landmark-measured displacement error and the loss of radiation energy was  -0.16 for the FEM-based algorithm. To avoid uncertainties in measuring distorted landmarks, the B-Spline-based registrations were compared to the FEM registrations, and their displacement differences equal 4.2  ±  4.7 mm on average. The displacement differences were correlated to their relative loss of radiation energy with a cross-correlation coefficient equal to 0.68. Based on the principle of energy conservation, the FEM-based mechanical model has a better performance than the B-Spline-based DIR algorithm. It is recommended that the principle of energy conservation be incorporated into a comprehensive QA protocol for adaptive radiotherapy.

  20. Fully 3D refraction correction dosimetry system.

    PubMed

    Manjappa, Rakesh; Makki, S Sharath; Kumar, Rajesh; Vasu, Ram Mohan; Kanhirodan, Rajan

    2016-02-21

    The irradiation of selective regions in a polymer gel dosimeter results in an increase in optical density and refractive index (RI) at those regions. An optical tomography-based dosimeter depends on rayline path through the dosimeter to estimate and reconstruct the dose distribution. The refraction of light passing through a dose region results in artefacts in the reconstructed images. These refraction errors are dependant on the scanning geometry and collection optics. We developed a fully 3D image reconstruction algorithm, algebraic reconstruction technique-refraction correction (ART-rc) that corrects for the refractive index mismatches present in a gel dosimeter scanner not only at the boundary, but also for any rayline refraction due to multiple dose regions inside the dosimeter. In this study, simulation and experimental studies have been carried out to reconstruct a 3D dose volume using 2D CCD measurements taken for various views. The study also focuses on the effectiveness of using different refractive-index matching media surrounding the gel dosimeter. Since the optical density is assumed to be low for a dosimeter, the filtered backprojection is routinely used for reconstruction. We carry out the reconstructions using conventional algebraic reconstruction (ART) and refractive index corrected ART (ART-rc) algorithms. The reconstructions based on FDK algorithm for cone-beam tomography has also been carried out for comparison. Line scanners and point detectors, are used to obtain reconstructions plane by plane. The rays passing through dose region with a RI mismatch does not reach the detector in the same plane depending on the angle of incidence and RI. In the fully 3D scanning setup using 2D array detectors, light rays that undergo refraction are still collected and hence can still be accounted for in the reconstruction algorithm. It is found that, for the central region of the dosimeter, the usable radius using ART-rc algorithm with water as RI matched medium is 71.8%, an increase of 6.4% compared to that achieved using conventional ART algorithm. Smaller diameter dosimeters are scanned with dry air scanning by using a wide-angle lens that collects refracted light. The images reconstructed using cone beam geometry is seen to deteriorate in some planes as those regions are not scanned. Refraction correction is important and needs to be taken in to consideration to achieve quantitatively accurate dose reconstructions. Refraction modeling is crucial in array based scanners as it is not possible to identify refracted rays in the sinogram space.

  1. Technical Note: Dosimetric evaluation of Monte Carlo algorithm in iPlan for stereotactic ablative body radiotherapy (SABR) for lung cancer patients using RTOG 0813 parameters.

    PubMed

    Pokhrel, Damodar; Badkul, Rajeev; Jiang, Hongyu; Kumar, Pravesh; Wang, Fen

    2015-01-08

    For stereotactic ablative body radiotherapy (SABR) in lung cancer patients, Radiation Therapy Oncology Group (RTOG) protocols currently require radiation dose to be calculated using tissue heterogeneity corrections. Dosimetric criteria of RTOG 0813 were established based on the results obtained from non-Monte Carlo (MC) algorithms, such as superposition/convolutions. Clinically, MC-based algorithms are now routinely used for lung SABR dose calculations. It is essential to confirm that MC calculations in lung SABR meet RTOG guidelines. This report evaluates iPlan MC plans for SABR in lung cancer patients using dose-volume histogram normalization per current RTOG 0813 compliance criteria. Eighteen Stage I-II non-small cell lung cancer (NSCLC) patients with centrally located tumors, who underwent MC-based lung SABR with heterogeneity correction using X-ray Voxel Monte Carlo (XVMC) algorithm (BrainLAB iPlan version 4.1.2), were analyzed. Total dose of 60 Gy in 5 fractions was delivered to planning target volume (PTV) with at least V100% = 95%. Internal target volumes (ITVs) were delineated on maximum intensity projection (MIP) images of 4D CT scans. PTV (ITV + 5 mm margin) volumes ranged from 10.0 to 99.9 cc (mean = 36.8 ± 20.7 cc). Organs at risk (OARs) were delineated on average images of 4D CT scans. Optimal clinical MC SABR plans were generated using a combination of non-coplanar conformal arcs and beams for the Novalis-TX consisting of high definition multileaf collimators (MLCs) and 6 MV-SRS (1000 MU/min) mode. All plans were evaluated using the RTOG 0813 high and intermediate dose spillage criteria: conformity index (R100%), ratio of 50% isodose volume to the PTV (R50%), maximum dose 2 cm away from PTV in any direction (D2 cm), and percent of normal lung receiving 20 Gy (V20) or more. Other organs-at-risk (OARs) doses were tabulated, including the volume of normal lung receiving 5 Gy (V5), maximum cord dose, dose to < 15 cc of heart, and dose to <5 cc of esophagus. Only six out of 18 patients met all RTOG 0813 compliance criteria. Eight of 18 patients had minor deviations in R100%, four in R50%, and nine in D2 cm. However, only one patient had minor deviation in V20. All other OARs doses, such as maximum cord dose, dose to < 15 cc of heart, and dose to < 5 cc of esophagus, were satisfactory for RTOG criteria, except for one patient, for whom the dose to < 15 cc of heart was higher than RTOG guidelines. The preliminary results for our limited iPlan XVMC dose calculations indicate that the majority (i.e., 2/3) of our patients had minor deviations in the dosimetric guidelines set by RTOG 0813 protocol in one way or another. When using an exclusive highly sophisticated XVMC algorithm, the RTOG 0813 dosimetric compliance criteria such as R100% and D2 cm may need to be revisited. Based on our limited number of patient datasets, in general, about 6% for R100% and 9% for D2 cm corrections could be applied to pass the RTOG 0813 compliance criteria in most of those patients. More patient plans need to be evaluated to make recommendation for R50%. No adjustment is necessary for OAR dose tolerances, including normal lung V20. In order to establish new MC specific dose parameters, further investigation with a large cohort of patients including central, as well as peripheral lung tumors, is anticipated and strongly recommended.

  2. Selection of the initial design for the two-stage continual reassessment method.

    PubMed

    Jia, Xiaoyu; Ivanova, Anastasia; Lee, Shing M

    2017-01-01

    In the two-stage continual reassessment method (CRM), model-based dose escalation is preceded by a pre-specified escalating sequence starting from the lowest dose level. This is appealing to clinicians because it allows a sufficient number of patients to be assigned to each of the lower dose levels before escalating to higher dose levels. While a theoretical framework to build the two-stage CRM has been proposed, the selection of the initial dose-escalating sequence, generally referred to as the initial design, remains arbitrary, either by specifying cohorts of three patients or by trial and error through extensive simulations. Motivated by a currently ongoing oncology dose-finding study for which clinicians explicitly stated their desire to assign at least one patient to each of the lower dose levels, we proposed a systematic approach for selecting the initial design for the two-stage CRM. The initial design obtained using the proposed algorithm yields better operating characteristics compared to using a cohort of three initial design with a calibrated CRM. The proposed algorithm simplifies the selection of initial design for the two-stage CRM. Moreover, initial designs to be used as reference for planning a two-stage CRM are provided.

  3. Improvement of dose calculation in radiation therapy due to metal artifact correction using the augmented likelihood image reconstruction.

    PubMed

    Ziemann, Christian; Stille, Maik; Cremers, Florian; Buzug, Thorsten M; Rades, Dirk

    2018-04-17

    Metal artifacts caused by high-density implants lead to incorrectly reconstructed Hounsfield units in computed tomography images. This can result in a loss of accuracy in dose calculation in radiation therapy. This study investigates the potential of the metal artifact reduction algorithms, Augmented Likelihood Image Reconstruction and linear interpolation, in improving dose calculation in the presence of metal artifacts. In order to simulate a pelvis with a double-sided total endoprosthesis, a polymethylmethacrylate phantom was equipped with two steel bars. Artifacts were reduced by applying the Augmented Likelihood Image Reconstruction, a linear interpolation, and a manual correction approach. Using the treatment planning system Eclipse™, identical planning target volumes for an idealized prostate as well as structures for bladder and rectum were defined in corrected and noncorrected images. Volumetric modulated arc therapy plans have been created with double arc rotations with and without avoidance sectors that mask out the prosthesis. The irradiation plans were analyzed for variations in the dose distribution and their homogeneity. Dosimetric measurements were performed using isocentric positioned ionization chambers. Irradiation plans based on images containing artifacts lead to a dose error in the isocenter of up to 8.4%. Corrections with the Augmented Likelihood Image Reconstruction reduce this dose error to 2.7%, corrections with linear interpolation to 3.2%, and manual artifact correction to 4.1%. When applying artifact correction, the dose homogeneity was slightly improved for all investigated methods. Furthermore, the calculated mean doses are higher for rectum and bladder if avoidance sectors are applied. Streaking artifacts cause an imprecise dose calculation within irradiation plans. Using a metal artifact correction algorithm, the planning accuracy can be significantly improved. Best results were accomplished using the Augmented Likelihood Image Reconstruction algorithm. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, M; Lee, V; Leung, R

    Purpose: Investigating the relative sensitivity of Monte Carlo (MC) and Pencil Beam (PB) dose calculation algorithms to low-Z (titanium) metallic artifacts is important for accurate and consistent dose reporting in post¬operative spinal RS. Methods: Sensitivity analysis of MC and PB dose calculation algorithms on the Monaco v.3.3 treatment planning system (Elekta CMS, Maryland Heights, MO, USA) was performed using CT images reconstructed without (plain) and with Orthopedic Metal Artifact Reduction (OMAR; Philips Healthcare system, Cleveland, OH, USA). 6MV and 10MV volumetric-modulated arc (VMAT) RS plans were obtained for MC and PB on the plain and OMAR images (MC-plain/OMAR and PB-plain/OMAR).more » Results: Maximum differences in dose to 0.2cc (D0.2cc) of spinal cord and cord +2mm for 6MV and 10MV VMAT plans were 0.1Gy between MC-OMAR and MC-plain, and between PB-OMAR and PB-plain. Planning target volume (PTV) dose coverage changed by 0.1±0.7% and 0.2±0.3% for 6MV and 10MV from MC-OMAR to MC-plain, and by 0.1±0.1% for both 6MV and 10 MV from PB-OMAR to PB-plain, respectively. In no case for both MC and PB the D0.2cc to spinal cord was found to exceed the planned tolerance changing from OMAR to plain CT in dose calculations. Conclusion: Dosimetric impacts of metallic artifacts caused by low-Z metallic spinal hardware (mainly titanium alloy) are not clinically important in VMAT-based spine RS, without significant dependence on dose calculation methods (MC and PB) and photon energy ≥ 6MV. There is no need to use one algorithm instead of the other to reduce uncertainty for dose reporting. The dose calculation method that should be used in spine RS shall be consistent with the usual clinical practice.« less

  5. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawai, D; Takahashi, R; Kamima, T

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencilmore » Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.« less

  6. Prediction of Warfarin Dose Reductions in Puerto Rican Patients, Based on Combinatorial CYP2C9 and VKORC1 Genotypes

    PubMed Central

    Valentin, Isa Ivette; Vazquez, Joan; Rivera-Miranda, Giselle; Seip, Richard L; Velez, Meredith; Kocherla, Mohan; Bogaard, Kali; Cruz-Gonzalez, Iadelisse; Cadilla, Carmen L; Renta, Jessica Y; Felliu, Juan F; Ramos, Alga S; Alejandro-Cowan, Yirelia; Gorowski, Krystyna; Ruaño, Gualberto; Duconge, Jorge

    2012-01-01

    BACKGROUND The influence of CYP2C9 and VKORC1 polymorphisms on warfarin dose has been investigated in white, Asian, and African American populations but not in Puerto Rican Hispanic patients. OBJECTIVE To test the associations between genotypes, international normalized ratio (INR) measurements, and warfarin dosing and gauge the impact of these polymorphisms on warfarin dose, using a published algorithm. METHODS A retrospective warfarin pharmacogenetic association study in 106 Puerto Rican patients was performed. DNA samples from patients were assayed for 12 variants in both CYP2C9 and VKORC1 loci by HILOmet PhyzioType assay. Demographic and clinical nongenetic data were retrospectively collected from medical records. Allele and genotype frequencies were determined and Hardy-Weinberg equilibrium (HWE) was tested. RESULTS Sixty-nine percent of patients were carriers of at least one polymorphism in either the CYP2C9 or the VKORC1 gene. Double, triple, and quadruple carriers accounted for 22%, 5%, and 1%, respectively. No significant departure from HWE was found. Among patients with a given CYP2C9 genotype, warfarin dose requirements declined from GG to AA haplotypes; whereas, within each VKORC1 haplotype, the dose decreased as the number of CYP2C9 variants increased. The presence of these loss-of-function alleles was associated with more out-of-range INR measurements (OR = 1.38) but not with significant INR >4 during the initiation phase. Analyses based on a published pharmacogenetic algorithm predicted dose reductions of up to 4.9 mg/day in carriers and provided better dose prediction in an extreme subgroup of highly sensitive patients, but also suggested the need to improve predictability by developing a customized model for use in Puerto Rican patients. CONCLUSIONS This study laid important groundwork for supporting a prospective pharmacogenetic trial in Puerto Ricans to detect the benefits of incorporating relevant genomic information into a customized DNA-guided warfarin dosing algorithm. PMID:22274142

  7. Prediction of warfarin dose reductions in Puerto Rican patients, based on combinatorial CYP2C9 and VKORC1 genotypes.

    PubMed

    Valentin, Isa Ivette; Vazquez, Joan; Rivera-Miranda, Giselle; Seip, Richard L; Velez, Meredith; Kocherla, Mohan; Bogaard, Kali; Cruz-Gonzalez, Iadelisse; Cadilla, Carmen L; Renta, Jessica Y; Feliu, Juan F; Ramos, Alga S; Alejandro-Cowan, Yirelia; Gorowski, Krystyna; Ruaño, Gualberto; Duconge, Jorge

    2012-02-01

    The influence of CYP2C9 and VKORC1 polymorphisms on warfarin dose has been investigated in white, Asian, and African American populations but not in Puerto Rican Hispanic patients. To test the associations between genotypes, international normalized ratio (INR) measurements, and warfarin dosing and gauge the impact of these polymorphisms on warfarin dose, using a published algorithm. A retrospective warfarin pharmacogenetic association study in 106 Puerto Rican patients was performed. DNA samples from patients were assayed for 12 variants in both CYP2C9 and VKORC1 loci by HILOmet PhyzioType assay. Demographic and clinical nongenetic data were retrospectively collected from medical records. Allele and genotype frequencies were determined and Hardy-Weinberg equilibrium (HWE) was tested. Sixty-nine percent of patients were carriers of at least one polymorphism in either the CYP2C9 or the VKORC1 gene. Double, triple, and quadruple carriers accounted for 22%, 5%, and 1%, respectively. No significant departure from HWE was found. Among patients with a given CYP2C9 genotype, warfarin dose requirements declined from GG to AA haplotypes; whereas, within each VKORC1 haplotype, the dose decreased as the number of CYP2C9 variants increased. The presence of these loss-of-function alleles was associated with more out-of-range INR measurements (OR = 1.38) but not with significant INR >4 during the initiation phase. Analyses based on a published pharmacogenetic algorithm predicted dose reductions of up to 4.9 mg/day in carriers and provided better dose prediction in an extreme subgroup of highly sensitive patients, but also suggested the need to improve predictability by developing a customized model for use in Puerto Rican patients. This study laid important groundwork for supporting a prospective pharmacogenetic trial in Puerto Ricans to detect the benefits of incorporating relevant genomic information into a customized DNA-guided warfarin dosing algorithm.

  8. Delivery confirmation of bolus electron conformal therapy combined with intensity modulated x-ray therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavanaugh, James A.; Hogstrom, Kenneth R.; Fontenot, Jonas P.

    2013-02-15

    Purpose: The purpose of this study was to demonstrate that a bolus electron conformal therapy (ECT) dose plan and a mixed beam plan, composed of an intensity modulated x-ray therapy (IMXT) dose plan optimized on top of the bolus ECT plan, can be accurately delivered. Methods: Calculated dose distributions were compared with measured dose distributions for parotid and chest wall (CW) bolus ECT and mixed beam plans, each simulated in a cylindrical polystyrene phantom that allowed film dose measurements. Bolus ECT plans were created for both parotid and CW PTVs (planning target volumes) using 20 and 16 MeV beams, respectively,more » whose 90% dose surface conformed to the PTV. Mixed beam plans consisted of an IMXT dose plan optimized on top of the bolus ECT dose plan. The bolus ECT, IMXT, and mixed beam dose distributions were measured using radiographic films in five transverse and one sagittal planes for a total of 36 measurement conditions. Corrections for film dose response, effects of edge-on photon irradiation, and effects of irregular phantom optical properties on the Cerenkov component of the film signal resulted in high precision measurements. Data set consistency was verified by agreement of depth dose at the intersections of the sagittal plane with the five measured transverse planes. For these same depth doses, results for the mixed beam plan agreed with the sum of the individual depth doses for the bolus ECT and IMXT plans. The six mean measured planar dose distributions were compared with those calculated by the treatment planning system for all modalities. Dose agreement was assessed using the 4% dose difference and 0.2 cm distance to agreement. Results: For the combined high-dose region and low-dose region, pass rates for the parotid and CW plans were 98.7% and 96.2%, respectively, for the bolus ECT plans and 97.9% and 97.4%, respectively, for the mixed beam plans. For the high-dose gradient region, pass rates for the parotid and CW plans were 93.1% and 94.62%, respectively, for the bolus ECT plans and 89.2% and 95.1%, respectively, for the mixed beam plans. For all regions, pass rates for the parotid and CW plans were 98.8% and 97.3%, respectively, for the bolus ECT plans and 97.5% and 95.9%, respectively, for the mixed beam plans. For the IMXT component of the mixed beam plans, pass rates for the parotid and CW plans were 93.7% and 95.8%. Conclusions: Bolus ECT and mixed beam therapy dose delivery to the phantom were more accurate than IMXT delivery, adding confidence to the use of planning, fabrication, and delivery for bolus ECT tools either alone or as part of mixed beam therapy. The methodology reported in this work could serve as a basis for future standardization of the commissioning of bolus ECT or mixed beam therapy. When applying this technology to patients, it is recommended that an electron dose algorithm more accurate than the pencil beam algorithm, e.g., a Monte Carlo algorithm or analytical transport such as the pencil beam redefinition algorithm, be used for planning to ensure the desired accuracy.« less

  9. Clinical Implications of TiGRT Algorithm for External Audit in Radiation Oncology.

    PubMed

    Shahbazi-Gahrouei, Daryoush; Saeb, Mohsen; Monadi, Shahram; Jabbari, Iraj

    2017-01-01

    Performing audits play an important role in quality assurance program in radiation oncology. Among different algorithms, TiGRT is one of the common application software for dose calculation. This study aimed to clinical implications of TiGRT algorithm to measure dose and compared to calculated dose delivered to the patients for a variety of cases, with and without the presence of inhomogeneities and beam modifiers. Nonhomogeneous phantom as quality dose verification phantom, Farmer ionization chambers, and PC-electrometer (Sun Nuclear, USA) as a reference class electrometer was employed throughout the audit in linear accelerators 6 and 18 MV energies (Siemens ONCOR Impression Plus, Germany). Seven test cases were performed using semi CIRS phantom. In homogeneous regions and simple plans for both energies, there was a good agreement between measured and treatment planning system calculated dose. Their relative error was found to be between 0.8% and 3% which is acceptable for audit, but in nonhomogeneous organs, such as lung, a few errors were observed. In complex treatment plans, when wedge or shield in the way of energy is used, the error was in the accepted criteria. In complex beam plans, the difference between measured and calculated dose was found to be 2%-3%. All differences were obtained between 0.4% and 1%. A good consistency was observed for the same type of energy in the homogeneous and nonhomogeneous phantom for the three-dimensional conformal field with a wedge, shield, asymmetric using the TiGRT treatment planning software in studied center. The results revealed that the national status of TPS calculations and dose delivery for 3D conformal radiotherapy was globally within acceptable standards with no major causes for concern.

  10. SU-F-SPS-10: The Dosimetric Comparison of GammaKnife and Cyberknife Treatment Plans for Brain SRS Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanli, E; Mabhouti, H; Cebe, M

    Purpose: Brain stereotactic radiosurgery (SRS) involves the use of precisely directed, single session radiation to create a desired radiobiologic response within the brain target with acceptable minimal effects on surrounding structures or tissues. In this study, the dosimetric comparison of GammaKnife perfection and Cyberknife M6 treatment plans were made. Methods: Treatment plannings were done for GammaKnife perfection unit using Gammaplan treatment planning system (TPS) on the CT scan of head and neck randophantom simulating the treatment of sterotactic treatments for one brain metastasis. The dose distribution were calculated using TMR 10 algorithm. The treatment planning for the same target weremore » also done for Cyberknife M6 machine using Multiplan (TPS) with Monte Carlo algorithm. Using the same film batch, the net OD to dose calibration curve was obtained using both machine by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. Dose distribution were measured using EBT3 film dosimeter. The measured and calculated doses were compared. Results: The dose distribution in the target and 2 cm beyond the target edge were calculated on TPSs and measured using EBT3 film. For cyberknife treatment plans, the gamma analysis passing rates between measured and calculated dose distributions were 99.2% and 96.7% for target and peripheral region of target respectively. For gammaknife treatment plans, the gamma analysis passing rates were 98.9% and 93.2% for target and peripheral region of target respectively. Conclusion: The study shows that dosimetrically comparable plans are achievable with Cyberknife and GammaKnife. Although TMR 10 algorithm predicts the target dose.« less

  11. Achieving Routine Submillisievert CT Scanning: Report from the Summit on Management of Radiation Dose in CT

    PubMed Central

    Chen, Guang Hong; Kalender, Willi; Leng, Shuai; Samei, Ehsan; Taguchi, Katsuyuki; Wang, Ge; Yu, Lifeng; Pettigrew, Roderic I.

    2012-01-01

    This Special Report presents the consensus of the Summit on Management of Radiation Dose in Computed Tomography (CT) (held in February 2011), which brought together participants from academia, clinical practice, industry, and regulatory and funding agencies to identify the steps required to reduce the effective dose from routine CT examinations to less than 1 mSv. The most promising technologies and methods discussed at the summit include innovations and developments in x-ray sources; detectors; and image reconstruction, noise reduction, and postprocessing algorithms. Access to raw projection data and standard data sets for algorithm validation and optimization is a clear need, as is the need for new, clinically relevant metrics of image quality and diagnostic performance. Current commercially available techniques such as automatic exposure control, optimization of tube potential, beam-shaping filters, and dynamic z-axis collimators are important, and education to successfully implement these methods routinely is critically needed. Other methods that are just becoming widely available, such as iterative reconstruction, noise reduction, and postprocessing algorithms, will also have an important role. Together, these existing techniques can reduce dose by a factor of two to four. Technical advances that show considerable promise for additional dose reduction but are several years or more from commercial availability include compressed sensing, volume of interest and interior tomography techniques, and photon-counting detectors. This report offers a strategic roadmap for the CT user and research and manufacturer communities toward routinely achieving effective doses of less than 1 mSv, which is well below the average annual dose from naturally occurring sources of radiation. © RSNA, 2012 PMID:22692035

  12. Clinical Implications of TiGRT Algorithm for External Audit in Radiation Oncology

    PubMed Central

    Shahbazi-Gahrouei, Daryoush; Saeb, Mohsen; Monadi, Shahram; Jabbari, Iraj

    2017-01-01

    Background: Performing audits play an important role in quality assurance program in radiation oncology. Among different algorithms, TiGRT is one of the common application software for dose calculation. This study aimed to clinical implications of TiGRT algorithm to measure dose and compared to calculated dose delivered to the patients for a variety of cases, with and without the presence of inhomogeneities and beam modifiers. Materials and Methods: Nonhomogeneous phantom as quality dose verification phantom, Farmer ionization chambers, and PC-electrometer (Sun Nuclear, USA) as a reference class electrometer was employed throughout the audit in linear accelerators 6 and 18 MV energies (Siemens ONCOR Impression Plus, Germany). Seven test cases were performed using semi CIRS phantom. Results: In homogeneous regions and simple plans for both energies, there was a good agreement between measured and treatment planning system calculated dose. Their relative error was found to be between 0.8% and 3% which is acceptable for audit, but in nonhomogeneous organs, such as lung, a few errors were observed. In complex treatment plans, when wedge or shield in the way of energy is used, the error was in the accepted criteria. In complex beam plans, the difference between measured and calculated dose was found to be 2%–3%. All differences were obtained between 0.4% and 1%. Conclusions: A good consistency was observed for the same type of energy in the homogeneous and nonhomogeneous phantom for the three-dimensional conformal field with a wedge, shield, asymmetric using the TiGRT treatment planning software in studied center. The results revealed that the national status of TPS calculations and dose delivery for 3D conformal radiotherapy was globally within acceptable standards with no major causes for concern. PMID:28989910

  13. Cost-effectiveness of pharmacogenetics-guided warfarin therapy vs. alternative anticoagulation in atrial fibrillation.

    PubMed

    Pink, J; Pirmohamed, M; Lane, S; Hughes, D A

    2014-02-01

    Pharmacogenetics-guided warfarin dosing is an alternative to standard clinical algorithms and new oral anticoagulants for patients with nonvalvular atrial fibrillation. However, clinical evidence for pharmacogenetics-guided warfarin dosing is limited to intermediary outcomes, and consequently, there is a lack of information on the cost-effectiveness of anticoagulation treatment options. A clinical trial simulation of S-warfarin was used to predict times within therapeutic range for different dosing algorithms. Relative risks of clinical events, obtained from a meta-analysis of trials linking times within therapeutic range with outcomes, served as inputs to an economic analysis. Neither dabigatran nor rivaroxaban were cost-effective options. Along the cost-effectiveness frontier, in relation to clinically dosed warfarin, pharmacogenetics-guided warfarin and apixaban had incremental cost-effectiveness ratios of £13,226 and £20,671 per quality-adjusted life year gained, respectively. On the basis of our simulations, apixaban appears to be the most cost-effective treatment.

  14. Evaluation of the usefulness of a MOSFET detector in an anthropomorphic phantom for 6-MV photon beam.

    PubMed

    Kohno, Ryosuke; Hirano, Eriko; Kitou, Satoshi; Goka, Tomonori; Matsubara, Kana; Kameoka, Satoru; Matsuura, Taeko; Ariji, Takaki; Nishio, Teiji; Kawashima, Mitsuhiko; Ogino, Takashi

    2010-07-01

    In order to evaluate the usefulness of a metal oxide-silicon field-effect transistor (MOSFET) detector as a in vivo dosimeter, we performed in vivo dosimetry using the MOSFET detector with an anthropomorphic phantom. We used the RANDO phantom as an anthropomorphic phantom, and dose measurements were carried out in the abdominal, thoracic, and head and neck regions for simple square field sizes of 10 x 10, 5 x 5, and 3 x 3 cm(2) with a 6-MV photon beam. The dose measured by the MOSFET detector was verified by the dose calculations of the superposition (SP) algorithm in the XiO radiotherapy treatment-planning system. In most cases, the measured doses agreed with the results of the SP algorithm within +/-3%. Our results demonstrated the utility of the MOSFET detector for in vivo dosimetry even in the presence of clinical tissue inhomogeneities.

  15. Warfarin therapy: in need of improvement after all these years

    PubMed Central

    Kimmel, Stephen E

    2010-01-01

    Background Warfarin therapy has been used clinically for over 60 years, yet continues to be problematic because of its narrow therapeutic index and large inter-individual variability in patient response. As a result, warfarin is a leading cause of serious medication-related adverse events, and its efficacy is also suboptimal. Objective To review factors that are responsible for variable response to warfarin, including clinical, environmental, and genetic factors, and to explore some possible approaches to improving warfarin therapy. Results Recent efforts have focused on developing dosing algorithms that included genetic information to try to improve warfarin dosing. These dosing algorithms hold promise, but have not been fully validated or tested in rigorous clinical trials. Perhaps equally importantly, adherence to warfarin is a major problem that should be addressed with innovative and cost-effective interventions. Conclusion Additional research is needed to further test whether interventions can be used to improve warfarin dosing and outcomes. PMID:18345947

  16. Characterization of differences in calculated and actual measured skin doses to canine limbs during stereotactic radiosurgery using Gafchromic film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walters, Jerri; Colorado State University, Fort Collins, CO; Ryan, Stewart

    Accurate calculation of absorbed dose to the skin, especially the superficial and radiosensitive basal cell layer, is difficult for many reasons including, but not limited to, the build-up effect of megavoltage photons, tangential beam effects, mixed energy scatter from support devices, and dose interpolation caused by a finite resolution calculation matrix. Stereotactic body radiotherapy (SBRT) has been developed as an alternative limb salvage treatment option at Colorado State University Veterinary Teaching Hospital for dogs with extremity bone tumors. Optimal dose delivery to the tumor during SBRT treatment can be limited by uncertainty in skin dose calculation. The aim of thismore » study was to characterize the difference between measured and calculated radiation dose by the Varian Eclipse (Varian Medical Systems, Palo Alto, CA) AAA treatment planning algorithm (for 1-mm, 2-mm, and 5-mm calculation voxel dimensions) as a function of distance from the skin surface. The study used Gafchromic EBT film (International Specialty Products, Wayne, NJ), FilmQA analysis software, a limb phantom constructed from plastic water Trade-Mark-Sign (fluke Biomedical, Everett, WA) and a canine cadaver forelimb. The limb phantom was exposed to 6-MV treatments consisting of a single-beam, a pair of parallel opposed beams, and a 7-beam coplanar treatment plan. The canine forelimb was exposed to the 7-beam coplanar plan. Radiation dose to the forelimb skin at the surface and at depths of 1.65 mm and 1.35 mm below the skin surface were also measured with the Gafchromic film. The calculation algorithm estimated the dose well at depths beyond buildup for all calculation voxel sizes. The calculation algorithm underestimated the dose in portions of the buildup region of tissue for all comparisons, with the most significant differences observed in the 5-mm calculation voxel and the least difference in the 1-mm voxel. Results indicate a significant difference between measured and calculated data extending to average depths of 2.5 mm, 3.4 mm, and 10 mm for the 1-mm, 2-mm, and 5-mm dimension calculation matrices, respectively. These results emphasize the importance of selecting as small a treatment planning software calculation matrix dimension as is practically possible and of taking a conservative approach for skin treatment planning objectives. One suggested conservative approach is accomplished by defining the skin organ as the outermost 2-3 mm of the body such that the high dose tail of the skin organ dose-volume histogram curve represents dose on the deep side of the skin where the algorithm is more accurate.« less

  17. Comparison of Acuros (AXB) and Anisotropic Analytical Algorithm (AAA) for dose calculation in treatment of oesophageal cancer: effects on modelling tumour control probability.

    PubMed

    Padmanaban, Sriram; Warren, Samantha; Walsh, Anthony; Partridge, Mike; Hawkins, Maria A

    2014-12-23

    To investigate systematic changes in dose arising when treatment plans optimised using the Anisotropic Analytical Algorithm (AAA) are recalculated using Acuros XB (AXB) in patients treated with definitive chemoradiotherapy (dCRT) for locally advanced oesophageal cancers. We have compared treatment plans created using AAA with those recalculated using AXB. Although the Anisotropic Analytical Algorithm (AAA) is currently more widely used in clinical routine, Acuros XB (AXB) has been shown to more accurately calculate the dose distribution, particularly in heterogeneous regions. Studies to predict clinical outcome should be based on modelling the dose delivered to the patient as accurately as possible. CT datasets from ten patients were selected for this retrospective study. VMAT (Volumetric modulated arc therapy) plans with 2 arcs, collimator rotation ± 5-10° and dose prescription 50 Gy / 25 fractions were created using Varian Eclipse (v10.0). The initial dose calculation was performed with AAA, and AXB plans were created by re-calculating the dose distribution using the same number of monitor units (MU) and multileaf collimator (MLC) files as the original plan. The difference in calculated dose to organs at risk (OAR) was compared using dose-volume histogram (DVH) statistics and p values were calculated using the Wilcoxon signed rank test. The potential clinical effect of dosimetric differences in the gross tumour volume (GTV) was evaluated using three different TCP models from the literature. PTV Median dose was apparently 0.9 Gy lower (range: 0.5 Gy - 1.3 Gy; p < 0.05) for VMAT AAA plans re-calculated with AXB and GTV mean dose was reduced by on average 1.0 Gy (0.3 Gy -1.5 Gy; p < 0.05). An apparent difference in TCP of between 1.2% and 3.1% was found depending on the choice of TCP model. OAR mean dose was lower in the AXB recalculated plan than the AAA plan (on average, dose reduction: lung 1.7%, heart 2.4%). Similar trends were seen for CRT plans. Differences in dose distribution are observed with VMAT and CRT plans recalculated with AXB particularly within soft tissue at the tumour/lung interface, where AXB has been shown to more accurately represent the true dose distribution. AAA apparently overestimates dose, particularly the PTV median dose and GTV mean dose, which could result in a difference in TCP model parameters that reaches clinical significance.

  18. Total body irradiation, toward optimal individual delivery: dose evaluation with metal oxide field effect transistors, thermoluminescence detectors, and a treatment planning system.

    PubMed

    Bloemen-van Gurp, Esther J; Mijnheer, Ben J; Verschueren, Tom A M; Lambin, Philippe

    2007-11-15

    To predict the three-dimensional dose distribution of our total body irradiation technique, using a commercial treatment planning system (TPS). In vivo dosimetry, using metal oxide field effect transistors (MOSFETs) and thermoluminescence detectors (TLDs), was used to verify the calculated dose distributions. A total body computed tomography scan was performed and loaded into our TPS, and a three-dimensional-dose distribution was generated. In vivo dosimetry was performed at five locations on the patient. Entrance and exit dose values were converted to midline doses using conversion factors, previously determined with phantom measurements. The TPS-predicted dose values were compared with the MOSFET and TLD in vivo dose values. The MOSFET and TLD dose values agreed within 3.0% and the MOSFET and TPS data within 0.5%. The convolution algorithm of the TPS, which is routinely applied in the clinic, overestimated the dose in the lung region. Using a superposition algorithm reduced the calculated lung dose by approximately 3%. The dose inhomogeneity, as predicted by the TPS, can be reduced using a simple intensity-modulated radiotherapy technique. The use of a TPS to calculate the dose distributions in individual patients during total body irradiation is strongly recommended. Using a TPS gives good insight of the over- and underdosage in a patient and the influence of patient positioning on dose homogeneity. MOSFETs are suitable for in vivo dosimetry purposes during total body irradiation, when using appropriate conversion factors. The MOSFET, TLD, and TPS results agreed within acceptable margins.

  19. Feasibility of a low-dose orbital CT protocol with a knowledge-based iterative model reconstruction algorithm for evaluating Graves' orbitopathy.

    PubMed

    Lee, Ho-Joon; Kim, Jinna; Kim, Ki Wook; Lee, Seung-Koo; Yoon, Jin Sook

    2018-06-23

    To evaluate the clinical feasibility of low-dose orbital CT with a knowledge-based iterative model reconstruction (IMR) algorithm for evaluating Graves' orbitopathy. Low-dose orbital CT was performed with a CTDI vol of 4.4 mGy. In 12 patients for whom prior or subsequent non-low-dose orbital CT data obtained within 12 months were available, background noise, SNR, and CNR were compared for images generated using filtered back projection (FBP), hybrid iterative reconstruction (iDose 4 ), and IMR and non-low-dose CT images. Comparison of clinically relevant measurements for Graves' orbitopathy, such as rectus muscle thickness and retrobulbar fat area, was performed in a subset of 6 patients who underwent CT for causes other than Graves' orbitopathy, by using the Wilcoxon signed-rank test. The lens dose estimated from skin dosimetry on a phantom was 4.13 mGy, which was on average 59.34% lower than that of the non-low-dose protocols. Image quality in terms of background noise, SNR, and CNR was the best for IMR, followed by non-low-dose CT, iDose 4 , and FBP, in descending order. A comparison of clinically relevant measurements revealed no significant difference in the retrobulbar fat area and the inferior and medial rectus muscle thicknesses between the low-dose and non-low-dose CT images. Low-dose CT with IMR may be performed without significantly affecting the measurement of prognostic parameters for Graves' orbitopathy while lowering the lens dose and image noise. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Comparison of pencil beam–based homogeneous vs inhomogeneous target dose planning for stereotactic body radiotherapy of peripheral lung tumors through Monte Carlo–based recalculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohtakara, Kazuhiro, E-mail: ohtakara@murakami.asahi-u.ac.jp; Hoshi, Hiroaki

    2015-10-01

    This study was conducted to ascertain whether homogeneous target dose planning is suitable for stereotactic body radiotherapy (SBRT) of peripheral lung cancer under appropriate breath-holding. For 20 peripheral lung tumors, paired dynamic conformal arc plans were generated by only adjusting the leaf margin to the planning target volume (PTV) edge for fulfilling the conditions such that the prescription isodose surface (IDS) encompassing exactly 95% of the PTV (PTV D{sub 95}) corresponds to 95% and 80% IDS, normalized to 100% at the PTV isocenter under a pencil beam (PB) algorithm with radiologic path length correction. These plans were recalculated using themore » x-ray voxel Monte Carlo (XVMC) algorithm under otherwise identical conditions, and then compared. Lesions abutting the parietal pleura or not were defined as edge or island tumors, respectively, and the influences of the target volume and its location relative to the chest wall on the target dose were examined. The median (range) leaf margin required for the 95% and 80% plans was 3.9 mm (1.3 to 5.0) and −1.2 mm (−1.8 to 0.1), respectively. Notably, the latter was significantly correlated negatively with PTV. In the 80% plans, the PTV D{sub 95} was slightly higher under XVMC, whereas the PTV D{sub 98} was significantly lower, irrespective of the dose calculation algorithm used. Other PTV and all gross tumor volume doses were significantly higher, while the lung doses outside the PTV were slightly lower. The target doses increased as a function of PTV and were significantly lower for island tumors than for edge tumors. In conclusion, inhomogeneous target dose planning using smaller leaf margin for a larger tumor volume was deemed suitable in ensuring more sufficient target dose while slightly reducing lung dose. In addition, more inhomogeneous target dose planning using <80% IDS (e.g., 70%) for PTV covering would be preferable for island tumors.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venencia, C; Pino, M; Caussa, L

    Purpose: The purpose of this work was to quantify the dosimetric impact of Monte Carlo (MC) dose calculation algorithm compared to Pencil Beam (PB) on Spine SBRT with HybridARC (HA) and sliding windows IMRT (dMLC) treatment modality. Methods: A 6MV beam (1000MU/min) produced by a Novalis TX (BrainLAB-Varian) equipped with HDMLC was used. HA uses 1 arc plus 8 IMRT beams (arc weight between 60–40%) and dIMRT 15 beams. Plans were calculated using iPlan v.4.5.3 (BrainLAB) and the treatment dose prescription was 27Gy in 3 fractions. Dose calculation was done by PB (4mm spatial resolution) with heterogeneity correction and MCmore » dose to water (4mm spatial resolution and 4% mean variance). PTV and spinal cord dose comparison were done. Study was done on 12 patients. IROC Spine Phantom was used to validate HA and quantify dose variation using PB and MC algorithm. Results: The difference between PB and MC for PTV D98%, D95%, Dmean, D2% were 2.6% [−5.1, 6.8], 0.1% [−4.2, 5.4], 0.9% [−1.5, 3.8] and 2.4% [−0.5, 8.3]. The difference between PB and MC for spinal cord Dmax, D1.2cc and D0.35cc were 5.3% [−6.4, 18.4], 9% [−7.0, 17.0] and 7.6% [−0.6, 14.8] respectively. IROC spine phantom shows PTV TLD dose variation of 0.98% for PB and 1.01% for MC. Axial and sagittal film plane gamma index (5%-3mm) was 95% and 97% for PB and 95% and 99% for MC. Conclusion: PB slightly underestimates the dose for the PTV. For the spinal cord PB underestimates the dose and dose differences could be as high as 18% which could have unexpected clinical impact. CI shows no variation between PB and MC for both treatment modalities Treatment modalities have no impact with the dose calculation algorithms used. Following the IROC pass-fail criteria, treatment acceptance requirement was fulfilled for PB and MC.« less

  2. The role of advanced reconstruction algorithms in cardiac CT

    PubMed Central

    Halliburton, Sandra S.; Tanabe, Yuki; Partovi, Sasan

    2017-01-01

    Non-linear iterative reconstruction (IR) algorithms have been increasingly incorporated into clinical cardiac CT protocols at institutions around the world. Multiple IR algorithms are available commercially from various vendors. IR algorithms decrease image noise and are primarily used to enable lower radiation dose protocols. IR can also be used to improve image quality for imaging of obese patients, coronary atherosclerotic plaques, coronary stents, and myocardial perfusion. In this article, we will review the various applications of IR algorithms in cardiac imaging and evaluate how they have changed practice. PMID:29255694

  3. A Phantom Study on Fetal Dose Reducing Factors in Pregnant Patients with Breast Cancer during Radiotherapy Treatment

    PubMed Central

    Öğretici, Akın; Çakır, Aydın; Akbaş, Uğur; Köksal, Canan; Kalafat, Ümmühan; Tambaş, Makbule; Bilge, Hatice

    2017-01-01

    Purpose: This study aims to investigate the factors that reduce fetal dose in pregnant patients with breast cancer throughout their radiation treatment. Two main factors in a standard radiation oncology center are considered as the treatment planning systems (TPSs) and simple shielding for intensity modulated radiation therapy technique. Materials and Methods: TPS factor was evaluated with two different planning algorithms: Anisotropic analytical algorithm and Acuros XB (external beam). To evaluate the shielding factor, a standard radiological purpose lead apron was chosen. For both studies, thermoluminescence dosimeters were used to measure the point dose, and an Alderson RANDO-phantom was used to simulate a female pregnant patient in this study. Thirteen measurement points were chosen in the 32nd slice of the phantom to cover all possible locations of a fetus up to 8th week of gestation. Results: The results show that both of the TPS algorithms are incapable of calculating the fetal doses, therefore, unable to reduce them at the planning stage. Shielding with a standard lead apron, however, showed a slight radiation protection (about 4.7%) to the fetus decreasing the mean fetal dose from 84.8 mGy to 80.8 mGy, which cannot be disregarded in case of fetal irradiation. Conclusions: Using a lead apron for shielding the abdominal region of a pregnant patient during breast irradiation showed a minor advantage; however, its possible side effects (i.e., increased scattered radiation and skin dose) should also be investigated further to solidify its benefits. PMID:28974857

  4. Pharmacogenetic versus clinical dosing of warfarin in individuals of Chinese and African-American ancestry: assessment using data simulation.

    PubMed

    Syn, Nicholas L X; Lee, Soo-Chin; Brunham, Liam R; Goh, Boon-Cher

    2015-10-01

    Clinical trials of genotype-guided dosing of warfarin have yielded mixed results, which may in part reflect ethnic differences among study participants. However, no previous study has compared genotype-guided versus clinically guided or standard-of-care dosing in a Chinese population, whereas those involving African-Americans were underpowered to detect significant differences. We present a preclinical strategy that integrates pharmacogenetics (PG) and pharmacometrics to predict the outcome or guide the design of dosing strategies for drugs that show large interindividual variability. We use the example of warfarin and focus on two underrepresented groups in warfarin research. We identified the parameters required to simulate a patient population and the outcome of dosing strategies. PG and pharmacogenetic plus loading (PG+L) algorithms that take into account a patient's VKORC1 and CYP2C9 genotype status were considered and compared against a clinical (CA) algorithm for a simulated Chinese population using a predictive Monte Carlo and pharmacokinetic-pharmacodynamic framework. We also examined a simulated population of African-American ancestry to assess the robustness of the model in relation to real-world clinical trial data. The simulations replicated similar trends observed with clinical data in African-Americans. They further predict that the PG+L regimen is superior to both the CA and the PG regimen in maximizing percentage time in therapeutic range in a Chinese cohort, whereas the CA regimen poses the highest risk of overanticoagulation during warfarin initiation. The findings supplement the literature with an unbiased comparison of warfarin dosing algorithms and highlights interethnic differences in anticoagulation control.

  5. Comparison of doses and NTCP to risk organs with enhanced inspiration gating and free breathing for left-sided breast cancer radiotherapy using the AAA algorithm.

    PubMed

    Edvardsson, Anneli; Nilsson, Martin P; Amptoulach, Sousana; Ceberg, Sofie

    2015-04-10

    The purpose of this study was to investigate the potential dose reduction to the heart, left anterior descending (LAD) coronary artery and the ipsilateral lung for patients treated with tangential and locoregional radiotherapy for left-sided breast cancer with enhanced inspiration gating (EIG) compared to free breathing (FB) using the AAA algorithm. The radiobiological implication of such dose sparing was also investigated. Thirty-two patients, who received tangential or locoregional adjuvant radiotherapy with EIG for left-sided breast cancer, were retrospectively enrolled in this study. Each patient was CT-scanned during FB and EIG. Similar treatment plans, with comparable target coverage, were created in the two CT-sets using the AAA algorithm. Further, the probability of radiation induced cardiac mortality and pneumonitis were calculated using NTCP models. For tangential treatment, the median V25Gy for the heart and LAD was decreased for EIG from 2.2% to 0.2% and 40.2% to 0.1% (p < 0.001), respectively, whereas there was no significant difference in V20Gy for the ipsilateral lung (p = 0.109). For locoregional treatment, the median V25Gy for the heart and LAD was decreased for EIG from 3.3% to 0.2% and 51.4% to 5.1% (p < 0.001), respectively, and the median ipsilateral lung V20Gy decreased from 27.0% for FB to 21.5% (p = 0.020) for EIG. The median excess cardiac mortality probability decreased from 0.49% for FB to 0.02% for EIG (p < 0.001) for tangential treatment and from 0.75% to 0.02% (p < 0.001) for locoregional treatment. There was no significant difference in risk of radiation pneumonitis for tangential treatment (p = 0.179) whereas it decreased for locoregional treatment from 6.82% for FB to 3.17% for EIG (p = 0.004). In this study the AAA algorithm was used for dose calculation to the heart, LAD and left lung when comparing the EIG and FB techniques for tangential and locoregional radiotherapy of breast cancer patients. The results support the dose and NTCP reductions reported in previous studies where dose calculations were performed using the pencil beam algorithm.

  6. SU-E-T-802: Verification of Implanted Cardiac Pacemaker Doses in Intensity-Modulated Radiation Therapy: Dose Prediction Accuracy and Reduction Effect of a Lead Sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, J; Chung, J

    2015-06-15

    Purpose: To verify delivered doses on the implanted cardiac pacemaker, predicted doses with and without dose reduction method were verified using the MOSFET detectors in terms of beam delivery and dose calculation techniques in intensity-modulated radiation therapy (IMRT). Methods: The pacemaker doses for a patient with a tongue cancer were predicted according to the beam delivery methods [step-and-shoot (SS) and sliding window (SW)], intensity levels for dose optimization, and dose calculation algorithms. Dosimetric effects on the pacemaker were calculated three dose engines: pencil-beam convolution (PBC), analytical anisotropic algorithm (AAA), and Acuros-XB. A lead shield of 2 mm thickness was designedmore » for minimizing irradiated doses to the pacemaker. Dose variations affected by the heterogeneous material properties of the pacemaker and effectiveness of the lead shield were predicted by the Acuros-XB. Dose prediction accuracy and the feasibility of the dose reduction strategy were verified based on the measured skin doses right above the pacemaker using mosfet detectors during the radiation treatment. Results: The Acuros-XB showed underestimated skin doses and overestimated doses by the lead-shield effect, even though the lower dose disagreement was observed. It led to improved dose prediction with higher intensity level of dose optimization in IMRT. The dedicated tertiary lead sheet effectively achieved reduction of pacemaker dose up to 60%. Conclusion: The current SS technique could deliver lower scattered doses than recommendation criteria, however, use of the lead sheet contributed to reduce scattered doses.Thin lead plate can be a useful tertiary shielder and it could not acuse malfunction or electrical damage of the implanted pacemaker in IMRT. It is required to estimate more accurate scattered doses of the patient with medical device to design proper dose reduction strategy.« less

  7. From AAA to Acuros XB-clinical implications of selecting either Acuros XB dose-to-water or dose-to-medium.

    PubMed

    Zifodya, Jackson M; Challens, Cameron H C; Hsieh, Wen-Long

    2016-06-01

    When implementing Acuros XB (AXB) as a substitute for anisotropic analytic algorithm (AAA) in the Eclipse Treatment Planning System, one is faced with a dilemma of reporting either dose to medium, AXB-Dm or dose to water, AXB-Dw. To assist with decision making on selecting either AXB-Dm or AXB-Dw for dose reporting, a retrospective study of treated patients for head & neck (H&N), prostate, breast and lung is presented. Ten patients, previously treated using AAA plans, were selected for each site and re-planned with AXB-Dm and AXB-Dw. Re-planning was done with fixed monitor units (MU) as well as non-fixed MUs. Dose volume histograms (DVH) of targets and organs at risk (OAR), were analyzed in conjunction with ICRU-83 recommended dose reporting metrics. Additionally, comparisons of plan homogeneity indices (HI) and MUs were done to further highlight the differences between the algorithms. Results showed that, on average AAA overestimated dose to the target volume and OARs by less than 2.0 %. Comparisons between AXB-Dw and AXB-Dm, for all sites, also showed overall dose differences to be small (<1.5 %). However, in non-water biological media, dose differences between AXB-Dw and AXB-Dm, as large as 4.6 % were observed. AXB-Dw also tended to have unexpectedly high 3D maximum dose values (>135 % of prescription dose) for target volumes with high density materials. Homogeneity indices showed that AAA planning and optimization templates would need to be adjusted only for the H&N and Lung sites. MU comparison showed insignificant differences between AXB-Dw relative to AAA and between AXB-Dw relative to AXB-Dm. However AXB-Dm MUs relative to AAA, showed an average difference of about 1.3 % signifying an underdosage by AAA. In conclusion, when dose is reported as AXB-Dw, the effect that high density structures in the PTV has on the dose distribution should be carefully considered. As the results show overall small dose differences between the algorithms, when transitioning from AAA to AXB, no significant change to existing prescription protocols is expected. As most of the clinical experience is dose-to-water based and calibration protocols and clinical trials are also dose-to-water based and there still exists uncertainties in converting CT number to medium, selecting AXB-Dw is strongly recommended.

  8. Effects of refractive index mismatch in optical CT imaging of polymer gel dosimeters.

    PubMed

    Manjappa, Rakesh; Makki S, Sharath; Kumar, Rajesh; Kanhirodan, Rajan

    2015-02-01

    Proposing an image reconstruction technique, algebraic reconstruction technique-refraction correction (ART-rc). The proposed method takes care of refractive index mismatches present in gel dosimeter scanner at the boundary, and also corrects for the interior ray refraction. Polymer gel dosimeters with high dose regions have higher refractive index and optical density compared to the background medium, these changes in refractive index at high dose results in interior ray bending. The inclusion of the effects of refraction is an important step in reconstruction of optical density in gel dosimeters. The proposed ray tracing algorithm models the interior multiple refraction at the inhomogeneities. Jacob's ray tracing algorithm has been modified to calculate the pathlengths of the ray that traverses through the higher dose regions. The algorithm computes the length of the ray in each pixel along its path and is used as the weight matrix. Algebraic reconstruction technique and pixel based reconstruction algorithms are used for solving the reconstruction problem. The proposed method is tested with numerical phantoms for various noise levels. The experimental dosimetric results are also presented. The results show that the proposed scheme ART-rc is able to reconstruct optical density inside the dosimeter better than the results obtained using filtered backprojection and conventional algebraic reconstruction approaches. The quantitative improvement using ART-rc is evaluated using gamma-index. The refraction errors due to regions of different refractive indices are discussed. The effects of modeling of interior refraction in the dose region are presented. The errors propagated due to multiple refraction effects have been modeled and the improvements in reconstruction using proposed model is presented. The refractive index of the dosimeter has a mismatch with the surrounding medium (for dry air or water scanning). The algorithm reconstructs the dose profiles by estimating refractive indices of multiple inhomogeneities having different refractive indices and optical densities embedded in the dosimeter. This is achieved by tracking the path of the ray that traverses through the dosimeter. Extensive simulation studies have been carried out and results are found to be matching that of experimental results.

  9. WE-D-18A-04: How Iterative Reconstruction Algorithms Affect the MTFs of Variable-Contrast Targets in CT Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dodge, C.T.; Rong, J.; Dodge, C.W.

    2014-06-15

    Purpose: To determine how filtered back-projection (FBP), adaptive statistical (ASiR), and model based (MBIR) iterative reconstruction algorithms affect the measured modulation transfer functions (MTFs) of variable-contrast targets over a wide range of clinically applicable dose levels. Methods: The Catphan 600 CTP401 module, surrounded by an oval, fat-equivalent ring to mimic patient size/shape, was scanned on a GE HD750 CT scanner at 1, 2, 3, 6, 12 and 24 mGy CTDIvol levels with typical patient scan parameters: 120kVp, 0.8s, 40mm beam width, large SFOV, 2.5mm thickness, 0.984 pitch. The images were reconstructed using GE's Standard kernel with FBP; 20%, 40% andmore » 70% ASiR; and MBIR. A task-based MTF (MTFtask) was computed for six cylindrical targets: 2 low-contrast (Polystyrene, LDPE), 2 medium-contrast (Delrin, PMP), and 2 high-contrast (Teflon, air). MTFtask was used to compare the performance of reconstruction algorithms with decreasing CTDIvol from 24mGy, which is currently used in the clinic. Results: For the air target and 75% dose savings (6 mGy), MBIR MTFtask at 5 lp/cm measured 0.24, compared to 0.20 for 70% ASiR and 0.11 for FBP. Overall, for both high-contrast targets, MBIR MTFtask improved with increasing CTDIvol and consistently outperformed ASiR and FBP near the system's Nyquist frequency. Conversely, for Polystyrene at 6 mGy, MBIR (0.10) and 70% ASiR (0.07) MTFtask was lower than for FBP (0.18). For medium and low-contrast targets, FBP remains the best overall algorithm for improved resolution at low CTDIvol (1–6 mGy) levels, whereas MBIR is comparable at higher dose levels (12–24 mGy). Conclusion: MBIR improved the MTF of small, high-contrast targets compared to FBP and ASiR at doses of 50%–12.5% of those currently used in the clinic. However, for imaging low- and mediumcontrast targets, FBP performed the best across all dose levels. For assessing MTF from different reconstruction algorithms, task-based MTF measurements are necessary.« less

  10. Effects of refractive index mismatch in optical CT imaging of polymer gel dosimeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manjappa, Rakesh; Makki S, Sharath; Kanhirodan, Rajan, E-mail: rajan@physics.iisc.ernet.in

    2015-02-15

    Purpose: Proposing an image reconstruction technique, algebraic reconstruction technique-refraction correction (ART-rc). The proposed method takes care of refractive index mismatches present in gel dosimeter scanner at the boundary, and also corrects for the interior ray refraction. Polymer gel dosimeters with high dose regions have higher refractive index and optical density compared to the background medium, these changes in refractive index at high dose results in interior ray bending. Methods: The inclusion of the effects of refraction is an important step in reconstruction of optical density in gel dosimeters. The proposed ray tracing algorithm models the interior multiple refraction at themore » inhomogeneities. Jacob’s ray tracing algorithm has been modified to calculate the pathlengths of the ray that traverses through the higher dose regions. The algorithm computes the length of the ray in each pixel along its path and is used as the weight matrix. Algebraic reconstruction technique and pixel based reconstruction algorithms are used for solving the reconstruction problem. The proposed method is tested with numerical phantoms for various noise levels. The experimental dosimetric results are also presented. Results: The results show that the proposed scheme ART-rc is able to reconstruct optical density inside the dosimeter better than the results obtained using filtered backprojection and conventional algebraic reconstruction approaches. The quantitative improvement using ART-rc is evaluated using gamma-index. The refraction errors due to regions of different refractive indices are discussed. The effects of modeling of interior refraction in the dose region are presented. Conclusions: The errors propagated due to multiple refraction effects have been modeled and the improvements in reconstruction using proposed model is presented. The refractive index of the dosimeter has a mismatch with the surrounding medium (for dry air or water scanning). The algorithm reconstructs the dose profiles by estimating refractive indices of multiple inhomogeneities having different refractive indices and optical densities embedded in the dosimeter. This is achieved by tracking the path of the ray that traverses through the dosimeter. Extensive simulation studies have been carried out and results are found to be matching that of experimental results.« less

  11. Safety reports on the off-label use of baclofen for alcohol-dependence: recommendations to improve causality assessment.

    PubMed

    Rolland, Benjamin; Auffret, Marine; Franchitto, Nicolas

    2016-06-01

    The off-label use of high-dose baclofen (HDB) for alcohol-dependence has recently spread. However, HDB has been associated with numerous reports of adverse events (AEs). Pharmacovigilance reporting is supposed to differentiate AEs from adverse drug reactions (ADRs), for which the causality of the drug is determined using validated methods. Since 2010, we found 20 publications on baclofen-related AEs in alcohol dependence, in Medline-referenced journals or national pharmacovigilance reports. We focused on whether these reports used causality algorithms, and provided essential elements for determining baclofen causality and excluding the involvement of alcohol and other psychoactive substances or psychotropic drugs. In half of the cases, no causality algorithm was used. Detailed information on baclofen dosing was found in 17 out of 20 (85%) articles, whereas alcohol doses were given only in 10 (50%) publications. Other psychoactive substances and psychotropic drugs were broached in 14 (70%) publications. future publications reporting suspected HDB-induced ADRs should use validated causality algorithms and provide sufficient amount of contextual information for excluding other potential causes. For HDB, the psychiatric history, and the longitudinal description of alcohol consumptions and associated doses of psychoactive substances or psychotropic medications should be detailed for every reported case.

  12. Fast local reconstruction by selective backprojection for low dose in dental computed tomography

    NASA Astrophysics Data System (ADS)

    Yan, Bin; Deng, Lin; Han, Yu; Zhang, Feng; Wang, Xian-Chao; Li, Lei

    2014-10-01

    The high radiation dose in computed tomography (CT) scans increases the lifetime risk of cancer, which becomes a major clinical concern. The backprojection-filtration (BPF) algorithm could reduce the radiation dose by reconstructing the images from truncated data in a short scan. In a dental CT, it could reduce the radiation dose for the teeth by using the projection acquired in a short scan, and could avoid irradiation to the other part by using truncated projection. However, the limit of integration for backprojection varies per PI-line, resulting in low calculation efficiency and poor parallel performance. Recently, a tent BPF has been proposed to improve the calculation efficiency by rearranging the projection. However, the memory-consuming data rebinning process is included. Accordingly, the selective BPF (S-BPF) algorithm is proposed in this paper. In this algorithm, the derivative of the projection is backprojected to the points whose x coordinate is less than that of the source focal spot to obtain the differentiated backprojection. The finite Hilbert inverse is then applied to each PI-line segment. S-BPF avoids the influence of the variable limit of integration by selective backprojection without additional time cost or memory cost. The simulation experiment and the real experiment demonstrated the higher reconstruction efficiency of S-BPF.

  13. Genetic polymorphisms are associated with variations in warfarin maintenance dose in Han Chinese patients with venous thromboembolism.

    PubMed

    Zhang, Wei; Zhang, Wei-Juan; Zhu, Jin; Kong, Fan-Cui; Li, Yan-Yan; Wang, He-Yao; Yang, Yuan-Hua; Wang, Chen

    2012-02-01

    Warfarin is a clinical anticoagulant that requires periodic monitoring because it is associated with adverse outcomes. Personalized medicine, which is based on pharmacogenetics, holds great promise in solving these types of problems. It aims to provide the tools and knowledge to tailor drug therapy to an individual patient, with the potential of increasing safety and efficacy of medications. In the present study we analyzed genotypes of 14 SNPs for seven genes using DNA from 297 Han Chinese venous thromboembolism patients treated with warfarin. Multiple regression analyses revealed that CYP2C9 genotype (p = 0.001), VKORC1 genotype (p < 0.001), age (p < 0.01) and weight (p < 0.001) were all associated with warfarin dose requirements, which can explain 37.4% of the variability of warfarin dose among Han Chinese patients. Meanwhile, in the validation cohort, the predicted warfarin daily dose was calculated using the best model with a 64.5% predicted dose being acceptable (-1 mg/day ≤Δwarfarin dose ≤1 mg/day). We developed a pharmacogenetic dose algorithm for warfarin treatment that uses genotypes from two genes (VKORC1 and CYP2C9) and clinical variables to predict therapeutic maintenance doses in Chinese patients with venous thromboembolism. The validity of the dosing algorithm was confirmed in a cohort of venous thromboembolism patients on warfarin therapy.

  14. Adaptive, dose-finding phase 2 trial evaluating the safety and efficacy of ABT-089 in mild to moderate Alzheimer disease.

    PubMed

    Lenz, Robert A; Pritchett, Yili L; Berry, Scott M; Llano, Daniel A; Han, Shu; Berry, Donald A; Sadowsky, Carl H; Abi-Saab, Walid M; Saltarelli, Mario D

    2015-01-01

    ABT-089, an α4β2 neuronal nicotinic receptor partial agonist, was evaluated for efficacy and safety in mild to moderate Alzheimer disease patients receiving stable doses of acetylcholinesterase inhibitors. This phase 2 double-blind, placebo-controlled, proof-of-concept, and dose-finding study adaptively randomized patients to receive ABT-089 (5, 10, 15, 20, 30, or 35 mg once daily) or placebo for 12 weeks. The primary efficacy endpoint was the Alzheimer's Disease Assessment Scale, cognition subscale (ADAS-Cog) total score. A Bayesian response-adaptive randomization algorithm dynamically assigned allocation probabilities based on interim ADAS-Cog total scores. A normal dynamic linear model for dose-response relationships and a longitudinal model for predicting final ADAS-cog score were employed in the algorithm. Stopping criteria for futility or success were defined. The futility stopping criterion was met, terminating the study with 337 patients randomized. No dose-response relationship was observed and no dose demonstrated statistically significant improvement over placebo on ADAS-Cog or any secondary endpoint. ABT-089 was well tolerated at all dose levels. When administered as adjunctive therapy to acetylcholinesterase inhibitors, ABT-089 was not efficacious in mild to moderate Alzheimer disease. The adaptive study design enabled the examination of a broad dose range, enabled rapid determination of futility, and reduced patient exposure to nonefficacious doses of the investigational compound.

  15. PIVET rFSH dosing algorithms for individualized controlled ovarian stimulation enables optimized pregnancy productivity rates and avoidance of ovarian hyperstimulation syndrome.

    PubMed

    Yovich, John L; Alsbjerg, Birgit; Conceicao, Jason L; Hinchliffe, Peter M; Keane, Kevin N

    2016-01-01

    The first PIVET algorithm for individualized recombinant follicle stimulating hormone (rFSH) dosing in in vitro fertilization, reported in 2012, was based on age and antral follicle count grading with adjustments for anti-Müllerian hormone level, body mass index, day-2 FSH, and smoking history. In 2007, it was enabled by the introduction of a metered rFSH pen allowing small dosage increments of ~8.3 IU per click. In 2011, a second rFSH pen was introduced allowing more precise dosages of 12.5 IU per click, and both pens with their individual algorithms have been applied continuously at our clinic. The objective of this observational study was to validate the PIVET algorithms pertaining to the two rFSH pens with the aim of collecting ≤15 oocytes and minimizing the risk of ovarian hyperstimulation syndrome. The data set included 2,822 in vitro fertilization stimulations over a 6-year period until April 2014 applying either of the two individualized dosing algorithms and corresponding pens. The main outcome measures were mean oocytes retrieved and resultant embryos designated for transfer or cryopreservation permitted calculation of oocyte and embryo utilization rates. Ensuing pregnancies were tracked until live births, and live birth productivity rates embracing fresh and frozen transfers were calculated. Overall, the results showed that mean oocyte numbers were 10.0 for all women <40 years with 24% requiring rFSH dosages <150 IU. Applying both specific algorithms in our clinic meant that the starting dose was not altered for 79.1% of patients and for 30.1% of those receiving the very lowest rFSH dosages (≤75 IU). Only 0.3% patients were diagnosed with severe ovarian hyperstimulation syndrome, all deemed avoidable due to definable breaches from the protocols. The live birth productivity rates exceeded 50% for women <35 years and was 33.2% for the group aged 35-39 years. Routine use of both algorithms led to only 11.6% of women generating >15 oocytes, significantly lower than recently published data applying conventional dosages (38.2%; P<0.0001). When comparing both specific algorithms to each other, the outcomes were mainly comparable for pregnancy, live birth, and miscarriage rate. However, there were significant differences in relation to number of oocytes retrieved, but the mean for both the algorithms remained well below 15 oocytes. Consequently, application of both these algorithms in our in vitro fertilization clinic allows the use of both the rFSH products, with very similar results, and they can be considered validated on the basis of effectiveness and safety, clearly avoiding ovarian hyperstimulation syndrome.

  16. Poster — Thur Eve — 76: Dosimetric Comparison of Pinnacle and iPlan Algorithms with an Anthropomorphic Lung Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, P.; Tambasco, M.; LaFontaine, R.

    2014-08-15

    Our goal is to compare the dosimetric accuracy of the Pinnacle-3 9.2 Collapsed Cone Convolution Superposition (CCCS) and the iPlan 4.1 Monte Carlo (MC) and Pencil Beam (PB) algorithms in an anthropomorphic lung phantom using measurement as the gold standard. Ion chamber measurements were taken for 6, 10, and 18 MV beams in a CIRS E2E SBRT Anthropomorphic Lung Phantom, which mimics lung, spine, ribs, and tissue. The plan implemented six beams with a 5×5 cm{sup 2} field size, delivering a total dose of 48 Gy. Data from the planning systems were computed at the treatment isocenter in the leftmore » lung, and two off-axis points, the spinal cord and the right lung. The measurements were taken using a pinpoint chamber. The best results between data from the algorithms and our measurements occur at the treatment isocenter. For the 6, 10, and 18 MV beams, iPlan 4.1 MC software performs the best with 0.3%, 0.2%, and 4.2% absolute percent difference from measurement, respectively. Differences between our measurements and algorithm data are much greater for the off-axis points. The best agreement seen for the right lung and spinal cord is 11.4% absolute percent difference with 6 MV iPlan 4.1 PB and 18 MV iPlan 4.1 MC, respectively. As energy increases absolute percent difference from measured data increases up to 54.8% for the 18 MV CCCS algorithm. This study suggests that iPlan 4.1 MC computes peripheral dose and target dose in the lung more accurately than the iPlan 4.1 PB and Pinnicale CCCS algorithms.« less

  17. Poster - Thur Eve - 68: Evaluation and analytical comparison of different 2D and 3D treatment planning systems using dosimetry in anthropomorphic phantom.

    PubMed

    Khosravi, H R; Nodehi, Mr Golrokh; Asnaashari, Kh; Mahdavi, S R; Shirazi, A R; Gholami, S

    2012-07-01

    The aim of this study was to evaluate and analytically compare different calculation algorithms applied in our country radiotherapy centers base on the methodology developed by IAEA for treatment planning systems (TPS) commissioning (IAEA TEC-DOC 1583). Thorax anthropomorphic phantom (002LFC CIRS inc.), was used to measure 7 tests that simulate the whole chain of external beam TPS. The dose were measured with ion chambers and the deviation between measured and TPS calculated dose was reported. This methodology, which employs the same phantom and the same setup test cases, was tested in 4 different hospitals which were using 5 different algorithms/ inhomogeneity correction methods implemented in different TPS. The algorithms in this study were divided into two groups including correction based and model based algorithms. A total of 84 clinical test case datasets for different energies and calculation algorithms were produced, which amounts of differences in inhomogeneity points with low density (lung) and high density (bone) was decreased meaningfully with advanced algorithms. The number of deviations outside agreement criteria was increased with the beam energy and decreased with advancement of the TPS calculation algorithm. Large deviations were seen in some correction based algorithms, so sophisticated algorithms, would be preferred in clinical practices, especially for calculation in inhomogeneous media. Use of model based algorithms with lateral transport calculation, is recommended. Some systematic errors which were revealed during this study, is showing necessity of performing periodic audits on TPS in radiotherapy centers. © 2012 American Association of Physicists in Medicine.

  18. Cone-Beam Computed Tomography for Image-Guided Radiation Therapy of Prostate Cancer

    DTIC Science & Technology

    2008-01-01

    forexa t volumetri image re onstru tion. As a onsequense, images re onstru ted by approx-imate algorithms, mostly based on the Feldkamp algorithm...patient dose from CBCT. Reverse heli al CBCT has been developed for exa tre onstru tion of volumetri images, region-of-interest (ROI) re onstru tion...algorithm with a priori informa-tion in few-view CBCT for IGRT. We expe t the proposed algorithm an redu e the numberof proje tions needed for volumetri

  19. A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation.

    PubMed

    Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro

    2012-06-01

    To investigate and validate the clinical feasibility of using half-value layer (HVL) and peak tube potential (kVp) for characterizing a kilovoltage (kV) source spectrum for the purpose of computing kV x-ray dose accrued from imaging procedures. To use this approach to characterize a Varian® On-Board Imager® (OBI) source and perform experimental validation of a novel in-house hybrid dose computation algorithm for kV x-rays. We characterized the spectrum of an imaging kV x-ray source using the HVL and the kVp as the sole beam quality identifiers using third-party freeware Spektr to generate the spectra. We studied the sensitivity of our dose computation algorithm to uncertainties in the beam's HVL and kVp by systematically varying these spectral parameters. To validate our approach experimentally, we characterized the spectrum of a Varian® OBI system by measuring the HVL using a Farmer-type Capintec ion chamber (0.06 cc) in air and compared dose calculations using our computationally validated in-house kV dose calculation code to measured percent depth-dose and transverse dose profiles for 80, 100, and 125 kVp open beams in a homogeneous phantom and a heterogeneous phantom comprising tissue, lung, and bone equivalent materials. The sensitivity analysis of the beam quality parameters (i.e., HVL, kVp, and field size) on dose computation accuracy shows that typical measurement uncertainties in the HVL and kVp (±0.2 mm Al and ±2 kVp, respectively) source characterization parameters lead to dose computation errors of less than 2%. Furthermore, for an open beam with no added filtration, HVL variations affect dose computation accuracy by less than 1% for a 125 kVp beam when field size is varied from 5 × 5 cm(2) to 40 × 40 cm(2). The central axis depth dose calculations and experimental measurements for the 80, 100, and 125 kVp energies agreed within 2% for the homogeneous and heterogeneous block phantoms, and agreement for the transverse dose profiles was within 6%. The HVL and kVp are sufficient for characterizing a kV x-ray source spectrum for accurate dose computation. As these parameters can be easily and accurately measured, they provide for a clinically feasible approach to characterizing a kV energy spectrum to be used for patient specific x-ray dose computations. Furthermore, these results provide experimental validation of our novel hybrid dose computation algorithm. © 2012 American Association of Physicists in Medicine.

  20. SU-D-12A-06: A Comprehensive Parameter Analysis for Low Dose Cone-Beam CT Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, W; Southern Medical University, Guangzhou; Yan, H

    Purpose: There is always a parameter in compressive sensing based iterative reconstruction (IR) methods low dose cone-beam CT (CBCT), which controls the weight of regularization relative to data fidelity. A clear understanding of the relationship between image quality and parameter values is important. The purpose of this study is to investigate this subject based on experimental data and a representative advanced IR algorithm using Tight-frame (TF) regularization. Methods: Three data sets of a Catphan phantom acquired at low, regular and high dose levels are used. For each tests, 90 projections covering a 200-degree scan range are used for reconstruction. Threemore » different regions-of-interest (ROIs) of different contrasts are used to calculate contrast-to-noise ratios (CNR) for contrast evaluation. A single point structure is used to measure modulation transfer function (MTF) for spatial-resolution evaluation. Finally, we analyze CNRs and MTFs to study the relationship between image quality and parameter selections. Results: It was found that: 1) there is no universal optimal parameter. The optimal parameter value depends on specific task and dose level. 2) There is a clear trade-off between CNR and resolution. The parameter for the best CNR is always smaller than that for the best resolution. 3) Optimal parameters are also dose-specific. Data acquired under a high dose protocol require less regularization, yielding smaller optimal parameter values. 4) Comparing with conventional FDK images, TF-based CBCT images are better under a certain optimally selected parameters. The advantages are more obvious for low dose data. Conclusion: We have investigated the relationship between image quality and parameter values in the TF-based IR algorithm. Preliminary results indicate optimal parameters are specific to both the task types and dose levels, providing guidance for selecting parameters in advanced IR algorithms. This work is supported in part by NIH (1R01CA154747-01)« less

  1. Dosing algorithm for warfarin using CYP2C9 and VKORC1 genotyping from a multi-ethnic population: comparison with other equations.

    PubMed

    Wu, Alan H B; Wang, Ping; Smith, Andrew; Haller, Christine; Drake, Katherine; Linder, Mark; Valdes, Roland

    2008-02-01

    Polymorphism in the genes for cytochrome (CYP)2C9 and the vitamin K epoxide reductase complex subunit 1 (VKORC1) affect the pharmacokinetics and pharmacodynamics of warfarin. We developed and validated a warfarin-dosing algorithm for a multi-ethnic population that predicts the best dose for stable anticoagulation, and compared its performance against other regression equations. We determined the allele and haplotype frequencies of genes for CYP2C9 and VKORC1 on 167 Caucasian, African-American, Asian and Hispanic patients on warfarin. On a subset where complete data were available (n=92), we developed a dosing equation that predicts the actual dose needed to maintain target anticoagulation using demographic variables and genotypes. This regression was validated against an independent group of subjects. We also applied our data to five other published warfarin-dosing equations. The allele frequency for CYP2C9*2 and *3 and the A allele for VKORC1 3673 was similar to previously published reports. For Caucasians and Asians, VKORC1 SNPs were in Hardy-Weinberg linkage equilibrium. Some VKORC1 SNPs among the African-American population and one SNP among Hispanics were not in equilibrium. The linear regression of predicted versus actual warfarin dose produced r-values of 0.71 for the training set and 0.67 for the validation set. The regression coefficient improved (to r=0.78 and 0.75, respectively) when rare genotypes were eliminated or when the 7566 VKORC1 genotype was added to the model. All of the regression models tested produced a similar degree of correlation. The exclusion of rare genotypes that are more associated with certain ethnicities improved the model. Minor improvements in algorithms can be observed with the inclusion of ethnicity and more CYP2C9 and VKORC1 SNPs as variables. Major improvements will likely require the identification of new gene associations with warfarin dosing.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosarge, Christina L., E-mail: cbosarge@umail.iu.edu; Ewing, Marvene M.; DesRosiers, Colleen M.

    To demonstrate the dosimetric advantages and disadvantages of standard anteroposterior-posteroanterior (S-AP/PA{sub AAA}), inverse-planned AP/PA (IP-AP/PA) and volumetry-modulated arc (VMAT) radiotherapies in the treatment of children undergoing whole-lung irradiation. Each technique was evaluated by means of target coverage and normal tissue sparing, including data regarding low doses. A historical approach with and without tissue heterogeneity corrections is also demonstrated. Computed tomography (CT) scans of 10 children scanned from the neck to the reproductive organs were used. For each scan, 6 plans were created: (1) S-AP/PA{sub AAA} using the anisotropic analytical algorithm (AAA), (2) IP-AP/PA, (3) VMAT, (4) S-AP/PA{sub NONE} without heterogeneitymore » corrections, (5) S-AP/PA{sub PB} using the Pencil-Beam algorithm and enforcing monitor units from technique 4, and (6) S-AP/PA{sub AAA[FM]} using AAA and forcing fixed monitor units. The first 3 plans compare modern methods and were evaluated based on target coverage and normal tissue sparing. Body maximum and lower body doses (50% and 30%) were also analyzed. Plans 4 to 6 provide a historic view on the progression of heterogeneity algorithms and elucidate what was actually delivered in the past. Averages of each comparison parameter were calculated for all techniques. The S-AP/PA{sub AAA} technique resulted in superior target coverage but had the highest maximum dose to every normal tissue structure. The IP-AP/PA technique provided the lowest dose to the esophagus, stomach, and lower body doses. VMAT excelled at body maximum dose and maximum doses to the heart, spine, and spleen, but resulted in the highest dose in the 30% body range. It was, however, superior to the S-AP/PA{sub AAA} approach in the 50% range. Each approach has strengths and weaknesses thus associated. Techniques may be selected on a case-by-case basis and by physician preference of target coverage vs normal tissue sparing.« less

  3. Population Pharmacokinetics of Busulfan in Pediatric and Young Adult Patients Undergoing Hematopoietic Cell Transplant: A Model-Based Dosing Algorithm for Personalized Therapy and Implementation into Routine Clinical Use

    PubMed Central

    Long-Boyle, Janel; Savic, Rada; Yan, Shirley; Bartelink, Imke; Musick, Lisa; French, Deborah; Law, Jason; Horn, Biljana; Cowan, Morton J.; Dvorak, Christopher C.

    2014-01-01

    Background Population pharmacokinetic (PK) studies of busulfan in children have shown that individualized model-based algorithms provide improved targeted busulfan therapy when compared to conventional dosing. The adoption of population PK models into routine clinical practice has been hampered by the tendency of pharmacologists to develop complex models too impractical for clinicians to use. The authors aimed to develop a population PK model for busulfan in children that can reliably achieve therapeutic exposure (concentration-at-steady-state, Css) and implement a simple, model-based tool for the initial dosing of busulfan in children undergoing HCT. Patients and Methods Model development was conducted using retrospective data available in 90 pediatric and young adult patients who had undergone HCT with busulfan conditioning. Busulfan drug levels and potential covariates influencing drug exposure were analyzed using the non-linear mixed effects modeling software, NONMEM. The final population PK model was implemented into a clinician-friendly, Microsoft Excel-based tool and used to recommend initial doses of busulfan in a group of 21 pediatric patients prospectively dosed based on the population PK model. Results Modeling of busulfan time-concentration data indicates busulfan CL displays non-linearity in children, decreasing up to approximately 20% between the concentrations of 250–2000 ng/mL. Important patient-specific covariates found to significantly impact busulfan CL were actual body weight and age. The percentage of individuals achieving a therapeutic Css was significantly higher in subjects receiving initial doses based on the population PK model (81%) versus historical controls dosed on conventional guidelines (52%) (p = 0.02). Conclusion When compared to the conventional dosing guidelines, the model-based algorithm demonstrates significant improvement for providing targeted busulfan therapy in children and young adults. PMID:25162216

  4. Adaptive statistical iterative reconstruction and bismuth shielding for evaluation of dose reduction to the eye and image quality during head CT

    NASA Astrophysics Data System (ADS)

    Kim, Myeong Seong; Choi, Jiwon; Kim, Sun Young; Kweon, Dae Cheol

    2014-03-01

    There is a concern regarding the adverse effects of increasing radiation doses due to repeated computed tomography (CT) scans, especially in radiosensitive organs and portions thereof, such as the lenses of the eyes. Bismuth shielding with an adaptive statistical iterative reconstruction (ASIR) algorithm was recently introduced in our clinic as a method to reduce the absorbed radiation dose. This technique was applied to the lens of the eye during CT scans. The purpose of this study was to evaluate the reduction in the absorbed radiation dose and to determine the noise level when using bismuth shielding and the ASIR algorithm with the GE DC 750 HD 64-channel CT scanner for CT of the head of a humanoid phantom. With the use of bismuth shielding, the noise level was higher in the beam-hardening artifact areas than in the revealed artifact areas. However, with the use of ASIR, the noise level was lower than that with the use of bismuth alone; it was also lower in the artifact areas. The reduction in the radiation dose with the use of bismuth was greatest at the surface of the phantom to a limited depth. In conclusion, it is possible to reduce the radiation level and slightly decrease the bismuth-induced noise level by using a combination of ASIR as an algorithm process and bismuth as an in-plane hardware-type shielding method.

  5. SU-F-T-609: Impact of Dosimetric Variation for Prescription Dose Using Analytical Anisotropic Algorithm (AAA) in Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawai, D; Takahashi, R; Kamima, T

    Purpose: Actual irradiated prescription dose to patients cannot be verified. Thus, independent dose verification and second treatment planning system are used as the secondary check. AAA dose calculation engine has contributed to lung SBRT. We conducted a multi-institutional study to assess variation of prescription dose for lung SBRT when using AAA in reference to using Acuros XB and Clarkson algorithm. Methods: Six institutes in Japan participated in this study. All SBRT treatments were planed using AAA in Eclipse and Adaptive Convolve (AC) in Pinnacle3. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU,more » Triangle Product, Ishikawa, Japan), which implemented a Clarkson-based dose calculation algorithm using CT image dataset. A retrospective analysis for lung SBRT plans (73 patients) was performed to compute the confidence limit (CL, Average±2SD) in dose between the AAA and the SMU. In one of the institutes, a additional analysis was conducted to evaluate the variations between the AAA and the Acuros XB (AXB). Results: The CL for SMU shows larger systematic and random errors of 8.7±9.9 % for AAA than the errors of 5.7±4.2 % for AC. The variations of AAA correlated with the mean CT values in the voxels of PTV (a correlation coefficient : −0.7) . The comparison of AXB vs. AAA shows smaller systematic and random errors of −0.7±1.7%. The correlation between dose variations for AXB and the mean CT values in PTV was weak (0.4). However, there were several plans with more than 2% deviation of AAPM TG114 (Maximum: −3.3 %). Conclusion: In comparison for AC, prescription dose calculated by AAA may be more variable in lung SBRT patient. Even AXB comparison shows unexpected variation. Care should be taken for the use of AAA in lung SBRT. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  6. SU-E-T-329: Dosimetric Impact of Implementing Metal Artifact Reduction Methods and Metal Energy Deposition Kernels for Photon Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, J; Followill, D; Howell, R

    2015-06-15

    Purpose: To investigate two strategies for reducing dose calculation errors near metal implants: use of CT metal artifact reduction methods and implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) method. Methods: Radiochromic film was used to measure the dose upstream and downstream of titanium and Cerrobend implants. To assess the dosimetric impact of metal artifact reduction methods, dose calculations were performed using baseline, uncorrected images and metal artifact reduction Methods: Philips O-MAR, GE’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI imaging with metal artifact reduction software applied (MARs).To assess the impact of metal kernels, titaniummore » and silver kernels were implemented into a commercial collapsed cone C/S algorithm. Results: The CT artifact reduction methods were more successful for titanium than Cerrobend. Interestingly, for beams traversing the metal implant, we found that errors in the dimensions of the metal in the CT images were more important for dose calculation accuracy than reduction of imaging artifacts. The MARs algorithm caused a distortion in the shape of the titanium implant that substantially worsened the calculation accuracy. In comparison to water kernel dose calculations, metal kernels resulted in better modeling of the increased backscatter dose at the upstream interface but decreased accuracy directly downstream of the metal. We also found that the success of metal kernels was dependent on dose grid size, with smaller calculation voxels giving better accuracy. Conclusion: Our study yielded mixed results, with neither the metal artifact reduction methods nor the metal kernels being globally effective at improving dose calculation accuracy. However, some successes were observed. The MARs algorithm decreased errors downstream of Cerrobend by a factor of two, and metal kernels resulted in more accurate backscatter dose upstream of metals. Thus, these two strategies do have the potential to improve accuracy for patients with metal implants in certain scenarios. This work was supported by Public Health Service grants CA 180803 and CA 10953 awarded by the National Cancer Institute, United States of Health and Human Services, and in part by Mobius Medical Systems.« less

  7. Accurate condensed history Monte Carlo simulation of electron transport. II. Application to ion chamber response simulations.

    PubMed

    Kawrakow, I

    2000-03-01

    In this report the condensed history Monte Carlo simulation of electron transport and its application to the calculation of ion chamber response is discussed. It is shown that the strong step-size dependencies and lack of convergence to the correct answer previously observed are the combined effect of the following artifacts caused by the EGS4/PRESTA implementation of the condensed history technique: dose underprediction due to PRESTA'S pathlength correction and lateral correlation algorithm; dose overprediction due to the boundary crossing algorithm; dose overprediction due to the breakdown of the fictitious cross section method for sampling distances between discrete interaction and the inaccurate evaluation of energy-dependent quantities. These artifacts are now understood quantitatively and analytical expressions for their effect are given.

  8. Robust dynamic myocardial perfusion CT deconvolution using adaptive-weighted tensor total variation regularization

    NASA Astrophysics Data System (ADS)

    Gong, Changfei; Zeng, Dong; Bian, Zhaoying; Huang, Jing; Zhang, Xinyu; Zhang, Hua; Lu, Lijun; Feng, Qianjin; Liang, Zhengrong; Ma, Jianhua

    2016-03-01

    Dynamic myocardial perfusion computed tomography (MPCT) is a promising technique for diagnosis and risk stratification of coronary artery disease by assessing the myocardial perfusion hemodynamic maps (MPHM). Meanwhile, the repeated scanning of the same region results in a relatively large radiation dose to patients potentially. In this work, we present a robust MPCT deconvolution algorithm with adaptive-weighted tensor total variation regularization to estimate residue function accurately under the low-dose context, which is termed `MPD-AwTTV'. More specifically, the AwTTV regularization takes into account the anisotropic edge property of the MPCT images compared with the conventional total variation (TV) regularization, which can mitigate the drawbacks of TV regularization. Subsequently, an effective iterative algorithm was adopted to minimize the associative objective function. Experimental results on a modified XCAT phantom demonstrated that the present MPD-AwTTV algorithm outperforms and is superior to other existing deconvolution algorithms in terms of noise-induced artifacts suppression, edge details preservation and accurate MPHM estimation.

  9. Thermoluminescence dosimetry applied to in vivo dose measurements for total body irradiation techniques.

    PubMed

    Duch, M A; Ginjaume, M; Chakkor, H; Ortega, X; Jornet, N; Ribas, M

    1998-06-01

    In total body irradiation (TBI) treatments in vivo dosimetry is recommended because it makes it possible to ensure the accuracy and quality control of dose delivery. The aim of this work is to set up an in vivo thermoluminescence dosimetry (TLD) system to measure the dose distribution during the TBI technique used prior to bone marrow transplant. Some technical problems due to the presence of lung shielding blocks are discussed. Irradiations were performed in the Hospital de la Santa Creu i Sant Pau by means of a Varian Clinac-1800 linear accelerator with 18 MV X-ray beams. Different TLD calibration experiments were set up to optimize in vivo dose assessment and to analyze the influence on dose measurement of shielding blocks. An algorithm to estimate midplane doses from entrance and exit doses is proposed and the estimated dose in critical organs is compared to internal dose measurements performed in an Alderson anthropomorphic phantom. The predictions of the dose algorithm, even in heterogeneous zones of the body such as the lungs, are in good agreement with the experimental results obtained with and without shielding blocks. The differences between measured and predicted values are in all cases lower than 2%. The TLD system described in this work has been proven to be appropriate for in vivo dosimetry in TBI irradiations. The described calibration experiments point out the difficulty of calibrating an in vivo dosimetry system when lung shielding blocks are used.

  10. Accuracy of radiotherapy dose calculations based on cone-beam CT: comparison of deformable registration and image correction based methods

    NASA Astrophysics Data System (ADS)

    Marchant, T. E.; Joshi, K. D.; Moore, C. J.

    2018-03-01

    Radiotherapy dose calculations based on cone-beam CT (CBCT) images can be inaccurate due to unreliable Hounsfield units (HU) in the CBCT. Deformable image registration of planning CT images to CBCT, and direct correction of CBCT image values are two methods proposed to allow heterogeneity corrected dose calculations based on CBCT. In this paper we compare the accuracy and robustness of these two approaches. CBCT images for 44 patients were used including pelvis, lung and head & neck sites. CBCT HU were corrected using a ‘shading correction’ algorithm and via deformable registration of planning CT to CBCT using either Elastix or Niftyreg. Radiotherapy dose distributions were re-calculated with heterogeneity correction based on the corrected CBCT and several relevant dose metrics for target and OAR volumes were calculated. Accuracy of CBCT based dose metrics was determined using an ‘override ratio’ method where the ratio of the dose metric to that calculated on a bulk-density assigned version of the same image is assumed to be constant for each patient, allowing comparison to the patient’s planning CT as a gold standard. Similar performance is achieved by shading corrected CBCT and both deformable registration algorithms, with mean and standard deviation of dose metric error less than 1% for all sites studied. For lung images, use of deformed CT leads to slightly larger standard deviation of dose metric error than shading corrected CBCT with more dose metric errors greater than 2% observed (7% versus 1%).

  11. Effect of Genetic Variants, Especially CYP2C9 and VKORC1, on the Pharmacology of Warfarin

    PubMed Central

    Fung, Erik; Patsopoulos, Nikolaos A.; Belknap, Steven M.; O’Rourke, Daniel J.; Robb, John F.; Anderson, Jeffrey L.; Shworak, Nicholas W.; Moore, Jason H.

    2014-01-01

    The genes encoding the cytochrome P450 2C9 enzyme (CYP2C9) and vitamin K-epoxide reductase complex unit 1 (VKORC1) are major determinants of anticoagulant response to warfarin. Together with patient demographics and clinical information, they account for approximately one-half of the warfarin dose variance in individuals of European descent. Recent prospective and randomized controlled trial data support pharmacogenetic guidance with their use in warfarin dose initiation and titration. Benefits from pharmacogenetics-guided warfarin dosing have been reported to extend beyond the period of initial dosing, with supportive data indicating benefits to at least 3 months. The genetic effects of VKORC1 and CYP2C9 in African and Asian populations are concordant with those in individuals of European ancestry; however, frequency distribution of allelic variants can vary considerably between major populations. Future randomized controlled trials in multiethnic settings using population-specific dosing algorithms will allow us to further ascertain the generalizability and cost-effectiveness of pharmacogenetics-guided warfarin therapy. Additional genome-wide association studies may help us to improve and refine dosing algorithms and potentially identify novel biological pathways. PMID:23041981

  12. Energy-based dosimetry of low-energy, photon-emitting brachytherapy sources

    NASA Astrophysics Data System (ADS)

    Malin, Martha J.

    Model-based dose calculation algorithms (MBDCAs) for low-energy, photon-emitting brachytherapy sources have advanced to the point where the algorithms may be used in clinical practice. Before these algorithms can be used, a methodology must be established to verify the accuracy of the source models used by the algorithms. Additionally, the source strength metric for these algorithms must be established. This work explored the feasibility of verifying the source models used by MBDCAs by measuring the differential photon fluence emitted from the encapsulation of the source. The measured fluence could be compared to that modeled by the algorithm to validate the source model. This work examined how the differential photon fluence varied with position and angle of emission from the source, and the resolution that these measurements would require for dose computations to be accurate to within 1.5%. Both the spatial and angular resolution requirements were determined. The techniques used to determine the resolution required for measurements of the differential photon fluence were applied to determine why dose-rate constants determined using a spectroscopic technique disagreed with those computed using Monte Carlo techniques. The discrepancy between the two techniques had been previously published, but the cause of the discrepancy was not known. This work determined the impact that some of the assumptions used by the spectroscopic technique had on the accuracy of the calculation. The assumption of isotropic emission was found to cause the largest discrepancy in the spectroscopic dose-rate constant. Finally, this work improved the instrumentation used to measure the rate at which energy leaves the encapsulation of a brachytherapy source. This quantity is called emitted power (EP), and is presented as a possible source strength metric for MBDCAs. A calorimeter that measured EP was designed and built. The theoretical framework that the calorimeter relied upon to measure EP was established. Four clinically relevant 125I brachytherapy sources were measured with the instrument. The accuracy of the measured EP was compared to an air-kerma strength-derived EP to test the accuracy of the instrument. The instrument was accurate to within 10%, with three out of the four source measurements accurate to within 4%.

  13. SU-E-T-252: Developing a Pencil Beam Dose Calculation Algorithm for CyberKnife System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, B; Duke University Medical Center, Durham, NC; Liu, B

    2015-06-15

    Purpose: Currently there are two dose calculation algorithms available in the Cyberknife planning system: ray-tracing and Monte Carlo, which is either not accurate or time-consuming for irregular field shaped by the MLC that was recently introduced. The purpose of this study is to develop a fast and accurate pencil beam dose calculation algorithm which can handle irregular field. Methods: A pencil beam dose calculation algorithm widely used in Linac system is modified. The algorithm models both primary (short range) and scatter (long range) components with a single input parameter: TPR{sub 20}/{sub 10}. The TPR{sub 20}/{sub 20}/{sub 10} value was firstmore » estimated to derive an initial set of pencil beam model parameters (PBMP). The agreement between predicted and measured TPRs for all cones were evaluated using the root mean square of the difference (RMSTPR), which was then minimized by adjusting PBMPs. PBMPs are further tuned to minimize OCR RMS (RMSocr) by focusing at the outfield region. Finally, an arbitrary intensity profile is optimized by minimizing RMSocr difference at infield region. To test model validity, the PBMPs were obtained by fitting to only a subset of cones (4) and applied to all cones (12) for evaluation. Results: With RMS values normalized to the dmax and all cones combined, the average RMSTPR at build-up and descending region is 2.3% and 0.4%, respectively. The RMSocr at infield, penumbra and outfield region is 1.5%, 7.8% and 0.6%, respectively. Average DTA in penumbra region is 0.5mm. There is no trend found in TPR or OCR agreement among cones or depths. Conclusion: We have developed a pencil beam algorithm for Cyberknife system. The prediction agrees well with commissioning data. Only a subset of measurements is needed to derive the model. Further improvements are needed for TPR buildup region and OCR penumbra. Experimental validations on MLC shaped irregular field needs to be performed. This work was partially supported by the National Natural Science Foundation of China (61171005) and the China Scholarship Council (CSC)« less

  14. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebe, A.; Leveling, A.; Lu, T.

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay gamma-quanta by the residuals in the activated structures and scoring the prompt doses of these gamma-quanta at arbitrary distances frommore » those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and showed a good agreement. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.« less

  15. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    NASA Astrophysics Data System (ADS)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  16. An empirical model for calculation of the collimator contamination dose in therapeutic proton beams

    NASA Astrophysics Data System (ADS)

    Vidal, M.; De Marzi, L.; Szymanowski, H.; Guinement, L.; Nauraye, C.; Hierso, E.; Freud, N.; Ferrand, R.; François, P.; Sarrut, D.

    2016-02-01

    Collimators are used as lateral beam shaping devices in proton therapy with passive scattering beam lines. The dose contamination due to collimator scattering can be as high as 10% of the maximum dose and influences calculation of the output factor or monitor units (MU). To date, commercial treatment planning systems generally use a zero-thickness collimator approximation ignoring edge scattering in the aperture collimator and few analytical models have been proposed to take scattering effects into account, mainly limited to the inner collimator face component. The aim of this study was to characterize and model aperture contamination by means of a fast and accurate analytical model. The entrance face collimator scatter distribution was modeled as a 3D secondary dose source. Predicted dose contaminations were compared to measurements and Monte Carlo simulations. Measurements were performed on two different proton beam lines (a fixed horizontal beam line and a gantry beam line) with divergent apertures and for several field sizes and energies. Discrepancies between analytical algorithm dose prediction and measurements were decreased from 10% to 2% using the proposed model. Gamma-index (2%/1 mm) was respected for more than 90% of pixels. The proposed analytical algorithm increases the accuracy of analytical dose calculations with reasonable computation times.

  17. Sub-second pencil beam dose calculation on GPU for adaptive proton therapy.

    PubMed

    da Silva, Joakim; Ansorge, Richard; Jena, Rajesh

    2015-06-21

    Although proton therapy delivered using scanned pencil beams has the potential to produce better dose conformity than conventional radiotherapy, the created dose distributions are more sensitive to anatomical changes and patient motion. Therefore, the introduction of adaptive treatment techniques where the dose can be monitored as it is being delivered is highly desirable. We present a GPU-based dose calculation engine relying on the widely used pencil beam algorithm, developed for on-line dose calculation. The calculation engine was implemented from scratch, with each step of the algorithm parallelized and adapted to run efficiently on the GPU architecture. To ensure fast calculation, it employs several application-specific modifications and simplifications, and a fast scatter-based implementation of the computationally expensive kernel superposition step. The calculation time for a skull base treatment plan using two beam directions was 0.22 s on an Nvidia Tesla K40 GPU, whereas a test case of a cubic target in water from the literature took 0.14 s to calculate. The accuracy of the patient dose distributions was assessed by calculating the γ-index with respect to a gold standard Monte Carlo simulation. The passing rates were 99.2% and 96.7%, respectively, for the 3%/3 mm and 2%/2 mm criteria, matching those produced by a clinical treatment planning system.

  18. SU-E-T-280: Reconstructed Rectal Wall Dose Map-Based Verification of Rectal Dose Sparing Effect According to Rectum Definition Methods and Dose Perturbation by Air Cavity in Endo-Rectal Balloon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, J; Research Institute of Biomedical Engineering, The Catholic University of Korea, Seoul; Park, H

    Purpose: Dosimetric effect and discrepancy according to the rectum definition methods and dose perturbation by air cavity in an endo-rectal balloon (ERB) were verified using rectal-wall (Rwall) dose maps considering systematic errors in dose optimization and calculation accuracy in intensity-modulated radiation treatment (IMRT) for prostate cancer patients. Methods: When the inflated ERB having average diameter of 4.5 cm and air volume of 100 cc is used for patient, Rwall doses were predicted by pencil-beam convolution (PBC), anisotropic analytic algorithm (AAA), and AcurosXB (AXB) with material assignment function. The errors of dose optimization and calculation by separating air cavity from themore » whole rectum (Rwhole) were verified with measured rectal doses. The Rwall doses affected by the dose perturbation of air cavity were evaluated using a featured rectal phantom allowing insert of rolled-up gafchromic films and glass rod detectors placed along the rectum perimeter. Inner and outer Rwall doses were verified with reconstructed predicted rectal wall dose maps. Dose errors and extent at dose levels were evaluated with estimated rectal toxicity. Results: While AXB showed insignificant difference of target dose coverage, Rwall doses underestimated by up to 20% in dose optimization for the Rwhole than Rwall at all dose range except for the maximum dose. As dose optimization for Rwall was applied, the Rwall doses presented dose error less than 3% between dose calculation algorithm except for overestimation of maximum rectal dose up to 5% in PBC. Dose optimization for Rwhole caused dose difference of Rwall especially at intermediate doses. Conclusion: Dose optimization for Rwall could be suggested for more accurate prediction of rectal wall dose prediction and dose perturbation effect by air cavity in IMRT for prostate cancer. This research was supported by the Leading Foreign Research Institute Recruitment Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (MSIP) (Grant No. 200900420)« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latifi, Kujtim, E-mail: Kujtim.Latifi@Moffitt.org; Oliver, Jasmine; Department of Physics, University of South Florida, Tampa, Florida

    Purpose: Pencil beam (PB) and collapsed cone convolution (CCC) dose calculation algorithms differ significantly when used in the thorax. However, such differences have seldom been previously directly correlated with outcomes of lung stereotactic ablative body radiation (SABR). Methods and Materials: Data for 201 non-small cell lung cancer patients treated with SABR were analyzed retrospectively. All patients were treated with 50 Gy in 5 fractions of 10 Gy each. The radiation prescription mandated that 95% of the planning target volume (PTV) receive the prescribed dose. One hundred sixteen patients were planned with BrainLab treatment planning software (TPS) with the PB algorithm and treatedmore » on a Novalis unit. The other 85 were planned on the Pinnacle TPS with the CCC algorithm and treated on a Varian linac. Treatment planning objectives were numerically identical for both groups. The median follow-up times were 24 and 17 months for the PB and CCC groups, respectively. The primary endpoint was local/marginal control of the irradiated lesion. Gray's competing risk method was used to determine the statistical differences in local/marginal control rates between the PB and CCC groups. Results: Twenty-five patients planned with PB and 4 patients planned with the CCC algorithms to the same nominal doses experienced local recurrence. There was a statistically significant difference in recurrence rates between the PB and CCC groups (hazard ratio 3.4 [95% confidence interval: 1.18-9.83], Gray's test P=.019). The differences (Δ) between the 2 algorithms for target coverage were as follows: ΔD99{sub GITV} = 7.4 Gy, ΔD99{sub PTV} = 10.4 Gy, ΔV90{sub GITV} = 13.7%, ΔV90{sub PTV} = 37.6%, ΔD95{sub PTV} = 9.8 Gy, and ΔD{sub ISO} = 3.4 Gy. GITV = gross internal tumor volume. Conclusions: Local control in patients receiving who were planned to the same nominal dose with PB and CCC algorithms were statistically significantly different. Possible alternative explanations are described in the report, although they are not thought likely to explain the difference. We conclude that the difference is due to relative dosimetric underdosing of tumors with the PB algorithm.« less

  20. Commissioning dose computation models for spot scanning proton beams in water for a commercially available treatment planning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X. R.; Poenisch, F.; Lii, M.

    2013-04-15

    Purpose: To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). Methods: The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm{sup 2}/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateralmore » dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. Results: We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. Conclusions: We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future.« less

  1. Commissioning dose computation models for spot scanning proton beams in water for a commercially available treatment planning system

    PubMed Central

    Zhu, X. R.; Poenisch, F.; Lii, M.; Sawakuchi, G. O.; Titt, U.; Bues, M.; Song, X.; Zhang, X.; Li, Y.; Ciangaru, G.; Li, H.; Taylor, M. B.; Suzuki, K.; Mohan, R.; Gillin, M. T.; Sahoo, N.

    2013-01-01

    Purpose: To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). Methods: The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm2/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateral dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. Results: We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. Conclusions: We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future. PMID:23556893

  2. Commissioning dose computation models for spot scanning proton beams in water for a commercially available treatment planning system.

    PubMed

    Zhu, X R; Poenisch, F; Lii, M; Sawakuchi, G O; Titt, U; Bues, M; Song, X; Zhang, X; Li, Y; Ciangaru, G; Li, H; Taylor, M B; Suzuki, K; Mohan, R; Gillin, M T; Sahoo, N

    2013-04-01

    To present our method and experience in commissioning dose models in water for spot scanning proton therapy in a commercial treatment planning system (TPS). The input data required by the TPS included in-air transverse profiles and integral depth doses (IDDs). All input data were obtained from Monte Carlo (MC) simulations that had been validated by measurements. MC-generated IDDs were converted to units of Gy mm(2)/MU using the measured IDDs at a depth of 2 cm employing the largest commercially available parallel-plate ionization chamber. The sensitive area of the chamber was insufficient to fully encompass the entire lateral dose deposited at depth by a pencil beam (spot). To correct for the detector size, correction factors as a function of proton energy were defined and determined using MC. The fluence of individual spots was initially modeled as a single Gaussian (SG) function and later as a double Gaussian (DG) function. The DG fluence model was introduced to account for the spot fluence due to contributions of large angle scattering from the devices within the scanning nozzle, especially from the spot profile monitor. To validate the DG fluence model, we compared calculations and measurements, including doses at the center of spread out Bragg peaks (SOBPs) as a function of nominal field size, range, and SOBP width, lateral dose profiles, and depth doses for different widths of SOBP. Dose models were validated extensively with patient treatment field-specific measurements. We demonstrated that the DG fluence model is necessary for predicting the field size dependence of dose distributions. With this model, the calculated doses at the center of SOBPs as a function of nominal field size, range, and SOBP width, lateral dose profiles and depth doses for rectangular target volumes agreed well with respective measured values. With the DG fluence model for our scanning proton beam line, we successfully treated more than 500 patients from March 2010 through June 2012 with acceptable agreement between TPS calculated and measured dose distributions. However, the current dose model still has limitations in predicting field size dependence of doses at some intermediate depths of proton beams with high energies. We have commissioned a DG fluence model for clinical use. It is demonstrated that the DG fluence model is significantly more accurate than the SG fluence model. However, some deficiencies in modeling the low-dose envelope in the current dose algorithm still exist. Further improvements to the current dose algorithm are needed. The method presented here should be useful for commissioning pencil beam dose algorithms in new versions of TPS in the future.

  3. Use of a hybrid iterative reconstruction technique to reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography.

    PubMed

    Kligerman, Seth; Mehta, Dhruv; Farnadesh, Mahmmoudreza; Jeudy, Jean; Olsen, Kathryn; White, Charles

    2013-01-01

    To determine whether an iterative reconstruction (IR) technique (iDose, Philips Healthcare) can reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography (CTPA). The study was Health Insurance Portability and Accountability Act compliant and approved by our institutional review board. A total of 33 obese patients (average body mass index: 42.7) underwent CTPA studies following standard departmental protocols. The data were reconstructed with filtered back projection (FBP) and 3 iDose strengths (iDoseL1, iDoseL3, and iDoseL5) for a total of 132 studies. FBP data were collected from 33 controls (average body mass index: 22) undergoing CTPA. Regions of interest were drawn at 6 identical levels in the pulmonary artery (PA), from the main PA to a subsegmental branch, in both the control group and study groups using each algorithm. Noise and attenuation were measured at all PA levels. Three thoracic radiologists graded each study on a scale of 1 (very poor) to 5 (ideal) by 4 categories: image quality, noise, PA enhancement, and "plastic" appearance. Statistical analysis was performed using an unpaired t test, 1-way analysis of variance, and linear weighted κ. Compared with the control group, there was significantly higher noise with FBP, iDoseL1, and iDoseL3 algorithms (P<0.001) in the study group. There was no significant difference between the noise in the control group and iDoseL5 algorithm in the study group. Analysis within the study group showed a significant and progressive decrease in noise and increase in the contrast-to-noise ratio as the level of IR was increased (P<0.001). Compared with FBP, readers graded overall image quality as being higher using iDoseL1 (P=0.0018), iDoseL3 (P<0.001), and iDoseL5 (P<0.001). Compared with FBP, there was subjective improvement in image noise and PA enhancement with increasing levels of iDose. The use of an IR technique leads to qualitative and quantitative improvements in image noise and image quality in obese patients undergoing CTPA.

  4. Patient‐specific CT dosimetry calculation: a feasibility study

    PubMed Central

    Xie, Huchen; Cheng, Jason Y.; Ning, Holly; Zhuge, Ying; Miller, Robert W.

    2011-01-01

    Current estimation of radiation dose from computed tomography (CT) scans on patients has relied on the measurement of Computed Tomography Dose Index (CTDI) in standard cylindrical phantoms, and calculations based on mathematical representations of “standard man”. Radiation dose to both adult and pediatric patients from a CT scan has been a concern, as noted in recent reports. The purpose of this study was to investigate the feasibility of adapting a radiation treatment planning system (RTPS) to provide patient‐specific CT dosimetry. A radiation treatment planning system was modified to calculate patient‐specific CT dose distributions, which can be represented by dose at specific points within an organ of interest, as well as organ dose‐volumes (after image segmentation) for a GE Light Speed Ultra Plus CT scanner. The RTPS calculation algorithm is based on a semi‐empirical, measured correction‐based algorithm, which has been well established in the radiotherapy community. Digital representations of the physical phantoms (virtual phantom) were acquired with the GE CT scanner in axial mode. Thermoluminescent dosimeter (TLDs) measurements in pediatric anthropomorphic phantoms were utilized to validate the dose at specific points within organs of interest relative to RTPS calculations and Monte Carlo simulations of the same virtual phantoms (digital representation). Congruence of the calculated and measured point doses for the same physical anthropomorphic phantom geometry was used to verify the feasibility of the method. The RTPS algorithm can be extended to calculate the organ dose by calculating a dose distribution point‐by‐point for a designated volume. Electron Gamma Shower (EGSnrc) codes for radiation transport calculations developed by National Research Council of Canada (NRCC) were utilized to perform the Monte Carlo (MC) simulation. In general, the RTPS and MC dose calculations are within 10% of the TLD measurements for the infant and child chest scans. With respect to the dose comparisons for the head, the RTPS dose calculations are slightly higher (10%–20%) than the TLD measurements, while the MC results were within 10% of the TLD measurements. The advantage of the algebraic dose calculation engine of the RTPS is a substantially reduced computation time (minutes vs. days) relative to Monte Carlo calculations, as well as providing patient‐specific dose estimation. It also provides the basis for a more elaborate reporting of dosimetric results, such as patient specific organ dose volumes after image segmentation. PACS numbers: 87.55.D‐, 87.57.Q‐, 87.53.Bn, 87.55.K‐ PMID:22089016

  5. A versatile multi-objective FLUKA optimization using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Vlachoudis, Vasilis; Antoniucci, Guido Arnau; Mathot, Serge; Kozlowska, Wioletta Sandra; Vretenar, Maurizio

    2017-09-01

    Quite often Monte Carlo simulation studies require a multi phase-space optimization, a complicated task, heavily relying on the operator experience and judgment. Examples of such calculations are shielding calculations with stringent conditions in the cost, in residual dose, material properties and space available, or in the medical field optimizing the dose delivered to a patient under a hadron treatment. The present paper describes our implementation inside flair[1] the advanced user interface of FLUKA[2,3] of a multi-objective Genetic Algorithm[Erreur ! Source du renvoi introuvable.] to facilitate the search for the optimum solution.

  6. The art and science of switching antipsychotic medications, part 2.

    PubMed

    Weiden, Peter J; Miller, Alexander L; Lambert, Tim J; Buckley, Peter F

    2007-01-01

    In the presentation "Switching and Metabolic Syndrome," Weiden summarizes reasons to switch antipsychotics, highlighting weight gain and other metabolic adverse events as recent treatment targets. In "Texas Medication Algorithm Project (TMAP)," Miller reviews the TMAP study design, discusses results related to the algorithm versus treatment as usual, and concludes with the implications of the study. Lambert's presentation, "Dosing and Titration Strategies to Optimize Patient Outcome When Switching Antipsychotic Therapy," reviews the decision-making process when switching patients' medication, addresses dosing and titration strategies to effectively transition between medications, and examines other factors to consider when switching pharmacotherapy.

  7. The combination of a reduction in contrast agent dose with low tube voltage and an adaptive statistical iterative reconstruction algorithm in CT enterography: Effects on image quality and radiation dose.

    PubMed

    Feng, Cui; Zhu, Di; Zou, Xianlun; Li, Anqin; Hu, Xuemei; Li, Zhen; Hu, Daoyu

    2018-03-01

    To investigate the subjective and quantitative image quality and radiation exposure of CT enterography (CTE) examination performed at low tube voltage and low concentration of contrast agent with adaptive statistical iterative reconstruction (ASIR) algorithm, compared with conventional CTE.One hundred thirty-seven patients with suspected or proved gastrointestinal diseases underwent contrast enhanced CTE in a multidetector computed tomography (MDCT) scanner. All cases were assigned to 2 groups. Group A (n = 79) underwent CT with low tube voltage based on patient body mass index (BMI) (BMI < 23 kg/m, 80 kVp; BMI ≥ 23 kg/m, 100 kVp) and low concentration of contrast agent (270 mg I/mL), the images were reconstructed with standard filtered back projection (FBP) algorithm and 50% ASIR algorithm. Group B (n = 58) underwent conventional CTE with 120 kVp and 350 mg I/mL contrast agent, the images were reconstructed with FBP algorithm. The computed tomography dose index volume (CTDIvol), dose length product (DLP), effective dose (ED), and total iodine dosage were calculated and compared. The CT values, contrast-to-noise ratio (CNR), and signal-to-noise ratio (SNR) of the normal bowel wall, gastrointestinal lesions, and mesenteric vessels were assessed and compared. The subjective image quality was assessed independently and blindly by 2 radiologists using a 5-point Likert scale.The differences of values for CTDIvol (8.64 ± 2.72 vs 11.55 ± 3.95, P < .001), ED (6.34 ± 2.24 vs 8.52 ± 3.02, P < .001), and DLP (422.6 ± 149.40 vs 568.30 ± 213.90, P < .001) were significant between group A and group B, with a reduction of 25.2%, 25.7%, and 25.7% in group A, respectively. The total iodine dosage in group A was reduced by 26.1%. The subjective image quality did not differ between the 2 groups (P > .05) and all image quality scores were greater than or equal to 3 (moderate). Fifty percent ASIR-A group images provided lower image noise, but similar or higher quantitative image quality in comparison with FBP-B group images.Compared with the conventional protocol, CTE performed at low tube voltage, low concentration of contrast agent with 50% ASIR algorithm produce a diagnostically acceptable image quality with a mean ED of 6.34 mSv and a total iodine dose reduction of 26.1%.

  8. SU-E-I-82: Improving CT Image Quality for Radiation Therapy Using Iterative Reconstruction Algorithms and Slightly Increasing Imaging Doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noid, G; Chen, G; Tai, A

    2014-06-01

    Purpose: Iterative reconstruction (IR) algorithms are developed to improve CT image quality (IQ) by reducing noise without diminishing spatial resolution or contrast. For CT in radiation therapy (RT), slightly increasing imaging dose to improve IQ may be justified if it can substantially enhance structure delineation. The purpose of this study is to investigate and to quantify the IQ enhancement as a result of increasing imaging doses and using IR algorithms. Methods: CT images were acquired for phantoms, built to evaluate IQ metrics including spatial resolution, contrast and noise, with a variety of imaging protocols using a CT scanner (Definition ASmore » Open, Siemens) installed inside a Linac room. Representative patients were scanned once the protocols were optimized. Both phantom and patient scans were reconstructed using the Sinogram Affirmed Iterative Reconstruction (SAFIRE) and the Filtered Back Projection (FBP) methods. IQ metrics of the obtained CTs were compared. Results: IR techniques are demonstrated to preserve spatial resolution as measured by the point spread function and reduce noise in comparison to traditional FBP. Driven by the reduction in noise, the contrast to noise ratio is doubled by adopting the highest SAFIRE strength. As expected, increasing imaging dose reduces noise for both SAFIRE and FBP reconstructions. The contrast to noise increases from 3 to 5 by increasing the dose by a factor of 4. Similar IQ improvement was observed on the CTs for selected patients with pancreas and prostrate cancers. Conclusion: The IR techniques produce a measurable enhancement to CT IQ by reducing the noise. Increasing imaging dose further reduces noise independent of the IR techniques. The improved CT enables more accurate delineation of tumors and/or organs at risk during RT planning and delivery guidance.« less

  9. Edge enhancement algorithm for low-dose X-ray fluoroscopic imaging.

    PubMed

    Lee, Min Seok; Park, Chul Hee; Kang, Moon Gi

    2017-12-01

    Low-dose X-ray fluoroscopy has continually evolved to reduce radiation risk to patients during clinical diagnosis and surgery. However, the reduction in dose exposure causes quality degradation of the acquired images. In general, an X-ray device has a time-average pre-processor to remove the generated quantum noise. However, this pre-processor causes blurring and artifacts within the moving edge regions, and noise remains in the image. During high-pass filtering (HPF) to enhance edge detail, this noise in the image is amplified. In this study, a 2D edge enhancement algorithm comprising region adaptive HPF with the transient improvement (TI) method, as well as artifacts and noise reduction (ANR), was developed for degraded X-ray fluoroscopic images. The proposed method was applied in a static scene pre-processed by a low-dose X-ray fluoroscopy device. First, the sharpness of the X-ray image was improved using region adaptive HPF with the TI method, which facilitates sharpening of edge details without overshoot problems. Then, an ANR filter that uses an edge directional kernel was developed to remove the artifacts and noise that can occur during sharpening, while preserving edge details. The quantitative and qualitative results obtained by applying the developed method to low-dose X-ray fluoroscopic images and visually and numerically comparing the final images with images improved using conventional edge enhancement techniques indicate that the proposed method outperforms existing edge enhancement methods in terms of objective criteria and subjective visual perception of the actual X-ray fluoroscopic image. The developed edge enhancement algorithm performed well when applied to actual low-dose X-ray fluoroscopic images, not only by improving the sharpness, but also by removing artifacts and noise, including overshoot. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. An investigation of the impact of variations of DVH calculation algorithms on DVH dependant radiation therapy plan evaluation metrics

    NASA Astrophysics Data System (ADS)

    Kennedy, A. M.; Lane, J.; Ebert, M. A.

    2014-03-01

    Plan review systems often allow dose volume histogram (DVH) recalculation as part of a quality assurance process for trials. A review of the algorithms provided by a number of systems indicated that they are often very similar. One notable point of variation between implementations is in the location and frequency of dose sampling. This study explored the impact such variations can have on DVH based plan evaluation metrics (Normal Tissue Complication Probability (NTCP), min, mean and max dose), for a plan with small structures placed over areas of high dose gradient. Dose grids considered were exported from the original planning system at a range of resolutions. We found that for the CT based resolutions used in all but one plan review systems (CT and CT with guaranteed minimum number of sampling voxels in the x and y direction) results were very similar and changed in a similar manner with changes in the dose grid resolution despite the extreme conditions. Differences became noticeable however when resolution was increased in the axial (z) direction. Evaluation metrics also varied differently with changing dose grid for CT based resolutions compared to dose grid based resolutions. This suggests that if DVHs are being compared between systems that use a different basis for selecting sampling resolution it may become important to confirm that a similar resolution was used during calculation.

  11. DNA sensors to assess the effect of VKORC1 and CYP2C9 gene polymorphisms on warfarin dose requirement in Chinese patients with atrial fibrillation.

    PubMed

    Huang, Tao-Sheng; Zhang, Ling; He, Qiong; Li, Yu-Bin; Dai, Zhong-Li; Zheng, Jian-Rui; Cheng, Pei-Qi; He, Yun-Shao

    2017-03-01

    The optimal dose of warfarin depends on polymorphisms in the VKORC1 (the vitamin K epoxide reductase complex subunit (1) and CYP2C9 (cytochrome P450 2C9) genes. To minimize the risk of adverse reactions, warfarin dosages should be adjusted according to results from rapid and simple monitoring methods. However, there are few pharmacogenetic-guided warfarin dosing algorithms that are based on large cohorts from the Chinese population, especially patients with atrial fibrillation. This study aimed to validate a pharmacogenetic-guided warfarin dosing algorithm based on results from a new rapid electrochemical detection method used in a multicenter study. Three SNPs (CYP2C9 *2, *3 and VKORC1 c.-1639G > A) were genotyped by electrochemical detection using a sandwich-type format that included a 3' short thiol capture probe and a 5' ferrocene-labeled signal probe. A total of 1285 samples from four clinical hospitals were evaluated. Concordance rates between the results from the electrochemical DNA biosensor and the sequencing test were 99.8%. The results for gene distribution showed that most Chinese patients had higher warfarin susceptibility because mutant-type and heterozygotes were present in the majority of subjects (99.4%) at locus c.-1639G > A. When the International Warfarin Pharmacogenetics Consortium algorithm was used to estimate therapeutic dosages for 362 patients with AF and the values were compared with their actual dosages, the results revealed that 56.9% were similar to actual dosages (within the 20% range). A novel electrochemical detection method of CYP2C9 *2, *3and VKORC1 c.-1639G > A alleles was evaluated. The warfarin dosing algorithm based on data gathered from a large patient cohort can facilitate the reasonable and effective use of warfarin in Chinese patients with AF.

  12. A comparison of TPS and different measurement techniques in small-field electron beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donmez Kesen, Nazmiye, E-mail: nazo94@gmail.com; Cakir, Aydin; Okutan, Murat

    In recent years, small-field electron beams have been used for the treatment of superficial lesions, which requires small circular fields. However, when using very small electron fields, some significant dosimetric problems may occur. In this study, dose distributions and outputs of circular fields with dimensions of 5 cm and smaller, for nominal energies of 6, 9, and 15 MeV from the Siemens ONCOR Linac, were measured and compared with data from a treatment planning system using the pencil-beam algorithm in electron beam calculations. All dose distribution measurements were performed using the Gafchromic EBT film; these measurements were compared with datamore » that were obtained from the Computerized Medical Systems (CMS) XiO treatment planning system (TPS), using the gamma-index method in the PTW VeriSoft software program. Output measurements were performed using the Gafchromic EBT film, an Advanced Markus ion chamber, and thermoluminescent dosimetry (TLD). Although the pencil-beam algorithm is used to model electron beams in many clinics, there is no substantial amount of detailed information in the literature about its use. As the field size decreased, the point of maximum dose moved closer to the surface. Output factors were consistent; differences from the values obtained from the TPS were, at maximum, 42% for 6 and 15 MeV and 32% for 9 MeV. When the dose distributions from the TPS were compared with the measurements from the Gafchromic EBT films, it was observed that the results were consistent for 2-cm diameter and larger fields, but the outputs for fields of 1-cm diameter and smaller were not consistent. In CMS XiO TPS, calculated using the pencil-beam algorithm, the dose distributions of electron treatment fields that were created with circular cutout of a 1-cm diameter were not appropriate for patient treatment and the pencil-beam algorithm is not convenient for monitor unit (MU) calculations in electron dosimetry.« less

  13. SU-F-I-41: Calibration-Free Material Decomposition for Dual-Energy CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, W; Xing, L; Zhang, Q

    2016-06-15

    Purpose: To eliminate tedious phantom calibration or manually region of interest (ROI) selection as required in dual-energy CT material decomposition, we establish a new projection-domain material decomposition framework with incorporation of energy spectrum. Methods: Similar to the case of dual-energy CT, the integral of the basis material image in our model is expressed as a linear combination of basis functions, which are the polynomials of high- and low-energy raw projection data. To yield the unknown coefficients of the linear combination, the proposed algorithm minimizes the quadratic error between the high- and low-energy raw projection data and the projection calculated usingmore » material images. We evaluate the algorithm with an iodine concentration numerical phantom at different dose and iodine concentration levels. The x-ray energy spectra of the high and low energy are estimated using an indirect transmission method. The derived monochromatic images are compared with the high- and low-energy CT images to demonstrate beam hardening artifacts reduction. Quantitative results were measured and compared to the true values. Results: The differences between the true density value used for simulation and that were obtained from the monochromatic images, are 1.8%, 1.3%, 2.3%, and 2.9% for the dose levels from standard dose to 1/8 dose, and are 0.4%, 0.7%, 1.5%, and 1.8% for the four iodine concentration levels from 6 mg/mL to 24 mg/mL. For all of the cases, beam hardening artifacts, especially streaks shown between dense inserts, are almost completely removed in the monochromatic images. Conclusion: The proposed algorithm provides an effective way to yield material images and artifacts-free monochromatic images at different dose levels without the need for phantom calibration or ROI selection. Furthermore, the approach also yields accurate results when the concentration of the iodine concentrate insert is very low, suggesting the algorithm is robust with respect to the low-contrast scenario.« less

  14. TEMIS UV product validation using NILU-UV ground-based measurements in Thessaloniki, Greece

    NASA Astrophysics Data System (ADS)

    Zempila, Melina-Maria; van Geffen, Jos H. G. M.; Taylor, Michael; Fountoulakis, Ilias; Koukouli, Maria-Elissavet; van Weele, Michiel; van der A, Ronald J.; Bais, Alkiviadis; Meleti, Charikleia; Balis, Dimitrios

    2017-06-01

    This study aims to cross-validate ground-based and satellite-based models of three photobiological UV effective dose products: the Commission Internationale de l'Éclairage (CIE) erythemal UV, the production of vitamin D in the skin, and DNA damage, using high-temporal-resolution surface-based measurements of solar UV spectral irradiances from a synergy of instruments and models. The satellite-based Tropospheric Emission Monitoring Internet Service (TEMIS; version 1.4) UV daily dose data products were evaluated over the period 2009 to 2014 with ground-based data from a Norsk Institutt for Luftforskning (NILU)-UV multifilter radiometer located at the northern midlatitude super-site of the Laboratory of Atmospheric Physics, Aristotle University of Thessaloniki (LAP/AUTh), in Greece. For the NILU-UV effective dose rates retrieval algorithm, a neural network (NN) was trained to learn the nonlinear functional relation between NILU-UV irradiances and collocated Brewer-based photobiological effective dose products. Then the algorithm was subjected to sensitivity analysis and validation. The correlation of the NN estimates with target outputs was high (r = 0. 988 to 0.990) and with a very low bias (0.000 to 0.011 in absolute units) proving the robustness of the NN algorithm. For further evaluation of the NILU NN-derived products, retrievals of the vitamin D and DNA-damage effective doses from a collocated Yankee Environmental Systems (YES) UVB-1 pyranometer were used. For cloud-free days, differences in the derived UV doses are better than 2 % for all UV dose products, revealing the reference quality of the ground-based UV doses at Thessaloniki from the NILU-UV NN retrievals. The TEMIS UV doses used in this study are derived from ozone measurements by the SCIAMACHY/Envisat and GOME2/MetOp-A satellite instruments, over the European domain in combination with SEVIRI/Meteosat-based diurnal cycle of the cloud cover fraction per 0. 5° × 0. 5° (lat × long) grid cells. TEMIS UV doses were found to be ˜ 12.5 % higher than the NILU NN estimates but, despite the presence of a visually apparent seasonal pattern, the R2 values were found to be robustly high and equal to 0.92-0.93 for 1588 all-sky coincidences. These results significantly improve when limiting the dataset to cloud-free days with differences of 0.57 % for the erythemal doses, 1.22 % for the vitamin D doses, and 1.18 % for the DNA-damage doses, with standard deviations of the order of 11-13 %. The improvement of the comparative statistics under cloud-free cases further testifies to the importance of the appropriate consideration of the contribution of clouds in the UV radiation reaching the Earth's surface. For the urban area of Thessaloniki, with highly variable aerosol, the weakness of the implicit aerosol information introduced to the TEMIS UV dose algorithm was revealed by comparison of the datasets to aerosol optical depths at 340 nm as reported by a collocated CIMEL sun photometer, operating in Thessaloniki at LAP/AUTh as part of the NASA Aerosol Robotic Network.

  15. Validation of contour-driven thin-plate splines for tracking fraction-to-fraction changes in anatomy and radiation therapy dose mapping.

    PubMed

    Schaly, B; Bauman, G S; Battista, J J; Van Dyk, J

    2005-02-07

    The goal of this study is to validate a deformable model using contour-driven thin-plate splines for application to radiation therapy dose mapping. Our testing includes a virtual spherical phantom as well as real computed tomography (CT) data from ten prostate cancer patients with radio-opaque markers surgically implanted into the prostate and seminal vesicles. In the spherical mathematical phantom, homologous control points generated automatically given input contour data in CT slice geometry were compared to homologous control point placement using analytical geometry as the ground truth. The dose delivered to specific voxels driven by both sets of homologous control points were compared to determine the accuracy of dose tracking via the deformable model. A 3D analytical spherically symmetric dose distribution with a dose gradient of approximately 10% per mm was used for this phantom. This test showed that the uncertainty in calculating the delivered dose to a tissue element depends on slice thickness and the variation in defining homologous landmarks, where dose agreement of 3-4% in high dose gradient regions was achieved. In the patient data, radio-opaque marker positions driven by the thin-plate spline algorithm were compared to the actual marker positions as identified in the CT scans. It is demonstrated that the deformable model is accurate (approximately 2.5 mm) to within the intra-observer contouring variability. This work shows that the algorithm is appropriate for describing changes in pelvic anatomy and for the dose mapping application with dose gradients characteristic of conformal and intensity modulated radiation therapy.

  16. Pharmacogenetics of warfarin: challenges and opportunities

    PubMed Central

    Ta Michael Lee, Ming; Klein, Teri E

    2014-01-01

    Since the introduction in the 1950s, warfarin has become the commonly used oral anticoagulant for the prevention of thromboembolism in patients with deep vein thrombosis, atrial fibrillation or prosthetic heart valve replacement. Warfarin is highly efficacious; however, achieving the desired anticoagulation is difficult because of its narrow therapeutic window and highly variable dose response among individuals. Bleeding is often associated with overdose of warfarin. There is overwhelming evidence that an individual's warfarin maintenance is associated with clinical factors and genetic variations, most notably polymorphisms in cytochrome P450 2C9 and vitamin K epoxide reductase subunit 1. Numerous dose-prediction algorithms incorporating both genetic and clinical factors have been developed and tested clinically. However, results from major clinical trials are not available yet. This review aims to provide an overview of the field of warfarin which includes information about the drug, genetics of warfarin dose requirements, dosing algorithms developed and the challenges for the clinical implementation of warfarin pharmacogenetics. PMID:23657428

  17. Adaptation of the CVT algorithm for catheter optimization in high dose rate brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poulin, Eric; Fekete, Charles-Antoine Collins; Beaulieu, Luc

    2013-11-15

    Purpose: An innovative, simple, and fast method to optimize the number and position of catheters is presented for prostate and breast high dose rate (HDR) brachytherapy, both for arbitrary templates or template-free implants (such as robotic templates).Methods: Eight clinical cases were chosen randomly from a bank of patients, previously treated in our clinic to test our method. The 2D Centroidal Voronoi Tessellations (CVT) algorithm was adapted to distribute catheters uniformly in space, within the maximum external contour of the planning target volume. The catheters optimization procedure includes the inverse planning simulated annealing algorithm (IPSA). Complete treatment plans can then bemore » generated from the algorithm for different number of catheters. The best plan is chosen from different dosimetry criteria and will automatically provide the number of catheters and their positions. After the CVT algorithm parameters were optimized for speed and dosimetric results, it was validated against prostate clinical cases, using clinically relevant dose parameters. The robustness to implantation error was also evaluated. Finally, the efficiency of the method was tested in breast interstitial HDR brachytherapy cases.Results: The effect of the number and locations of the catheters on prostate cancer patients was studied. Treatment plans with a better or equivalent dose distributions could be obtained with fewer catheters. A better or equal prostate V100 was obtained down to 12 catheters. Plans with nine or less catheters would not be clinically acceptable in terms of prostate V100 and D90. Implantation errors up to 3 mm were acceptable since no statistical difference was found when compared to 0 mm error (p > 0.05). No significant difference in dosimetric indices was observed for the different combination of parameters within the CVT algorithm. A linear relation was found between the number of random points and the optimization time of the CVT algorithm. Because the computation time decrease with the number of points and that no effects were observed on the dosimetric indices when varying the number of sampling points and the number of iterations, they were respectively fixed to 2500 and to 100. The computation time to obtain ten complete treatments plans ranging from 9 to 18 catheters, with the corresponding dosimetric indices, was 90 s. However, 93% of the computation time is used by a research version of IPSA. For the breast, on average, the Radiation Therapy Oncology Group recommendations would be satisfied down to 12 catheters. Plans with nine or less catheters would not be clinically acceptable in terms of V100, dose homogeneity index, and D90.Conclusions: The authors have devised a simple, fast and efficient method to optimize the number and position of catheters in interstitial HDR brachytherapy. The method was shown to be robust for both prostate and breast HDR brachytherapy. More importantly, the computation time of the algorithm is acceptable for clinical use. Ultimately, this catheter optimization algorithm could be coupled with a 3D ultrasound system to allow real-time guidance and planning in HDR brachytherapy.« less

  18. Iterative raw measurements restoration method with penalized weighted least squares approach for low-dose CT

    NASA Astrophysics Data System (ADS)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu

    2014-03-01

    Statistical iterative reconstruction and post-log data restoration algorithms for CT noise reduction have been widely studied and these techniques have enabled us to reduce irradiation doses while maintaining image qualities. In low dose scanning, electronic noise becomes obvious and it results in some non-positive signals in raw measurements. The nonpositive signal should be converted to positive signal so that it can be log-transformed. Since conventional conversion methods do not consider local variance on the sinogram, they have difficulty of controlling the strength of the filtering. Thus, in this work, we propose a method to convert the non-positive signal to the positive signal by mainly controlling the local variance. The method is implemented in two separate steps. First, an iterative restoration algorithm based on penalized weighted least squares is used to mitigate the effect of electronic noise. The algorithm preserves the local mean and reduces the local variance induced by the electronic noise. Second, smoothed raw measurements by the iterative algorithm are converted to the positive signal according to a function which replaces the non-positive signal with its local mean. In phantom studies, we confirm that the proposed method properly preserves the local mean and reduce the variance induced by the electronic noise. Our technique results in dramatically reduced shading artifacts and can also successfully cooperate with the post-log data filter to reduce streak artifacts.

  19. Experimental verification of a 4D MLEM reconstruction algorithm used for in-beam PET measurements in particle therapy

    NASA Astrophysics Data System (ADS)

    Stützer, K.; Bert, C.; Enghardt, W.; Helmbrecht, S.; Parodi, K.; Priegnitz, M.; Saito, N.; Fiedler, F.

    2013-08-01

    In-beam positron emission tomography (PET) has been proven to be a reliable technique in ion beam radiotherapy for the in situ and non-invasive evaluation of the correct dose deposition in static tumour entities. In the presence of intra-fractional target motion an appropriate time-resolved (four-dimensional, 4D) reconstruction algorithm has to be used to avoid reconstructed activity distributions suffering from motion-related blurring artefacts and to allow for a dedicated dose monitoring. Four-dimensional reconstruction algorithms from diagnostic PET imaging that can properly handle the typically low counting statistics of in-beam PET data have been adapted and optimized for the characteristics of the double-head PET scanner BASTEI installed at GSI Helmholtzzentrum Darmstadt, Germany (GSI). Systematic investigations with moving radioactive sources demonstrate the more effective reduction of motion artefacts by applying a 4D maximum likelihood expectation maximization (MLEM) algorithm instead of the retrospective co-registration of phasewise reconstructed quasi-static activity distributions. Further 4D MLEM results are presented from in-beam PET measurements of irradiated moving phantoms which verify the accessibility of relevant parameters for the dose monitoring of intra-fractionally moving targets. From in-beam PET listmode data sets acquired together with a motion surrogate signal, valuable images can be generated by the 4D MLEM reconstruction for different motion patterns and motion-compensated beam delivery techniques.

  20. Effect of algorithm aggressiveness on the performance of the Hypoglycemia-Hyperglycemia Minimizer (HHM) System.

    PubMed

    Finan, Daniel A; McCann, Thomas W; Rhein, Kathleen; Dassau, Eyal; Breton, Marc D; Patek, Stephen D; Anhalt, Henry; Kovatchev, Boris P; Doyle, Francis J; Anderson, Stacey M; Zisser, Howard; Venugopalan, Ramakrishna

    2014-07-01

    The Hypoglycemia-Hyperglycemia Minimizer (HHM) System aims to mitigate glucose excursions by preemptively modulating insulin delivery based on continuous glucose monitor (CGM) measurements. The "aggressiveness factor" is a key parameter in the HHM System algorithm, affecting how readily the system adjusts insulin infusion in response to changing CGM levels. Twenty adults with type 1 diabetes were studied in closed-loop in a clinical research center for approximately 26 hours. This analysis focused on the effect of the aggressiveness factor on the insulin dosing characteristics of the algorithm and, to a lesser extent, on the glucose control results observed. As the aggressiveness factor increased from conservative to medium to aggressive: the maximum observed insulin dose delivered by the algorithm—which is designed to give doses that are corrective in nature every 5 minutes—increased (1.00 vs 1.15 vs 2.20 U, respectively); tendency to adhere to the subject's nominal basal dose decreased (61.9% vs 56.6% vs 53.4%); and readiness to decrease insulin below basal also increased (18.4% vs 19.4% vs 25.2%). Glucose analyses by both CGM and Yellow Springs Instruments (YSI) indicated that the aggressive setting of the algorithm resulted in the least time spent at levels >180 mg/dL, and the most time spent between 70-180 mg/dL. There was no severe hyperglycemia, diabetic ketoacidosis, or severe hypoglycemia for any of the aggressiveness values investigated. These analyses underscore the importance of investigating the sensitivity of the HHM System to its key parameters, such as the aggressiveness factor, to guide future development decisions. © 2014 Diabetes Technology Society.

  1. A dosimetric evaluation of the Eclipse AAA algorithm and Millennium 120 MLC for cranial intensity-modulated radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo Ortega, Juan Francisco, E-mail: jfcdrr@yahoo.es; Moragues, Sandra; Pozo, Miquel

    2014-07-01

    The aim of this study is to assess the accuracy of a convolution-based algorithm (anisotropic analytical algorithm [AAA]) implemented in the Eclipse planning system for intensity-modulated radiosurgery (IMRS) planning of small cranial targets by using a 5-mm leaf-width multileaf collimator (MLC). Overall, 24 patient-based IMRS plans for cranial lesions of variable size (0.3 to 15.1 cc) were planned (Eclipse, AAA, version 10.0.28) using fixed field-based IMRS produced by a Varian linear accelerator equipped with a 120 MLC (5-mm width on central leaves). Plan accuracy was evaluated according to phantom-based measurements performed with radiochromic film (EBT2, ISP, Wayne, NJ). Film 2Dmore » dose distributions were performed with the FilmQA Pro software (version 2011, Ashland, OH) by using the triple-channel dosimetry method. Comparison between computed and measured 2D dose distributions was performed using the gamma method (3%/1 mm). Performance of the MLC was checked by inspection of the DynaLog files created by the linear accelerator during the delivery of each dynamic field. The absolute difference between the calculated and measured isocenter doses for all the IMRS plans was 2.5% ± 2.1%. The gamma evaluation method resulted in high average passing rates of 98.9% ± 1.4% (red channel) and 98.9% ± 1.5% (blue and green channels). DynaLog file analysis revealed a maximum root mean square error of 0.46 mm. According to our results, we conclude that the Eclipse/AAA algorithm provides accurate cranial IMRS dose distributions that may be accurately delivered by a Varian linac equipped with a Millennium 120 MLC.« less

  2. Sinogram-based adaptive iterative reconstruction for sparse view x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Trinca, D.; Zhong, Y.; Wang, Y.-Z.; Mamyrbayev, T.; Libin, E.

    2016-10-01

    With the availability of more powerful computing processors, iterative reconstruction algorithms have recently been successfully implemented as an approach to achieving significant dose reduction in X-ray CT. In this paper, we propose an adaptive iterative reconstruction algorithm for X-ray CT, that is shown to provide results comparable to those obtained by proprietary algorithms, both in terms of reconstruction accuracy and execution time. The proposed algorithm is thus provided for free to the scientific community, for regular use, and for possible further optimization.

  3. Improved tissue assignment using dual-energy computed tomography in low-dose rate prostate brachytherapy for Monte Carlo dose calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Côté, Nicolas; Bedwani, Stéphane; Carrier, Jean-François, E-mail: jean-francois.carrier.chum@ssss.gouv.qc.ca

    Purpose: An improvement in tissue assignment for low-dose rate brachytherapy (LDRB) patients using more accurate Monte Carlo (MC) dose calculation was accomplished with a metallic artifact reduction (MAR) method specific to dual-energy computed tomography (DECT). Methods: The proposed MAR algorithm followed a four-step procedure. The first step involved applying a weighted blend of both DECT scans (I {sub H/L}) to generate a new image (I {sub Mix}). This action minimized Hounsfield unit (HU) variations surrounding the brachytherapy seeds. In the second step, the mean HU of the prostate in I {sub Mix} was calculated and shifted toward the mean HUmore » of the two original DECT images (I {sub H/L}). The third step involved smoothing the newly shifted I {sub Mix} and the two original I {sub H/L}, followed by a subtraction of both, generating an image that represented the metallic artifact (I {sub A,(H/L)}) of reduced noise levels. The final step consisted of subtracting the original I {sub H/L} from the newly generated I {sub A,(H/L)} and obtaining a final image corrected for metallic artifacts. Following the completion of the algorithm, a DECT stoichiometric method was used to extract the relative electronic density (ρ{sub e}) and effective atomic number (Z {sub eff}) at each voxel of the corrected scans. Tissue assignment could then be determined with these two newly acquired physical parameters. Each voxel was assigned the tissue bearing the closest resemblance in terms of ρ{sub e} and Z {sub eff}, comparing with values from the ICRU 42 database. A MC study was then performed to compare the dosimetric impacts of alternative MAR algorithms. Results: An improvement in tissue assignment was observed with the DECT MAR algorithm, compared to the single-energy computed tomography (SECT) approach. In a phantom study, tissue misassignment was found to reach 0.05% of voxels using the DECT approach, compared with 0.40% using the SECT method. Comparison of the DECT and SECT D {sub 90} dose parameter (volume receiving 90% of the dose) indicated that D {sub 90} could be underestimated by up to 2.3% using the SECT method. Conclusions: The DECT MAR approach is a simple alternative to reduce metallic artifacts found in LDRB patient scans. Images can be processed quickly and do not require the determination of x-ray spectra. Substantial information on density and atomic number can also be obtained. Furthermore, calcifications within the prostate are detected by the tissue assignment algorithm. This enables more accurate, patient-specific MC dose calculations.« less

  4. Comparison study of image quality and effective dose in dual energy chest digital tomosynthesis

    NASA Astrophysics Data System (ADS)

    Lee, Donghoon; Choi, Sunghoon; Lee, Haenghwa; Kim, Dohyeon; Choi, Seungyeon; Kim, Hee-Joung

    2018-07-01

    The present study aimed to introduce a recently developed digital tomosynthesis system for the chest and describe the procedure for acquiring dual energy bone decomposed tomosynthesis images. Various beam quality and reconstruction algorithms were evaluated for acquiring dual energy chest digital tomosynthesis (CDT) images and the effective dose was calculated with ion chamber and Monte Carlo simulations. The results demonstrated that dual energy CDT improved visualization of the lung field by eliminating the bony structures. In addition, qualitative and quantitative image quality of dual energy CDT using iterative reconstruction was better than that with filtered backprojection (FBP) algorithm. The contrast-to-noise ratio and figure of merit values of dual energy CDT acquired with iterative reconstruction were three times better than those acquired with FBP reconstruction. The difference in the image quality according to the acquisition conditions was not noticeable, but the effective dose was significantly affected by the acquisition condition. The high energy acquisition condition using 130 kVp recorded a relatively high effective dose. We conclude that dual energy CDT has the potential to compensate for major problems in CDT due to decomposed bony structures, which induce significant artifacts. Although there are many variables in the clinical practice, our results regarding reconstruction algorithms and acquisition conditions may be used as the basis for clinical use of dual energy CDT imaging.

  5. SU-F-T-441: Dose Calculation Accuracy in CT Images Reconstructed with Artifact Reduction Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, C; Chan, S; Lee, F

    Purpose: Accuracy of radiotherapy dose calculation in patients with surgical implants is complicated by two factors. First is the accuracy of CT number, second is the dose calculation accuracy. We compared measured dose with dose calculated on CT images reconstructed with FBP and an artifact reduction algorithm (OMAR, Philips) for a phantom with high density inserts. Dose calculation were done with Varian AAA and AcurosXB. Methods: A phantom was constructed with solid water in which 2 titanium or stainless steel rods could be inserted. The phantom was scanned with the Philips Brillance Big Bore CT. Image reconstruction was done withmore » FBP and OMAR. Two 6 MV single field photon plans were constructed for each phantom. Radiochromic films were placed at different locations to measure the dose deposited. One plan has normal incidence on the titanium/steel rods. In the second plan, the beam is at almost glancing incidence on the metal rods. Measurements were then compared with dose calculated with AAA and AcurosXB. Results: The use of OMAR images slightly improved the dose calculation accuracy. The agreement between measured and calculated dose was best with AXB and image reconstructed with OMAR. Dose calculated on titanium phantom has better agreement with measurement. Large discrepancies were seen at points directly above and below the high density inserts. Both AAA and AXB underestimated the dose directly above the metal surface, while overestimated the dose below the metal surface. Doses measured downstream of metal were all within 3% of calculated values. Conclusion: When doing treatment planning for patients with metal implants, care must be taken to acquire correct CT images to improve dose calculation accuracy. Moreover, great discrepancies in measured and calculated dose were observed at metal/tissue interface. Care must be taken in estimating the dose in critical structures that come into contact with metals.« less

  6. Effective Dose Calculation Program (EDCP) for the usage of NORM-added consumer product.

    PubMed

    Yoo, Do Hyeon; Lee, Jaekook; Min, Chul Hee

    2018-04-09

    The aim of this study is to develop the Effective Dose Calculation Program (EDCP) for the usage of Naturally Occurring Radioactive Material (NORM) added consumer products. The EDCP was developed based on a database of effective dose conversion coefficient and the Matrix Laboratory (MATLAB) program to incorporate a Graphic User Interface (GUI) for ease of use. To validate EDCP, the effective dose calculated with EDCP by manually determining the source region by using the GUI and that by using the reference mathematical algorithm were compared for pillow, waist supporter, eye-patch and sleeping mattress. The results show that the annual effective dose calculated with EDCP was almost identical to that calculated using the reference mathematical algorithm in most of the assessment cases. With the assumption of the gamma energy of 1 MeV and activity of 1 MBq, the annual effective doses of pillow, waist supporter, sleeping mattress, and eye-patch determined using the reference algorithm were 3.444 mSv year -1 , 2.770 mSv year -1 , 4.629 mSv year -1 , and 3.567 mSv year -1 , respectively, while those calculated using EDCP were 3.561 mSv year -1 , 2.630 mSv year -1 , 4.740 mSv year -1 , and 3.780 mSv year -1 , respectively. The differences in the annual effective doses were less than 5%, despite the different calculation methods employed. The EDCP can therefore be effectively used for radiation protection management in the context of the usage of NORM-added consumer products. Additionally, EDCP can be used by members of the public through the GUI for various studies in the field of radiation protection, thus facilitating easy access to the program. Copyright © 2018. Published by Elsevier Ltd.

  7. Pharmacogenomics of warfarin in populations of African descent

    PubMed Central

    Suarez-Kurtz, Guilherme; Botton, Mariana R

    2013-01-01

    Warfarin is the most commonly prescribed oral anticoagulant worldwide despite its narrow therapeutic index and the notorious inter- and intra-individual variability in dose required for the target clinical effect. Pharmacogenetic polymorphisms are major determinants of warfarin pharmacokinetic and dynamics and included in several warfarin dosing algorithms. This review focuses on warfarin pharmacogenomics in sub-Saharan peoples, African Americans and admixed Brazilians. These ‘Black’ populations differ in several aspects, notably their extent of recent admixture with Europeans, a factor which impacts on the frequency distribution of pharmacogenomic polymorphisms relevant to warfarin dose requirement for the target clinical effect. Whereas a small number of polymorphisms in VKORC1 (3673G > A, rs9923231), CYP2C9 (alleles *2 and *3, rs1799853 and rs1057910, respectively) and arguably CYP4F2 (rs2108622), may capture most of the pharmacogenomic influence on warfarin dose variance in White populations, additional polymorphisms in these, and in other, genes (e.g. CALU rs339097) increase the predictive power of pharmacogenetic warfarin dosing algorithms in the Black populations examined. A personalized strategy for initiation of warfarin therapy, allowing for improved safety and cost-effectiveness for populations of African descent must take into account their pharmacogenomic diversity, as well as socio-economical, cultural and medical factors. Accounting for this heterogeneity in algorithms that are ‘friendly’ enough to be adopted by warfarin prescribers worldwide requires gathering information from trials at different population levels, but demands also a critical appraisal of racial/ethnic labels that are commonly used in the clinical pharmacology literature but do not accurately reflect genetic ancestry and population diversity. PMID:22676711

  8. Application of a dummy eye shield for electron treatment planning

    PubMed Central

    Kang, Sei-Kwon; Park, Soah; Hwang, Taejin; Cheong, Kwang-Ho; Han, Taejin; Kim, Haeyoung; Lee, Me-Yeon; Kim, Kyoung Ju; Oh, Do Hoon; Bae, Hoonsik

    2013-01-01

    Metallic eye shields have been widely used for near-eye treatments to protect critical regions, but have never been incorporated into treatment plans because of the unwanted appearance of the metal artifacts on CT images. The purpose of this work was to test the use of an acrylic dummy eye shield as a substitute for a metallic eye shield during CT scans. An acrylic dummy shield of the same size as the tungsten eye shield was machined and CT scanned. The BEAMnrc and the DOSXYZnrc were used for the Monte Carlo (MC) simulation, with the appropriate material information and density for the aluminum cover, steel knob and tungsten body of the eye shield. The Pinnacle adopting the Hogstrom electron pencil-beam algorithm was used for the one-port 6-MeV beam plan after delineation and density override of the metallic parts. The results were confirmed with the metal oxide semiconductor field effect transistor (MOSFET) detectors and the Gafchromic EBT2 film measurements. For both the maximum eyelid dose over the shield and the maximum dose under the shield, the MC results agreed with the EBT2 measurements within 1.7%. For the Pinnacle plan, the maximum dose under the shield agreed with the MC within 0.3%; however, the eyelid dose differed by –19.3%. The adoption of the acrylic dummy eye shield was successful for the treatment plan. However, the Pinnacle pencil-beam algorithm was not sufficient to predict the eyelid dose on the tungsten shield, and more accurate algorithms like MC should be considered for a treatment plan. PMID:22915776

  9. A simple method for low-contrast detectability, image quality and dose optimisation with CT iterative reconstruction algorithms and model observers.

    PubMed

    Bellesi, Luca; Wyttenbach, Rolf; Gaudino, Diego; Colleoni, Paolo; Pupillo, Francesco; Carrara, Mauro; Braghetti, Antonio; Puligheddu, Carla; Presilla, Stefano

    2017-01-01

    The aim of this work was to evaluate detection of low-contrast objects and image quality in computed tomography (CT) phantom images acquired at different tube loadings (i.e. mAs) and reconstructed with different algorithms, in order to find appropriate settings to reduce the dose to the patient without any image detriment. Images of supraslice low-contrast objects of a CT phantom were acquired using different mAs values. Images were reconstructed using filtered back projection (FBP), hybrid and iterative model-based methods. Image quality parameters were evaluated in terms of modulation transfer function; noise, and uniformity using two software resources. For the definition of low-contrast detectability, studies based on both human (i.e. four-alternative forced-choice test) and model observers were performed across the various images. Compared to FBP, image quality parameters were improved by using iterative reconstruction (IR) algorithms. In particular, IR model-based methods provided a 60% noise reduction and a 70% dose reduction, preserving image quality and low-contrast detectability for human radiological evaluation. According to the model observer, the diameters of the minimum detectable detail were around 2 mm (up to 100 mAs). Below 100 mAs, the model observer was unable to provide a result. IR methods improve CT protocol quality, providing a potential dose reduction while maintaining a good image detectability. Model observer can in principle be useful to assist human performance in CT low-contrast detection tasks and in dose optimisation.

  10. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT.

    PubMed

    Park, Justin C; Li, Jonathan G; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-01

    The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm(2) square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm(2), where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a volumetric modulated arc therapy fields (∼90 control points) with a 3D grid size of 2.0 × 2.0 × 2.0 mm(3), dose was computed within 3-5 and 10-15 s timeframe, respectively. The authors have developed an efficient adaptive beamlet-based pencil beam dose calculation algorithm. The fast computation nature along with GPU compatibility has shown better performance than conventional FSPB. This enables the implementation of AB-FSPB in the clinical environment for independent volumetric dose verification.

  11. Experimental verification of a commercial Monte Carlo-based dose calculation module for high-energy photon beams.

    PubMed

    Künzler, Thomas; Fotina, Irina; Stock, Markus; Georg, Dietmar

    2009-12-21

    The dosimetric performance of a Monte Carlo algorithm as implemented in a commercial treatment planning system (iPlan, BrainLAB) was investigated. After commissioning and basic beam data tests in homogenous phantoms, a variety of single regular beams and clinical field arrangements were tested in heterogeneous conditions (conformal therapy, arc therapy and intensity-modulated radiotherapy including simultaneous integrated boosts). More specifically, a cork phantom containing a concave-shaped target was designed to challenge the Monte Carlo algorithm in more complex treatment cases. All test irradiations were performed on an Elekta linac providing 6, 10 and 18 MV photon beams. Absolute and relative dose measurements were performed with ion chambers and near tissue equivalent radiochromic films which were placed within a transverse plane of the cork phantom. For simple fields, a 1D gamma (gamma) procedure with a 2% dose difference and a 2 mm distance to agreement (DTA) was applied to depth dose curves, as well as to inplane and crossplane profiles. The average gamma value was 0.21 for all energies of simple test cases. For depth dose curves in asymmetric beams similar gamma results as for symmetric beams were obtained. Simple regular fields showed excellent absolute dosimetric agreement to measurement values with a dose difference of 0.1% +/- 0.9% (1 standard deviation) at the dose prescription point. A more detailed analysis at tissue interfaces revealed dose discrepancies of 2.9% for an 18 MV energy 10 x 10 cm(2) field at the first density interface from tissue to lung equivalent material. Small fields (2 x 2 cm(2)) have their largest discrepancy in the re-build-up at the second interface (from lung to tissue equivalent material), with a local dose difference of about 9% and a DTA of 1.1 mm for 18 MV. Conformal field arrangements, arc therapy, as well as IMRT beams and simultaneous integrated boosts were in good agreement with absolute dose measurements in the heterogeneous phantom. For the clinical test cases, the average dose discrepancy was 0.5% +/- 1.1%. Relative dose investigations of the transverse plane for clinical beam arrangements were performed with a 2D gamma-evaluation procedure. For 3% dose difference and 3 mm DTA criteria, the average value for gamma(>1) was 4.7% +/- 3.7%, the average gamma(1%) value was 1.19 +/- 0.16 and the mean 2D gamma-value was 0.44 +/- 0.07 in the heterogeneous phantom. The iPlan MC algorithm leads to accurate dosimetric results under clinical test conditions.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klüter, Sebastian, E-mail: sebastian.klueter@med.uni-heidelberg.de; Schubert, Kai; Lissner, Steffen

    Purpose: The dosimetric verification of treatment plans in helical tomotherapy usually is carried out via verification measurements. In this study, a method for independent dose calculation of tomotherapy treatment plans is presented, that uses a conventional treatment planning system with a pencil kernel dose calculation algorithm for generation of verification dose distributions based on patient CT data. Methods: A pencil beam algorithm that directly uses measured beam data was configured for dose calculation for a tomotherapy machine. Tomotherapy treatment plans were converted into a format readable by an in-house treatment planning system by assigning each projection to one static treatmentmore » field and shifting the calculation isocenter for each field in order to account for the couch movement. The modulation of the fluence for each projection is read out of the delivery sinogram, and with the kernel-based dose calculation, this information can directly be used for dose calculation without the need for decomposition of the sinogram. The sinogram values are only corrected for leaf output and leaf latency. Using the converted treatment plans, dose was recalculated with the independent treatment planning system. Multiple treatment plans ranging from simple static fields to real patient treatment plans were calculated using the new approach and either compared to actual measurements or the 3D dose distribution calculated by the tomotherapy treatment planning system. In addition, dose–volume histograms were calculated for the patient plans. Results: Except for minor deviations at the maximum field size, the pencil beam dose calculation for static beams agreed with measurements in a water tank within 2%/2 mm. A mean deviation to point dose measurements in the cheese phantom of 0.89% ± 0.81% was found for unmodulated helical plans. A mean voxel-based deviation of −0.67% ± 1.11% for all voxels in the respective high dose region (dose values >80%), and a mean local voxel-based deviation of −2.41% ± 0.75% for all voxels with dose values >20% were found for 11 modulated plans in the cheese phantom. Averaged over nine patient plans, the deviations amounted to −0.14% ± 1.97% (voxels >80%) and −0.95% ± 2.27% (>20%, local deviations). For a lung case, mean voxel-based deviations of more than 4% were found, while for all other patient plans, all mean voxel-based deviations were within ±2.4%. Conclusions: The presented method is suitable for independent dose calculation for helical tomotherapy within the known limitations of the pencil beam algorithm. It can serve as verification of the primary dose calculation and thereby reduce the need for time-consuming measurements. By using the patient anatomy and generating full 3D dose data, and combined with measurements of additional machine parameters, it can substantially contribute to overall patient safety.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCulloch, M; Cazoulat, G; Polan, D

    Purpose: It is well documented that the delivered dose to patients undergoing radiotherapy (RT) is often different from the planned dose due to geometric variability and uncertainties in patient positioning. Recent work suggests that accumulated dose to the GTV is a better predictor of progression compared to the minimum planned dose to the PTV. The purpose of this study is to evaluate if deviations from the planned dose can contributed to tumor progression. Methods: From 2010 to 2014 an in-house Phase II clinical trial of adaptive stereotactic body RT was completed. Of the 90 patients enrolled, 7 patients had amore » local recurrence defined on contrast enhanced CT or MR imaging 3–21 months after completion of RT. Retrospective dose accumulation was performed using a biomechanical model-based deformable image registration algorithm (DIR) to accumulate the dose based on the kV CBCT acquired prior to each fraction for soft tissue alignment of the patient. The DIR algorithm was previously validated for geometric accuracy in the liver (target registration error = 2.0 mm) and dose accumulation in a homogeneous image, similar to a liver CBCT (gamma index = 91%). Following dose accumulation, the minimum dose to 0.5 cc of the GTV was compared between the planned and accumulated dose. Work is ongoing to evaluate the tumor control probability based on the planned and accumulated dose. Results: DIR and dose accumulation was performed on all fractions for 6 patients with local recurrence. The difference in minimum dose to 0.5 cc of the GTV ranged from −0.3–2.3 Gy over 3–5 fractions. One patient had a potentially significant difference in minimum dose of 2.3 Gy. Conclusion: Dose accumulation can reveal tumor underdosage, improving our ability to understand recurrence and tumor progression patterns, and could aid in adaptive re-planning during therapy to correct for this. This work was supported in part by NIH P01CA059827.« less

  14. Deformable Dose Reconstruction to Optimize the Planning and Delivery of Liver Cancer Radiotherapy

    NASA Astrophysics Data System (ADS)

    Velec, Michael

    The precise delivery of radiation to liver cancer patients results in improved control with higher tumor doses and minimized normal tissues doses. A margin of normal tissue around the tumor requires irradiation however to account for treatment delivery uncertainties. Daily image-guidance allows targeting of the liver, a surrogate for the tumor, to reduce geometric errors. However poor direct tumor visualization, anatomical deformation and breathing motion introduce uncertainties between the planned dose, calculated on a single pre-treatment computed tomography image, and the dose that is delivered. A novel deformable image registration algorithm based on tissue biomechanics was applied to previous liver cancer patients to track targets and surrounding organs during radiotherapy. Modeling these daily anatomic variations permitted dose accumulation, thereby improving calculations of the delivered doses. The accuracy of the algorithm to track dose was validated using imaging from a deformable, 3-dimensional dosimeter able to optically track absorbed dose. Reconstructing the delivered dose revealed that 70% of patients had substantial deviations from the initial planned dose. An alternative image-guidance technique using respiratory-correlated imaging was simulated, which reduced both the residual tumor targeting errors and the magnitude of the delivered dose deviations. A planning and delivery strategy for liver radiotherapy was then developed that minimizes the impact of breathing motion, and applied a margin to account for the impact of liver deformation during treatment. This margin is 38% smaller on average than the margin used clinically, and permitted an average dose-escalation to liver tumors of 9% for the same risk of toxicity. Simulating the delivered dose with deformable dose reconstruction demonstrated the plans with smaller margins were robust as 90% of patients' tumors received the intended dose. This strategy can be readily implemented with widely available technologies and thus can potentially improve local control for liver cancer patients receiving radiotherapy.

  15. Targeting MRS-Defined Dominant Intraprostatic Lesions with Inverse-Planned High Dose Rate Brachytherapy

    DTIC Science & Technology

    2008-06-01

    brachytherapy treatment planning has been demonstrated. Using the inverse planning program IPSA , dose escalation of target regions with a higher tumor...algorithm (called IPSA ) was used to generate dose distributions for five different levels of DIL- boost, at least 110%, 120%, 130%, 140% and 150...and LDR, VI Last Generation Radiotherapy Course, São Paulo, Brazil, Oct. 19, 2006. Principles and Clinical Applications of IPSA ; Nucletron

  16. Treatment planning with intensity modulated particle therapy for multiple targets in stage IV non-small cell lung cancer

    NASA Astrophysics Data System (ADS)

    Anderle, Kristjan; Stroom, Joep; Vieira, Sandra; Pimentel, Nuno; Greco, Carlo; Durante, Marco; Graeff, Christian

    2018-01-01

    Intensity modulated particle therapy (IMPT) can produce highly conformal plans, but is limited in advanced lung cancer patients with multiple lesions due to motion and planning complexity. A 4D IMPT optimization including all motion states was expanded to include multiple targets, where each target (isocenter) is designated to specific field(s). Furthermore, to achieve stereotactic treatment planning objectives, target and OAR weights plus objective doses were automatically iteratively adapted. Finally, 4D doses were calculated for different motion scenarios. The results from our algorithm were compared to clinical stereotactic body radiation treatment (SBRT) plans. The study included eight patients with 24 lesions in total. Intended dose regimen for SBRT was 24 Gy in one fraction, but lower fractionated doses had to be delivered in three cases due to OAR constraints or failed plan quality assurance. The resulting IMPT treatment plans had no significant difference in target coverage compared to SBRT treatment plans. Average maximum point dose and dose to specific volume in OARs were on average 65% and 22% smaller with IMPT. IMPT could also deliver 24 Gy in one fraction in a patient where SBRT was limited due to the OAR vicinity. The developed algorithm shows the potential of IMPT in treatment of multiple moving targets in a complex geometry.

  17. Application of Machine-Learning Models to Predict Tacrolimus Stable Dose in Renal Transplant Recipients

    NASA Astrophysics Data System (ADS)

    Tang, Jie; Liu, Rong; Zhang, Yue-Li; Liu, Mou-Ze; Hu, Yong-Fang; Shao, Ming-Jie; Zhu, Li-Jun; Xin, Hua-Wen; Feng, Gui-Wen; Shang, Wen-Jun; Meng, Xiang-Guang; Zhang, Li-Rong; Ming, Ying-Zi; Zhang, Wei

    2017-02-01

    Tacrolimus has a narrow therapeutic window and considerable variability in clinical use. Our goal was to compare the performance of multiple linear regression (MLR) and eight machine learning techniques in pharmacogenetic algorithm-based prediction of tacrolimus stable dose (TSD) in a large Chinese cohort. A total of 1,045 renal transplant patients were recruited, 80% of which were randomly selected as the “derivation cohort” to develop dose-prediction algorithm, while the remaining 20% constituted the “validation cohort” to test the final selected algorithm. MLR, artificial neural network (ANN), regression tree (RT), multivariate adaptive regression splines (MARS), boosted regression tree (BRT), support vector regression (SVR), random forest regression (RFR), lasso regression (LAR) and Bayesian additive regression trees (BART) were applied and their performances were compared in this work. Among all the machine learning models, RT performed best in both derivation [0.71 (0.67-0.76)] and validation cohorts [0.73 (0.63-0.82)]. In addition, the ideal rate of RT was 4% higher than that of MLR. To our knowledge, this is the first study to use machine learning models to predict TSD, which will further facilitate personalized medicine in tacrolimus administration in the future.

  18. Analyser-based mammography using single-image reconstruction.

    PubMed

    Briedis, Dahliyani; Siu, Karen K W; Paganin, David M; Pavlov, Konstantin M; Lewis, Rob A

    2005-08-07

    We implement an algorithm that is able to decode a single analyser-based x-ray phase-contrast image of a sample, converting it into an equivalent conventional absorption-contrast radiograph. The algorithm assumes the projection approximation for x-ray propagation in a single-material object embedded in a substrate of approximately uniform thickness. Unlike the phase-contrast images, which have both directional bias and a bias towards edges present in the sample, the reconstructed images are directly interpretable in terms of the projected absorption coefficient of the sample. The technique was applied to a Leeds TOR[MAM] phantom, which is designed to test mammogram quality by the inclusion of simulated microcalcifications, filaments and circular discs. This phantom was imaged at varying doses using three modalities: analyser-based synchrotron phase-contrast images converted to equivalent absorption radiographs using our algorithm, slot-scanned synchrotron imaging and imaging using a conventional mammography unit. Features in the resulting images were then assigned a quality score by volunteers. The single-image reconstruction method achieved higher scores at equivalent and lower doses than the conventional mammography images, but no improvement of visualization of the simulated microcalcifications, and some degradation in image quality at reduced doses for filament features.

  19. Sub-second pencil beam dose calculation on GPU for adaptive proton therapy

    NASA Astrophysics Data System (ADS)

    da Silva, Joakim; Ansorge, Richard; Jena, Rajesh

    2015-06-01

    Although proton therapy delivered using scanned pencil beams has the potential to produce better dose conformity than conventional radiotherapy, the created dose distributions are more sensitive to anatomical changes and patient motion. Therefore, the introduction of adaptive treatment techniques where the dose can be monitored as it is being delivered is highly desirable. We present a GPU-based dose calculation engine relying on the widely used pencil beam algorithm, developed for on-line dose calculation. The calculation engine was implemented from scratch, with each step of the algorithm parallelized and adapted to run efficiently on the GPU architecture. To ensure fast calculation, it employs several application-specific modifications and simplifications, and a fast scatter-based implementation of the computationally expensive kernel superposition step. The calculation time for a skull base treatment plan using two beam directions was 0.22 s on an Nvidia Tesla K40 GPU, whereas a test case of a cubic target in water from the literature took 0.14 s to calculate. The accuracy of the patient dose distributions was assessed by calculating the γ-index with respect to a gold standard Monte Carlo simulation. The passing rates were 99.2% and 96.7%, respectively, for the 3%/3 mm and 2%/2 mm criteria, matching those produced by a clinical treatment planning system.

  20. Insulin algorithms in the self-management of insulin-dependent diabetes: the interactive 'Apple Juice' program.

    PubMed

    Williams, A G

    1996-01-01

    The 'Apple Juice' program is an interactive diabetes self-management program which runs on a lap-top Macintosh Powerbook 100 computer. The dose-by-dose insulin advisory program was initially designed for children with insulin-dependent (type 1) diabetes mellitus. It utilizes several different insulin algorithms, measurement formulae, and compensation factors for meals, activity, medication and the dawn phenomenon. It was developed to assist the individual with diabetes and/or care providers, in determining specific insulin dosage recommendations throughout a 24 h period. Information technology functions include, but are not limited to automated record keeping, data recall, event reminders, data trend/pattern analyses and education. This paper highlights issues, observations and recommendations surrounding the use of the current version of the software, along with a detailed description of the insulin algorithms and measurement formulae applied successfully with the author's daughter over a six year period.

  1. SU-F-J-66: Anatomy Deformation Based Comparison Between One-Step and Two-Step Optimization for Online ART

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Z; Yu, G; Qin, S

    Purpose: This study investigated that how the quality of adapted plan was affected by inter-fractional anatomy deformation by using one-step and two-step optimization for on line adaptive radiotherapy (ART) procedure. Methods: 10 lung carcinoma patients were chosen randomly to produce IMRT plan by one-step and two-step algorithms respectively, and the prescribed dose was set as 60 Gy on the planning target volume (PTV) for all patients. To simulate inter-fractional target deformation, four specific cases were created by systematic anatomy variation; including target superior shift 0.5 cm, 0.3cm contraction, 0.3 cm expansion and 45-degree rotation. Based on these four anatomy deformation,more » adapted plan, regenerated plan and non-adapted plan were created to evaluate quality of adaptation. Adapted plans were generated automatically by using one-step and two-step algorithms respectively to optimize original plans, and regenerated plans were manually created by experience physicists. Non-adapted plans were produced by recalculating the dose distribution based on corresponding original plans. The deviations among these three plans were statistically analyzed by paired T-test. Results: In PTV superior shift case, adapted plans had significantly better PTV coverage by using two-step algorithm compared with one-step one, and meanwhile there was a significant difference of V95 by comparison with adapted and non-adapted plans (p=0.0025). In target contraction deformation, with almost same PTV coverage, the total lung received lower dose using one-step algorithm than two-step algorithm (p=0.0143,0.0126 for V20, Dmean respectively). In other two deformation cases, there were no significant differences observed by both two optimized algorithms. Conclusion: In geometry deformation such as target contraction, with comparable PTV coverage, one-step algorithm gave better OAR sparing than two-step algorithm. Reversely, the adaptation by using two-step algorithm had higher efficiency and accuracy as target occurred position displacement. We want to thank Dr. Lei Xing and Dr. Yong Yang in the Stanford University School of Medicine for this work. This work was jointly supported by NSFC (61471226), Natural Science Foundation for Distinguished Young Scholars of Shandong Province (JQ201516), and China Postdoctoral Science Foundation (2015T80739, 2014M551949).« less

  2. Feasibility of using two-dimensional array dosimeter for in vivo dose reconstruction via transit dosimetry.

    PubMed

    Chung, Heeteak; Li, Jonathan; Samant, Sanjiv

    2011-04-08

    Two-dimensional array dosimeters are commonly used to perform pretreatment quality assurance procedures, which makes them highly desirable for measuring transit fluences for in vivo dose reconstruction. The purpose of this study was to determine if an in vivo dose reconstruction via transit dosimetry using a 2D array dosimeter was possible. To test the accuracy of measuring transit dose distribution using a 2D array dosimeter, we evaluated it against the measurements made using ionization chamber and radiochromic film (RCF) profiles for various air gap distances (distance from the exit side of the solid water slabs to the detector distance; 0 cm, 30 cm, 40 cm, 50 cm, and 60 cm) and solid water slab thicknesses (10 cm and 20 cm). The backprojection dose reconstruction algorithm was described and evaluated. The agreement between the ionization chamber and RCF profiles for the transit dose distribution measurements ranged from -0.2% ~ 4.0% (average 1.79%). Using the backprojection dose reconstruction algorithm, we found that, of the six conformal fields, four had a 100% gamma index passing rate (3%/3 mm gamma index criteria), and two had gamma index passing rates of 99.4% and 99.6%. Of the five IMRT fields, three had a 100% gamma index passing rate, and two had gamma index passing rates of 99.6% and 98.8%. It was found that a 2D array dosimeter could be used for backprojection dose reconstruction for in vivo dosimetry.

  3. A new scanning device in CT with dose reduction potential

    NASA Astrophysics Data System (ADS)

    Tischenko, Oleg; Xu, Yuan; Hoeschen, Christoph

    2006-03-01

    The amount of x-ray radiation currently applied in CT practice is not utilized optimally. A portion of radiation traversing the patient is either not detected at all or is used ineffectively. The reason lies partly in the reconstruction algorithms and partly in the geometry of the CT scanners designed specifically for these algorithms. In fact, the reconstruction methods widely used in CT are intended to invert the data that correspond to ideal straight lines. However, the collection of such data is often not accurate due to likely movement of the source/detector system of the scanner in the time interval during which all the detectors are read. In this paper, a new design of the scanner geometry is proposed that is immune to the movement of the CT system and will collect all radiation traversing the patient. The proposed scanning design has a potential to reduce the patient dose by a factor of two. Furthermore, it can be used with the existing reconstruction algorithm and it is particularly suitable for OPED, a new robust reconstruction algorithm.

  4. Comparison of Body Weight Trend Algorithms for Prediction of Heart Failure Related Events in Home Care Setting.

    PubMed

    Eggerth, Alphons; Modre-Osprian, Robert; Hayn, Dieter; Kastner, Peter; Pölzl, Gerhard; Schreier, Günter

    2017-01-01

    Automatic event detection is used in telemedicine based heart failure disease management programs supporting physicians and nurses in monitoring of patients' health data. Analysis of the performance of automatic event detection algorithms for prediction of HF related hospitalisations or diuretic dose increases. Rule-Of-Thumb and Moving Average Convergence Divergence (MACD) algorithm were applied to body weight data from 106 heart failure patients of the HerzMobil-Tirol disease management program. The evaluation criteria were based on Youden index and ROC curves. Analysis of data from 1460 monitoring weeks with 54 events showed a maximum Youden index of 0.19 for MACD and RoT with a specificity > 0.90. Comparison of the two algorithms for real-world monitoring data showed similar results regarding total and limited AUC. An improvement of the sensitivity might be possible by including additional health data (e.g. vital signs and self-reported well-being) because body weight variations obviously are not the only cause of HF related hospitalisations or diuretic dose increases.

  5. Stereotactic, Single-Dose Irradiation of Lung Tumors: A Comparison of Absolute Dose and Dose Distribution Between Pencil Beam and Monte Carlo Algorithms Based on Actual Patient CT Scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen Huixiao; Lohr, Frank; Fritz, Peter

    2010-11-01

    Purpose: Dose calculation based on pencil beam (PB) algorithms has its shortcomings predicting dose in tissue heterogeneities. The aim of this study was to compare dose distributions of clinically applied non-intensity-modulated radiotherapy 15-MV plans for stereotactic body radiotherapy between voxel Monte Carlo (XVMC) calculation and PB calculation for lung lesions. Methods and Materials: To validate XVMC, one treatment plan was verified in an inhomogeneous thorax phantom with EDR2 film (Eastman Kodak, Rochester, NY). Both measured and calculated (PB and XVMC) dose distributions were compared regarding profiles and isodoses. Then, 35 lung plans originally created for clinical treatment by PB calculationmore » with the Eclipse planning system (Varian Medical Systems, Palo Alto, CA) were recalculated by XVMC (investigational implementation in PrecisePLAN [Elekta AB, Stockholm, Sweden]). Clinically relevant dose-volume parameters for target and lung tissue were compared and analyzed statistically. Results: The XVMC calculation agreed well with film measurements (<1% difference in lateral profile), whereas the deviation between PB calculation and film measurements was up to +15%. On analysis of 35 clinical cases, the mean dose, minimal dose and coverage dose value for 95% volume of gross tumor volume were 1.14 {+-} 1.72 Gy, 1.68 {+-} 1.47 Gy, and 1.24 {+-} 1.04 Gy lower by XVMC compared with PB, respectively (prescription dose, 30 Gy). The volume covered by the 9 Gy isodose of lung was 2.73% {+-} 3.12% higher when calculated by XVMC compared with PB. The largest differences were observed for small lesions circumferentially encompassed by lung tissue. Conclusions: Pencil beam dose calculation overestimates dose to the tumor and underestimates lung volumes exposed to a given dose consistently for 15-MV photons. The degree of difference between XVMC and PB is tumor size and location dependent. Therefore XVMC calculation is helpful to further optimize treatment planning.« less

  6. Estimating the uncertainty of calculated out-of-field organ dose from a commercial treatment planning system.

    PubMed

    Wang, Lilie; Ding, George X

    2018-06-12

    Therapeutic radiation to cancer patients is accompanied by unintended radiation to organs outside the treatment field. It is known that the model-based dose algorithm has limitation in calculating the out-of-field doses. This study evaluated the out-of-field dose calculated by the Varian Eclipse treatment planning system (v.11 with AAA algorithm) in realistic treatment plans with the goal of estimating the uncertainties of calculated organ doses. Photon beam phase-space files for TrueBeam linear accelerator were provided by Varian. These were used as incident sources in EGSnrc Monte Carlo simulations of radiation transport through the downstream jaws and MLC. Dynamic movements of the MLC leaves were fully modeled based on treatment plans using IMRT or VMAT techniques. The Monte Carlo calculated out-of-field doses were then compared with those calculated by Eclipse. The dose comparisons were performed for different beam energies and treatment sites, including head-and-neck, lung, and pelvis. For 6 MV (FF/FFF), 10 MV (FF/FFF), and 15 MV (FF) beams, Eclipse underestimated out-of-field local doses by 30%-50% compared with Monte Carlo calculations when the local dose was <1% of prescribed dose. The accuracy of out-of-field dose calculations using Eclipse is improved when collimator jaws were set at the smallest possible aperture for MLC openings. The Eclipse system consistently underestimates out-of-field dose by a factor of 2 for all beam energies studied at the local dose level of less than 1% of prescribed dose. These findings are useful in providing information on the uncertainties of out-of-field organ doses calculated by Eclipse treatment planning system. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  7. Commissioning and validation of COMPASS system for VMAT patient specific quality assurance

    NASA Astrophysics Data System (ADS)

    Pimthong, J.; Kakanaporn, C.; Tuntipumiamorn, L.; Laojunun, P.; Iampongpaiboon, P.

    2016-03-01

    Pre-treatment patient specific quality assurance (QA) of advanced treatment techniques such as volumetric modulated arc therapy (VMAT) is one of important QA in radiotherapy. The fast and reliable dosimetric device is required. The objective of this study is to commission and validate the performance of COMPASS system for dose verification of VMAT technique. The COMPASS system is composed of an array of ionization detectors (MatriXX) mounted to the gantry using a custom holder and software for the analysis and visualization of QA results. We validated the COMPASS software for basic and advanced clinical application. For the basic clinical study, the simple open field in various field sizes were validated in homogeneous phantom. And the advanced clinical application, the fifteen prostate and fifteen nasopharyngeal cancers VMAT plans were chosen to study. The treatment plans were measured by the MatriXX. The doses and dose-volume histograms (DVHs) reconstructed from the fluence measurements were compared to the TPS calculated plans. And also, the doses and DVHs computed using collapsed cone convolution (CCC) Algorithm were compared with Eclipse TPS calculated plans using Analytical Anisotropic Algorithm (AAA) that according to dose specified in ICRU 83 for PTV.

  8. Percentage depth dose evaluation in heterogeneous media using thermoluminescent dosimetry

    PubMed Central

    da Rosa, L.A.R.; Campos, L.T.; Alves, V.G.L.; Batista, D.V.S.; Facure, A.

    2010-01-01

    The purpose of this study is to investigate the influence of lung heterogeneity inside a soft tissue phantom on percentage depth dose (PDD). PDD curves were obtained experimentally using LiF:Mg,Ti (TLD‐100) thermoluminescent detectors and applying Eclipse treatment planning system algorithms Batho, modified Batho (M‐Batho or BMod), equivalent TAR (E‐TAR or EQTAR), and anisotropic analytical algorithm (AAA) for a 15 MV photon beam and field sizes of 1×1,2×2,5×5, and 10×10cm2. Monte Carlo simulations were performed using the DOSRZnrc user code of EGSnrc. The experimental results agree with Monte Carlo simulations for all irradiation field sizes. Comparisons with Monte Carlo calculations show that the AAA algorithm provides the best simulations of PDD curves for all field sizes investigated. However, even this algorithm cannot accurately predict PDD values in the lung for field sizes of 1×1 and 2×2cm2. An overdosage in the lung of about 40% and 20% is calculated by the AAA algorithm close to the interface soft tissue/lung for 1×1 and 2×2cm2 field sizes, respectively. It was demonstrated that differences of 100% between Monte Carlo results and the algorithms Batho, modified Batho, and equivalent TAR responses may exist inside the lung region for the 1×1cm2 field. PACS number: 87.55.kd

  9. A multi-institutional study of independent calculation verification in inhomogeneous media using a simple and effective method of heterogeneity correction integrated with the Clarkson method.

    PubMed

    Jinno, Shunta; Tachibana, Hidenobu; Moriya, Shunsuke; Mizuno, Norifumi; Takahashi, Ryo; Kamima, Tatsuya; Ishibashi, Satoru; Sato, Masanori

    2018-05-21

    In inhomogeneous media, there is often a large systematic difference in the dose between the conventional Clarkson algorithm (C-Clarkson) for independent calculation verification and the superposition-based algorithms of treatment planning systems (TPSs). These treatment site-dependent differences increase the complexity of the radiotherapy planning secondary check. We developed a simple and effective method of heterogeneity correction integrated with the Clarkson algorithm (L-Clarkson) to account for the effects of heterogeneity in the lateral dimension, and performed a multi-institutional study to evaluate the effectiveness of the method. In the method, a 2D image reconstructed from computed tomography (CT) images is divided according to lines extending from the reference point to the edge of the multileaf collimator (MLC) or jaw collimator for each pie sector, and the radiological path length (RPL) of each line is calculated on the 2D image to obtain a tissue maximum ratio and phantom scatter factor, allowing the dose to be calculated. A total of 261 plans (1237 beams) for conventional breast and lung treatments and lung stereotactic body radiotherapy were collected from four institutions. Disagreements in dose between the on-site TPSs and a verification program using the C-Clarkson and L-Clarkson algorithms were compared. Systematic differences with the L-Clarkson method were within 1% for all sites, while the C-Clarkson method resulted in systematic differences of 1-5%. The L-Clarkson method showed smaller variations. This heterogeneity correction integrated with the Clarkson algorithm would provide a simple evaluation within the range of -5% to +5% for a radiotherapy plan secondary check.

  10. Evaluation of methods to produce an image library for automatic patient model localization for dose mapping during fluoroscopically guided procedures

    NASA Astrophysics Data System (ADS)

    Kilian-Meneghin, Josh; Xiong, Z.; Rudin, S.; Oines, A.; Bednarek, D. R.

    2017-03-01

    The purpose of this work is to evaluate methods for producing a library of 2D-radiographic images to be correlated to clinical images obtained during a fluoroscopically-guided procedure for automated patient-model localization. The localization algorithm will be used to improve the accuracy of the skin-dose map superimposed on the 3D patient- model of the real-time Dose-Tracking-System (DTS). For the library, 2D images were generated from CT datasets of the SK-150 anthropomorphic phantom using two methods: Schmid's 3D-visualization tool and Plastimatch's digitally-reconstructed-radiograph (DRR) code. Those images, as well as a standard 2D-radiographic image, were correlated to a 2D-fluoroscopic image of a phantom, which represented the clinical-fluoroscopic image, using the Corr2 function in Matlab. The Corr2 function takes two images and outputs the relative correlation between them, which is fed into the localization algorithm. Higher correlation means better alignment of the 3D patient-model with the patient image. In this instance, it was determined that the localization algorithm will succeed when Corr2 returns a correlation of at least 50%. The 3D-visualization tool images returned 55-80% correlation relative to the fluoroscopic-image, which was comparable to the correlation for the radiograph. The DRR images returned 61-90% correlation, again comparable to the radiograph. Both methods prove to be sufficient for the localization algorithm and can be produced quickly; however, the DRR method produces more accurate grey-levels. Using the DRR code, a library at varying angles can be produced for the localization algorithm.

  11. Study of Variation in Dose Calculation Accuracy Between kV Cone-Beam Computed Tomography and kV fan-Beam Computed Tomography

    PubMed Central

    Kaliyaperumal, Venkatesan; Raphael, C. Jomon; Varghese, K. Mathew; Gopu, Paul; Sivakumar, S.; Boban, Minu; Raj, N. Arunai Nambi; Senthilnathan, K.; Babu, P. Ramesh

    2017-01-01

    Cone-beam computed tomography (CBCT) images are presently used for geometric verification for daily patient positioning. In this work, we have compared the images of CBCT with the images of conventional fan beam CT (FBCT) in terms of image quality and Hounsfield units (HUs). We also compared the dose calculated using CBCT with that of FBCT. Homogenous RW3 plates and Catphan phantom were scanned by FBCT and CBCT. In RW3 and Catphan phantom, percentage depth dose (PDD), profiles, isodose distributions (for intensity modulated radiotherapy plans), and calculated dose volume histograms were compared. The HU difference was within ± 20 HU (central region) and ± 30 HU (peripheral region) for homogeneous RW3 plates. In the Catphan phantom, the difference in HU was ± 20 HU in the central area and peripheral areas. The HU differences were within ± 30 HU for all HU ranges starting from −1000 to 990 in phantom and patient images. In treatment plans done with simple symmetric and asymmetric fields, dose difference (DD) between CBCT plan and FBCT plan was within 1.2% for both phantoms. In intensity modulated radiotherapy (IMRT) treatment plans, for different target volumes, the difference was <2%. This feasibility study investigated HU variation and dose calculation accuracy between FBCT and CBCT based planning and has validated inverse planning algorithms with CBCT. In our study, we observed a larger deviation of HU values in the peripheral region compared to the central region. This is due to the ring artifact and scatter contribution which may prevent the use of CBCT as the primary imaging modality for radiotherapy treatment planning. The reconstruction algorithm needs to be modified further for improving the image quality and accuracy in HU values. However, our study with TG-119 and intensity modulated radiotherapy test targets shows that CBCT can be used for adaptive replanning as the recalculation of dose with the anisotropic analytical algorithm is in full accord with conventional planning CT except in the build-up regions. Patient images with CBCT have to be carefully analyzed for any artifacts before using them for such dose calculations. PMID:28974864

  12. An individualized strategy to estimate the effect of deformable registration uncertainty on accumulated dose in the upper abdomen

    NASA Astrophysics Data System (ADS)

    Wang, Yibing; Petit, Steven F.; Vásquez Osorio, Eliana; Gupta, Vikas; Méndez Romero, Alejandra; Heijmen, Ben

    2018-06-01

    In the abdomen, it is challenging to assess the accuracy of deformable image registration (DIR) for individual patients, due to the lack of clear anatomical landmarks, which can hamper clinical applications that require high accuracy DIR, such as adaptive radiotherapy. In this study, we propose and evaluate a methodology for estimating the impact of uncertainties in DIR on calculated accumulated dose in the upper abdomen, in order to aid decision making in adaptive treatment approaches. Sixteen liver metastasis patients treated with SBRT were evaluated. Each patient had one planning and three daily treatment CT-scans. Each daily CT scan was deformably registered 132 times to the planning CT-scan, using a wide range of parameter settings for the registration algorithm. A subset of ‘realistic’ registrations was then objectively selected based on distances between mapped and target contours. The underlying 3D transformations of these registrations were used to assess the corresponding uncertainties in voxel positions, and delivered dose, with a focus on accumulated maximum doses in the hollow OARs, i.e. esophagus, stomach, and duodenum. The number of realistic registrations varied from 5 to 109, depending on the patient, emphasizing the need for individualized registration parameters. Considering for all patients the realistic registrations, the 99th percentile of the voxel position uncertainties was 5.6  ±  3.3 mm. This translated into a variation (difference between 1st and 99th percentile) in accumulated D max in hollow OARs of up to 3.3 Gy. For one patient a violation of the accumulated stomach dose outside the uncertainty band was detected. The observed variation in accumulated doses in the OARs related to registration uncertainty, emphasizes the need to investigate the impact of this uncertainty for any DIR algorithm prior to clinical use for dose accumulation. The proposed method for assessing on an individual patient basis the impact of uncertainties in DIR on accumulated dose is in principle applicable for all DIR algorithms allowing variation in registration parameters.

  13. TU-AB-303-08: GPU-Based Software Platform for Efficient Image-Guided Adaptive Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S; Robinson, A; McNutt, T

    2015-06-15

    Purpose: In this study, we develop an integrated software platform for adaptive radiation therapy (ART) that combines fast and accurate image registration, segmentation, and dose computation/accumulation methods. Methods: The proposed system consists of three key components; 1) deformable image registration (DIR), 2) automatic segmentation, and 3) dose computation/accumulation. The computationally intensive modules including DIR and dose computation have been implemented on a graphics processing unit (GPU). All required patient-specific data including the planning CT (pCT) with contours, daily cone-beam CTs, and treatment plan are automatically queried and retrieved from their own databases. To improve the accuracy of DIR between pCTmore » and CBCTs, we use the double force demons DIR algorithm in combination with iterative CBCT intensity correction by local intensity histogram matching. Segmentation of daily CBCT is then obtained by propagating contours from the pCT. Daily dose delivered to the patient is computed on the registered pCT by a GPU-accelerated superposition/convolution algorithm. Finally, computed daily doses are accumulated to show the total delivered dose to date. Results: Since the accuracy of DIR critically affects the quality of the other processes, we first evaluated our DIR method on eight head-and-neck cancer cases and compared its performance. Normalized mutual-information (NMI) and normalized cross-correlation (NCC) computed as similarity measures, and our method produced overall NMI of 0.663 and NCC of 0.987, outperforming conventional methods by 3.8% and 1.9%, respectively. Experimental results show that our registration method is more consistent and roust than existing algorithms, and also computationally efficient. Computation time at each fraction took around one minute (30–50 seconds for registration and 15–25 seconds for dose computation). Conclusion: We developed an integrated GPU-accelerated software platform that enables accurate and efficient DIR, auto-segmentation, and dose computation, thus supporting an efficient ART workflow. This work was supported by NIH/NCI under grant R42CA137886.« less

  14. A qualitative and quantitative analysis of radiation dose and image quality of computed tomography images using adaptive statistical iterative reconstruction.

    PubMed

    Hussain, Fahad Ahmed; Mail, Noor; Shamy, Abdulrahman M; Suliman, Alghamdi; Saoudi, Abdelhamid

    2016-05-08

    Image quality is a key issue in radiology, particularly in a clinical setting where it is important to achieve accurate diagnoses while minimizing radiation dose. Some computed tomography (CT) manufacturers have introduced algorithms that claim significant dose reduction. In this study, we assessed CT image quality produced by two reconstruction algorithms provided with GE Healthcare's Discovery 690 Elite positron emission tomography (PET) CT scanner. Image quality was measured for images obtained at various doses with both conventional filtered back-projection (FBP) and adaptive statistical iterative reconstruction (ASIR) algorithms. A stan-dard CT dose index (CTDI) phantom and a pencil ionization chamber were used to measure the CT dose at 120 kVp and an exposure of 260 mAs. Image quality was assessed using two phantoms. CT images of both phantoms were acquired at tube voltage (kV) of 120 with exposures ranging from 25 mAs to 400 mAs. Images were reconstructed using FBP and ASIR ranging from 10% to 100%, then analyzed for noise, low-contrast detectability, contrast-to-noise ratio (CNR), and modulation transfer function (MTF). Noise was 4.6 HU in water phantom images acquired at 260 mAs/FBP 120 kV and 130 mAs/50% ASIR 120 kV. The large objects (fre-quency < 7 lp/cm) retained fairly acceptable image quality at 130 mAs/50% ASIR, compared to 260 mAs/FBP. The application of ASIR for small objects (frequency >7 lp/cm) showed poor visibility compared to FBP at 260 mAs and even worse for images acquired at less than 130 mAs. ASIR blending more than 50% at low dose tends to reduce contrast of small objects (frequency >7 lp/cm). We concluded that dose reduction and ASIR should be applied with close attention if the objects to be detected or diagnosed are small (frequency > 7 lp/cm). Further investigations are required to correlate the small objects (frequency > 7 lp/cm) to patient anatomy and clinical diagnosis.

  15. Technical Report: Evaluation of peripheral dose for flattening filter free photon beams.

    PubMed

    Covington, E L; Ritter, T A; Moran, J M; Owrangi, A M; Prisciandaro, J I

    2016-08-01

    To develop a comprehensive peripheral dose (PD) dataset for the two unflattened beams of nominal energy 6 and 10 MV for use in clinical care. Measurements were made in a 40 × 120 × 20 cm(3) (width × length × depth) stack of solid water using an ionization chamber at varying depths (dmax, 5, and 10 cm), field sizes (3 × 3 to 30 × 30 cm(2)), and distances from the field edge (5-40 cm). The effects of the multileaf collimator (MLC) and collimator rotation were also evaluated for a 10 × 10 cm(2) field. Using the same phantom geometry, the accuracy of the analytic anisotropic algorithm (AAA) and Acuros dose calculation algorithm was assessed and compared to the measured values. The PDs for both the 6 flattening filter free (FFF) and 10 FFF photon beams were found to decrease with increasing distance from the radiation field edge and the decreasing field size. The measured PD was observed to be higher for the 6 FFF than for the 10 FFF for all field sizes and depths. The impact of collimator rotation was not found to be clinically significant when used in conjunction with MLCs. AAA and Acuros algorithms both underestimated the PD with average errors of -13.6% and -7.8%, respectively, for all field sizes and depths at distances of 5 and 10 cm from the field edge, but the average error was found to increase to nearly -69% at greater distances. Given the known inaccuracies of peripheral dose calculations, this comprehensive dataset can be used to estimate the out-of-field dose to regions of interest such as organs at risk, electronic implantable devices, and a fetus. While the impact of collimator rotation was not found to significantly decrease PD when used in conjunction with MLCs, results are expected to be machine model and beam energy dependent. It is not recommended to use a treatment planning system to estimate PD due to the underestimation of the out-of-field dose and the inability to calculate dose at extended distances due to the limits of the dose calculation matrix.

  16. Independent calculation of monitor units for VMAT and SPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xin; Bush, Karl; Ding, Aiping

    Purpose: Dose and monitor units (MUs) represent two important facets of a radiation therapy treatment. In current practice, verification of a treatment plan is commonly done in dose domain, in which a phantom measurement or forward dose calculation is performed to examine the dosimetric accuracy and the MU settings of a given treatment plan. While it is desirable to verify directly the MU settings, a computational framework for obtaining the MU values from a known dose distribution has yet to be developed. This work presents a strategy to calculate independently the MUs from a given dose distribution of volumetric modulatedmore » arc therapy (VMAT) and station parameter optimized radiation therapy (SPORT). Methods: The dose at a point can be expressed as a sum of contributions from all the station points (or control points). This relationship forms the basis of the proposed MU verification technique. To proceed, the authors first obtain the matrix elements which characterize the dosimetric contribution of the involved station points by computing the doses at a series of voxels, typically on the prescription surface of the VMAT/SPORT treatment plan, with unit MU setting for all the station points. An in-house Monte Carlo (MC) software is used for the dose matrix calculation. The MUs of the station points are then derived by minimizing the least-squares difference between doses computed by the treatment planning system (TPS) and that of the MC for the selected set of voxels on the prescription surface. The technique is applied to 16 clinical cases with a variety of energies, disease sites, and TPS dose calculation algorithms. Results: For all plans except the lung cases with large tissue density inhomogeneity, the independently computed MUs agree with that of TPS to within 2.7% for all the station points. In the dose domain, no significant difference between the MC and Eclipse Anisotropic Analytical Algorithm (AAA) dose distribution is found in terms of isodose contours, dose profiles, gamma index, and dose volume histogram (DVH) for these cases. For the lung cases, the MC-calculated MUs differ significantly from that of the treatment plan computed using AAA. However, the discrepancies are reduced to within 3% when the TPS dose calculation algorithm is switched to a transport equation-based technique (Acuros™). Comparison in the dose domain between the MC and Eclipse AAA/Acuros calculation yields conclusion consistent with the MU calculation. Conclusions: A computational framework relating the MU and dose domains has been established. The framework does not only enable them to verify the MU values of the involved station points of a VMAT plan directly in the MU domain but also provide a much needed mechanism to adaptively modify the MU values of the station points in accordance to a specific change in the dose domain.« less

  17. Assessment of phase based dose modulation for improved dose efficiency in cardiac CT on an anthropomorphic motion phantom

    NASA Astrophysics Data System (ADS)

    Budde, Adam; Nilsen, Roy; Nett, Brian

    2014-03-01

    State of the art automatic exposure control modulates the tube current across view angle and Z based on patient anatomy for use in axial full scan reconstructions. Cardiac CT, however, uses a fundamentally different image reconstruction that applies a temporal weighting to reduce motion artifacts. This paper describes a phase based mA modulation that goes beyond axial and ECG modulation; it uses knowledge of the temporal view weighting applied within the reconstruction algorithm to improve dose efficiency in cardiac CT scanning. Using physical phantoms and synthetic noise emulation, we measure how knowledge of sinogram temporal weighting and the prescribed cardiac phase can be used to improve dose efficiency. First, we validated that a synthetic CT noise emulation method produced realistic image noise. Next, we used the CT noise emulation method to simulate mA modulation on scans of a physical anthropomorphic phantom where a motion profile corresponding to a heart rate of 60 beats per minute was used. The CT noise emulation method matched noise to lower dose scans across the image within 1.5% relative error. Using this noise emulation method to simulate modulating the mA while keeping the total dose constant, the image variance was reduced by an average of 11.9% on a scan with 50 msec padding, demonstrating improved dose efficiency. Radiation dose reduction in cardiac CT can be achieved while maintaining the same level of image noise through phase based dose modulation that incorporates knowledge of the cardiac reconstruction algorithm.

  18. STABILITY OF THE NEUTRON DOSE DETERMINATION ALGORITHM FOR PERSONAL NEUTRON DOSEMETERS AT DIFFERENT RADON GAS EXPOSURES.

    PubMed

    Mayer, Sabine; Boschung, Markus; Butterweck, Gernot; Assenmacher, Frank; Hohmann, Eike

    2016-09-01

    Since 2008 the Paul Scherrer Institute (PSI) has been using a microscope-based automatic scanning system for assessing personal neutron doses with a dosemeter based on PADC. This scanning system, known as TASLImage, includes a comprehensive characterisation of tracks. The distributions of several specific track characteristics such as size, shape and optical density are compared with a reference set to discriminate tracks of alpha particles and non-track background. Due to the dosemeter design at PSI, it is anticipated that radon should not significantly contribute to the creation of additional tracks in the PADC detector. The present study tests the stability of the neutron dose determination algorithm of the personal neutron dosemeter system in operation at PSI at different radon gas exposures. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs.

    PubMed

    Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve

    2013-12-21

    Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine.

  20. Harmony search optimization for HDR prostate brachytherapy

    NASA Astrophysics Data System (ADS)

    Panchal, Aditya

    In high dose-rate (HDR) prostate brachytherapy, multiple catheters are inserted interstitially into the target volume. The process of treating the prostate involves calculating and determining the best dose distribution to the target and organs-at-risk by means of optimizing the time that the radioactive source dwells at specified positions within the catheters. It is the goal of this work to investigate the use of a new optimization algorithm, known as Harmony Search, in order to optimize dwell times for HDR prostate brachytherapy. The new algorithm was tested on 9 different patients and also compared with the genetic algorithm. Simulations were performed to determine the optimal value of the Harmony Search parameters. Finally, multithreading of the simulation was examined to determine potential benefits. First, a simulation environment was created using the Python programming language and the wxPython graphical interface toolkit, which was necessary to run repeated optimizations. DICOM RT data from Varian BrachyVision was parsed and used to obtain patient anatomy and HDR catheter information. Once the structures were indexed, the volume of each structure was determined and compared to the original volume calculated in BrachyVision for validation. Dose was calculated using the AAPM TG-43 point source model of the GammaMed 192Ir HDR source and was validated against Varian BrachyVision. A DVH-based objective function was created and used for the optimization simulation. Harmony Search and the genetic algorithm were implemented as optimization algorithms for the simulation and were compared against each other. The optimal values for Harmony Search parameters (Harmony Memory Size [HMS], Harmony Memory Considering Rate [HMCR], and Pitch Adjusting Rate [PAR]) were also determined. Lastly, the simulation was modified to use multiple threads of execution in order to achieve faster computational times. Experimental results show that the volume calculation that was implemented in this thesis was within 2% of the values computed by Varian BrachyVision for the prostate, within 3% for the rectum and bladder and 6% for the urethra. The calculation of dose compared to BrachyVision was determined to be different by only 0.38%. Isodose curves were also generated and were found to be similar to BrachyVision. The comparison between Harmony Search and genetic algorithm showed that Harmony Search was over 4 times faster when compared over multiple data sets. The optimal Harmony Memory Size was found to be 5 or lower; the Harmony Memory Considering Rate was determined to be 0.95, and the Pitch Adjusting Rate was found to be 0.9. Ultimately, the effect of multithreading showed that as intensive computations such as optimization and dose calculation are involved, the threads of execution scale with the number of processors, achieving a speed increase proportional to the number of processor cores. In conclusion, this work showed that Harmony Search is a viable alternative to existing algorithms for use in HDR prostate brachytherapy optimization. Coupled with the optimal parameters for the algorithm and a multithreaded simulation, this combination has the capability to significantly decrease the time spent on minimizing optimization problems in the clinic that are time intensive, such as brachytherapy, IMRT and beam angle optimization.

  1. Evaluation of lens dose from anterior electron beams: comparison of Pinnacle and Gafchromic EBT3 film.

    PubMed

    Sonier, Marcus; Wronski, Matt; Yeboah, Collins

    2015-03-08

    Lens dose is a concern during the treatment of facial lesions with anterior electron beams. Lead shielding is routinely employed to reduce lens dose and minimize late complications. The purpose of this work is twofold: 1) to measure dose pro-files under large-area lead shielding at the lens depth for clinical electron energies via film dosimetry; and 2) to assess the accuracy of the Pinnacle treatment planning system in calculating doses under lead shields. First, to simulate the clinical geometry, EBT3 film and 4 cm wide lead shields were incorporated into a Solid Water phantom. With the lead shield inside the phantom, the film was positioned at a depth of 0.7 cm below the lead, while a variable thickness of solid water, simulating bolus, was placed on top. This geometry was reproduced in Pinnacle to calculate dose profiles using the pencil beam electron algorithm. The measured and calculated dose profiles were normalized to the central-axis dose maximum in a homogeneous phantom with no lead shielding. The resulting measured profiles, functions of bolus thickness and incident electron energy, can be used to estimate the lens dose under various clinical scenarios. These profiles showed a minimum lead margin of 0.5 cm beyond the lens boundary is required to shield the lens to ≤ 10% of the dose maximum. Comparisons with Pinnacle showed a consistent overestimation of dose under the lead shield with discrepancies of ~ 25% occur-ring near the shield edge. This discrepancy was found to increase with electron energy and bolus thickness and decrease with distance from the lead edge. Thus, the Pinnacle electron algorithm is not recommended for estimating lens dose in this situation. The film measurements, however, allow for a reasonable estimate of lens dose from electron beams and for clinicians to assess the lead margin required to reduce the lens dose to an acceptable level.

  2. Method for simulating dose reduction in digital mammography using the Anscombe transformation.

    PubMed

    Borges, Lucas R; Oliveira, Helder C R de; Nunes, Polyana F; Bakic, Predrag R; Maidment, Andrew D A; Vieira, Marcelo A C

    2016-06-01

    This work proposes an accurate method for simulating dose reduction in digital mammography starting from a clinical image acquired with a standard dose. The method developed in this work consists of scaling a mammogram acquired at the standard radiation dose and adding signal-dependent noise. The algorithm accounts for specific issues relevant in digital mammography images, such as anisotropic noise, spatial variations in pixel gain, and the effect of dose reduction on the detective quantum efficiency. The scaling process takes into account the linearity of the system and the offset of the detector elements. The inserted noise is obtained by acquiring images of a flat-field phantom at the standard radiation dose and at the simulated dose. Using the Anscombe transformation, a relationship is created between the calculated noise mask and the scaled image, resulting in a clinical mammogram with the same noise and gray level characteristics as an image acquired at the lower-radiation dose. The performance of the proposed algorithm was validated using real images acquired with an anthropomorphic breast phantom at four different doses, with five exposures for each dose and 256 nonoverlapping ROIs extracted from each image and with uniform images. The authors simulated lower-dose images and compared these with the real images. The authors evaluated the similarity between the normalized noise power spectrum (NNPS) and power spectrum (PS) of simulated images and real images acquired with the same dose. The maximum relative error was less than 2.5% for every ROI. The added noise was also evaluated by measuring the local variance in the real and simulated images. The relative average error for the local variance was smaller than 1%. A new method is proposed for simulating dose reduction in clinical mammograms. In this method, the dependency between image noise and image signal is addressed using a novel application of the Anscombe transformation. NNPS, PS, and local noise metrics confirm that this method is capable of precisely simulating various dose reductions.

  3. The effects of small field dosimetry on the biological models used in evaluating IMRT dose distributions

    NASA Astrophysics Data System (ADS)

    Cardarelli, Gene A.

    The primary goal in radiation oncology is to deliver lethal radiation doses to tumors, while minimizing dose to normal tissue. IMRT has the capability to increase the dose to the targets and decrease the dose to normal tissue, increasing local control, decrease toxicity and allow for effective dose escalation. This advanced technology does present complex dose distributions that are not easily verified. Furthermore, the dose inhomogeneity caused by non-uniform dose distributions seen in IMRT treatments has caused the development of biological models attempting to characterize the dose-volume effect in the response of organized tissues to radiation. Dosimetry of small fields can be quite challenging when measuring dose distributions for high-energy X-ray beams used in IMRT. The proper modeling of these small field distributions is essential in reproducing accurate dose for IMRT. This evaluation was conducted to quantify the effects of small field dosimetry on IMRT plan dose distributions and the effects on four biological model parameters. The four biological models evaluated were: (1) the generalized Equivalent Uniform Dose (gEUD), (2) the Tumor Control Probability (TCP), (3) the Normal Tissue Complication Probability (NTCP) and (4) the Probability of uncomplicated Tumor Control (P+). These models are used to estimate local control, survival, complications and uncomplicated tumor control. This investigation compares three distinct small field dose algorithms. Dose algorithms were created using film, small ion chamber, and a combination of ion chamber measurements and small field fitting parameters. Due to the nature of uncertainties in small field dosimetry and the dependence of biological models on dose volume information, this examination quantifies the effects of small field dosimetry techniques on radiobiological models and recommends pathways to reduce the errors in using these models to evaluate IMRT dose distributions. This study demonstrates the importance of valid physical dose modeling prior to the use of biological modeling. The success of using biological function data, such as hypoxia, in clinical IMRT planning will greatly benefit from the results of this study.

  4. Evaluation of the dose calculation accuracy for small fields defined by jaw or MLC for AAA and Acuros XB algorithms.

    PubMed

    Fogliata, Antonella; Lobefalo, Francesca; Reggiori, Giacomo; Stravato, Antonella; Tomatis, Stefano; Scorsetti, Marta; Cozzi, Luca

    2016-10-01

    Small field measurements are challenging, due to the physical characteristics coming from the lack of charged particle equilibrium, the partial occlusion of the finite radiation source, and to the detector response. These characteristics can be modeled in the dose calculations in the treatment planning systems. Aim of the present work is to evaluate the MU calculation accuracy for small fields, defined by jaw or MLC, for anisotropic analytical algorithm (AAA) and Acuros XB algorithms, relative to output measurements on the beam central axis. Single point output factor measurement was acquired with a PTW microDiamond detector for 6 MV, 6 and 10 MV unflattened beams generated by a Varian TrueBeam STx equipped with high definition-MLC. Fields defined by jaw or MLC apertures were set; jaw-defined: 0.6 × 0.6, 0.8 × 0.8, 1 × 1, 2 × 2, 3 × 3, 4 × 4, 5 × 5, and 10 × 10 cm 2 ; MLC-defined: 0.5 × 0.5 cm 2 to the maximum field defined by the jaw, with 0.5 cm stepping, and jaws set to: 2 × 2, 3 × 3, 4 × 4, 5 × 5, and 10 × 10 cm 2 . MU calculation was obtained with 1 mm grid in a virtual water phantom for the same fields, for AAA and Acuros algorithms implemented in the Varian eclipse treatment planning system (version 13.6). Configuration parameters as the effective spot size (ESS) and the dosimetric leaf gap (DLG) were varied to find the best parameter setting. Differences between calculated and measured doses were analyzed. Agreement better than 0.5% was found for field sizes equal to or larger than 2 × 2 cm 2 for both algorithms. A dose overestimation was present for smaller jaw-defined fields, with the best agreement, averaged over all the energies, of 1.6% and 4.6% for a 1 × 1 cm 2 field calculated by AAA and Acuros, respectively, for a configuration with ESS = 1 mm for both X and Y directions for AAA, and ESS = 1.5 and 0 mm for X and Y directions for Acuros. Conversely, a calculated dose underestimation was found for small MLC-defined fields, with the best agreement averaged over all the energies, of -3.9% and 0.2% for a 1 × 1 cm 2 field calculated by AAA and Acuros, respectively, for a configuration with ESS = 0 mm for both directions and both algorithms. For optimal setting applied in the algorithm configuration phase, the agreement of Acuros calculations with measurements could achieve the 3% for MLC-defined fields as small as 0.5 × 0.5 cm 2 . Similar agreement was found for AAA for fields as small as 1 × 1 cm 2 .

  5. Application of low-tube current with iterative model reconstruction on Philips Brilliance iCT Elite FHD in the accuracy of spinal QCT using a European spine phantom.

    PubMed

    Wu, Yan; Jiang, Yaojun; Han, Xueli; Wang, Mingyue; Gao, Jianbo

    2018-02-01

    To investigate the repeatability and accuracy of quantitative CT (QCT) measurement of bone mineral density (BMD) by low-mAs using iterative model reconstruction (IMR) technique based on phantom model. European spine phantom (ESP) was selected and measured on the Philips Brilliance iCT Elite FHD machine for 10 times. Data were transmitted to the QCT PRO workstation to measure BMD (mg/cm 3 ) of the ESP (L1, L2, L3). Scanning method: the voltage of X-ray tube is 120 kV, the electric current of X-ray tube output in five respective groups A-E were: 20, 30, 40, 50 and 60 mAs. Reconstruction: all data were reconstructed using filtered back projection (FBP), IR levels of hybrid iterative reconstruction (iDose 4 , levels 1, 2, 3, 4, 5, 6 were used) and IMR (levels 1, 2, 3 were used). ROIs were placed in the middle of L1, L2 and L3 spine phantom in each group. CT values, noise and contrast-to-noise ratio (CNR) were measured and calculated. One-way analysis of variance (ANOVA) was used to compare BMD values of different mAs and different IMR. Radiation dose [volume CT dose index (CTDI vol ) and dose length product (DLP)] was positively correlated with tube current. In L1 with low BMD, different mAs in FBP showed P<0.05, indicating statistically significant BMD in ESP. In other iterative algorithms, different mAs under same iterative algorithms showed P>0.05, indicating no difference in BMD. And P>0.05 was observed among BMD of spine phantom in L1, L2 and L3 under same mAs joined with varied iterative reconstruction. The BMD in L1 varied greatly during FBP reconstruction, and less variation was observed in reconstruction of IMR [1] and IMR [2]. The BMD of L2 changed more during FBP reconstruction, where less was observed in IMR [2]. The BMD of L3 varied greatly during FBP reconstruction, and was less varied in all levels of iDose 4 and reconstruction of IMR [2]. In addition, along with continuous mAs incensement, the CNRs in various algorithms continued to increase. Among them, CNR with the FBP algorithm is the lowest, and CNR of the IMR [3] algorithm is the highest. Repeated measurements of BMD with QCT in the ESP multicenter showed that BMD changes in L1-L3 are the least varied at IMR [2] algorithm. It is recommended to scan at 120 kV with 20 mAs combined with IMR [2] algorithm. In this way, the BMD of spine by QCT could be accurately measured, while radiation dosage significantly reduced and imaging quality improved at the same time.

  6. A medical image-based graphical platform -- features, applications and relevance for brachytherapy.

    PubMed

    Fonseca, Gabriel P; Reniers, Brigitte; Landry, Guillaume; White, Shane; Bellezzo, Murillo; Antunes, Paula C G; de Sales, Camila P; Welteman, Eduardo; Yoriyaz, Hélio; Verhaegen, Frank

    2014-01-01

    Brachytherapy dose calculation is commonly performed using the Task Group-No 43 Report-Updated protocol (TG-43U1) formalism. Recently, a more accurate approach has been proposed that can handle tissue composition, tissue density, body shape, applicator geometry, and dose reporting either in media or water. Some model-based dose calculation algorithms are based on Monte Carlo (MC) simulations. This work presents a software platform capable of processing medical images and treatment plans, and preparing the required input data for MC simulations. The A Medical Image-based Graphical platfOrm-Brachytherapy module (AMIGOBrachy) is a user interface, coupled to the MCNP6 MC code, for absorbed dose calculations. The AMIGOBrachy was first validated in water for a high-dose-rate (192)Ir source. Next, dose distributions were validated in uniform phantoms consisting of different materials. Finally, dose distributions were obtained in patient geometries. Results were compared against a treatment planning system including a linear Boltzmann transport equation (LBTE) solver capable of handling nonwater heterogeneities. The TG-43U1 source parameters are in good agreement with literature with more than 90% of anisotropy values within 1%. No significant dependence on the tissue composition was observed comparing MC results against an LBTE solver. Clinical cases showed differences up to 25%, when comparing MC results against TG-43U1. About 92% of the voxels exhibited dose differences lower than 2% when comparing MC results against an LBTE solver. The AMIGOBrachy can improve the accuracy of the TG-43U1 dose calculation by using a more accurate MC dose calculation algorithm. The AMIGOBrachy can be incorporated in clinical practice via a user-friendly graphical interface. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  7. TREATMENT INTENSIFICATION WITH INSULIN DEGLUDEC/INSULIN ASPART TWICE DAILY: RANDOMIZED STUDY TO COMPARE SIMPLE AND STEP-WISE TITRATION ALGORITHMS.

    PubMed

    Gerety, Gregg; Bebakar, Wan Mohamad Wan; Chaykin, Louis; Ozkaya, Mesut; Macura, Stanislava; Hersløv, Malene Lundgren; Behnke, Thomas

    2016-05-01

    This 26-week, multicenter, randomized, open-label, parallel-group, treat-to-target trial in adults with type 2 diabetes compared the efficacy and safety of treatment intensification algorithms with twice-daily (BID) insulin degludec/insulin aspart (IDegAsp). Patients randomized 1:1 to IDegAsp BID used either a 'Simple' algorithm (twice-weekly dose adjustments based on a single prebreakfast and pre-evening meal self-monitored plasma glucose [SMPG] measurement; IDegAsp[BIDSimple], n = 136) or a 'Stepwise' algorithm (once-weekly dose adjustments based on the lowest of 3 pre-breakfast and 3 pre-evening meal SMPG values; IDegAsp[BIDStep-wise], n = 136). After 26 weeks, mean change from baseline in glycated hemoglobin (HbA1c) with IDegAsp[BIDSimple] was noninferior to IDegAsp[BIDStep-wise] (-15 mmol/mol versus -14 mmol/mol; 95% confidence interval [CI] upper limit, <4 mmol/mol) (baseline HbA1c: 66.3 mmol/mol IDegAsp[BIDSimple] and 66.6 mmol/mol IDegAsp[BIDStep-wise]). The proportion of patients who achieved HbA1c <7.0% (<53 mmol/mol) at the end of the trial was 66.9% with IDegAsp[BIDSimple] and 62.5% with IDegAsp[BIDStep-wise]. Fasting plasma glucose levels were reduced with each titration algorithm (-1.51 mmol/L IDegAsp[BIDSimple] versus -1.95 mmol/L IDegAsp[BIDStep-wise]). Weight gain was 3.8 kg IDegAsp[BIDSimple] versus 2.6 kg IDegAsp[BIDStep-wise], and rates of overall confirmed hypoglycemia (5.16 episodes per patient-year of exposure [PYE] versus 8.93 PYE) and nocturnal confirmed hypoglycemia (0.78 PYE versus 1.33 PYE) were significantly lower with IDegAsp[BIDStep-wise] versus IDegAsp[BIDSimple]. There were no significant differences in insulin dose increments between groups. Treatment intensification with IDegAsp[BIDSimple] was noninferior to IDegAsp[BIDStep-wise]. Both titration algorithms were well tolerated; however, the more conservative step-wise algorithm led to less weight gain and fewer hypoglycemic episodes. Clinicaltrials.gov: NCT01680341.

  8. Pharmacogenomics of warfarin in populations of African descent.

    PubMed

    Suarez-Kurtz, Guilherme; Botton, Mariana R

    2013-02-01

    Warfarin is the most commonly prescribed oral anticoagulant worldwide despite its narrow therapeutic index and the notorious inter- and intra-individual variability in dose required for the target clinical effect. Pharmacogenetic polymorphisms are major determinants of warfarin pharmacokinetic and dynamics and included in several warfarin dosing algorithms. This review focuses on warfarin pharmacogenomics in sub-Saharan peoples, African Americans and admixed Brazilians. These 'Black' populations differ in several aspects, notably their extent of recent admixture with Europeans, a factor which impacts on the frequency distribution of pharmacogenomic polymorphisms relevant to warfarin dose requirement for the target clinical effect. Whereas a small number of polymorphisms in VKORC1 (3673G > A, rs9923231), CYP2C9 (alleles *2 and *3, rs1799853 and rs1057910, respectively) and arguably CYP4F2 (rs2108622), may capture most of the pharmacogenomic influence on warfarin dose variance in White populations, additional polymorphisms in these, and in other, genes (e.g. CALU rs339097) increase the predictive power of pharmacogenetic warfarin dosing algorithms in the Black populations examined. A personalized strategy for initiation of warfarin therapy, allowing for improved safety and cost-effectiveness for populations of African descent must take into account their pharmacogenomic diversity, as well as socio-economical, cultural and medical factors. Accounting for this heterogeneity in algorithms that are 'friendly' enough to be adopted by warfarin prescribers worldwide requires gathering information from trials at different population levels, but demands also a critical appraisal of racial/ethnic labels that are commonly used in the clinical pharmacology literature but do not accurately reflect genetic ancestry and population diversity. © 2012 The Authors. British Journal of Clinical Pharmacology © 2012 The British Pharmacological Society.

  9. Trajectory optimization for dynamic couch rotation during volumetric modulated arc radiotherapy

    NASA Astrophysics Data System (ADS)

    Smyth, Gregory; Bamber, Jeffrey C.; Evans, Philip M.; Bedford, James L.

    2013-11-01

    Non-coplanar radiation beams are often used in three-dimensional conformal and intensity modulated radiotherapy to reduce dose to organs at risk (OAR) by geometric avoidance. In volumetric modulated arc radiotherapy (VMAT) non-coplanar geometries are generally achieved by applying patient couch rotations to single or multiple full or partial arcs. This paper presents a trajectory optimization method for a non-coplanar technique, dynamic couch rotation during VMAT (DCR-VMAT), which combines ray tracing with a graph search algorithm. Four clinical test cases (partial breast, brain, prostate only, and prostate and pelvic nodes) were used to evaluate the potential OAR sparing for trajectory-optimized DCR-VMAT plans, compared with standard coplanar VMAT. In each case, ray tracing was performed and a cost map reflecting the number of OAR voxels intersected for each potential source position was generated. The least-cost path through the cost map, corresponding to an optimal DCR-VMAT trajectory, was determined using Dijkstra’s algorithm. Results show that trajectory optimization can reduce dose to specified OARs for plans otherwise comparable to conventional coplanar VMAT techniques. For the partial breast case, the mean heart dose was reduced by 53%. In the brain case, the maximum lens doses were reduced by 61% (left) and 77% (right) and the globes by 37% (left) and 40% (right). Bowel mean dose was reduced by 15% in the prostate only case. For the prostate and pelvic nodes case, the bowel V50 Gy and V60 Gy were reduced by 9% and 45% respectively. Future work will involve further development of the algorithm and assessment of its performance over a larger number of cases in site-specific cohorts.

  10. Semimechanistic Bone Marrow Exhaustion Pharmacokinetic/Pharmacodynamic Model for Chemotherapy-Induced Cumulative Neutropenia.

    PubMed

    Henrich, Andrea; Joerger, Markus; Kraff, Stefanie; Jaehde, Ulrich; Huisinga, Wilhelm; Kloft, Charlotte; Parra-Guillen, Zinnia Patricia

    2017-08-01

    Paclitaxel is a commonly used cytotoxic anticancer drug with potentially life-threatening toxicity at therapeutic doses and high interindividual pharmacokinetic variability. Thus, drug and effect monitoring is indicated to control dose-limiting neutropenia. Joerger et al. (2016) developed a dose individualization algorithm based on a pharmacokinetic (PK)/pharmacodynamic (PD) model describing paclitaxel and neutrophil concentrations. Furthermore, the algorithm was prospectively compared in a clinical trial against standard dosing (Central European Society for Anticancer Drug Research Study of Paclitaxel Therapeutic Drug Monitoring; 365 patients, 720 cycles) but did not substantially improve neutropenia. This might be caused by misspecifications in the PK/PD model underlying the algorithm, especially without consideration of the observed cumulative pattern of neutropenia or the platinum-based combination therapy, both impacting neutropenia. This work aimed to externally evaluate the original PK/PD model for potential misspecifications and to refine the PK/PD model while considering the cumulative neutropenia pattern and the combination therapy. An underprediction was observed for the PK (658 samples), the PK parameters, and these parameters were re-estimated using the original estimates as prior information. Neutrophil concentrations (3274 samples) were overpredicted by the PK/PD model, especially for later treatment cycles when the cumulative pattern aggravated neutropenia. Three different modeling approaches (two from the literature and one newly developed) were investigated. The newly developed model, which implemented the bone marrow hypothesis semiphysiologically, was superior. This model further included an additive effect for toxicity of carboplatin combination therapy. Overall, a physiologically plausible PK/PD model was developed that can be used for dose adaptation simulations and prospective studies to further improve paclitaxel/carboplatin combination therapy. Copyright © 2017 by The American Society for Pharmacology and Experimental Therapeutics.

  11. Median prior constrained TV algorithm for sparse view low-dose CT reconstruction.

    PubMed

    Liu, Yi; Shangguan, Hong; Zhang, Quan; Zhu, Hongqing; Shu, Huazhong; Gui, Zhiguo

    2015-05-01

    It is known that lowering the X-ray tube current (mAs) or tube voltage (kVp) and simultaneously reducing the total number of X-ray views (sparse view) is an effective means to achieve low-dose in computed tomography (CT) scan. However, the associated image quality by the conventional filtered back-projection (FBP) usually degrades due to the excessive quantum noise. Although sparse-view CT reconstruction algorithm via total variation (TV), in the scanning protocol of reducing X-ray tube current, has been demonstrated to be able to result in significant radiation dose reduction while maintain image quality, noticeable patchy artifacts still exist in reconstructed images. In this study, to address the problem of patchy artifacts, we proposed a median prior constrained TV regularization to retain the image quality by introducing an auxiliary vector m in register with the object. Specifically, the approximate action of m is to draw, in each iteration, an object voxel toward its own local median, aiming to improve low-dose image quality with sparse-view projection measurements. Subsequently, an alternating optimization algorithm is adopted to optimize the associative objective function. We refer to the median prior constrained TV regularization as "TV_MP" for simplicity. Experimental results on digital phantoms and clinical phantom demonstrated that the proposed TV_MP with appropriate control parameters can not only ensure a higher signal to noise ratio (SNR) of the reconstructed image, but also its resolution compared with the original TV method. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. ZEPrompt: An Algorithm for Rapid Estimation of Building Attenuation for Prompt Radiation from a Nuclear Detonation

    DTIC Science & Technology

    2014-01-01

    and 50 kT, to within 30% of first-principles code ( MCNP ) for complicated cities and 10% for simpler cities. 15. SUBJECT TERMS Radiation Transport...Use of MCNP for Dose Calculations .................................................................... 3 2.3 MCNP Open-Field Absorbed Dose...Calculations .................................................. 4 2.4 The MCNP Urban Model

  13. Development and Evaluation of a New Air Exchange Rate Algorithm for the Stochastic Human Exposure and Dose Simulation Model

    EPA Science Inventory

    between-home and between-city variability in residential pollutant infiltration. This is likely a result of differences in home ventilation, or air exchange rates (AER). The Stochastic Human Exposure and Dose Simulation (SHEDS) model is a population exposure model that uses a pro...

  14. Impact of radiation attenuation by a carbon fiber couch on patient dose verification

    NASA Astrophysics Data System (ADS)

    Yu, Chun-Yen; Chou, Wen-Tsae; Liao, Yi-Jen; Lee, Jeng-Hung; Liang, Ji-An; Hsu, Shih-Ming

    2017-02-01

    The aim of this study was to understand the difference between the measured and calculated irradiation attenuations obtained using two algorithms and to identify the influence of couch attenuation on patient dose verification. We performed eight tests of couch attenuation with two photon energies, two longitudinal couch positions, and two rail positions. The couch attenuation was determined using a radiation treatment planning system. The measured and calculated attenuations were compared. We also performed 12 verifications of head-and-neck and rectum cases by using a Delta phantom. The dose deviation (DD), distance to agreement (DTA), and gamma index of pencil-beam convolution (PBC) verifications were nearly the same. The agreement was least consistent for the anisotropic analytical algorithm (AAA) without the couch for the head-and-neck case, in which the DD, DTA, and gamma index were 74.4%, 99.3%, and 89%, respectively; for the rectum case, the corresponding values were 56.2%, 95.1%, and 92.4%. We suggest that dose verification should be performed using the following three metrics simultaneously: DD, DTA, and the gamma index.

  15. SU-F-J-198: A Cross-Platform Adaptation of An a Priori Scatter Correction Algorithm for Cone-Beam Projections to Enable Image- and Dose-Guided Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, A; Casares-Magaz, O; Elstroem, U

    Purpose: Cone-beam CT (CBCT) imaging may enable image- and dose-guided proton therapy, but is challenged by image artefacts. The aim of this study was to demonstrate the general applicability of a previously developed a priori scatter correction algorithm to allow CBCT-based proton dose calculations. Methods: The a priori scatter correction algorithm used a plan CT (pCT) and raw cone-beam projections acquired with the Varian On-Board Imager. The projections were initially corrected for bow-tie filtering and beam hardening and subsequently reconstructed using the Feldkamp-Davis-Kress algorithm (rawCBCT). The rawCBCTs were intensity normalised before a rigid and deformable registration were applied on themore » pCTs to the rawCBCTs. The resulting images were forward projected onto the same angles as the raw CB projections. The two projections were subtracted from each other, Gaussian and median filtered, and then subtracted from the raw projections and finally reconstructed to the scatter-corrected CBCTs. For evaluation, water equivalent path length (WEPL) maps (from anterior to posterior) were calculated on different reconstructions of three data sets (CB projections and pCT) of three parts of an Alderson phantom. Finally, single beam spot scanning proton plans (0–360 deg gantry angle in steps of 5 deg; using PyTRiP) treating a 5 cm central spherical target in the pCT were re-calculated on scatter-corrected CBCTs with identical targets. Results: The scatter-corrected CBCTs resulted in sub-mm mean WEPL differences relative to the rigid registration of the pCT for all three data sets. These differences were considerably smaller than what was achieved with the regular Varian CBCT reconstruction algorithm (1–9 mm mean WEPL differences). Target coverage in the re-calculated plans was generally improved using the scatter-corrected CBCTs compared to the Varian CBCT reconstruction. Conclusion: We have demonstrated the general applicability of a priori CBCT scatter correction, potentially opening for CBCT-based image/dose-guided proton therapy, including adaptive strategies. Research agreement with Varian Medical Systems, not connected to the present project.« less

  16. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, H; Tachibana, R

    2015-06-15

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less

  17. A new concept of pencil beam dose calculation for 40-200 keV photons using analytical dose kernels.

    PubMed

    Bartzsch, Stefan; Oelfke, Uwe

    2013-11-01

    The advent of widespread kV-cone beam computer tomography in image guided radiation therapy and special therapeutic application of keV photons, e.g., in microbeam radiation therapy (MRT) require accurate and fast dose calculations for photon beams with energies between 40 and 200 keV. Multiple photon scattering originating from Compton scattering and the strong dependence of the photoelectric cross section on the atomic number of the interacting tissue render these dose calculations by far more challenging than the ones established for corresponding MeV beams. That is why so far developed analytical models of kV photon dose calculations fail to provide the required accuracy and one has to rely on time consuming Monte Carlo simulation techniques. In this paper, the authors introduce a novel analytical approach for kV photon dose calculations with an accuracy that is almost comparable to the one of Monte Carlo simulations. First, analytical point dose and pencil beam kernels are derived for homogeneous media and compared to Monte Carlo simulations performed with the Geant4 toolkit. The dose contributions are systematically separated into contributions from the relevant orders of multiple photon scattering. Moreover, approximate scaling laws for the extension of the algorithm to inhomogeneous media are derived. The comparison of the analytically derived dose kernels in water showed an excellent agreement with the Monte Carlo method. Calculated values deviate less than 5% from Monte Carlo derived dose values, for doses above 1% of the maximum dose. The analytical structure of the kernels allows adaption to arbitrary materials and photon spectra in the given energy range of 40-200 keV. The presented analytical methods can be employed in a fast treatment planning system for MRT. In convolution based algorithms dose calculation times can be reduced to a few minutes.

  18. SU-F-T-449: Dosimetric Comparison of Acuros XB, Adaptive Convolve in Intensity Modulated Radiotherapy for Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uehara, R; Tachibana, H

    Purpose: There have been several publications focusing on dose calculation in lung for a new dose calculation algorithm of Acuros XB (AXB). AXB could contribute to dose calculation for high-density media for bone and dental prosthesis rather than in lung. We compared the dosimetric performance of AXB, Adaptive Convolve (AC) in head and neck IMRT plans. Methods: In a phantom study, the difference in depth profile between AXB and AC was evaluated using Kodak EDR2 film sandwiched with tough water phantoms. 6 MV x-ray using the TrueBeam was irradiated. In a patient study, 20 head and neck IMRT plans hadmore » been clinically approved in Pinnacle3 and were transferred to Eclipse. Dose distribution was recalculated using AXB in Eclipse while maintaining AC-calculated monitor units and MLC sequence planned in Pinnacle. Subsequently, both the dose-volumetric data obtained using the two different calculation algorithms were compared. Results: The results in the phantom evaluation for the shallow area ahead of the build-up region shows over-dose for AXB and under-dose for AC, respectively. In the patient plans, AXB shows more hot spots especially around the high-density media than AC in terms of PTV (Max difference: 4.0%) and OAR (Max. difference: 1.9%). Compared to AC, there were larger dose deviations in steep dose gradient region and higher skin-dose. Conclusion: In head and neck IMRT plans, AXB and AC show different dosimetric performance for the regions inside the target volume around high-density media, steep dose gradient regions and skin-surface. There are limitations in skin-dose and complex anatomic condition using even inhomogeneous anthropomorphic phantom Thus, there is the potential for an increase of hot-spot in AXB, and an underestimation of dose in substance boundaries and skin regions in AC.« less

  19. Direct plan comparison of RapidArc and CyberKnife for spine stereotactic body radiation therapy

    NASA Astrophysics Data System (ADS)

    Choi, Young Eun; Kwak, Jungwon; Song, Si Yeol; Choi, Eun Kyung; Ahn, Seung Do; Cho, Byungchul

    2015-07-01

    We compared the treatment planning performance of RapidArc (RA) vs. CyberKnife (CK) for spinal stereotactic body radiation therapy (SBRT). Ten patients with spinal lesions who had been treated with CK were re-planned with RA, which consisted of two complete arcs. Computed tomography (CT) and volumetric dose data of CK, generated using the Multiplan (Accuray) treatment planning system (TPS) and the Ray-trace algorithm, were imported to Varian Eclipse TPS in Dicom format, and the data were compared with the RA plan by using an analytical anisotropic algorithm (AAA) dose calculation. The optimized dose priorities for both the CK and the RA plans were similar for all patients. The highest priority was to provide enough dose coverage to the planned target volume (PTV) while limiting the maximum dose to the spinal cord. Plan quality was evaluated with respect to PTV coverage, conformity index (CI), high-dose spillage, intermediate-dose spillage (R50% and D2cm), and maximum dose to the spinal cord, which are criteria recommended by the RTOG 0631 spine and 0915 lung SBRT protocols. The mean CI' SD values of the PTV were 1.11' 0.03 and 1.17' 0.10 for RA and CK ( p = 0.02), respectively. On average, the maximum dose delivered to the spinal cord in CK plans was approximately 11.6% higher than that in RA plans, and this difference was statistically significant ( p < 0.001). High-dose spillages were 0.86% and 2.26% for RA and CK ( p = 0.203), respectively. Intermediate-dose spillage characterized by D2cm was lower for RA than for CK; however, R50% was not statistically different. Even though both systems can create highly conformal volumetric dose distributions, the current study shows that RA demonstrates lower high- and intermediate-dose spillages than CK. Therefore, RA plans for spinal SBRT may be superior to CK plans.

  20. A comparison of TPS and different measurement techniques in small-field electron beams.

    PubMed

    Donmez Kesen, Nazmiye; Cakir, Aydin; Okutan, Murat; Bilge, Hatice

    2015-01-01

    In recent years, small-field electron beams have been used for the treatment of superficial lesions, which requires small circular fields. However, when using very small electron fields, some significant dosimetric problems may occur. In this study, dose distributions and outputs of circular fields with dimensions of 5cm and smaller, for nominal energies of 6, 9, and 15MeV from the Siemens ONCOR Linac, were measured and compared with data from a treatment planning system using the pencil-beam algorithm in electron beam calculations. All dose distribution measurements were performed using the Gafchromic EBT film; these measurements were compared with data that were obtained from the Computerized Medical Systems (CMS) XiO treatment planning system (TPS), using the gamma-index method in the PTW VeriSoft software program. Output measurements were performed using the Gafchromic EBT film, an Advanced Markus ion chamber, and thermoluminescent dosimetry (TLD). Although the pencil-beam algorithm is used to model electron beams in many clinics, there is no substantial amount of detailed information in the literature about its use. As the field size decreased, the point of maximum dose moved closer to the surface. Output factors were consistent; differences from the values obtained from the TPS were, at maximum, 42% for 6 and 15MeV and 32% for 9MeV. When the dose distributions from the TPS were compared with the measurements from the Gafchromic EBT films, it was observed that the results were consistent for 2-cm diameter and larger fields, but the outputs for fields of 1-cm diameter and smaller were not consistent. In CMS XiO TPS, calculated using the pencil-beam algorithm, the dose distributions of electron treatment fields that were created with circular cutout of a 1-cm diameter were not appropriate for patient treatment and the pencil-beam algorithm is not convenient for monitor unit (MU) calculations in electron dosimetry. Copyright © 2015 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

Top